Sample records for imaging interaction toolkit

  1. Implementation of an oblique-sectioning visualization tool for line-of-sight stereotactic neurosurgical navigation using the AVW toolkit

    NASA Astrophysics Data System (ADS)

    Bates, Lisa M.; Hanson, Dennis P.; Kall, Bruce A.; Meyer, Frederic B.; Robb, Richard A.

    1998-06-01

    An important clinical application of biomedical imaging and visualization techniques is provision of image guided neurosurgical planning and navigation techniques using interactive computer display systems in the operating room. Current systems provide interactive display of orthogonal images and 3D surface or volume renderings integrated with and guided by the location of a surgical probe. However, structures in the 'line-of-sight' path which lead to the surgical target cannot be directly visualized, presenting difficulty in obtaining full understanding of the 3D volumetric anatomic relationships necessary for effective neurosurgical navigation below the cortical surface. Complex vascular relationships and histologic boundaries like those found in artereovenous malformations (AVM's) also contribute to the difficulty in determining optimal approaches prior to actual surgical intervention. These difficulties demonstrate the need for interactive oblique imaging methods to provide 'line-of-sight' visualization. Capabilities for 'line-of- sight' interactive oblique sectioning are present in several current neurosurgical navigation systems. However, our implementation is novel, in that it utilizes a completely independent software toolkit, AVW (A Visualization Workshop) developed at the Mayo Biomedical Imaging Resource, integrated with a current neurosurgical navigation system, the COMPASS stereotactic system at Mayo Foundation. The toolkit is a comprehensive, C-callable imaging toolkit containing over 500 optimized imaging functions and structures. The powerful functionality and versatility of the AVW imaging toolkit provided facile integration and implementation of desired interactive oblique sectioning using a finite set of functions. The implementation of the AVW-based code resulted in higher-level functions for complete 'line-of-sight' visualization.

  2. [Research on Three-dimensional Medical Image Reconstruction and Interaction Based on HTML5 and Visualization Toolkit].

    PubMed

    Gao, Peng; Liu, Peng; Su, Hongsen; Qiao, Liang

    2015-04-01

    Integrating visualization toolkit and the capability of interaction, bidirectional communication and graphics rendering which provided by HTML5, we explored and experimented on the feasibility of remote medical image reconstruction and interaction in pure Web. We prompted server-centric method which did not need to download the big medical data to local connections and avoided considering network transmission pressure and the three-dimensional (3D) rendering capability of client hardware. The method integrated remote medical image reconstruction and interaction into Web seamlessly, which was applicable to lower-end computers and mobile devices. Finally, we tested this method in the Internet and achieved real-time effects. This Web-based 3D reconstruction and interaction method, which crosses over internet terminals and performance limited devices, may be useful for remote medical assistant.

  3. MITK-OpenIGTLink for combining open-source toolkits in real-time computer-assisted interventions.

    PubMed

    Klemm, Martin; Kirchner, Thomas; Gröhl, Janek; Cheray, Dominique; Nolden, Marco; Seitel, Alexander; Hoppe, Harald; Maier-Hein, Lena; Franz, Alfred M

    2017-03-01

    Due to rapid developments in the research areas of medical imaging, medical image processing and robotics, computer-assisted interventions (CAI) are becoming an integral part of modern patient care. From a software engineering point of view, these systems are highly complex and research can benefit greatly from reusing software components. This is supported by a number of open-source toolkits for medical imaging and CAI such as the medical imaging interaction toolkit (MITK), the public software library for ultrasound imaging research (PLUS) and 3D Slicer. An independent inter-toolkit communication such as the open image-guided therapy link (OpenIGTLink) can be used to combine the advantages of these toolkits and enable an easier realization of a clinical CAI workflow. MITK-OpenIGTLink is presented as a network interface within MITK that allows easy to use, asynchronous two-way messaging between MITK and clinical devices or other toolkits. Performance and interoperability tests with MITK-OpenIGTLink were carried out considering the whole CAI workflow from data acquisition over processing to visualization. We present how MITK-OpenIGTLink can be applied in different usage scenarios. In performance tests, tracking data were transmitted with a frame rate of up to 1000 Hz and a latency of 2.81 ms. Transmission of images with typical ultrasound (US) and greyscale high-definition (HD) resolutions of [Formula: see text] and [Formula: see text] is possible at up to 512 and 128 Hz, respectively. With the integration of OpenIGTLink into MITK, this protocol is now supported by all established open-source toolkits in the field. This eases interoperability between MITK and toolkits such as PLUS or 3D Slicer and facilitates cross-toolkit research collaborations. MITK and its submodule MITK-OpenIGTLink are provided open source under a BSD-style licence ( http://mitk.org ).

  4. Integrating the visualization concept of the medical imaging interaction toolkit (MITK) into the XIP-Builder visual programming environment

    NASA Astrophysics Data System (ADS)

    Wolf, Ivo; Nolden, Marco; Schwarz, Tobias; Meinzer, Hans-Peter

    2010-02-01

    The Medical Imaging Interaction Toolkit (MITK) and the eXtensible Imaging Platform (XIP) both aim at facilitating the development of medical imaging applications, but provide support on different levels. MITK offers support from the toolkit level, whereas XIP comes with a visual programming environment. XIP is strongly based on Open Inventor. Open Inventor with its scene graph-based rendering paradigm was not specifically designed for medical imaging, but focuses on creating dedicated visualizations. MITK has a visualization concept with a model-view-controller like design that assists in implementing multiple, consistent views on the same data, which is typically required in medical imaging. In addition, MITK defines a unified means of describing position, orientation, bounds, and (if required) local deformation of data and views, supporting e.g. images acquired with gantry tilt and curved reformations. The actual rendering is largely delegated to the Visualization Toolkit (VTK). This paper presents an approach of how to integrate the visualization concept of MITK with XIP, especially into the XIP-Builder. This is a first step of combining the advantages of both platforms. It enables experimenting with algorithms in the XIP visual programming environment without requiring a detailed understanding of Open Inventor. Using MITK-based add-ons to XIP, any number of data objects (images, surfaces, etc.) produced by algorithms can simply be added to an MITK DataStorage object and rendered into any number of slice-based (2D) or 3D views. Both MITK and XIP are open-source C++ platforms. The extensions presented in this paper will be available from www.mitk.org.

  5. Analyzing microtomography data with Python and the scikit-image library.

    PubMed

    Gouillart, Emmanuelle; Nunez-Iglesias, Juan; van der Walt, Stéfan

    2017-01-01

    The exploration and processing of images is a vital aspect of the scientific workflows of many X-ray imaging modalities. Users require tools that combine interactivity, versatility, and performance. scikit-image is an open-source image processing toolkit for the Python language that supports a large variety of file formats and is compatible with 2D and 3D images. The toolkit exposes a simple programming interface, with thematic modules grouping functions according to their purpose, such as image restoration, segmentation, and measurements. scikit-image users benefit from a rich scientific Python ecosystem that contains many powerful libraries for tasks such as visualization or machine learning. scikit-image combines a gentle learning curve, versatile image processing capabilities, and the scalable performance required for the high-throughput analysis of X-ray imaging data.

  6. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    PubMed

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  7. Cancer Imaging Phenomics Toolkit (CaPTK) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    CaPTk is a tool that facilitates translation of highly sophisticated methods that help us gain a comprehensive understanding of the underlying mechanisms of cancer from medical imaging research to the clinic. It replicates basic interactive functionalities of radiological workstations and is distributed under a BSD-style license.

  8. An interactive toolkit to extract phenological time series data from digital repeat photography

    NASA Astrophysics Data System (ADS)

    Seyednasrollah, B.; Milliman, T. E.; Hufkens, K.; Kosmala, M.; Richardson, A. D.

    2017-12-01

    Near-surface remote sensing and in situ photography are powerful tools to study how climate change and climate variability influence vegetation phenology and the associated seasonal rhythms of green-up and senescence. The rapidly-growing PhenoCam network has been using in situ digital repeat photography to study phenology in almost 500 locations around the world, with an emphasis on North America. However, extracting time series data from multiple years of half-hourly imagery - while each set of images may contain several regions of interest (ROI's), corresponding to different species or vegetation types - is not always straightforward. Large volumes of data require substantial processing time, and changes (either intentional or accidental) in camera field of view requires adjustment of ROI masks. Here, we introduce and present "DrawROI" as an interactive web-based application for imagery from PhenoCam. DrawROI can also be used offline, as a fully independent toolkit that significantly facilitates extraction of phenological data from any stack of digital repeat photography images. DrawROI provides a responsive environment for phenological scientists to interactively a) delineate ROIs, b) handle field of view (FOV) shifts, and c) extract and export time series data characterizing image color (i.e. red, green and blue channel digital numbers for the defined ROI). The application utilizes artificial intelligence and advanced machine learning techniques and gives user the opportunity to redraw new ROIs every time an FOV shift occurs. DrawROI also offers a quality control flag to indicate noisy data and images with low quality due to presence of foggy weather or snow conditions. The web-based application significantly accelerates the process of creating new ROIs and modifying pre-existing ROI in the PhenoCam database. The offline toolkit is presented as an open source R-package that can be used with similar datasets with time-lapse photography to obtain more data for studying phenology for a large community of ecologists. We will illustrate the use of the toolkit using imagery from a selection of sites within the National Ecological Observatory Network (NEON).

  9. The Ames MER microscopic imager toolkit

    USGS Publications Warehouse

    Sargent, R.; Deans, Matthew; Kunz, C.; Sims, M.; Herkenhoff, K.

    2005-01-01

    12The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a ??3mm depth of field and a 31??31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser.This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission. ?? 2005 IEEE.

  10. The Ames MER Microscopic Imager Toolkit

    NASA Technical Reports Server (NTRS)

    Sargent, Randy; Deans, Matthew; Kunz, Clayton; Sims, Michael; Herkenhoff, Ken

    2005-01-01

    The Mars Exploration Rovers, Spirit and Opportunity, have spent several successful months on Mars, returning gigabytes of images and spectral data to scientists on Earth. One of the instruments on the MER rovers, the Athena Microscopic Imager (MI), is a fixed focus, megapixel camera providing a plus or minus mm depth of field and a 3lx31mm field of view at a working distance of 63 mm from the lens to the object being imaged. In order to maximize the science return from this instrument, we developed the Ames MI Toolkit and supported its use during the primary mission. The MI Toolkit is a set of programs that operate on collections of MI images, with the goal of making the data more understandable to the scientists on the ground. Because of the limited depth of field of the camera, and the often highly variable topography of the terrain being imaged, MI images of a given rock are often taken as a stack, with the Instrument Deployment Device (IDD) moving along a computed normal vector, pausing every few millimeters for the MI to acquire an image. The MI Toolkit provides image registration and focal section merging, which combine these images to form a single, maximally in-focus image, while compensating for changes in lighting as well as parallax due to the motion of the camera. The MI Toolkit also provides a 3-D reconstruction of the surface being imaged using stereo and can embed 2-D MI images as texture maps into 3-D meshes produced by other imagers on board the rover to provide context. The 2-D images and 3-D meshes output from the Toolkit are easily viewed by scientists using other mission tools, such as Viz or the MI Browser. This paper describes the MI Toolkit in detail, as well as our experience using it with scientists at JPL during the primary MER mission.

  11. A system for rapid prototyping of hearts with congenital malformations based on the medical imaging interaction toolkit (MITK)

    NASA Astrophysics Data System (ADS)

    Wolf, Ivo; Böttger, Thomas; Rietdorf, Urte; Maleike, Daniel; Greil, Gerald; Sieverding, Ludger; Miller, Stephan; Mottl-Link, Sibylle; Meinzer, Hans-Peter

    2006-03-01

    Precise knowledge of the individual cardiac anatomy is essential for diagnosis and treatment of congenital heart disease. Complex malformations of the heart can best be comprehended not from images but from anatomic specimens. Physical models can be created from data using rapid prototyping techniques, e.g., lasersintering or 3D-printing. We have developed a system for obtaining data that show the relevant cardiac anatomy from high-resolution CT/MR images and are suitable for rapid prototyping. The challenge is to preserve all relevant details unaltered in the produced models. The main anatomical structures of interest are the four heart cavities (atria, ventricles), the valves and the septum separating the cavities, and the great vessels. These can be shown either by reproducing the morphology itself or by producing a model of the blood-pool, thus creating a negative of the morphology. Algorithmically the key issue is segmentation. Practically, possibilities allowing the cardiologist or cardiac surgeon to interactively check and correct the segmentation are even more important due to the complex, irregular anatomy and imaging artefacts. The paper presents the algorithmic and interactive processing steps implemented in the system, which is based on the open-source Medical Imaging Interaction Toolkit (MITK, www.mitk.org). It is shown how the principles used in MITK enable to assemble the system from modules (functionalities) developed independently from each other. The system allows to produce models of the heart (and other anatomic structures) of individual patients as well as to reproduce unique specimens from pathology collections for teaching purposes.

  12. Fluorescence Behavioral Imaging (FBI) Tracks Identity in Heterogeneous Groups of Drosophila

    PubMed Central

    Ramdya, Pavan; Schaffter, Thomas; Floreano, Dario; Benton, Richard

    2012-01-01

    Distinguishing subpopulations in group behavioral experiments can reveal the impact of differences in genetic, pharmacological and life-histories on social interactions and decision-making. Here we describe Fluorescence Behavioral Imaging (FBI), a toolkit that uses transgenic fluorescence to discriminate subpopulations, imaging hardware that simultaneously records behavior and fluorescence expression, and open-source software for automated, high-accuracy determination of genetic identity. Using FBI, we measure courtship partner choice in genetically mixed groups of Drosophila. PMID:23144871

  13. Fluorescence behavioral imaging (FBI) tracks identity in heterogeneous groups of Drosophila.

    PubMed

    Ramdya, Pavan; Schaffter, Thomas; Floreano, Dario; Benton, Richard

    2012-01-01

    Distinguishing subpopulations in group behavioral experiments can reveal the impact of differences in genetic, pharmacological and life-histories on social interactions and decision-making. Here we describe Fluorescence Behavioral Imaging (FBI), a toolkit that uses transgenic fluorescence to discriminate subpopulations, imaging hardware that simultaneously records behavior and fluorescence expression, and open-source software for automated, high-accuracy determination of genetic identity. Using FBI, we measure courtship partner choice in genetically mixed groups of Drosophila.

  14. Application development environment for advanced digital workstations

    NASA Astrophysics Data System (ADS)

    Valentino, Daniel J.; Harreld, Michael R.; Liu, Brent J.; Brown, Matthew S.; Huang, Lu J.

    1998-06-01

    One remaining barrier to the clinical acceptance of electronic imaging and information systems is the difficulty in providing intuitive access to the information needed for a specific clinical task (such as reaching a diagnosis or tracking clinical progress). The purpose of this research was to create a development environment that enables the design and implementation of advanced digital imaging workstations. We used formal data and process modeling to identify the diagnostic and quantitative data that radiologists use and the tasks that they typically perform to make clinical decisions. We studied a diverse range of radiology applications, including diagnostic neuroradiology in an academic medical center, pediatric radiology in a children's hospital, screening mammography in a breast cancer center, and thoracic radiology consultation for an oncology clinic. We used object- oriented analysis to develop software toolkits that enable a programmer to rapidly implement applications that closely match clinical tasks. The toolkits support browsing patient information, integrating patient images and reports, manipulating images, and making quantitative measurements on images. Collectively, we refer to these toolkits as the UCLA Digital ViewBox toolkit (ViewBox/Tk). We used the ViewBox/Tk to rapidly prototype and develop a number of diverse medical imaging applications. Our task-based toolkit approach enabled rapid and iterative prototyping of workstations that matched clinical tasks. The toolkit functionality and performance provided a 'hands-on' feeling for manipulating images, and for accessing textual information and reports. The toolkits directly support a new concept for protocol based-reading of diagnostic studies. The design supports the implementation of network-based application services (e.g., prefetching, workflow management, and post-processing) that will facilitate the development of future clinical applications.

  15. An open source toolkit for medical imaging de-identification.

    PubMed

    González, David Rodríguez; Carpenter, Trevor; van Hemert, Jano I; Wardlaw, Joanna

    2010-08-01

    Medical imaging acquired for clinical purposes can have several legitimate secondary uses in research projects and teaching libraries. No commonly accepted solution for anonymising these images exists because the amount of personal data that should be preserved varies case by case. Our objective is to provide a flexible mechanism for anonymising Digital Imaging and Communications in Medicine (DICOM) data that meets the requirements for deployment in multicentre trials. We reviewed our current de-identification practices and defined the relevant use cases to extract the requirements for the de-identification process. We then used these requirements in the design and implementation of the toolkit. Finally, we tested the toolkit taking as a reference those requirements, including a multicentre deployment. The toolkit successfully anonymised DICOM data from various sources. Furthermore, it was shown that it could forward anonymous data to remote destinations, remove burned-in annotations, and add tracking information to the header. The toolkit also implements the DICOM standard confidentiality mechanism. A DICOM de-identification toolkit that facilitates the enforcement of privacy policies was developed. It is highly extensible, provides the necessary flexibility to account for different de-identification requirements and has a low adoption barrier for new users.

  16. JAVA Stereo Display Toolkit

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  17. New Software Developments for Quality Mesh Generation and Optimization from Biomedical Imaging Data

    PubMed Central

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2013-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly graphical interface for easy manipulation and informative visualization of biomedical images and mesh models. Numerous examples are presented to demonstrate the effectiveness and efficiency of the described methods and toolkit. PMID:24252469

  18. Light-Field Imaging Toolkit

    NASA Astrophysics Data System (ADS)

    Bolan, Jeffrey; Hall, Elise; Clifford, Chris; Thurow, Brian

    The Light-Field Imaging Toolkit (LFIT) is a collection of MATLAB functions designed to facilitate the rapid processing of raw light field images captured by a plenoptic camera. An included graphical user interface streamlines the necessary post-processing steps associated with plenoptic images. The generation of perspective shifted views and computationally refocused images is supported, in both single image and animated formats. LFIT performs necessary calibration, interpolation, and structuring steps to enable future applications of this technology.

  19. Semantic Web Technologies for Mobile Context-Aware Services

    DTIC Science & Technology

    2006-03-01

    for Context-Aware Service Provisioning 6 Communication toolkit (http, e-mail, IM, etc.) User interaction manager Platform manager White & yellow ...NAICS] North American Industry Classification System , http://www.census.gov/epcd/www/naics.html [OS00] Opermann, R., and Specht , M., A Context...toolkit (http, e-mail, IM, etc.) User interaction manager Platform manager White & yellow pages MAS administration toolkit N ET W O R K knowledge

  20. New software developments for quality mesh generation and optimization from biomedical imaging data.

    PubMed

    Yu, Zeyun; Wang, Jun; Gao, Zhanheng; Xu, Ming; Hoshijima, Masahiko

    2014-01-01

    In this paper we present a new software toolkit for generating and optimizing surface and volumetric meshes from three-dimensional (3D) biomedical imaging data, targeted at image-based finite element analysis of some biomedical activities in a single material domain. Our toolkit includes a series of geometric processing algorithms including surface re-meshing and quality-guaranteed tetrahedral mesh generation and optimization. All methods described have been encapsulated into a user-friendly graphical interface for easy manipulation and informative visualization of biomedical images and mesh models. Numerous examples are presented to demonstrate the effectiveness and efficiency of the described methods and toolkit. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. MITK-based segmentation of co-registered MRI for subject-related regional anesthesia simulation

    NASA Astrophysics Data System (ADS)

    Teich, Christian; Liao, Wei; Ullrich, Sebastian; Kuhlen, Torsten; Ntouba, Alexandre; Rossaint, Rolf; Ullisch, Marcus; Deserno, Thomas M.

    2008-03-01

    With a steadily increasing indication, regional anesthesia is still trained directly on the patient. To develop a virtual reality (VR)-based simulation, a patient model is needed containing several tissues, which have to be extracted from individual magnet resonance imaging (MRI) volume datasets. Due to the given modality and the different characteristics of the single tissues, an adequate segmentation can only be achieved by using a combination of segmentation algorithms. In this paper, we present a framework for creating an individual model from MRI scans of the patient. Our work splits in two parts. At first, an easy-to-use and extensible tool for handling the segmentation task on arbitrary datasets is provided. The key idea is to let the user create a segmentation for the given subject by running different processing steps in a purposive order and store them in a segmentation script for reuse on new datasets. For data handling and visualization, we utilize the Medical Imaging Interaction Toolkit (MITK), which is based on the Visualization Toolkit (VTK) and the Insight Segmentation and Registration Toolkit (ITK). The second part is to find suitable segmentation algorithms and respectively parameters for differentiating the tissues required by the RA simulation. For this purpose, a fuzzy c-means clustering algorithm combined with mathematical morphology operators and a geometric active contour-based approach is chosen. The segmentation process itself aims at operating with minimal user interaction, and the gained model fits the requirements of the simulation. First results are shown for both, male and female MRI of the pelvis.

  2. The Montage Image Mosaic Toolkit As A Visualization Engine.

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce; Lerias, Angela; Good, John; Mandel, Eric; Pepper, Joshua

    2018-01-01

    The Montage toolkit has since 2003 been used to aggregate FITS images into mosaics for science analysis. It is now finding application as an engine for image visualization. One important reason is that the functionality developed for creating mosaics is also valuable in image visualization. An equally important (though perhaps less obvious) reason is that Montage is portable and is built on standard astrophysics toolkits, making it very easy to integrate into new environments. Montage models and rectifies the sky background to a common level and thus reveals faint, diffuse features; it offers an adaptive image stretching method that preserves the dynamic range of a FITS image when represented in PNG format; it provides utilities for creating cutouts of large images and downsampled versions of large images that can then be visualized on desktops or in browsers; it contains a fast reprojection algorithm intended for visualization; and it resamples and reprojects images to a common grid for subsequent multi-color visualization.This poster will highlight these visualization capabilities with the following examples:1. Creation of down-sampled multi-color images of a 16-wavelength Infrared Atlas of the Galactic Plane, sampled at 1 arcsec when created2. Integration into web-based image processing environment: JS9 is an interactive image display service for web browsers, desktops and mobile devices. It exploits the flux-preserving reprojection algorithms in Montage to transform diverse images to common image parameters for display. Select Montage programs have been compiled to Javascript/WebAssembly using the Emscripten compiler, which allows our reprojection algorithms to run in browsers at close to native speed.3. Creation of complex sky coverage maps: an multicolor all-sky map that shows the sky coverage of the Kepler and K2, KELT and TESS projects, overlaid on an all-sky 2MASS image.Montage is funded by the National Science Foundation under Grant Number ACI-1642453. JS9 is funded by the Chandra X-ray Center (NAS8-03060) and NASA's Universe of Learning (STScI-509913).

  3. Early Detection of Clinically Significant Prostate Cancer Using Ultrasonic Acoustic Radiation Force Impulse (ARFI) Imaging

    DTIC Science & Technology

    2017-10-01

    Toolkit for rapid 3D visualization and image volume interpretation, followed by automated transducer positioning in a user-selected image plane for... Toolkit (IGSTK) to enable rapid 3D visualization and image volume interpretation followed by automated transducer positioning in the user-selected... careers in science, technology, and the humanities. What do you plan to do during the next reporting period to accomplish the goals? If this

  4. Cancer Imaging Phenomics Toolkit (CaPTk) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    CaPTk is a software toolkit to facilitate translation of quantitative image analysis methods that help us obtain rich imaging phenotypic signatures of oncologic images and relate them to precision diagnostics and prediction of clinical outcomes, as well as to underlying molecular characteristics of cancer. The stand-alone graphical user interface of CaPTk brings analysis methods from the realm of medical imaging research to the clinic, and will be extended to use web-based services for computationally-demanding pipelines.

  5. VaST: A variability search toolkit

    NASA Astrophysics Data System (ADS)

    Sokolovsky, K. V.; Lebedev, A. A.

    2018-01-01

    Variability Search Toolkit (VaST) is a software package designed to find variable objects in a series of sky images. It can be run from a script or interactively using its graphical interface. VaST relies on source list matching as opposed to image subtraction. SExtractor is used to generate source lists and perform aperture or PSF-fitting photometry (with PSFEx). Variability indices that characterize scatter and smoothness of a lightcurve are computed for all objects. Candidate variables are identified as objects having high variability index values compared to other objects of similar brightness. The two distinguishing features of VaST are its ability to perform accurate aperture photometry of images obtained with non-linear detectors and handle complex image distortions. The software has been successfully applied to images obtained with telescopes ranging from 0.08 to 2.5 m in diameter equipped with a variety of detectors including CCD, CMOS, MIC and photographic plates. About 1800 variable stars have been discovered with VaST. It is used as a transient detection engine in the New Milky Way (NMW) nova patrol. The code is written in C and can be easily compiled on the majority of UNIX-like systems. VaST is free software available at http://scan.sai.msu.ru/vast/.

  6. NiftySim: A GPU-based nonlinear finite element package for simulation of soft tissue biomechanics.

    PubMed

    Johnsen, Stian F; Taylor, Zeike A; Clarkson, Matthew J; Hipwell, John; Modat, Marc; Eiben, Bjoern; Han, Lianghao; Hu, Yipeng; Mertzanidou, Thomy; Hawkes, David J; Ourselin, Sebastien

    2015-07-01

    NiftySim, an open-source finite element toolkit, has been designed to allow incorporation of high-performance soft tissue simulation capabilities into biomedical applications. The toolkit provides the option of execution on fast graphics processing unit (GPU) hardware, numerous constitutive models and solid-element options, membrane and shell elements, and contact modelling facilities, in a simple to use library. The toolkit is founded on the total Lagrangian explicit dynamics (TLEDs) algorithm, which has been shown to be efficient and accurate for simulation of soft tissues. The base code is written in C[Formula: see text], and GPU execution is achieved using the nVidia CUDA framework. In most cases, interaction with the underlying solvers can be achieved through a single Simulator class, which may be embedded directly in third-party applications such as, surgical guidance systems. Advanced capabilities such as contact modelling and nonlinear constitutive models are also provided, as are more experimental technologies like reduced order modelling. A consistent description of the underlying solution algorithm, its implementation with a focus on GPU execution, and examples of the toolkit's usage in biomedical applications are provided. Efficient mapping of the TLED algorithm to parallel hardware results in very high computational performance, far exceeding that available in commercial packages. The NiftySim toolkit provides high-performance soft tissue simulation capabilities using GPU technology for biomechanical simulation research applications in medical image computing, surgical simulation, and surgical guidance applications.

  7. An adaptive toolkit for image quality evaluation in system performance test of digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Zhang, Guozhi; Petrov, Dimitar; Marshall, Nicholas; Bosmans, Hilde

    2017-03-01

    Digital breast tomosynthesis (DBT) is a relatively new diagnostic imaging modality for women. Currently, various models of DBT systems are available on the market and the number of installations is rapidly increasing. EUREF, the European Reference Organization for Quality Assured Breast Screening and Diagnostic Services, has proposed a preliminary Guideline - protocol for the quality control of the physical and technical aspects of digital breast tomosynthesis systems, with an ultimate aim of providing limiting values guaranteeing proper performance for different applications of DBT. In this work, we introduce an adaptive toolkit developed in accordance with this guideline to facilitate the process of image quality evaluation in DBT performance test. This toolkit implements robust algorithms to quantify various technical parameters of DBT images and provides a convenient user interface in practice. Each test is built into a separate module with configurations set corresponding to the European guideline, which can be easily adapted to different settings and extended with additional tests. This toolkit largely improves the efficiency for image quality evaluation of DBT. It is also going to evolve with the development of protocols in quality control of DBT systems.

  8. The Application of the Montage Image Mosaic Engine To The Visualization Of Astronomical Images

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce; Good, J. C.

    2017-05-01

    The Montage Image Mosaic Engine was designed as a scalable toolkit, written in C for performance and portability across *nix platforms, that assembles FITS images into mosaics. This code is freely available and has been widely used in the astronomy and IT communities for research, product generation, and for developing next-generation cyber-infrastructure. Recently, it has begun finding applicability in the field of visualization. This development has come about because the toolkit design allows easy integration into scalable systems that process data for subsequent visualization in a browser or client. The toolkit it includes a visualization tool suitable for automation and for integration into Python: mViewer creates, with a single command, complex multi-color images overlaid with coordinate displays, labels, and observation footprints, and includes an adaptive image histogram equalization method that preserves the structure of a stretched image over its dynamic range. The Montage toolkit contains functionality originally developed to support the creation and management of mosaics, but which also offers value to visualization: a background rectification algorithm that reveals the faint structure in an image; and tools for creating cutout and downsampled versions of large images. Version 5 of Montage offers support for visualizing data written in HEALPix sky-tessellation scheme, and functionality for processing and organizing images to comply with the TOAST sky-tessellation scheme required for consumption by the World Wide Telescope (WWT). Four online tutorials allow readers to reproduce and extend all the visualizations presented in this paper.

  9. Modeling the tagged-neutron UXO identification technique using the Geant4 toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou Y.; Mitra S.; Zhu X.

    2011-10-16

    It is proposed to use 14 MeV neutrons tagged by the associated particle neutron time-of-flight technique (APnTOF) to identify the fillers of unexploded ordnances (UXO) by characterizing their carbon, nitrogen and oxygen contents. To facilitate the design and construction of a prototype system, a preliminary simulation model was developed, using the Geant4 toolkit. This work established the toolkit environment for (a) generating tagged neutrons, (b) their transport and interactions within a sample to induce emission and detection of characteristic gamma-rays, and (c) 2D and 3D-image reconstruction of the interrogated object using the neutron and gamma-ray time-of-flight information. Using the modeling,more » this article demonstrates the novelty of the tagged-neutron approach for extracting useful signals with high signal-to-background discrimination of an object-of-interest from that of its environment. Simulations indicated that an UXO filled with the RDX explosive, hexogen (C{sub 3}H{sub 6}O{sub 6}N{sub 6}), can be identified to a depth of 20 cm when buried in soil.« less

  10. GESearch: An Interactive GUI Tool for Identifying Gene Expression Signature.

    PubMed

    Ye, Ning; Yin, Hengfu; Liu, Jingjing; Dai, Xiaogang; Yin, Tongming

    2015-01-01

    The huge amount of gene expression data generated by microarray and next-generation sequencing technologies present challenges to exploit their biological meanings. When searching for the coexpression genes, the data mining process is largely affected by selection of algorithms. Thus, it is highly desirable to provide multiple options of algorithms in the user-friendly analytical toolkit to explore the gene expression signatures. For this purpose, we developed GESearch, an interactive graphical user interface (GUI) toolkit, which is written in MATLAB and supports a variety of gene expression data files. This analytical toolkit provides four models, including the mean, the regression, the delegate, and the ensemble models, to identify the coexpression genes, and enables the users to filter data and to select gene expression patterns by browsing the display window or by importing knowledge-based genes. Subsequently, the utility of this analytical toolkit is demonstrated by analyzing two sets of real-life microarray datasets from cell-cycle experiments. Overall, we have developed an interactive GUI toolkit that allows for choosing multiple algorithms for analyzing the gene expression signatures.

  11. Lens implementation on the GATE Monte Carlo toolkit for optical imaging simulation

    NASA Astrophysics Data System (ADS)

    Kang, Han Gyu; Song, Seong Hyun; Han, Young Been; Kim, Kyeong Min; Hong, Seong Jong

    2018-02-01

    Optical imaging techniques are widely used for in vivo preclinical studies, and it is well known that the Geant4 Application for Emission Tomography (GATE) can be employed for the Monte Carlo (MC) modeling of light transport inside heterogeneous tissues. However, the GATE MC toolkit is limited in that it does not yet include optical lens implementation, even though this is required for a more realistic optical imaging simulation. We describe our implementation of a biconvex lens into the GATE MC toolkit to improve both the sensitivity and spatial resolution for optical imaging simulation. The lens implemented into the GATE was validated against the ZEMAX optical simulation using an US air force 1951 resolution target. The ray diagrams and the charge-coupled device images of the GATE optical simulation agreed with the ZEMAX optical simulation results. In conclusion, the use of a lens on the GATE optical simulation could improve the image quality of bioluminescence and fluorescence significantly as compared with pinhole optics.

  12. Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience

    PubMed Central

    Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin

    2009-01-01

    Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671

  13. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.

  14. Technical note: DIRART--A software suite for deformable image registration and adaptive radiotherapy research.

    PubMed

    Yang, Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu, Yu; Goddu, S Murty; Mutic, Sasa; Deasy, Joseph O; Low, Daniel A

    2011-01-01

    Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research. 0 2011 Ameri-

  15. Engineering and algorithm design for an image processing Api: a technical report on ITK--the Insight Toolkit.

    PubMed

    Yoo, Terry S; Ackerman, Michael J; Lorensen, William E; Schroeder, Will; Chalana, Vikram; Aylward, Stephen; Metaxas, Dimitris; Whitaker, Ross

    2002-01-01

    We present the detailed planning and execution of the Insight Toolkit (ITK), an application programmers interface (API) for the segmentation and registration of medical image data. This public resource has been developed through the NLM Visible Human Project, and is in beta test as an open-source software offering under cost-free licensing. The toolkit concentrates on 3D medical data segmentation and registration algorithms, multimodal and multiresolution capabilities, and portable platform independent support for Windows, Linux/Unix systems. This toolkit was built using current practices in software engineering. Specifically, we embraced the concept of generic programming during the development of these tools, working extensively with C++ templates and the freedom and flexibility they allow. Software development tools for distributed consortium-based code development have been created and are also publicly available. We discuss our assumptions, design decisions, and some lessons learned.

  16. Lens implementation on the GATE Monte Carlo toolkit for optical imaging simulation.

    PubMed

    Kang, Han Gyu; Song, Seong Hyun; Han, Young Been; Kim, Kyeong Min; Hong, Seong Jong

    2018-02-01

    Optical imaging techniques are widely used for in vivo preclinical studies, and it is well known that the Geant4 Application for Emission Tomography (GATE) can be employed for the Monte Carlo (MC) modeling of light transport inside heterogeneous tissues. However, the GATE MC toolkit is limited in that it does not yet include optical lens implementation, even though this is required for a more realistic optical imaging simulation. We describe our implementation of a biconvex lens into the GATE MC toolkit to improve both the sensitivity and spatial resolution for optical imaging simulation. The lens implemented into the GATE was validated against the ZEMAX optical simulation using an US air force 1951 resolution target. The ray diagrams and the charge-coupled device images of the GATE optical simulation agreed with the ZEMAX optical simulation results. In conclusion, the use of a lens on the GATE optical simulation could improve the image quality of bioluminescence and fluorescence significantly as compared with pinhole optics. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  17. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    NASA Technical Reports Server (NTRS)

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  18. Technical Note: DIRART – A software suite for deformable image registration and adaptive radiotherapy research

    PubMed Central

    Yang, Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu, Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A.

    2011-01-01

    Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods:DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing∕registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research. PMID:21361176

  19. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less

  20. Fragment Impact Toolkit (FIT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shevitz, Daniel Wolf; Key, Brian P.; Garcia, Daniel B.

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  1. Common Metrics for Human-Robot Interaction

    NASA Technical Reports Server (NTRS)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  2. Comparison and assessment of semi-automatic image segmentation in computed tomography scans for image-guided kidney surgery.

    PubMed

    Glisson, Courtenay L; Altamar, Hernan O; Herrell, S Duke; Clark, Peter; Galloway, Robert L

    2011-11-01

    Image segmentation is integral to implementing intraoperative guidance for kidney tumor resection. Results seen in computed tomography (CT) data are affected by target organ physiology as well as by the segmentation algorithm used. This work studies variables involved in using level set methods found in the Insight Toolkit to segment kidneys from CT scans and applies the results to an image guidance setting. A composite algorithm drawing on the strengths of multiple level set approaches was built using the Insight Toolkit. This algorithm requires image contrast state and seed points to be identified as input, and functions independently thereafter, selecting and altering method and variable choice as needed. Semi-automatic results were compared to expert hand segmentation results directly and by the use of the resultant surfaces for registration of intraoperative data. Direct comparison using the Dice metric showed average agreement of 0.93 between semi-automatic and hand segmentation results. Use of the segmented surfaces in closest point registration of intraoperative laser range scan data yielded average closest point distances of approximately 1 mm. Application of both inverse registration transforms from the previous step to all hand segmented image space points revealed that the distance variability introduced by registering to the semi-automatically segmented surface versus the hand segmented surface was typically less than 3 mm both near the tumor target and at distal points, including subsurface points. Use of the algorithm shortened user interaction time and provided results which were comparable to the gold standard of hand segmentation. Further, the use of the algorithm's resultant surfaces in image registration provided comparable transformations to surfaces produced by hand segmentation. These data support the applicability and utility of such an algorithm as part of an image guidance workflow.

  3. Literacy Toolkit

    ERIC Educational Resources Information Center

    Center for Best Practices in Early Childhood Education, 2005

    2005-01-01

    The toolkit contains print and electronic resources, including (1) "eMERGing Literacy and Technology: Working Together", A 492 page curriculum guide; (2) "LitTECH Interactive Presents: The Beginning of Literacy", a DVD that provides and overview linking technology to the concepts of emerging literacy; (3) "Your Preschool Classroom Computer Center:…

  4. ECCE Toolkit: Prototyping Sensor-Based Interaction.

    PubMed

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-02-23

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  5. ECCE Toolkit: Prototyping Sensor-Based Interaction

    PubMed Central

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-01-01

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit. PMID:28241502

  6. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genser, Krzysztof; Hatcher, Robert; Perdue, Gabriel

    2016-11-10

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies.more » It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.« less

  7. Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.

    2001-01-01

    Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.

  8. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    NASA Astrophysics Data System (ADS)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  9. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    PubMed

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  10. Semi-Automated Identification of Rocks in Images

    NASA Technical Reports Server (NTRS)

    Bornstein, Benjamin; Castano, Andres; Anderson, Robert

    2006-01-01

    Rock Identification Toolkit Suite is a computer program that assists users in identifying and characterizing rocks shown in images returned by the Mars Explorer Rover mission. Included in the program are components for automated finding of rocks, interactive adjustments of outlines of rocks, active contouring of rocks, and automated analysis of shapes in two dimensions. The program assists users in evaluating the surface properties of rocks and soil and reports basic properties of rocks. The program requires either the Mac OS X operating system running on a G4 (or more capable) processor or a Linux operating system running on a Pentium (or more capable) processor, plus at least 128MB of random-access memory.

  11. Using the Browser for Science: A Collaborative Toolkit for Astronomy

    NASA Astrophysics Data System (ADS)

    Connolly, A. J.; Smith, I.; Krughoff, K. S.; Gibson, R.

    2011-07-01

    Astronomical surveys have yielded hundreds of terabytes of catalogs and images that span many decades of the electromagnetic spectrum. Even when observatories provide user-friendly web interfaces, exploring these data resources remains a complex and daunting task. In contrast, gadgets and widgets have become popular in social networking (e.g. iGoogle, Facebook). They provide a simple way to make complex data easily accessible that can be customized based on the interest of the user. With ASCOT (an AStronomical COllaborative Toolkit) we expand on these concepts to provide a customizable and extensible gadget framework for use in science. Unlike iGoogle, where all of the gadgets are independent, the gadgets we develop communicate and share information, enabling users to visualize and interact with data through multiple, simultaneous views. With this approach, web-based applications for accessing and visualizing data can be generated easily and, by linking these tools together, integrated and powerful data analysis and discovery tools can be constructed.

  12. FAST: framework for heterogeneous medical image computing and visualization.

    PubMed

    Smistad, Erik; Bozorgi, Mohammadmehdi; Lindseth, Frank

    2015-11-01

    Computer systems are becoming increasingly heterogeneous in the sense that they consist of different processors, such as multi-core CPUs and graphic processing units. As the amount of medical image data increases, it is crucial to exploit the computational power of these processors. However, this is currently difficult due to several factors, such as driver errors, processor differences, and the need for low-level memory handling. This paper presents a novel FrAmework for heterogeneouS medical image compuTing and visualization (FAST). The framework aims to make it easier to simultaneously process and visualize medical images efficiently on heterogeneous systems. FAST uses common image processing programming paradigms and hides the details of memory handling from the user, while enabling the use of all processors and cores on a system. The framework is open-source, cross-platform and available online. Code examples and performance measurements are presented to show the simplicity and efficiency of FAST. The results are compared to the insight toolkit (ITK) and the visualization toolkit (VTK) and show that the presented framework is faster with up to 20 times speedup on several common medical imaging algorithms. FAST enables efficient medical image computing and visualization on heterogeneous systems. Code examples and performance evaluations have demonstrated that the toolkit is both easy to use and performs better than existing frameworks, such as ITK and VTK.

  13. Facilitating interprofessional evidence-based practice in paediatric rehabilitation: development, implementation and evaluation of an online toolkit for health professionals.

    PubMed

    Glegg, Stephanie M N; Livingstone, Roslyn; Montgomery, Ivonne

    2016-01-01

    Lack of time, competencies, resources and supports are documented as barriers to evidence-based practice (EBP). This paper introduces a recently developed web-based toolkit designed to assist interprofessional clinicians in implementing EBP within a paediatric rehabilitation setting. EBP theory, models, frameworks and tools were applied or adapted in the development of the online resources, which formed the basis of a larger support strategy incorporating interactive workshops, knowledge broker facilitation and mentoring. The highly accessed toolkit contains flowcharts with embedded information sheets, resources and templates to streamline, quantify and document outcomes throughout the EBP process. Case examples relevance to occupational therapy and physical therapy highlight the utility and application of the toolkit in a clinical paediatric setting. Workshops were highly rated by learners for clinical relevance, presentation level and effectiveness. Eight evidence syntheses have been created and 79 interventions have been evaluated since the strategy's inception in January 2011. The toolkit resources streamlined and supported EBP processes, promoting consistency in quality and presentation of outputs. The online toolkit can be a useful tool to facilitate clinicians' use of EBP in order to meet the needs of the clients and families whom they support. Implications for Rehabilitation A comprehensive online EBP toolkit for interprofessional clinicians is available to streamline the EBP process and to support learning needs regardless of competency level. Multi-method facilitation support, including interactive education, e-learning, clinical librarian services and knowledge brokering, is a valued but cost-restrictive supplement to the implementation of online EBP resources. EBP resources are not one-size-fits-all; targeted appraisal tools, models and frameworks may be integrated to improve their utility for specific sectors, which may limit them for others.

  14. Pydpiper: a flexible toolkit for constructing novel registration pipelines.

    PubMed

    Friedel, Miriam; van Eede, Matthijs C; Pipitone, Jon; Chakravarty, M Mallar; Lerch, Jason P

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines "out-of-the-box." In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  15. Pydpiper: a flexible toolkit for constructing novel registration pipelines

    PubMed Central

    Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  16. Toolkit for testing scientific CCD cameras

    NASA Astrophysics Data System (ADS)

    Uzycki, Janusz; Mankiewicz, Lech; Molak, Marcin; Wrochna, Grzegorz

    2006-03-01

    The CCD Toolkit (1) is a software tool for testing CCD cameras which allows to measure important characteristics of a camera like readout noise, total gain, dark current, 'hot' pixels, useful area, etc. The application makes a statistical analysis of images saved in files with FITS format, commonly used in astronomy. A graphical interface is based on the ROOT package, which offers high functionality and flexibility. The program was developed in a way to ensure future compatibility with different operating systems: Windows and Linux. The CCD Toolkit was created for the "Pie of the Sky" project collaboration (2).

  17. Code Parallelization with CAPO: A User Manual

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A software tool has been developed to assist the parallelization of scientific codes. This tool, CAPO, extends an existing parallelization toolkit, CAPTools developed at the University of Greenwich, to generate OpenMP parallel codes for shared memory architectures. This is an interactive toolkit to transform a serial Fortran application code to an equivalent parallel version of the software - in a small fraction of the time normally required for a manual parallelization. We first discuss the way in which loop types are categorized and how efficient OpenMP directives can be defined and inserted into the existing code using the in-depth interprocedural analysis. The use of the toolkit on a number of application codes ranging from benchmark to real-world application codes is presented. This will demonstrate the great potential of using the toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of processors. The second part of the document gives references to the parameters and the graphic user interface implemented in the toolkit. Finally a set of tutorials is included for hands-on experiences with this toolkit.

  18. Development of an evidence-informed leisure time physical activity resource for adults with spinal cord injury: the SCI Get Fit Toolkit.

    PubMed

    Arbour-Nicitopoulos, K P; Martin Ginis, K A; Latimer-Cheung, A E; Bourne, C; Campbell, D; Cappe, S; Ginis, S; Hicks, A L; Pomerleau, P; Smith, K

    2013-06-01

    To systematically develop an evidence-informed leisure time physical activity (LTPA) resource for adults with spinal cord injury (SCI). Canada. The Appraisal of Guidelines, Research and Evaluation (AGREE) II protocol was used to develop a toolkit to teach and encourage adults with SCI how to make smart and informed choices about being physically active. A multidisciplinary expert panel appraised the evidence and generated specific recommendations for the content of the toolkit. Pilot testing was conducted to refine the toolkit's presentation. Recommendations emanating from the consultation process were that the toolkit be a brief, evidence-based resource that contains images of adults with tetraplegia and paraplegia, and links to more detailed online information. The content of the toolkit should include the physical activity guidelines (PAGs) for adults with SCI, activities tailored to manual and power chair users, the benefits of LTPA, and strategies to overcome common LTPA barriers for adults with SCI. The inclusion of action plans and safety tips was also recommended. These recommendations have resulted in the development of an evidence-informed LTPA resource to assist adults with SCI in meeting the PAGs. This toolkit will have important implications for consumers, health care professionals and policy makers for encouraging LTPA in the SCI community.

  19. Python-based geometry preparation and simulation visualization toolkits for STEPS

    PubMed Central

    Chen, Weiliang; De Schutter, Erik

    2014-01-01

    STEPS is a stochastic reaction-diffusion simulation engine that implements a spatial extension of Gillespie's Stochastic Simulation Algorithm (SSA) in complex tetrahedral geometries. An extensive Python-based interface is provided to STEPS so that it can interact with the large number of scientific packages in Python. However, a gap existed between the interfaces of these packages and the STEPS user interface, where supporting toolkits could reduce the amount of scripting required for research projects. This paper introduces two new supporting toolkits that support geometry preparation and visualization for STEPS simulations. PMID:24782754

  20. bioWidgets: data interaction components for genomics.

    PubMed

    Fischer, S; Crabtree, J; Brunk, B; Gibson, M; Overton, G C

    1999-10-01

    The presentation of genomics data in a perspicuous visual format is critical for its rapid interpretation and validation. Relatively few public database developers have the resources to implement sophisticated front-end user interfaces themselves. Accordingly, these developers would benefit from a reusable toolkit of user interface and data visualization components. We have designed the bioWidget toolkit as a set of JavaBean components. It includes a wide array of user interface components and defines an architecture for assembling applications. The toolkit is founded on established software engineering design patterns and principles, including componentry, Model-View-Controller, factored models and schema neutrality. As a proof of concept, we have used the bioWidget toolkit to create three extendible applications: AnnotView, BlastView and AlignView.

  1. Orthogonal Luciferase-Luciferin Pairs for Bioluminescence Imaging.

    PubMed

    Jones, Krysten A; Porterfield, William B; Rathbun, Colin M; McCutcheon, David C; Paley, Miranda A; Prescher, Jennifer A

    2017-02-15

    Bioluminescence imaging with luciferase-luciferin pairs is widely used in biomedical research. Several luciferases have been identified in nature, and many have been adapted for tracking cells in whole animals. Unfortunately, the optimal luciferases for imaging in vivo utilize the same substrate and therefore cannot easily differentiate multiple cell types in a single subject. To develop a broader set of distinguishable probes, we crafted custom luciferins that can be selectively processed by engineered luciferases. Libraries of mutant enzymes were iteratively screened with sterically modified luciferins, and orthogonal enzyme-substrate "hits" were identified. These tools produced light when complementary enzyme-substrate partners interacted both in vitro and in cultured cell models. Based on their selectivity, these designer pairs will bolster multicomponent imaging and enable the direct interrogation of cell networks not currently possible with existing tools. Our screening platform is also general and will expedite the identification of more unique luciferases and luciferins, further expanding the bioluminescence toolkit.

  2. Toolkit Approach to Integrating Library Resources into the Learning Management System

    ERIC Educational Resources Information Center

    Black, Elizabeth L.

    2008-01-01

    As use of learning management systems (LMS) increases, it is essential that librarians are there. Ohio State University Libraries took a toolkit approach to integrate library content in the LMS to facilitate creative and flexible interactions between librarians, students and faculty in Ohio State University's large and decentralized academic…

  3. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  4. Psy Toolkit: A Novel Web-Based Method for Running Online Questionnaires and Reaction-Time Experiments

    ERIC Educational Resources Information Center

    Stoet, Gijsbert

    2017-01-01

    This article reviews PsyToolkit, a free web-based service designed for setting up, running, and analyzing online questionnaires and reaction-time (RT) experiments. It comes with extensive documentation, videos, lessons, and libraries of free-to-use psychological scales and RT experiments. It provides an elaborate interactive environment to use (or…

  5. A Conceptual Framework and a Toolkit for Supporting the Rapid Prototyping of Context-Aware Applications.

    ERIC Educational Resources Information Center

    Dey, Anind K.; Abowd, Gregory D.; Salber, Daniel

    2001-01-01

    Discusses the trend toward ubiquitous computing and the challenge to enhance the behavior of any application by informing it of the context of its use. Defines context related to the interaction between humans, applications, and the surrounding environment; and presents a conceptual framework and a toolkit for supporting the rapid prototyping of…

  6. EduSpeak[R]: A Speech Recognition and Pronunciation Scoring Toolkit for Computer-Aided Language Learning Applications

    ERIC Educational Resources Information Center

    Franco, Horacio; Bratt, Harry; Rossier, Romain; Rao Gadde, Venkata; Shriberg, Elizabeth; Abrash, Victor; Precoda, Kristin

    2010-01-01

    SRI International's EduSpeak[R] system is a software development toolkit that enables developers of interactive language education software to use state-of-the-art speech recognition and pronunciation scoring technology. Automatic pronunciation scoring allows the computer to provide feedback on the overall quality of pronunciation and to point to…

  7. canvasDesigner: A versatile interactive high-resolution scientific multi-panel visualization toolkit.

    PubMed

    Zhang, Baohong; Zhao, Shanrong; Neuhaus, Isaac

    2018-05-03

    We present a bioinformatics and systems biology visualization toolkit harmonizing real time interactive exploring and analyzing of big data, full-fledged customizing of look-n-feel, and producing multi-panel publication-ready figures in PDF format simultaneously. Source code and detailed user guides are available at http://canvasxpress.org, https://baohongz.github.io/canvasDesigner, and https://baohongz.github.io/canvasDesigner/demo_video.html. isaac.neuhaus@bms.com, baohong.zhang@pfizer.com, shanrong.zhao@pfizer.com. Supplementary materials are available at https://goo.gl/1uQygs.

  8. X-Ray Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Radiographic Image Acquisition & Processing Software for Security Markets. Used in operation of commercial x-ray scanners and manipulation of x-ray images for emergency responders including State, Local, Federal, and US Military bomb technicians and analysts.

  9. Machine learning for a Toolkit for Image Mining

    NASA Technical Reports Server (NTRS)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  10. Atlas Toolkit: Fast registration of 3D morphological datasets in the absence of landmarks

    PubMed Central

    Grocott, Timothy; Thomas, Paul; Münsterberg, Andrea E.

    2016-01-01

    Image registration is a gateway technology for Developmental Systems Biology, enabling computational analysis of related datasets within a shared coordinate system. Many registration tools rely on landmarks to ensure that datasets are correctly aligned; yet suitable landmarks are not present in many datasets. Atlas Toolkit is a Fiji/ImageJ plugin collection offering elastic group-wise registration of 3D morphological datasets, guided by segmentation of the interesting morphology. We demonstrate the method by combinatorial mapping of cell signalling events in the developing eyes of chick embryos, and use the integrated datasets to predictively enumerate Gene Regulatory Network states. PMID:26864723

  11. Atlas Toolkit: Fast registration of 3D morphological datasets in the absence of landmarks.

    PubMed

    Grocott, Timothy; Thomas, Paul; Münsterberg, Andrea E

    2016-02-11

    Image registration is a gateway technology for Developmental Systems Biology, enabling computational analysis of related datasets within a shared coordinate system. Many registration tools rely on landmarks to ensure that datasets are correctly aligned; yet suitable landmarks are not present in many datasets. Atlas Toolkit is a Fiji/ImageJ plugin collection offering elastic group-wise registration of 3D morphological datasets, guided by segmentation of the interesting morphology. We demonstrate the method by combinatorial mapping of cell signalling events in the developing eyes of chick embryos, and use the integrated datasets to predictively enumerate Gene Regulatory Network states.

  12. The interactive learning toolkit: technology and the classroom

    NASA Astrophysics Data System (ADS)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  13. A cognitive prosthesis for complex decision-making.

    PubMed

    Tremblay, Sébastien; Gagnon, Jean-François; Lafond, Daniel; Hodgetts, Helen M; Doiron, Maxime; Jeuniaux, Patrick P J M H

    2017-01-01

    While simple heuristics can be ecologically rational and effective in naturalistic decision making contexts, complex situations require analytical decision making strategies, hypothesis-testing and learning. Sub-optimal decision strategies - using simplified as opposed to analytic decision rules - have been reported in domains such as healthcare, military operational planning, and government policy making. We investigate the potential of a computational toolkit called "IMAGE" to improve decision-making by developing structural knowledge and increasing understanding of complex situations. IMAGE is tested within the context of a complex military convoy management task through (a) interactive simulations, and (b) visualization and knowledge representation capabilities. We assess the usefulness of two versions of IMAGE (desktop and immersive) compared to a baseline. Results suggest that the prosthesis helped analysts in making better decisions, but failed to increase their structural knowledge about the situation once the cognitive prosthesis is removed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. DataViewer3D: An Open-Source, Cross-Platform Multi-Modal Neuroimaging Data Visualization Tool

    PubMed Central

    Gouws, André; Woods, Will; Millman, Rebecca; Morland, Antony; Green, Gary

    2008-01-01

    Integration and display of results from multiple neuroimaging modalities [e.g. magnetic resonance imaging (MRI), magnetoencephalography, EEG] relies on display of a diverse range of data within a common, defined coordinate frame. DataViewer3D (DV3D) is a multi-modal imaging data visualization tool offering a cross-platform, open-source solution to simultaneous data overlay visualization requirements of imaging studies. While DV3D is primarily a visualization tool, the package allows an analysis approach where results from one imaging modality can guide comparative analysis of another modality in a single coordinate space. DV3D is built on Python, a dynamic object-oriented programming language with support for integration of modular toolkits, and development of cross-platform software for neuroimaging. DV3D harnesses the power of the Visualization Toolkit (VTK) for two-dimensional (2D) and 3D rendering, calling VTK's low level C++ functions from Python. Users interact with data via an intuitive interface that uses Python to bind wxWidgets, which in turn calls the user's operating system dialogs and graphical user interface tools. DV3D currently supports NIfTI-1, ANALYZE™ and DICOM formats for MRI data display (including statistical data overlay). Formats for other data types are supported. The modularity of DV3D and ease of use of Python allows rapid integration of additional format support and user development. DV3D has been tested on Mac OSX, RedHat Linux and Microsoft Windows XP. DV3D is offered for free download with an extensive set of tutorial resources and example data. PMID:19352444

  15. Atrioventricular junction (AVJ) motion tracking: a software tool with ITK/VTK/Qt.

    PubMed

    Pengdong Xiao; Shuang Leng; Xiaodan Zhao; Hua Zou; Ru San Tan; Wong, Philip; Liang Zhong

    2016-08-01

    The quantitative measurement of the Atrioventricular Junction (AVJ) motion is an important index for ventricular functions of one cardiac cycle including systole and diastole. In this paper, a software tool that can conduct AVJ motion tracking from cardiovascular magnetic resonance (CMR) images is presented by using Insight Segmentation and Registration Toolkit (ITK), The Visualization Toolkit (VTK) and Qt. The software tool is written in C++ by using Visual Studio Community 2013 integrated development environment (IDE) containing both an editor and a Microsoft complier. The software package has been successfully implemented. From the software engineering practice, it is concluded that ITK, VTK, and Qt are very handy software systems to implement automatic image analysis functions for CMR images such as quantitative measure of motion by visual tracking.

  16. DYNECHARM++: a toolkit to simulate coherent interactions of high-energy charged particles in complex structures

    NASA Astrophysics Data System (ADS)

    Bagli, Enrico; Guidi, Vincenzo

    2013-08-01

    A toolkit for the simulation of coherent interactions between high-energy charged particles and complex crystal structures, called DYNECHARM++ has been developed. The code has been written in C++ language taking advantage of this object-oriented programing method. The code is capable to evaluating the electrical characteristics of complex atomic structures and to simulate and track the particle trajectory within them. Calculation method of electrical characteristics based on their expansion in Fourier series has been adopted. Two different approaches to simulate the interaction have been adopted, relying on the full integration of particle trajectories under the continuum potential approximation and on the definition of cross-sections of coherent processes. Finally, the code has proved to reproduce experimental results and to simulate interaction of charged particles with complex structures.

  17. Clustergrammer, a web-based heatmap visualization and analysis tool for high-dimensional biological data

    PubMed Central

    Fernandez, Nicolas F.; Gundersen, Gregory W.; Rahman, Adeeb; Grimes, Mark L.; Rikova, Klarisa; Hornbeck, Peter; Ma’ayan, Avi

    2017-01-01

    Most tools developed to visualize hierarchically clustered heatmaps generate static images. Clustergrammer is a web-based visualization tool with interactive features such as: zooming, panning, filtering, reordering, sharing, performing enrichment analysis, and providing dynamic gene annotations. Clustergrammer can be used to generate shareable interactive visualizations by uploading a data table to a web-site, or by embedding Clustergrammer in Jupyter Notebooks. The Clustergrammer core libraries can also be used as a toolkit by developers to generate visualizations within their own applications. Clustergrammer is demonstrated using gene expression data from the cancer cell line encyclopedia (CCLE), original post-translational modification data collected from lung cancer cells lines by a mass spectrometry approach, and original cytometry by time of flight (CyTOF) single-cell proteomics data from blood. Clustergrammer enables producing interactive web based visualizations for the analysis of diverse biological data. PMID:28994825

  18. MITK global tractography

    NASA Astrophysics Data System (ADS)

    Neher, Peter F.; Stieltjes, Bram; Reisert, Marco; Reicht, Ignaz; Meinzer, Hans-Peter; Fritzsche, Klaus H.

    2012-02-01

    Fiber tracking algorithms yield valuable information for neurosurgery as well as automated diagnostic approaches. However, they have not yet arrived in the daily clinical practice. In this paper we present an open source integration of the global tractography algorithm proposed by Reisert et.al.1 into the open source Medical Imaging Interaction Toolkit (MITK) developed and maintained by the Division of Medical and Biological Informatics at the German Cancer Research Center (DKFZ). The integration of this algorithm into a standardized and open development environment like MITK enriches accessibility of tractography algorithms for the science community and is an important step towards bringing neuronal tractography closer to a clinical application. The MITK diffusion imaging application, downloadable from www.mitk.org, combines all the steps necessary for a successful tractography: preprocessing, reconstruction of the images, the actual tracking, live monitoring of intermediate results, postprocessing and visualization of the final tracking results. This paper presents typical tracking results and demonstrates the steps for pre- and post-processing of the images.

  19. MetPetDB: A database for metamorphic geochemistry

    NASA Astrophysics Data System (ADS)

    Spear, Frank S.; Hallett, Benjamin; Pyle, Joseph M.; Adalı, Sibel; Szymanski, Boleslaw K.; Waters, Anthony; Linder, Zak; Pearce, Shawn O.; Fyffe, Matthew; Goldfarb, Dennis; Glickenhouse, Nickolas; Buletti, Heather

    2009-12-01

    We present a data model for the initial implementation of MetPetDB, a geochemical database specific to metamorphic rock samples. The database is designed around the concept of preservation of spatial relationships, at all scales, of chemical analyses and their textural setting. Objects in the database (samples) represent physical rock samples; each sample may contain one or more subsamples with associated geochemical and image data. Samples, subsamples, geochemical data, and images are described with attributes (some required, some optional); these attributes also serve as search delimiters. All data in the database are classified as published (i.e., archived or published data), public or private. Public and published data may be freely searched and downloaded. All private data is owned; permission to view, edit, download and otherwise manipulate private data may be granted only by the data owner; all such editing operations are recorded by the database to create a data version log. The sharing of data permissions among a group of collaborators researching a common sample is done by the sample owner through the project manager. User interaction with MetPetDB is hosted by a web-based platform based upon the Java servlet application programming interface, with the PostgreSQL relational database. The database web portal includes modules that allow the user to interact with the database: registered users may save and download public and published data, upload private data, create projects, and assign permission levels to project collaborators. An Image Viewer module provides for spatial integration of image and geochemical data. A toolkit consisting of plotting and geochemical calculation software for data analysis and a mobile application for viewing the public and published data is being developed. Future issues to address include population of the database, integration with other geochemical databases, development of the analysis toolkit, creation of data models for derivative data, and building a community-wide user base. It is believed that this and other geochemical databases will enable more productive collaborations, generate more efficient research efforts, and foster new developments in basic research in the field of solid earth geochemistry.

  20. A topological framework for interactive queries on 3D models in the Web.

    PubMed

    Figueiredo, Mauro; Rodrigues, José I; Silvestre, Ivo; Veiga-Pires, Cristina

    2014-01-01

    Several technologies exist to create 3D content for the web. With X3D, WebGL, and X3DOM, it is possible to visualize and interact with 3D models in a web browser. Frequently, three-dimensional objects are stored using the X3D file format for the web. However, there is no explicit topological information, which makes it difficult to design fast algorithms for applications that require adjacency and incidence data. This paper presents a new open source toolkit TopTri (Topological model for Triangle meshes) for Web3D servers that builds the topological model for triangular meshes of manifold or nonmanifold models. Web3D client applications using this toolkit make queries to the web server to get adjacent and incidence information of vertices, edges, and faces. This paper shows the application of the topological information to get minimal local points and iso-lines in a 3D mesh in a web browser. As an application, we present also the interactive identification of stalactites in a cave chamber in a 3D web browser. Several tests show that even for large triangular meshes with millions of triangles, the adjacency and incidence information is returned in real time making the presented toolkit appropriate for interactive Web3D applications.

  1. A Topological Framework for Interactive Queries on 3D Models in the Web

    PubMed Central

    Figueiredo, Mauro; Rodrigues, José I.; Silvestre, Ivo; Veiga-Pires, Cristina

    2014-01-01

    Several technologies exist to create 3D content for the web. With X3D, WebGL, and X3DOM, it is possible to visualize and interact with 3D models in a web browser. Frequently, three-dimensional objects are stored using the X3D file format for the web. However, there is no explicit topological information, which makes it difficult to design fast algorithms for applications that require adjacency and incidence data. This paper presents a new open source toolkit TopTri (Topological model for Triangle meshes) for Web3D servers that builds the topological model for triangular meshes of manifold or nonmanifold models. Web3D client applications using this toolkit make queries to the web server to get adjacent and incidence information of vertices, edges, and faces. This paper shows the application of the topological information to get minimal local points and iso-lines in a 3D mesh in a web browser. As an application, we present also the interactive identification of stalactites in a cave chamber in a 3D web browser. Several tests show that even for large triangular meshes with millions of triangles, the adjacency and incidence information is returned in real time making the presented toolkit appropriate for interactive Web3D applications. PMID:24977236

  2. A Toolkit for Forward/Inverse Problems in Electrocardiography within the SCIRun Problem Solving Environment

    PubMed Central

    Burton, Brett M; Tate, Jess D; Erem, Burak; Swenson, Darrell J; Wang, Dafang F; Steffen, Michael; Brooks, Dana H; van Dam, Peter M; Macleod, Rob S

    2012-01-01

    Computational modeling in electrocardiography often requires the examination of cardiac forward and inverse problems in order to non-invasively analyze physiological events that are otherwise inaccessible or unethical to explore. The study of these models can be performed in the open-source SCIRun problem solving environment developed at the Center for Integrative Biomedical Computing (CIBC). A new toolkit within SCIRun provides researchers with essential frameworks for constructing and manipulating electrocardiographic forward and inverse models in a highly efficient and interactive way. The toolkit contains sample networks, tutorials and documentation which direct users through SCIRun-specific approaches in the assembly and execution of these specific problems. PMID:22254301

  3. An open-source framework for testing tracking devices using Lego Mindstorms

    NASA Astrophysics Data System (ADS)

    Jomier, Julien; Ibanez, Luis; Enquobahrie, Andinet; Pace, Danielle; Cleary, Kevin

    2009-02-01

    In this paper, we present an open-source framework for testing tracking devices in surgical navigation applications. At the core of image-guided intervention systems is the tracking interface that handles communication with the tracking device and gathers tracking information. Given that the correctness of tracking information is critical for protecting patient safety and for ensuring the successful execution of an intervention, the tracking software component needs to be thoroughly tested on a regular basis. Furthermore, with widespread use of extreme programming methodology that emphasizes continuous and incremental testing of application components, testing design becomes critical. While it is easy to automate most of the testing process, it is often more difficult to test components that require manual intervention such as tracking device. Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source toolkit written in C++ to control the robot movements and assess the accuracy of the tracking devices. The application program interface (API) is cross-platform and runs on Windows, Linux and MacOS. We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit (IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on tracking devices can be performed at low cost and improve significantly the quality of the software.

  4. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.

  5. Toward a VPH/Physiome ToolKit.

    PubMed

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  6. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  7. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  8. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    PubMed

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  9. Carotid stenosis assessment with multi-detector CT angiography: comparison between manual and automatic segmentation methods.

    PubMed

    Zhu, Chengcheng; Patterson, Andrew J; Thomas, Owen M; Sadat, Umar; Graves, Martin J; Gillard, Jonathan H

    2013-04-01

    Luminal stenosis is used for selecting the optimal management strategy for patients with carotid artery disease. The aim of this study is to evaluate the reproducibility of carotid stenosis quantification using manual and automated segmentation methods using submillimeter through-plane resolution Multi-Detector CT angiography (MDCTA). 35 patients having carotid artery disease with >30 % luminal stenosis as identified by carotid duplex imaging underwent contrast enhanced MDCTA. Two experienced CT readers quantified carotid stenosis from axial source images, reconstructed maximum intensity projection (MIP) and 3D-carotid geometry which was automatically segmented by an open-source toolkit (Vascular Modelling Toolkit, VMTK) using NASCET criteria. Good agreement among the measurement using axial images, MIP and automatic segmentation was observed. Automatic segmentation methods show better inter-observer agreement between the readers (intra-class correlation coefficient (ICC): 0.99 for diameter stenosis measurement) than manual measurement of axial (ICC = 0.82) and MIP (ICC = 0.86) images. Carotid stenosis quantification using an automatic segmentation method has higher reproducibility compared with manual methods.

  10. SimpleITK Image-Analysis Notebooks: a Collaborative Environment for Education and Reproducible Research.

    PubMed

    Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard

    2018-06-01

    Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .

  11. Tracker Toolkit

    NASA Technical Reports Server (NTRS)

    Lewis, Steven J.; Palacios, David M.

    2013-01-01

    This software can track multiple moving objects within a video stream simultaneously, use visual features to aid in the tracking, and initiate tracks based on object detection in a subregion. A simple programmatic interface allows plugging into larger image chain modeling suites. It extracts unique visual features for aid in tracking and later analysis, and includes sub-functionality for extracting visual features about an object identified within an image frame. Tracker Toolkit utilizes a feature extraction algorithm to tag each object with metadata features about its size, shape, color, and movement. Its functionality is independent of the scale of objects within a scene. The only assumption made on the tracked objects is that they move. There are no constraints on size within the scene, shape, or type of movement. The Tracker Toolkit is also capable of following an arbitrary number of objects in the same scene, identifying and propagating the track of each object from frame to frame. Target objects may be specified for tracking beforehand, or may be dynamically discovered within a tripwire region. Initialization of the Tracker Toolkit algorithm includes two steps: Initializing the data structures for tracked target objects, including targets preselected for tracking; and initializing the tripwire region. If no tripwire region is desired, this step is skipped. The tripwire region is an area within the frames that is always checked for new objects, and all new objects discovered within the region will be tracked until lost (by leaving the frame, stopping, or blending in to the background).

  12. A Toolkit to Study Sensitivity of the Geant4 Predictions to the Variations of the Physics Model Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fields, Laura; Genser, Krzysztof; Hatcher, Robert

    Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. Thismore » raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.« less

  13. Three-dimensional in vitro cancer spheroid models for Photodynamic Therapy: Strengths and Opportunities

    NASA Astrophysics Data System (ADS)

    Evans, Conor

    2015-03-01

    Three dimensional, in vitro spheroid cultures offer considerable utility for the development and testing of anticancer photodynamic therapy regimens. More complex than monolayer cultures, three-dimensional spheroid systems replicate many of the important cell-cell and cell-matrix interactions that modulate treatment response in vivo. Simple enough to be grown by the thousands and small enough to be optically interrogated, spheroid cultures lend themselves to high-content and high-throughput imaging approaches. These advantages have enabled studies investigating photosensitizer uptake, spatiotemporal patterns of therapeutic response, alterations in oxygen diffusion and consumption during therapy, and the exploration of mechanisms that underlie therapeutic synergy. The use of quantitative imaging methods, in particular, has accelerated the pace of three-dimensional in vitro photodynamic therapy studies, enabling the rapid compilation of multiple treatment response parameters in a single experiment. Improvements in model cultures, the creation of new molecular probes of cell state and function, and innovations in imaging toolkits will be important for the advancement of spheroid culture systems for future photodynamic therapy studies.

  14. Raman spectroscopy based toolkit for mapping bacterial social interactions relevant to human and plant health

    NASA Astrophysics Data System (ADS)

    Couvillion, Sheha Polisetti

    Bacteria interact and co-exist with other microbes and with higher organisms like plants and humans, playing a major role in their health and well being. These ubiquitous single celled organisms are so successful, because they can form organized communities, called biofilms, that protect them from environmental stressors and enable communication and cooperation among members of the community. The work described in this thesis develops a toolkit of analytical techniques centered around Raman microspectroscopy and imaging representing a powerful approach to non-invasively investigate bacterial communities, yielding molecular information at the sub-micrometer length scale. Bacterial cellular components of non-pigmented and pigmented rhizosphere strains are characterized, and regiospecific SERS is used for cases where resonantly enhanced background signals obscure the spectra. Silver nanoparticle colloids were synthesized in situ, in the presence of the cells to form a proximal coating and principal component analysis (PCA) revealed features attributed to flavins. SERS enabled in situ acquisition of Raman spectra and chemical images in highly autofluorescent P.aeruginosa biofilms. In combination with PCA, this allowed for non-invasive spatial mapping of bacterial communities and revealed differences between strains and nutrients in the secretion of virulence factor pyocyanin. The rich potential of using Raman microspectroscopy to study plant-microbe interactions is demonstrated. Effect of exposure to oxidative stress, on both the wild type Pantoea sp. YR343 and carotenoid mutant Delta crtB, was assessed by following the intensity of the 1520 cm -1 and 1126 cm-1 Raman bands, respectively, after treatment with various concentrations of H2O2. Significant changes were observed in these marker bands even at concentrations (1 mM) below the point at which the traditional plate-based viability assay shows an effect (5-10 mM), thus establishing the value of Raman microspectroscopy as a tool for high sensitivity studies of bacterial environmental stressors. The use of PCA in Raman imaging can also discriminate between spectral contributions from plant and bacterial cells. Finally, spectroscopy compatible microfluidic corral platforms are fabricated and a simple microfluidic technique is demonstrated for capturing bacterial cells. This opens up the possibility of studying bacterial communication in settings where it is possible to control population size and microenvironment.

  15. 3CCD image segmentation and edge detection based on MATLAB

    NASA Astrophysics Data System (ADS)

    He, Yong; Pan, Jiazhi; Zhang, Yun

    2006-09-01

    This research aimed to identify weeds from crops in early stage in the field operation by using image-processing technology. As 3CCD images offer greater binary value difference between weed and crop section than ordinary digital images taken by common cameras. It has 3 channels (green, red, ifred) which takes a snap-photo of the same area, and the three images can be composed into one image, which facilitates the segmentation of different areas. By the application of image-processing toolkit on MATLAB, the different areas in the image can be segmented clearly. As edge detection technique is the first and very important step in image processing, The different result of different processing method was compared. Especially, by using the wavelet packet transform toolkit on MATLAB, An image was preprocessed and then the edge was extracted, and getting more clearly cut image of edge. The segmentation methods include operations as erosion, dilation and other algorithms to preprocess the images. It is of great importance to segment different areas in digital images in field real time, so as to be applied in precision farming, to saving energy and herbicide and many other materials. At present time Large scale software as MATLAB on PC was used, but the computation can be reduced and integrated into a small embed system, which means that the application of this technique in agricultural engineering is feasible and of great economical value.

  16. Genome and epigenome engineering CRISPR toolkit for in vivo modulation of cis-regulatory interactions and gene expression in the chicken embryo.

    PubMed

    Williams, Ruth M; Senanayake, Upeka; Artibani, Mara; Taylor, Gunes; Wells, Daniel; Ahmed, Ahmed Ashour; Sauka-Spengler, Tatjana

    2018-02-23

    CRISPR/Cas9 genome engineering has revolutionised all aspects of biological research, with epigenome engineering transforming gene regulation studies. Here, we present an optimised, adaptable toolkit enabling genome and epigenome engineering in the chicken embryo, and demonstrate its utility by probing gene regulatory interactions mediated by neural crest enhancers. First, we optimise novel efficient guide-RNA mini expression vectors utilising chick U6 promoters, provide a strategy for rapid somatic gene knockout and establish a protocol for evaluation of mutational penetrance by targeted next-generation sequencing. We show that CRISPR/Cas9-mediated disruption of transcription factors causes a reduction in their cognate enhancer-driven reporter activity. Next, we assess endogenous enhancer function using both enhancer deletion and nuclease-deficient Cas9 (dCas9) effector fusions to modulate enhancer chromatin landscape, thus providing the first report of epigenome engineering in a developing embryo. Finally, we use the synergistic activation mediator (SAM) system to activate an endogenous target promoter. The novel genome and epigenome engineering toolkit developed here enables manipulation of endogenous gene expression and enhancer activity in chicken embryos, facilitating high-resolution analysis of gene regulatory interactions in vivo . © 2018. Published by The Company of Biologists Ltd.

  17. A comparative study of gamma-ray interaction and absorption in some building materials using Zeff-toolkit

    NASA Astrophysics Data System (ADS)

    Mann, Kulwinder Singh; Heer, Manmohan Singh; Rani, Asha

    2016-07-01

    The gamma-ray shielding behaviour of a material can be investigated by determining its various interaction and energy-absorption parameters (such as mass attenuation coefficients, mass energy absorption coefficients, and corresponding effective atomic numbers and electron densities). Literature review indicates that the effective atomic number (Zeff) has been used as extensive parameters for evaluating the effects and defect in the chosen materials caused by ionising radiations (X-rays and gamma-rays). A computer program (Zeff-toolkit) has been designed for obtaining the mean value of effective atomic number calculated by three different methods. A good agreement between the results obtained with Zeff-toolkit, Auto_Zeff software and experimentally measured values of Zeff has been observed. Although the Zeff-toolkit is capable of computing effective atomic numbers for both photon interaction (Zeff,PI) and energy absorption (Zeff,En) using three methods in each. No similar computer program is available in the literature which simultaneously computes these parameters simultaneously. The computed parameters have been compared and correlated in the wide energy range (0.001-20 MeV) for 10 commonly used building materials. The prominent variations in these parameters with gamma-ray photon energy have been observed due to the dominance of various absorption and scattering phenomena. The mean values of two effective atomic numbers (Zeff,PI and Zeff,En) are equivalent at energies below 0.002 MeV and above 0.3 MeV, indicating the dominance of gamma-ray absorption (photoelectric and pair production) over scattering (Compton) at these energies. Conversely in the energy range 0.002-0.3 MeV, the Compton scattering of gamma-rays dominates the absorption. From the 10 chosen samples of building materials, 2 soils showed better shielding behaviour than did other 8 materials.

  18. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542

  19. The MIGenAS integrated bioinformatics toolkit for web-based sequence analysis

    PubMed Central

    Rampp, Markus; Soddemann, Thomas; Lederer, Hermann

    2006-01-01

    We describe a versatile and extensible integrated bioinformatics toolkit for the analysis of biological sequences over the Internet. The web portal offers convenient interactive access to a growing pool of chainable bioinformatics software tools and databases that are centrally installed and maintained by the RZG. Currently, supported tasks comprise sequence similarity searches in public or user-supplied databases, computation and validation of multiple sequence alignments, phylogenetic analysis and protein–structure prediction. Individual tools can be seamlessly chained into pipelines allowing the user to conveniently process complex workflows without the necessity to take care of any format conversions or tedious parsing of intermediate results. The toolkit is part of the Max-Planck Integrated Gene Analysis System (MIGenAS) of the Max Planck Society available at (click ‘Start Toolkit’). PMID:16844980

  20. The PRIDE (Partnership to Improve Diabetes Education) Toolkit: Development and Evaluation of Novel Literacy and Culturally Sensitive Diabetes Education Materials.

    PubMed

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L

    2016-02-01

    Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a "superior" score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. © 2015 The Author(s).

  1. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    NASA Astrophysics Data System (ADS)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  2. Interactive Visualization of Computational Fluid Dynamics using Mosaic

    NASA Technical Reports Server (NTRS)

    Clucas, Jean; Watson, Velvin; Chancellor, Marisa K. (Technical Monitor)

    1994-01-01

    The Web provides new Methods for accessing Information world-wide, but the current text-and-pictures approach neither utilizes all the Web's possibilities not provides for its limitations. While the inclusion of pictures and animations in a paper communicates more effectively than text alone, It Is essentially an extension of the concept of "publication." Also, as use of the Web increases putting images and animations online will quickly load even the "Information Superhighway." We need to find forms of communication that take advantage of the special nature of the Web. This paper presents one approach: the use of the Internet and the Mosaic interface for data sharing and collaborative analysis. We will describe (and In the presentation, demonstrate) our approach: using FAST (Flow Analysis Software Toolkit), a scientific visualization package, as a data viewer and interactive tool called from MOSAIC. Our intent is to stimulate the development of other tools that utilize the unique nature of electronic communication.

  3. Comparison of GEANT4 very low energy cross section models with experimental data in water.

    PubMed

    Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C

    2010-09-01

    The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.

  4. Integrating existing software toolkits into VO system

    NASA Astrophysics Data System (ADS)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  5. SU-E-T-565: RAdiation Resistance of Cancer CElls Using GEANT4 DNA: RACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perrot, Y; Payno, H; Delage, E

    2014-06-01

    Purpose: The objective of the RACE project is to develop a comparison between Monte Carlo simulation using the Geant4-DNA toolkit and measurements of radiation damage on 3D melanoma and chondrosarcoma culture cells coupled with gadolinium nanoparticles. We currently expose the status of the developments regarding simulations. Methods: Monte Carlo studies are driven using the Geant4 toolkit and the Geant4-DNA extension. In order to model the geometry of a cell population, the opensource CPOP++ program is being developed for the geometrical representation of 3D cell populations including a specific cell mesh coupled with a multi-agent system. Each cell includes cytoplasm andmore » nucleus. The correct modeling of the cell population has been validated with confocal microscopy images of spheroids. The Geant4 Livermore physics models are used to simulate the interactions of a 250 keV X-ray beam and the production of secondaries from gadolinium nanoparticles supposed to be fixed on the cell membranes. Geant4-DNA processes are used to simulate the interactions of charged particles with the cells. An atomistic description of the DNA molecule, from PDB (Protein Data Bank) files, is provided by the so-called PDB4DNA Geant4 user application we developed to score energy depositions in DNA base pairs and sugar-phosphate groups. Results: At the microscopic level, our simulations enable assessing microscopic energy distribution in each cell compartment of a realistic 3D cell population. Dose enhancement factors due to the presence of gadolinium nanoparticles can be estimated. At the nanometer scale, direct damages on nuclear DNA are also estimated. Conclusion: We successfully simulated the impact of direct radiations on a realistic 3D cell population model compatible with microdosimetry calculations using the Geant4-DNA toolkit. Upcoming validation and the future integration of the radiochemistry module of Geant4-DNA will propose to correlate clusters of ionizations with in vitro experiments. All those developments will be released publicly. This work was supported by grants from Plan Cancer 2009-2013 French national initiative managed by INSERM (Institut National de la Sante et de la Recherche Medicale)« less

  6. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    PubMed Central

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/. PMID:27375472

  7. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    PubMed

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit is highly scalable and adaptable, which makes it suitable for use in neuroscience applications as a scalable data processing toolkit. As part of the ongoing extension of NeuroPigPen, we are developing new modules to support statistical functions to analyze signal data for brain connectivity research. In addition, the toolkit is being extended to allow integration with scientific workflow systems. NeuroPigPen is released under BSD license at: https://sites.google.com/a/case.edu/neuropigpen/.

  8. Using CamiTK for rapid prototyping of interactive computer assisted medical intervention applications.

    PubMed

    Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne

    2013-01-01

    Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.

  9. LibIsopach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas

    2016-12-06

    LibIsopach is a toolkit for high performance distributed immersive visualization, leveraging modern OpenGL. It features a multi-process scenegraph, explicit instance rendering, mesh generation, and three-dimensional user interaction event processing.

  10. SLX4 Assembles a Telomere Maintenance Toolkit by Bridging Multiple Endonucleases with Telomeres

    PubMed Central

    Wan, Bingbing; Yin, Jinhu; Horvath, Kent; Sarkar, Jaya; Chen, Yong; Wu, Jian; Wan, Ke; Lu, Jian; Gu, Peili; Yu, Eun Young; Lue, Neal F.; Chang, Sandy

    2014-01-01

    Summary SLX4 interacts with several endonucleases to resolve structural barriers in DNA metabolism. SLX4 also interacts with telomeric protein TRF2 in human cells. The molecular mechanism of these interactions at telomeres remains unknown. Here, we report the crystal structure of the TRF2-binding motif of SLX4 (SLX4TBM) in complex with the TRFH domain of TRF2 (TRF2TRFH) and map the interactions of SLX4 with endonucleases SLX1, XPF, and MUS81. TRF2 recognizes a unique HxLxP motif on SLX4 via the peptide-binding site in its TRFH domain. Telomeric localization of SLX4 and associated nucleases depend on the SLX4-endonuclease and SLX4-TRF2 interactions and the protein levels of SLX4 and TRF2. SLX4 assembles an endonuclease toolkit that negatively regulates telomere length via SLX1-catalyzed nucleolytic resolution of telomere DNA structures. We propose that the SLX4-TRF2 complex serves as a double-layer scaffold bridging multiple endonucleases with telomeres for recombination-based telomere maintenance. PMID:24012755

  11. New Additions to the Toolkit for Forward/Inverse Problems in Electrocardiography within the SCIRun Problem Solving Environment.

    PubMed

    Coll-Font, Jaume; Burton, Brett M; Tate, Jess D; Erem, Burak; Swenson, Darrel J; Wang, Dafang; Brooks, Dana H; van Dam, Peter; Macleod, Rob S

    2014-09-01

    Cardiac electrical imaging often requires the examination of different forward and inverse problem formulations based on mathematical and numerical approximations of the underlying source and the intervening volume conductor that can generate the associated voltages on the surface of the body. If the goal is to recover the source on the heart from body surface potentials, the solution strategy must include numerical techniques that can incorporate appropriate constraints and recover useful solutions, even though the problem is badly posed. Creating complete software solutions to such problems is a daunting undertaking. In order to make such tools more accessible to a broad array of researchers, the Center for Integrative Biomedical Computing (CIBC) has made an ECG forward/inverse toolkit available within the open source SCIRun system. Here we report on three new methods added to the inverse suite of the toolkit. These new algorithms, namely a Total Variation method, a non-decreasing TMP inverse and a spline-based inverse, consist of two inverse methods that take advantage of the temporal structure of the heart potentials and one that leverages the spatial characteristics of the transmembrane potentials. These three methods further expand the possibilities of researchers in cardiology to explore and compare solutions to their particular imaging problem.

  12. Iplt--image processing library and toolkit for the electron microscopy community.

    PubMed

    Philippsen, Ansgar; Schenk, Andreas D; Stahlberg, Henning; Engel, Andreas

    2003-01-01

    We present the foundation for establishing a modular, collaborative, integrated, open-source architecture for image processing of electron microscopy images, named iplt. It is designed around object oriented paradigms and implemented using the programming languages C++ and Python. In many aspects it deviates from classical image processing approaches. This paper intends to motivate developers within the community to participate in this on-going project. The iplt homepage can be found at http://www.iplt.org.

  13. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    PubMed

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping).

  14. Development of a toolkit and glossary to aid in the adaptation of health technology assessment (HTA) reports for use in different contexts.

    PubMed

    Chase, D; Rosten, C; Turner, S; Hicks, N; Milne, R

    2009-11-01

    To develop a health technology assessment (HTA) adaptation toolkit and glossary of adaptation terms for use by HTA agencies within EU member states to support them in adapting HTA reports written for other contexts. The toolkit and glossary were developed by a partnership of 28 HTA agencies and networks across Europe (EUnetHTA work package 5), led by the UK National Coordinating Centre for Health Technology Assessment (NCCHTA). Methods employed for the two resources were literature searching, a survey of adaptation experience, two rounds of a Delphi survey, meetings of the partnership and drawing on the expertise and experience of the partnership, two rounds of review, and two rounds of quality assurance testing. All partners were requested to provide input into each stage of development. The resulting toolkit is a collection of resources, in the form of checklists of questions on relevance, reliability and transferability of data and information, and links to useful websites, that help the user assess whether data and information in existing HTA reports can be adapted for a different setting. The toolkit is designed for the adaptation of evidence synthesis rather than primary research. The accompanying glossary provides descriptions of meanings for HTA adaptation terms from HTA agencies across Europe. It seeks to highlight differences in the use and understanding of each word by HTA agencies. The toolkit and glossary are available for use by all HTA agencies and can be accessed via www.eunethta.net/. These resources have been developed to help HTA agencies make better use of HTA reports produced elsewhere. They can be used by policy-makers and clinicians to aid in understanding HTA reports written for other contexts. The main implication of this work is that there is the potential for the adaptation of HTA reports and, if utilised, this should release resources to enable the development of further HTA reports. Recommendations for the further development of the toolkit include the potential to develop an interactive web-based version and to extend the toolkit to facilitate the adaptation of HTA reports on diagnostic testing and screening.

  15. The Development and Evaluation of an Online Healthcare Toolkit for Autistic Adults and their Primary Care Providers.

    PubMed

    Nicolaidis, Christina; Raymaker, Dora; McDonald, Katherine; Kapp, Steven; Weiner, Michael; Ashkenazy, Elesia; Gerrity, Martha; Kripke, Clarissa; Platt, Laura; Baggs, Amelia

    2016-10-01

    The healthcare system is ill-equipped to meet the needs of adults on the autism spectrum. Our goal was to use a community-based participatory research (CBPR) approach to develop and evaluate tools to facilitate the primary healthcare of autistic adults. Toolkit development included cognitive interviewing and test-retest reliability studies. Evaluation consisted of a mixed-methods, single-arm pre/post-intervention comparison. A total of 259 autistic adults and 51 primary care providers (PCPs) residing in the United States. The AASPIRE Healthcare toolkit includes the Autism Healthcare Accommodations Tool (AHAT)-a tool that allows patients to create a personalized accommodations report for their PCP-and general healthcare- and autism-related information, worksheets, checklists, and resources for patients and healthcare providers. Satisfaction with patient-provider communication, healthcare self-efficacy, barriers to healthcare, and satisfaction with the toolkit's usability and utility; responses to open-ended questions. Preliminary testing of the AHAT demonstrated strong content validity and adequate test-retest stability. Almost all patient participants (>94 %) felt that the AHAT and the toolkit were easy to use, important, and useful. In pre/post-intervention comparisons, the mean number of barriers decreased (from 4.07 to 2.82, p < 0.0001), healthcare self-efficacy increased (from 37.9 to 39.4, p = 0.02), and satisfaction with PCP communication improved (from 30.9 to 32.6, p = 0.03). Patients stated that the toolkit helped clarify their needs, enabled them to self-advocate and prepare for visits more effectively, and positively influenced provider behavior. Most of the PCPs surveyed read the AHAT (97 %), rated it as moderately or very useful (82 %), and would recommend it to other patients (87 %). The CBPR process resulted in a reliable healthcare accommodation tool and a highly accessible healthcare toolkit. Patients and providers indicated that the tools positively impacted healthcare interactions. The toolkit has the potential to reduce barriers to healthcare and improve healthcare self-efficacy and patient-provider communication.

  16. Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ward, Logan Timothy; Hackenberg, Robert Errol

    These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a "XML Schema" look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneckmore » Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.« less

  17. GCView: the genomic context viewer for protein homology searches

    PubMed Central

    Grin, Iwan; Linke, Dirk

    2011-01-01

    Genomic neighborhood can provide important insights into evolution and function of a protein or gene. When looking at operons, changes in operon structure and composition can only be revealed by looking at the operon as a whole. To facilitate the analysis of the genomic context of a query in multiple organisms we have developed Genomic Context Viewer (GCView). GCView accepts results from one or multiple protein homology searches such as BLASTp as input. For each hit, the neighboring protein-coding genes are extracted, the regions of homology are labeled for each input and the results are presented as a clear, interactive graphical output. It is also possible to add more searches to iteratively refine the output. GCView groups outputs by the hits for different proteins. This allows for easy comparison of different operon compositions and structures. The tool is embedded in the framework of the Bioinformatics Toolkit of the Max-Planck Institute for Developmental Biology (MPI Toolkit). Job results from the homology search tools inside the MPI Toolkit can be forwarded to GCView and results can be subsequently analyzed by sequence analysis tools. Results are stored online, allowing for later reinspection. GCView is freely available at http://toolkit.tuebingen.mpg.de/gcview. PMID:21609955

  18. Technology and nursing education: an online toolkit for educators.

    PubMed

    Hart, Carolyn

    2012-09-01

    New tools that are free are available via the Internet and can successfully be used to create interactive and engaging courses designed to reach today's technologically savvy learners. Copyright 2012, SLACK Incorporated.

  19. Teaching the physician-manager role to psychiatric residents: development and implementation of a pilot curriculum.

    PubMed

    Stergiopoulos, Vicky; Maggi, Julie; Sockalingam, Sanjeev

    2009-01-01

    The authors describe a pilot physician-manager curriculum designed to address the learning needs of psychiatric residents in administrative psychiatry and health systems. The pilot curriculum includes a junior and a senior toolkit of four workshops each. The junior toolkit introduces postgraduate-year two (PGY-2) residents to the principles of teamwork, conflict resolution, quality improvement, and program planning and evaluation. The senior toolkit exposes PGY-4 residents to leadership and change management, organizational structures, mental health and addictions reform, and self and career development. Following curriculum implementation at the University of Toronto, residents rated the importance and clinical relevance of curriculum objectives and commented on the strengths and weaknesses of the workshops and areas needing improvement. The pilot curriculum was successfully introduced at the University of Toronto in 2006. Residents rated the curriculum very highly and commented that interactive learning and contextually relevant topics are essential in meeting their needs. It is possible to successfully introduce a physician-manager curriculum early during psychiatric residency training, to match the specific needs of clinical rotations. Interactive techniques and clinical illustrations may be crucial in facilitating teaching and learning the physician-manager role. The authors discuss barriers, facilitators, and critical success factors in implementing such a curriculum.

  20. Robust, Globally Consistent, and Fully-automatic Multi-image Registration and Montage Synthesis for 3-D Multi-channel Images

    PubMed Central

    Tsai, Chia-Ling; Lister, James P.; Bjornsson, Christopher J; Smith, Karen; Shain, William; Barnes, Carol A.; Roysam, Badrinath

    2013-01-01

    The need to map regions of brain tissue that are much wider than the field of view of the microscope arises frequently. One common approach is to collect a series of overlapping partial views, and align them to synthesize a montage covering the entire region of interest. We present a method that advances this approach in multiple ways. Our method (1) produces a globally consistent joint registration of an unorganized collection of 3-D multi-channel images with or without stage micrometer data; (2) produces accurate registrations withstanding changes in scale, rotation, translation and shear by using a 3-D affine transformation model; (3) achieves complete automation, and does not require any parameter settings; (4) handles low and variable overlaps (5 – 15%) between adjacent images, minimizing the number of images required to cover a tissue region; (5) has the self-diagnostic ability to recognize registration failures instead of delivering incorrect results; (6) can handle a broad range of biological images by exploiting generic alignment cues from multiple fluorescence channels without requiring segmentation; and (7) is computationally efficient enough to run on desktop computers regardless of the number of images. The algorithm was tested with several tissue samples of at least 50 image tiles, involving over 5,000 image pairs. It correctly registered all image pairs with an overlap greater than 7%, correctly recognized all failures, and successfully joint-registered all images for all tissue samples studied. This algorithm is disseminated freely to the community as included with the FARSIGHT toolkit for microscopy (www.farsight-toolkit.org). PMID:21361958

  1. Image Classification for Web Genre Identification

    DTIC Science & Technology

    2012-01-01

    recognition and landscape detection using the computer vision toolkit OpenCV1. For facial recognition , we researched the possibilities of using the...method for connecting these names with a face/personal photo and logo respectively. [2] METHODOLOGY For this project, we focused primarily on facial

  2. Image-guided navigation: a cost effective practical introduction using the Image-Guided Surgery Toolkit (IGSTK).

    PubMed

    Güler, Özgür; Yaniv, Ziv

    2012-01-01

    Teaching the key technical aspects of image-guided interventions using a hands-on approach is a challenging task. This is primarily due to the high cost and lack of accessibility to imaging and tracking systems. We provide a software and data infrastructure which addresses both challenges. Our infrastructure allows students, patients, and clinicians to develop an understanding of the key technologies by using them, and possibly by developing additional components and integrating them into a simple navigation system which we provide. Our approach requires minimal hardware, LEGO blocks to construct a phantom for which we provide CT scans, and a webcam which when combined with our software provides the functionality of a tracking system. A premise of this approach is that tracking accuracy is sufficient for our purpose. We evaluate the accuracy provided by a consumer grade webcam and show that it is sufficient for educational use. We provide an open source implementation of all the components required for a basic image-guided navigation as part of the Image-Guided Surgery Toolkit (IGSTK). It has long been known that in education there is no substitute for hands-on experience, to quote Sophocles, "One must learn by doing the thing; for though you think you know it, you have no certainty, until you try.". Our work provides this missing capability in the context of image-guided navigation. Enabling a wide audience to learn and experience the use of a navigation system.

  3. A cost effective and high fidelity fluoroscopy simulator using the Image-Guided Surgery Toolkit (IGSTK)

    NASA Astrophysics Data System (ADS)

    Gong, Ren Hui; Jenkins, Brad; Sze, Raymond W.; Yaniv, Ziv

    2014-03-01

    The skills required for obtaining informative x-ray fluoroscopy images are currently acquired while trainees provide clinical care. As a consequence, trainees and patients are exposed to higher doses of radiation. Use of simulation has the potential to reduce this radiation exposure by enabling trainees to improve their skills in a safe environment prior to treating patients. We describe a low cost, high fidelity, fluoroscopy simulation system. Our system enables operators to practice their skills using the clinical device and simulated x-rays of a virtual patient. The patient is represented using a set of temporal Computed Tomography (CT) images, corresponding to the underlying dynamic processes. Simulated x-ray images, digitally reconstructed radiographs (DRRs), are generated from the CTs using ray-casting with customizable machine specific imaging parameters. To establish the spatial relationship between the CT and the fluoroscopy device, the CT is virtually attached to a patient phantom and a web camera is used to track the phantom's pose. The camera is mounted on the fluoroscope's intensifier and the relationship between it and the x-ray source is obtained via calibration. To control image acquisition the operator moves the fluoroscope as in normal operation mode. Control of zoom, collimation and image save is done using a keypad mounted alongside the device's control panel. Implementation is based on the Image-Guided Surgery Toolkit (IGSTK), and the use of the graphics processing unit (GPU) for accelerated image generation. Our system was evaluated by 11 clinicians and was found to be sufficiently realistic for training purposes.

  4. Hue-shifted monomeric variants of Clavularia cyan fluorescent protein: identification of the molecular determinants of color and applications in fluorescence imaging

    PubMed Central

    Ai, Hui-wang; Olenych, Scott G; Wong, Peter; Davidson, Michael W; Campbell, Robert E

    2008-01-01

    Background In the 15 years that have passed since the cloning of Aequorea victoria green fluorescent protein (avGFP), the expanding set of fluorescent protein (FP) variants has become entrenched as an indispensable toolkit for cell biology research. One of the latest additions to the toolkit is monomeric teal FP (mTFP1), a bright and photostable FP derived from Clavularia cyan FP. To gain insight into the molecular basis for the blue-shifted fluorescence emission we undertook a mutagenesis-based study of residues in the immediate environment of the chromophore. We also employed site-directed and random mutagenesis in combination with library screening to create new hues of mTFP1-derived variants with wavelength-shifted excitation and emission spectra. Results Our results demonstrate that the protein-chromophore interactions responsible for blue-shifting the absorbance and emission maxima of mTFP1 operate independently of the chromophore structure. This conclusion is supported by the observation that the Tyr67Trp and Tyr67His mutants of mTFP1 retain a blue-shifted fluorescence emission relative to their avGFP counterparts (that is, Tyr66Trp and Tyr66His). Based on previous work with close homologs, His197 and His163 are likely to be the residues with the greatest contribution towards blue-shifting the fluorescence emission. Indeed we have identified the substitutions His163Met and Thr73Ala that abolish or disrupt the interactions of these residues with the chromophore. The mTFP1-Thr73Ala/His163Met double mutant has an emission peak that is 23 nm red-shifted from that of mTFP1 itself. Directed evolution of this double mutant resulted in the development of mWasabi, a new green fluorescing protein that offers certain advantages over enhanced avGFP (EGFP). To assess the usefulness of mTFP1 and mWasabi in live cell imaging applications, we constructed and imaged more than 20 different fusion proteins. Conclusion Based on the results of our mutagenesis study, we conclude that the two histidine residues in close proximity to the chromophore are approximately equal determinants of the blue-shifted fluorescence emission of mTFP1. With respect to live cell imaging applications, the mTFP1-derived mWasabi should be particularly useful in two-color imaging in conjunction with a Sapphire-type variant or as a fluorescence resonance energy transfer acceptor with a blue FP donor. In all fusions attempted, both mTFP1 and mWasabi give patterns of fluorescent localization indistinguishable from that of well-established avGFP variants. PMID:18325109

  5. ImTK: an open source multi-center information management toolkit

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  6. The Insight ToolKit image registration framework

    PubMed Central

    Avants, Brian B.; Tustison, Nicholas J.; Stauffer, Michael; Song, Gang; Wu, Baohua; Gee, James C.

    2014-01-01

    Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK4) seeks to establish new standards in publicly available image registration methodology. ITK4 makes several advances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK4 contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.1 PMID:24817849

  7. Monitoring Areal Snow Cover Using NASA Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Harshburger, Brian J.; Blandford, Troy; Moore, Brandon

    2011-01-01

    The objective of this project is to develop products and tools to assist in the hydrologic modeling process, including tools to help prepare inputs for hydrologic models and improved methods for the visualization of streamflow forecasts. In addition, this project will facilitate the use of NASA satellite imagery (primarily snow cover imagery) by other federal and state agencies with operational streamflow forecasting responsibilities. A GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts is being developed. This toolkit will be packaged as multiple extensions for ArcGIS 9.x and an opensource GIS software package. The toolkit will provide users with a means for ingesting NASA EOS satellite imagery (snow cover analysis), preparing hydrologic model inputs, and visualizing streamflow forecasts. Primary products include a software tool for predicting the presence of snow under clouds in satellite images; a software tool for producing gridded temperature and precipitation forecasts; and a suite of tools for visualizing hydrologic model forecasting results. The toolkit will be an expert system designed for operational users that need to generate accurate streamflow forecasts in a timely manner. The Remote Sensing of Snow Cover Toolbar will ingest snow cover imagery from multiple sources, including the MODIS Operational Snowcover Data and convert them to gridded datasets that can be readily used. Statistical techniques will then be applied to the gridded snow cover data to predict the presence of snow under cloud cover. The toolbar has the ability to ingest both binary and fractional snow cover data. Binary mapping techniques use a set of thresholds to determine whether a pixel contains snow or no snow. Fractional mapping techniques provide information regarding the percentage of each pixel that is covered with snow. After the imagery has been ingested, physiographic data is attached to each cell in the snow cover image. This data can be obtained from a digital elevation model (DEM) for the area of interest.

  8. Development of a toolkit to enhance care processes for people with a long-term neurological condition: a qualitative descriptive study.

    PubMed

    Sezier, Ann; Mudge, Suzie; Kayes, Nicola; Kersten, Paula; Payne, Deborah; Harwood, Matire; Potter, Eden; Smith, Greta; McPherson, Kathryn M

    2018-06-30

    To (A) explore perspectives of people with a long-term neurological condition, and of their family, clinicians and other stakeholders on three key processes: two-way communication, self-management and coordination of long-term care; and (B) use these data to develop a 'Living Well Toolkit', a structural support aiming to enhance the quality of these care processes. This qualitative descriptive study drew on the principles of participatory research. Data from interviews and focus groups with participants (n=25) recruited from five hospital, rehabilitation and community settings in New Zealand were analysed using conventional content analysis. Consultation with a knowledge-user group (n=4) and an implementation champion group (n=4) provided additional operational knowledge important to toolkit development and its integration into clinical practice. Four main, and one overarching, themes were constructed: (1) tailoring care: referring to getting to know the person and their individual circumstances; (2) i nvolving others: representing the importance of negotiating the involvement of others in the person's long-term management process; (3) exchanging knowledge: referring to acknowledging patient expertise; and (4) enabling: highlighting the importance of empowering relationships and processes. The overarching theme was: a ssume nothing . These themes informed the development of a toolkit comprising of two parts: one to support the person with the long-term neurological condition, and one targeted at clinicians to guide interaction and support their engagement with patients. Perspectives of healthcare users, clinicians and other stakeholders were fundamental to the development of the Living Well Toolkit. The findings were used to frame toolkit specifications and highlighted potential operational issues that could prove key to its success. Further research to evaluate its use is now underway. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Defining the Subcellular Interface of Nanoparticles by Live-Cell Imaging

    PubMed Central

    Hemmerich, Peter H.; von Mikecz, Anna H.

    2013-01-01

    Understanding of nanoparticle-bio-interactions within living cells requires knowledge about the dynamic behavior of nanomaterials during their cellular uptake, intracellular traffic and mutual reactions with cell organelles. Here, we introduce a protocol of combined kinetic imaging techniques that enables investigation of exemplary fluorochrome-labelled nanoparticles concerning their intracellular fate. By time-lapse confocal microscopy we observe fast, dynamin-dependent uptake of polystyrene and silica nanoparticles via the cell membrane within seconds. Fluorescence recovery after photobleaching (FRAP) experiments reveal fast and complete exchange of the investigated nanoparticles at mitochondria, cytoplasmic vesicles or the nuclear envelope. Nuclear translocation is observed within minutes by free diffusion and active transport. Fluorescence correlation spectroscopy (FCS) and raster image correlation spectroscopy (RICS) indicate diffusion coefficients of polystyrene and silica nanoparticles in the nucleus and the cytoplasm that are consistent with particle motion in living cells based on diffusion. Determination of the apparent hydrodynamic radii by FCS and RICS shows that nanoparticles exert their cytoplasmic and nuclear effects mainly as mobile, monodisperse entities. Thus, a complete toolkit of fluorescence fluctuation microscopy is presented for the investigation of nanomaterial biophysics in subcellular microenvironments that contributes to develop a framework of intracellular nanoparticle delivery routes. PMID:23637951

  10. The Virtual Physiological Human ToolKit.

    PubMed

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  11. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data.more » By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.« less

  12. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  13. TerraLook: Providing easy, no-cost access to satellite images for busy people and the technologically disinclined

    USGS Publications Warehouse

    Geller, G.N.; Fosnight, E.A.; Chaudhuri, Sambhudas

    2008-01-01

    Access to satellite images has been largely limited to communities with specialized tools and expertise, even though images could also benefit other communities. This situation has resulted in underutilization of the data. TerraLook, which consists of collections of georeferenced JPEG images and an open source toolkit to use them, makes satellite images available to those lacking experience with remote sensing. Users can find, roam, and zoom images, create and display vector overlays, adjust and annotate images so they can be used as a communication vehicle, compare images taken at different times, and perform other activities useful for natural resource management, sustainable development, education, and other activities. ?? 2007 IEEE.

  14. TerraLook: Providing easy, no-cost access to satellite images for busy people and the technologically disinclined

    USGS Publications Warehouse

    Geller, G.N.; Fosnight, E.A.; Chaudhuri, Sambhudas

    2007-01-01

    Access to satellite images has been largely limited to communities with specialized tools and expertise, even though images could also benefit other communities. This situation has resulted in underutilization of the data. TerraLook, which consists of collections of georeferenced JPEG images and an open source toolkit to use them, makes satellite images available to those lacking experience with remote sensing. Users can find, roam, and zoom images, create and display vector overlays, adjust and annotate images so they can be used as a communication vehicle, compare images taken at different times, and perform other activities useful for natural resource management, sustainable development, education, and other activities. ?? 2007 IEEE.

  15. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    NASA Astrophysics Data System (ADS)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  16. DeepInfer: Open-Source Deep Learning Deployment Toolkit for Image-Guided Therapy.

    PubMed

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A; Kapur, Tina; Wells, William M; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-02-11

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research workflows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  17. DeepInfer: Open-Source Deep Learning Deployment Toolkit for Image-Guided Therapy

    PubMed Central

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-01-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research workflows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose “DeepInfer” – an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections. PMID:28615794

  18. [Research and implementation of the TLS network transport security technology based on DICOM standard].

    PubMed

    Lu, Xiaoqi; Wang, Lei; Zhao, Jianfeng

    2012-02-01

    With the development of medical information, Picture Archiving and Communications System (PACS), Hospital Information System/Radiology Information System(HIS/RIS) and other medical information management system become popular and developed, and interoperability between these systems becomes more frequent. So, these enclosed systems will be open and regionalized by means of network, and this is inevitable. If the trend becomes true, the security of information transmission may be the first problem to be solved. Based on the need for network security, we investigated the Digital Imaging and Communications in Medicine (DICOM) Standard and Transport Layer Security (TLS) Protocol, and implemented the TLS transmission of the DICOM medical information with OpenSSL toolkit and DCMTK toolkit.

  19. The PhenX Toolkit: Get the Most From Your Measures

    PubMed Central

    Hamilton, Carol M.; Strader, Lisa C.; Pratt, Joseph G.; Maiese, Deborah; Hendershot, Tabitha; Kwok, Richard K.; Hammond, Jane A.; Huggins, Wayne; Jackman, Dean; Pan, Huaqin; Nettles, Destiney S.; Beaty, Terri H.; Farrer, Lindsay A.; Kraft, Peter; Marazita, Mary L.; Ordovas, Jose M.; Pato, Carlos N.; Spitz, Margaret R.; Wagener, Diane; Williams, Michelle; Junkins, Heather A.; Harlan, William R.; Ramos, Erin M.; Haines, Jonathan

    2011-01-01

    The potential for genome-wide association studies to relate phenotypes to specific genetic variation is greatly increased when data can be combined or compared across multiple studies. To facilitate replication and validation across studies, RTI International (Research Triangle Park, North Carolina) and the National Human Genome Research Institute (Bethesda, Maryland) are collaborating on the consensus measures for Phenotypes and eXposures (PhenX) project. The goal of PhenX is to identify 15 high-priority, well-established, and broadly applicable measures for each of 21 research domains. PhenX measures are selected by working groups of domain experts using a consensus process that includes input from the scientific community. The selected measures are then made freely available to the scientific community via the PhenX Toolkit. Thus, the PhenX Toolkit provides the research community with a core set of high-quality, well-established, low-burden measures intended for use in large-scale genomic studies. PhenX measures will have the most impact when included at the experimental design stage. The PhenX Toolkit also includes links to standards and resources in an effort to facilitate data harmonization to legacy data. Broad acceptance and use of PhenX measures will promote cross-study comparisons to increase statistical power for identifying and replicating variants associated with complex diseases and with gene-gene and gene-environment interactions. PMID:21749974

  20. The UK Earth System Models Marine Biogeochemical Evaluation Toolkit, BGC-val

    NASA Astrophysics Data System (ADS)

    de Mora, Lee

    2017-04-01

    The Biogeochemical Validation toolkit, BGC-val, is a model and grid independent python-based marine model evaluation framework that automates much of the validation of the marine component of an Earth System Model. BGC-val was initially developed to be a flexible and extensible system to evaluate the spin up of the marine UK Earth System Model (UKESM). However, the grid-independence and flexibility means that it is straightforward to adapt the BGC-val framework to evaluate other marine models. In addition to the marine component of the UKESM, this toolkit has been adapted to compare multiple models, including models from the CMIP5 and iMarNet inter-comparison projects. The BGC-val toolkit produces multiple levels of analysis which are presented in a simple to use interactive html5 document. Level 1 contains time series analyses, showing the development over time of many important biogeochemical and physical ocean metrics, such as the Global primary production or the Drake passage current. The second level of BGC-val is an in-depth spatial analyses of a single point in time. This is a series of point to point comparison of model and data in various regions, such as a comparison of Surface Nitrate in the model vs data from the world ocean atlas. The third level analyses are specialised ad-hoc packages to go in-depth on a specific question, such as the development of Oxygen minimum zones in the Equatorial Pacific. In additional to the three levels, the html5 document opens with a Level 0 table showing a summary of the status of the model run. The beta version of this toolkit is available via the Plymouth Marine Laboratory Gitlab server and uses the BSD 3 clause license.

  1. A qualitative study of clinic and community member perspectives on intervention toolkits: "Unless the toolkit is used it won't help solve the problem".

    PubMed

    Davis, Melinda M; Howk, Sonya; Spurlock, Margaret; McGinnis, Paul B; Cohen, Deborah J; Fagnan, Lyle J

    2017-07-18

    Intervention toolkits are common products of grant-funded research in public health and primary care settings. Toolkits are designed to address the knowledge translation gap by speeding implementation and dissemination of research into practice. However, few studies describe characteristics of effective intervention toolkits and their implementation. Therefore, we conducted this study to explore what clinic and community-based users want in intervention toolkits and to identify the factors that support application in practice. In this qualitative descriptive study we conducted focus groups and interviews with a purposive sample of community health coalition members, public health experts, and primary care professionals between November 2010 and January 2012. The transdisciplinary research team used thematic analysis to identify themes and a cross-case comparative analysis to explore variation by participant role and toolkit experience. Ninety six participants representing primary care (n = 54, 56%) and community settings (n = 42, 44%) participated in 18 sessions (13 focus groups, five key informant interviews). Participants ranged from those naïve through expert in toolkit development; many reported limited application of toolkits in actual practice. Participants wanted toolkits targeted at the right audience and demonstrated to be effective. Well organized toolkits, often with a quick start guide, with tools that were easy to tailor and apply were desired. Irrespective of perceived quality, participants experienced with practice change emphasized that leadership, staff buy-in, and facilitative support was essential for intervention toolkits to be translated into changes in clinic or public -health practice. Given the emphasis on toolkits in supporting implementation and dissemination of research and clinical guidelines, studies are warranted to determine when and how toolkits are used. Funders, policy makers, researchers, and leaders in primary care and public health are encouraged to allocate resources to foster both toolkit development and implementation. Support, through practice facilitation and organizational leadership, are critical for translating knowledge from intervention toolkits into practice.

  2. TH-AB-209-08: Next Generation Dedicated 3D Breast Imaging with XACT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, S; Chen, J; Samant, P

    Purpose: Exposure to radiation increases the risk of cancer. We have designed a new imaging paradigm, X-ray induced acoustic computed tomography (XACT). Applying this innovative technology to breast imaging, an X-ray exposure can generate a 3D acoustic image, which dramatically reduces the radiation dose to patients when compared to conventional breast CT. Methods: Theoretical calculations are done to determine the appropriate X-ray energy and ultrasound frequency in breast XACT imaging. A series of breast CT image along the coronal plane from a patient with calcifications in the breast tissue are used as the source image. HU value based segmentation ismore » done to distinguish the skin, adipose tissue, glandular tissue, breast calcification, and chest bone from each CT image. X-ray dose deposition in each pixel is calculated based on the tissue type by using GEANT4 Monte Carlo toolkits. The initial pressure rise caused by X-ray energy deposition is calculated according to tissue properties. Then, the X-ray induced acoustic wave propagation is simulated by K-WAVE toolkit. Breast XACT images are reconstructed from the recorded time-dependent ultrasound waves. Results: For imaging a breast with large size (16cm in diameter at chest wall), the photon energy of X-ray source and the central frequency of ultrasound detector is determined as 20keV and 5.5MHz. Approximately 10 times contrast between a calcification and the breast tissue can be acquire from XACT image. The calcification can be clearly identified from the reconstructed XACT image. Conclusion: XACT technique takes the advantages of X-ray absorption contrast and high ultrasonic resolution. With the proposed innovative technology, one can potentially reduce radiation dose to patient in 3D breast imaging as compared with current x-ray modalities, while still maintaining high imaging contrast and spatial resolution.« less

  3. NAIF Toolkit - Extended

    NASA Technical Reports Server (NTRS)

    Acton, Charles H., Jr.; Bachman, Nathaniel J.; Semenov, Boris V.; Wright, Edward D.

    2010-01-01

    The Navigation Ancillary Infor ma tion Facility (NAIF) at JPL, acting under the direction of NASA s Office of Space Science, has built a data system named SPICE (Spacecraft Planet Instrument Cmatrix Events) to assist scientists in planning and interpreting scientific observations (see figure). SPICE provides geometric and some other ancillary information needed to recover the full value of science instrument data, including correlation of individual instrument data sets with data from other instruments on the same or other spacecraft. This data system is used to produce space mission observation geometry data sets known as SPICE kernels. It is also used to read SPICE kernels and to compute derived quantities such as positions, orientations, lighting angles, etc. The SPICE toolkit consists of a subroutine/ function library, executable programs (both large applications and simple utilities that focus on kernel management), and simple examples of using SPICE toolkit subroutines. This software is very accurate, thoroughly tested, and portable to all computers. It is extremely stable and reusable on all missions. Since the previous version, three significant capabilities have been added: Interactive Data Language (IDL) interface, MATLAB interface, and a geometric event finder subsystem.

  4. An Aerodynamic Simulation Process for Iced Lifting Surfaces and Associated Issues

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Vickerman, Mary B.; Hackenberg, Anthony W.; Rigby, David L.

    2003-01-01

    This paper discusses technologies and software tools that are being implemented in a software toolkit currently under development at NASA Glenn Research Center. Its purpose is to help study the effects of icing on airfoil performance and assist with the aerodynamic simulation process which consists of characterization and modeling of ice geometry, application of block topology and grid generation, and flow simulation. Tools and technologies for each task have been carefully chosen based on their contribution to the overall process. For the geometry characterization and modeling, we have chosen an interactive rather than automatic process in order to handle numerous ice shapes. An Appendix presents features of a software toolkit developed to support the interactive process. Approaches taken for the generation of block topology and grids, and flow simulation, though not yet implemented in the software, are discussed with reasons for why particular methods are chosen. Some of the issues that need to be addressed and discussed by the icing community are also included.

  5. Stroke

    MedlinePlus

    ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ...

  6. A DICOM-based 2nd generation Molecular Imaging Data Grid implementing the IHE XDS-i integration profile.

    PubMed

    Lee, Jasper; Zhang, Jianguo; Park, Ryan; Dagliyan, Grant; Liu, Brent; Huang, H K

    2012-07-01

    A Molecular Imaging Data Grid (MIDG) was developed to address current informatics challenges in archival, sharing, search, and distribution of preclinical imaging studies between animal imaging facilities and investigator sites. This manuscript presents a 2nd generation MIDG replacing the Globus Toolkit with a new system architecture that implements the IHE XDS-i integration profile. Implementation and evaluation were conducted using a 3-site interdisciplinary test-bed at the University of Southern California. The 2nd generation MIDG design architecture replaces the initial design's Globus Toolkit with dedicated web services and XML-based messaging for dedicated management and delivery of multi-modality DICOM imaging datasets. The Cross-enterprise Document Sharing for Imaging (XDS-i) integration profile from the field of enterprise radiology informatics was adopted into the MIDG design because streamlined image registration, management, and distribution dataflow are likewise needed in preclinical imaging informatics systems as in enterprise PACS application. Implementation of the MIDG is demonstrated at the University of Southern California Molecular Imaging Center (MIC) and two other sites with specified hardware, software, and network bandwidth. Evaluation of the MIDG involves data upload, download, and fault-tolerance testing scenarios using multi-modality animal imaging datasets collected at the USC Molecular Imaging Center. The upload, download, and fault-tolerance tests of the MIDG were performed multiple times using 12 collected animal study datasets. Upload and download times demonstrated reproducibility and improved real-world performance. Fault-tolerance tests showed that automated failover between Grid Node Servers has minimal impact on normal download times. Building upon the 1st generation concepts and experiences, the 2nd generation MIDG system improves accessibility of disparate animal-model molecular imaging datasets to users outside a molecular imaging facility's LAN using a new architecture, dataflow, and dedicated DICOM-based management web services. Productivity and efficiency of preclinical research for translational sciences investigators has been further streamlined for multi-center study data registration, management, and distribution.

  7. Development of a national electronic interval cancer review for breast screening

    NASA Astrophysics Data System (ADS)

    Halling-Brown, M. D.; Patel, M. N.; Wallis, M. G.; Young, K. C.

    2018-03-01

    Reviewing interval cancers and prior screening mammograms are a key measure to monitor screening performance. Radiological analysis of the imaging features in prior mammograms and retrospective classification are an important educational tool for readers to improve individual performance. The requirements of remote, collaborative image review sessions, such as those required to run a remote interval cancer review, are variable and demand a flexible and configurable software solution that is not currently available on commercial workstations. The wide range of requirements for both collection and remote review of interval cancers has precipitated the creation of extensible medical image viewers and accompanying systems. In order to allow remote viewing, an application has been designed to allow workstation-independent, PACS-less viewing and interaction with medical images in a remote, collaborative manner, providing centralised reporting and web-based feedback. A semi-automated process, which allows the centralisation of interval cancer cases, has been developed. This stand-alone, flexible image collection toolkit provides the extremely important function of bespoke, ad-hoc image collection at sites where there is no dedicated hardware. Web interfaces have been created which allow a national or regional administrator to organise, coordinate and administer interval cancer review sessions and deploy invites to session members to participate. The same interface allows feedback to be analysed and distributed. The eICR provides a uniform process for classifying interval cancers across the NHSBSP, which facilitates rapid access to a robust 'external' review for patients and their relatives seeking answers about why their cancer was 'missed'.

  8. Varicose Veins

    MedlinePlus

    ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ...

  9. Tribal Green Building Toolkit

    EPA Pesticide Factsheets

    This Tribal Green Building Toolkit (Toolkit) is designed to help tribal officials, community members, planners, developers, and architects develop and adopt building codes to support green building practices. Anyone can use this toolkit!

  10. Find an Interventional Radiologist

    MedlinePlus

    ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ...

  11. Society of Interventional Radiology

    MedlinePlus

    ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ...

  12. Hereditary Hemorrhagic Telangiectasia - HHT

    MedlinePlus

    ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ...

  13. Child Abuse - Multiple Languages

    MedlinePlus

    ... Section Healthy Living Toolkit: Violence In the Home - English PDF Healthy Living Toolkit: Violence In the Home - ... Section Healthy Living Toolkit: Violence In the Home - English PDF Healthy Living Toolkit: Violence In the Home - ...

  14. Analysis of live cell images: Methods, tools and opportunities.

    PubMed

    Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens

    2017-02-15

    Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.

  15. Wind Integration National Dataset Toolkit | Grid Modernization | NREL

    Science.gov Websites

    information, share tips The WIND Toolkit includes meteorological conditions and turbine power for more than Integration National Dataset Toolkit Wind Integration National Dataset Toolkit The Wind Integration National Dataset (WIND) Toolkit is an update and expansion of the Eastern Wind Integration Data Set and

  16. Developing a middleware to support HDF data access in ArcGIS

    NASA Astrophysics Data System (ADS)

    Sun, M.; Jiang, Y.; Yang, C. P.

    2014-12-01

    Hierarchical Data Format (HDF) is the standard data format for the NASA Earth Observing System (EOS) data products, like the MODIS level-3 data. These data have been widely used in long-term study of the land surface, biosphere, atmosphere, and oceans of the Earth. Several toolkits have been developed to access HDF data, such as the HDF viewer and Geospatial Data Abstraction Library (GDAL), etc. ArcGIS integrated the GDAL providing data user a Graphical User Interface (GUI) to read HDF data. However, there are still some problems when using the toolkits:for example, 1) the projection information is not recognized correctly, 2) the image is dispalyed inverted, and 3) the tool lacks of capability to read the third dimension information stored in the data subsets, etc. Accordingly, in this study we attempt to improve the current HDF toolkits to address the aformentioned issues. Considering the wide-usage of ArcGIS, we develop a middleware for ArcGIS based on GDAL to solve the particular data access problems happening in ArcGIS, so that data users can access HDF data successfully and perform further data analysis with the ArcGIS geoprocessing tools.

  17. McIDAS-eXplorer: A version of McIDAS for planetary applications

    NASA Technical Reports Server (NTRS)

    Limaye, Sanjay S.; Saunders, R. Stephen; Sromovsky, Lawrence A.; Martin, Michael

    1994-01-01

    McIDAS-eXplorer is a set of software tools developed for analysis of planetary data published by the Planetary Data System on CD-ROM's. It is built upon McIDAS-X, an environment which has been in use nearly two decades now for earth weather satellite data applications in research and routine operations. The environment allows convenient access, navigation, analysis, display, and animation of planetary data by utilizing the full calibration data accompanying the planetary data. Support currently exists for Voyager images of the giant planets and their satellites; Magellan radar images (F-MIDR and C-MIDR's, global map products (GxDR's), and altimetry data (ARCDR's)); Galileo SSI images of the earth, moon, and Venus; Viking Mars images and MDIM's as well as most earth based telescopic images of solar system objects (FITS). The NAIF/JPL SPICE kernels are used for image navigation when available. For data without the SPICE kernels (such as the bulk of the Voyager Jupiter and Saturn imagery and Pioneer Orbiter images of Venus), tools based on NAIF toolkit allow the user to navigate the images interactively. Multiple navigation types can be attached to a given image (e.g., for ring navigation and planet navigation in the same image). Tools are available to perform common image processing tasks such as digital filtering, cartographic mapping, map overlays, and data extraction. It is also possible to have different planetary radii for an object such as Venus which requires a different radius for the surface and for the cloud level. A graphical user interface based on Tel-Tk scripting language is provided (UNIX only at present) for using the environment and also to provide on-line help. It is possible for end users to add applications of their own to the environment at any time.

  18. Chronic pelvic pain (pelvic congestion syndrome)

    MedlinePlus

    ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ... GUIDELINES, CLINICAL TOPIC ACKNOWLEDGEMENTS MACRA MATTERS HEALTH POLICY, ECONOMICS, CODING REIMBURSEMENT AND APPEALS TOOLKITS UFE AWARENESS TOOLKIT ...

  19. A Qualitative Evaluation of Web-Based Cancer Care Quality Improvement Toolkit Use in the Veterans Health Administration.

    PubMed

    Bowman, Candice; Luck, Jeff; Gale, Randall C; Smith, Nina; York, Laura S; Asch, Steven

    2015-01-01

    Disease severity, complexity, and patient burden highlight cancer care as a target for quality improvement (QI) interventions. The Veterans Health Administration (VHA) implemented a series of disease-specific online cancer care QI toolkits. To describe characteristics of the toolkits, target users, and VHA cancer care facilities that influenced toolkit access and use and assess whether such resources were beneficial for users. Deductive content analysis of detailed notes from 94 telephone interviews with individuals from 48 VHA facilities. We evaluated toolkit access and use across cancer types, participation in learning collaboratives, and affiliation with VHA cancer care facilities. The presence of champions was identified as a strong facilitator of toolkit use, and learning collaboratives were important for spreading information about toolkit availability. Identified barriers included lack of personnel and financial resources and complicated approval processes to support tool use. Online cancer care toolkits are well received across cancer specialties and provider types. Clinicians, administrators, and QI staff may benefit from the availability of toolkits as they become more reliant on rapid access to strategies that support comprehensive delivery of evidence-based care. Toolkits should be considered as a complement to other QI approaches.

  20. SlicerRT: radiation therapy research toolkit for 3D Slicer.

    PubMed

    Pinter, Csaba; Lasso, Andras; Wang, An; Jaffray, David; Fichtinger, Gabor

    2012-10-01

    Interest in adaptive radiation therapy research is constantly growing, but software tools available for researchers are mostly either expensive, closed proprietary applications, or free open-source packages with limited scope, extensibility, reliability, or user support. To address these limitations, we propose SlicerRT, a customizable, free, and open-source radiation therapy research toolkit. SlicerRT aspires to be an open-source toolkit for RT research, providing fast computations, convenient workflows for researchers, and a general image-guided therapy infrastructure to assist clinical translation of experimental therapeutic approaches. It is a medium into which RT researchers can integrate their methods and algorithms, and conduct comparative testing. SlicerRT was implemented as an extension for the widely used 3D Slicer medical image visualization and analysis application platform. SlicerRT provides functionality specifically designed for radiation therapy research, in addition to the powerful tools that 3D Slicer offers for visualization, registration, segmentation, and data management. The feature set of SlicerRT was defined through consensus discussions with a large pool of RT researchers, including both radiation oncologists and medical physicists. The development processes used were similar to those of 3D Slicer to ensure software quality. Standardized mechanisms of 3D Slicer were applied for documentation, distribution, and user support. The testing and validation environment was configured to automatically launch a regression test upon each software change and to perform comparison with ground truth results provided by other RT applications. Modules have been created for importing and loading DICOM-RT data, computing and displaying dose volume histograms, creating accumulated dose volumes, comparing dose volumes, and visualizing isodose lines and surfaces. The effectiveness of using 3D Slicer with the proposed SlicerRT extension for radiation therapy research was demonstrated on multiple use cases. A new open-source software toolkit has been developed for radiation therapy research. SlicerRT can import treatment plans from various sources into 3D Slicer for visualization, analysis, comparison, and processing. The provided algorithms are extensively tested and they are accessible through a convenient graphical user interface as well as a flexible application programming interface.

  1. Interactive deformation registration of endorectal prostate MRI using ITK thin plate splines.

    PubMed

    Cheung, M Rex; Krishnan, Karthik

    2009-03-01

    Magnetic resonance imaging with an endorectal coil allows high-resolution imaging of prostate cancer and the surrounding normal organs. These anatomic details can be used to direct radiotherapy. However, organ deformation introduced by the endorectal coil makes it difficult to register magnetic resonance images for treatment planning. In this study, plug-ins for the volume visualization software VolView were implemented on the basis of algorithms from the National Library of Medicine's Insight Segmentation and Registration Toolkit (ITK). Magnetic resonance images of a phantom simulating human pelvic structures were obtained with and without the endorectal coil balloon inflated. The prostate not deformed by the endorectal balloon was registered to the deformed prostate using an ITK thin plate spline (TPS). This plug-in allows the use of crop planes to limit the deformable registration in the region of interest around the prostate. These crop planes restricted the support of the TPS to the area around the prostate, where most of the deformation occurred. The region outside the crop planes was anchored by grid points. The TPS was more accurate in registering the local deformation of the prostate compared with a TPS variant, the elastic body spline. The TPS was also applied to register an in vivo T(2)-weighted endorectal magnetic resonance image. The intraprostatic tumor was accurately registered. This could potentially guide the boosting of intraprostatic targets. The source and target landmarks were placed graphically. This TPS plug-in allows the registration to be undone. The landmarks could be added, removed, and adjusted in real time and in three dimensions between repeated registrations. This interactive TPS plug-in allows a user to obtain a high level of accuracy satisfactory to a specific application efficiently. Because it is open-source software, the imaging community will be able to validate and improve the algorithm.

  2. Every Place Counts Leadership Academy : transportation toolkit quick guide

    DOT National Transportation Integrated Search

    2016-12-01

    This is a quick guide to the Transportation Toolkit. The Transportation Toolkit is meant to explain the transportation process to members of the public with no prior knowledge of transportation. The Toolkit is meant to demystify transportation and he...

  3. Object Toolkit Version 4.3 User’s Manual

    DTIC Science & Technology

    2016-12-31

    unlimited. (OPS-17-12855 dtd 19 Jan 2017) 13. SUPPLEMENTARY NOTES 14. ABSTRACT Object Toolkit is a finite - element model builder specifically designed for...INTRODUCTION 1 What Is Object Toolkit? Object Toolkit is a finite - element model builder specifically designed for creating representations of spacecraft...Nascap-2k and EPIC, the user is not required to purchase or learn expensive finite element generators to create system models. Second, Object Toolkit

  4. ChemDoodle Web Components: HTML5 toolkit for chemical graphics, interfaces, and informatics.

    PubMed

    Burger, Melanie C

    2015-01-01

    ChemDoodle Web Components (abbreviated CWC, iChemLabs, LLC) is a light-weight (~340 KB) JavaScript/HTML5 toolkit for chemical graphics, structure editing, interfaces, and informatics based on the proprietary ChemDoodle desktop software. The library uses and WebGL technologies and other HTML5 features to provide solutions for creating chemistry-related applications for the web on desktop and mobile platforms. CWC can serve a broad range of scientific disciplines including crystallography, materials science, organic and inorganic chemistry, biochemistry and chemical biology. CWC is freely available for in-house use and is open source (GPL v3) for all other uses.Graphical abstractAdd interactive 2D and 3D chemical sketchers, graphics, and spectra to websites and apps with ChemDoodle Web Components.

  5. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones

    PubMed Central

    Anderson, K.; Griffiths, D.; DeBell, L.; Hancock, S.; Duffy, J. P.; Shutler, J. D.; Reinhardt, W. J.; Griffiths, A.

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding ‘scheme blocks’ framework was used to build the application (‘app’), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing–a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for ‘UAV toolkit’ (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping). PMID:27144310

  6. ASERA: A spectrum eye recognition assistant for quasar spectra

    NASA Astrophysics Data System (ADS)

    Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao; Zhao, Yongheng

    2013-11-01

    Spectral type recognition is an important and fundamental step of large sky survey projects in the data reduction for further scientific research, like parameter measurement and statistic work. It tends out to be a huge job to manually inspect the low quality spectra produced from the massive spectroscopic survey, where the automatic pipeline may not provide confident type classification results. In order to improve the efficiency and effectiveness of spectral classification, we develop a semi-automated toolkit named ASERA, ASpectrum Eye Recognition Assistant. The main purpose of ASERA is to help the user in quasar spectral recognition and redshift measurement. Furthermore it can also be used to recognize various types of spectra of stars, galaxies and AGNs (Active Galactic Nucleus). It is an interactive software allowing the user to visualize observed spectra, superimpose template spectra from the Sloan Digital Sky Survey (SDSS), and interactively access related spectral line information. It is an efficient and user-friendly toolkit for the accurate classification of spectra observed by LAMOST (the Large Sky Area Multi-object Fiber Spectroscopic Telescope). The toolkit is available in two modes: a Java standalone application and a Java applet. ASERA has a few functions, such as wavelength and flux scale setting, zoom in and out, redshift estimation, spectral line identification, which helps user to improve the spectral classification accuracy especially for low quality spectra and reduce the labor of eyeball check. The function and performance of this tool is displayed through the recognition of several quasar spectra and a late type stellar spectrum from the LAMOST Pilot survey. Its future expansion capabilities are discussed.

  7. "Handy Manny" and the Emergent Literacy Technology Toolkit

    ERIC Educational Resources Information Center

    Hourcade, Jack J.; Parette, Howard P., Jr.; Boeckmann, Nichole; Blum, Craig

    2010-01-01

    This paper outlines the use of a technology toolkit to support emergent literacy curriculum and instruction in early childhood education settings. Components of the toolkit include hardware and software that can facilitate key emergent literacy skills. Implementation of the comprehensive technology toolkit enhances the development of these…

  8. Risk of Resource Failure and Toolkit Variation in Small-Scale Farmers and Herders

    PubMed Central

    Collard, Mark; Ruttle, April; Buchanan, Briggs; O’Brien, Michael J.

    2012-01-01

    Recent work suggests that global variation in toolkit structure among hunter-gatherers is driven by risk of resource failure such that as risk of resource failure increases, toolkits become more diverse and complex. Here we report a study in which we investigated whether the toolkits of small-scale farmers and herders are influenced by risk of resource failure in the same way. In the study, we applied simple linear and multiple regression analysis to data from 45 small-scale food-producing groups to test the risk hypothesis. Our results were not consistent with the hypothesis; none of the risk variables we examined had a significant impact on toolkit diversity or on toolkit complexity. It appears, therefore, that the drivers of toolkit structure differ between hunter-gatherers and small-scale food-producers. PMID:22844421

  9. The effectiveness of toolkits as knowledge translation strategies for integrating evidence into clinical care: a systematic review

    PubMed Central

    Yamada, Janet; Shorkey, Allyson; Barwick, Melanie; Widger, Kimberley; Stevens, Bonnie J

    2015-01-01

    Objectives The aim of this systematic review was to evaluate the effectiveness of toolkits as a knowledge translation (KT) strategy for facilitating the implementation of evidence into clinical care. Toolkits include multiple resources for educating and/or facilitating behaviour change. Design Systematic review of the literature on toolkits. Methods A search was conducted on MEDLINE, EMBASE, PsycINFO and CINAHL. Studies were included if they evaluated the effectiveness of a toolkit to support the integration of evidence into clinical care, and if the KT goal(s) of the study were to inform, share knowledge, build awareness, change practice, change behaviour, and/or clinical outcomes in healthcare settings, inform policy, or to commercialise an innovation. Screening of studies, assessment of methodological quality and data extraction for the included studies were conducted by at least two reviewers. Results 39 relevant studies were included for full review; 8 were rated as moderate to strong methodologically with clinical outcomes that could be somewhat attributed to the toolkit. Three of the eight studies evaluated the toolkit as a single KT intervention, while five embedded the toolkit into a multistrategy intervention. Six of the eight toolkits were partially or mostly effective in changing clinical outcomes and six studies reported on implementation outcomes. The types of resources embedded within toolkits varied but included predominantly educational materials. Conclusions Future toolkits should be informed by high-quality evidence and theory, and should be evaluated using rigorous study designs to explain the factors underlying their effectiveness and successful implementation. PMID:25869686

  10. Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms

    DTIC Science & Technology

    1998-01-01

    2.7.3 Load/Save Options ..... 2.7.4 Information Display .... 2.8 Library Files. 2.9 Evaluation .............. 3 Visual-Haptic Interactions 3.1...Northwestern University[ Colgate , 1994]. It is possible for a user to touch one side of a thin object and be propelled out the opposite side, because...when there is a high correlation in motion and force between the visual and haptic realms. * Chapter 7 concludes with an evaluation of the application

  11. The Mathlet Toolkit: Creating Dynamic Applets for Differential Equations and Dynamical Systems

    ERIC Educational Resources Information Center

    Decker, Robert

    2011-01-01

    Dynamic/interactive graphing applets can be used to supplement standard computer algebra systems such as Maple, Mathematica, Derive, or TI calculators, in courses such as Calculus, Differential Equations, and Dynamical Systems. The addition of this type of software can lead to discovery learning, with students developing their own conjectures, and…

  12. Geant4 hadronic physics for space radiation environment.

    PubMed

    Ivantchenko, Anton V; Ivanchenko, Vladimir N; Molina, Jose-Manuel Quesada; Incerti, Sebastien L

    2012-01-01

    To test and to develop Geant4 (Geometry And Tracking version 4) Monte Carlo hadronic models with focus on applications in a space radiation environment. The Monte Carlo simulations have been performed using the Geant4 toolkit. Binary (BIC), its extension for incident light ions (BIC-ion) and Bertini (BERT) cascades were used as main Monte Carlo generators. For comparisons purposes, some other models were tested too. The hadronic testing suite has been used as a primary tool for model development and validation against experimental data. The Geant4 pre-compound (PRECO) and de-excitation (DEE) models were revised and improved. Proton, neutron, pion, and ion nuclear interactions were simulated with the recent version of Geant4 9.4 and were compared with experimental data from thin and thick target experiments. The Geant4 toolkit offers a large set of models allowing effective simulation of interactions of particles with matter. We have tested different Monte Carlo generators with our hadronic testing suite and accordingly we can propose an optimal configuration of Geant4 models for the simulation of the space radiation environment.

  13. Integrating Space Systems Operations at the Marine Expeditionary Force Level

    DTIC Science & Technology

    2015-06-01

    Electromagnetic Interference ENVI Environment for Visualizing Images EW Electronic Warfare xvi FA40 Space Operations Officer FEC Fires and Effects...Information Facility SFE Space Force Enhancement SIGINT Signals Intelligence SSA Space Situational Awareness SSE Space Support Element STK Systems...April 23, 2015. 65 • GPS Interference and Navigation Tool (GIANT) for providing GPS accuracy prediction reports • Systems Toolkit ( STK ) Analysis

  14. Retrieval of radiology reports citing critical findings with disease-specific customization.

    PubMed

    Lacson, Ronilda; Sugarbaker, Nathanael; Prevedello, Luciano M; Ivan, Ip; Mar, Wendy; Andriole, Katherine P; Khorasani, Ramin

    2012-01-01

    Communication of critical results from diagnostic procedures between caregivers is a Joint Commission national patient safety goal. Evaluating critical result communication often requires manual analysis of voluminous data, especially when reviewing unstructured textual results of radiologic findings. Information retrieval (IR) tools can facilitate this process by enabling automated retrieval of radiology reports that cite critical imaging findings. However, IR tools that have been developed for one disease or imaging modality often need substantial reconfiguration before they can be utilized for another disease entity. THIS PAPER: 1) describes the process of customizing two Natural Language Processing (NLP) and Information Retrieval/Extraction applications - an open-source toolkit, A Nearly New Information Extraction system (ANNIE); and an application developed in-house, Information for Searching Content with an Ontology-Utilizing Toolkit (iSCOUT) - to illustrate the varying levels of customization required for different disease entities and; 2) evaluates each application's performance in identifying and retrieving radiology reports citing critical imaging findings for three distinct diseases, pulmonary nodule, pneumothorax, and pulmonary embolus. Both applications can be utilized for retrieval. iSCOUT and ANNIE had precision values between 0.90-0.98 and recall values between 0.79 and 0.94. ANNIE had consistently higher precision but required more customization. Understanding the customizations involved in utilizing NLP applications for various diseases will enable users to select the most suitable tool for specific tasks.

  15. Retrieval of Radiology Reports Citing Critical Findings with Disease-Specific Customization

    PubMed Central

    Lacson, Ronilda; Sugarbaker, Nathanael; Prevedello, Luciano M; Ivan, IP; Mar, Wendy; Andriole, Katherine P; Khorasani, Ramin

    2012-01-01

    Background: Communication of critical results from diagnostic procedures between caregivers is a Joint Commission national patient safety goal. Evaluating critical result communication often requires manual analysis of voluminous data, especially when reviewing unstructured textual results of radiologic findings. Information retrieval (IR) tools can facilitate this process by enabling automated retrieval of radiology reports that cite critical imaging findings. However, IR tools that have been developed for one disease or imaging modality often need substantial reconfiguration before they can be utilized for another disease entity. Purpose: This paper: 1) describes the process of customizing two Natural Language Processing (NLP) and Information Retrieval/Extraction applications – an open-source toolkit, A Nearly New Information Extraction system (ANNIE); and an application developed in-house, Information for Searching Content with an Ontology-Utilizing Toolkit (iSCOUT) – to illustrate the varying levels of customization required for different disease entities and; 2) evaluates each application’s performance in identifying and retrieving radiology reports citing critical imaging findings for three distinct diseases, pulmonary nodule, pneumothorax, and pulmonary embolus. Results: Both applications can be utilized for retrieval. iSCOUT and ANNIE had precision values between 0.90-0.98 and recall values between 0.79 and 0.94. ANNIE had consistently higher precision but required more customization. Conclusion: Understanding the customizations involved in utilizing NLP applications for various diseases will enable users to select the most suitable tool for specific tasks. PMID:22934127

  16. Field Ground Truthing Data Collector - a Mobile Toolkit for Image Analysis and Processing

    NASA Astrophysics Data System (ADS)

    Meng, X.

    2012-07-01

    Field Ground Truthing Data Collector is one of the four key components of the NASA funded ICCaRS project, being developed in Southeast Michigan. The ICCaRS ground truthing toolkit entertains comprehensive functions: 1) Field functions, including determining locations through GPS, gathering and geo-referencing visual data, laying out ground control points for AEROKAT flights, measuring the flight distance and height, and entering observations of land cover (and use) and health conditions of ecosystems and environments in the vicinity of the flight field; 2) Server synchronization functions, such as, downloading study-area maps, aerial photos and satellite images, uploading and synchronizing field-collected data with the distributed databases, calling the geospatial web services on the server side to conduct spatial querying, image analysis and processing, and receiving the processed results in field for near-real-time validation; and 3) Social network communication functions for direct technical assistance and pedagogical support, e.g., having video-conference calls in field with the supporting educators, scientists, and technologists, participating in Webinars, or engaging discussions with other-learning portals. This customized software package is being built on Apple iPhone/iPad and Google Maps/Earth. The technical infrastructures, data models, coupling methods between distributed geospatial data processing and field data collector tools, remote communication interfaces, coding schema, and functional flow charts will be illustrated and explained at the presentation. A pilot case study will be also demonstrated.

  17. Medical image reconstruction algorithm based on the geometric information between sensor detector and ROI

    NASA Astrophysics Data System (ADS)

    Ham, Woonchul; Song, Chulgyu; Lee, Kangsan; Roh, Seungkuk

    2016-05-01

    In this paper, we propose a new image reconstruction algorithm considering the geometric information of acoustic sources and senor detector and review the two-step reconstruction algorithm which was previously proposed based on the geometrical information of ROI(region of interest) considering the finite size of acoustic sensor element. In a new image reconstruction algorithm, not only mathematical analysis is very simple but also its software implementation is very easy because we don't need to use the FFT. We verify the effectiveness of the proposed reconstruction algorithm by showing the simulation results by using Matlab k-wave toolkit.

  18. First neuronavigation experiences in Uruguay.

    PubMed

    Carbajal, Guillermo; Gomez, Alvaro; Pereyra, Gabriela; Lima, Ramiro; Preciozzi, Javier; Vazquez, Luis; Villar, Alvaro

    2010-01-01

    Neuronavigation is the application of image guidance to neurosurgery where the position of a surgical tool can be displayed on a preoperative image. Although this technique has been used worldwide in the last ten years, it was never applied in Uruguay due to its cost. In an ongoing project, the Engineering Faculty (Universidad de la República), the Hospital de Clínicas (Medicine Faculty - Universidad de la República) and the Regional Hospital of Tacuarembó are doing the first experimental trials in neuronavigation. In this project, a prototype based on optical tracking equipment and the open source software IGSTK (Image Guided Surgery Toolkit) is under development and testing.

  19. Geant4 models for simulation of hadron/ion nuclear interactions at moderate and low energies.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Ivanchenko, Vladimir; Quesada, Jose-Manuel; Wright, Dennis

    The Geant4 toolkit is intended for Monte Carlo simulation of particle transport in media. It was initially designed for High Energy Physics purposes such as experiments at the Large Hadron Collider (LHC) at CERN. The toolkit offers a set of models allowing effective simulation of cosmic ray interactions with different materials. For moderate and low energy hadron/ion interactions with nuclei there are a number of competitive models: Binary and Bertini intra-nuclear cascade models, quantum molecular dynamic model (QMD), INCL/ABLA cascade model, and Chiral Invariant Phase Space Decay model (CHIPS). We report the status of these models for the recent version of Geant4 (release 9.3, December 2009). The Bertini cascade in-ternal cross sections were upgraded. The native Geant4 precompound and deexcitation models were used in the Binary cascade and QMD. They were significantly improved including emis-sion of light fragments, the Fermi break-up model, the General Evaporation Model (GEM), the multi-fragmentation model, and the fission model. Comparisons between model predictions and data for thin target experiments for neutron, proton, light ions, and isotope production are presented and discussed. The focus of these validations is concentrated on target materials important for space missions.

  20. Multimethod evaluation of the VA's peer-to-peer Toolkit for patient-centered medical home implementation.

    PubMed

    Luck, Jeff; Bowman, Candice; York, Laura; Midboe, Amanda; Taylor, Thomas; Gale, Randall; Asch, Steven

    2014-07-01

    Effective implementation of the patient-centered medical home (PCMH) in primary care practices requires training and other resources, such as online toolkits, to share strategies and materials. The Veterans Health Administration (VA) developed an online Toolkit of user-sourced tools to support teams implementing its Patient Aligned Care Team (PACT) medical home model. To present findings from an evaluation of the PACT Toolkit, including use, variation across facilities, effect of social marketing, and factors influencing use. The Toolkit is an online repository of ready-to-use tools created by VA clinic staff that physicians, nurses, and other team members may share, download, and adopt in order to more effectively implement PCMH principles and improve local performance on VA metrics. Multimethod evaluation using: (1) website usage analytics, (2) an online survey of the PACT community of practice's use of the Toolkit, and (3) key informant interviews. Survey respondents were PACT team members and coaches (n = 544) at 136 VA facilities. Interview respondents were Toolkit users and non-users (n = 32). For survey data, multivariable logistic models were used to predict Toolkit awareness and use. Interviews and open-text survey comments were coded using a "common themes" framework. The Consolidated Framework for Implementation Research (CFIR) guided data collection and analyses. The Toolkit was used by 6,745 staff in the first 19 months of availability. Among members of the target audience, 80 % had heard of the Toolkit, and of those, 70 % had visited the website. Tools had been implemented at 65 % of facilities. Qualitative findings revealed a range of user perspectives from enthusiastic support to lack of sufficient time to browse the Toolkit. An online Toolkit to support PCMH implementation was used at VA facilities nationwide. Other complex health care organizations may benefit from adopting similar online peer-to-peer resource libraries.

  1. Characterizing polarized illumination in high numerical aperture optical lithography with phase shifting masks

    NASA Astrophysics Data System (ADS)

    McIntyre, Gregory Russell

    The primary objective of this dissertation is to develop the phase shifting mask (PSM) as a precision instrument to characterize effects in optical lithography related to the use of polarized partially coherent illumination. The intent is to provide an in-situ characterization technique to add to the lithographer's tool-kit to help enable the stable and repeatable mass production of integrated circuits with feature sizes approaching 1/6th the wavelength of light being used. A series of complex-valued mathematical functions have been derived from basic principles and recent advances in photomask fabrication technology have enabled their implementation with four-phase mask making. When located in the object plane of an imaging system, these test functions serve to engineer a wavefiront that interacts with one particular optical effect, creating a measurable signal in the image plane. In most cases, these test patterns leverage proximity effects to create a central image intensity and are theoretically the most sensitive to the desired effect. Five novel classes of test patterns have been developed for in-situ characterization. The first two classes, The Linear Phase Grating (LPG) and Linear Phase Ring (LPR), both serve to characterize illumination angular distribution and uniformity by creating signals dependent on illumination angular frequency. The third class consists of the Radial Phase Grating (RPG) and Proximity Effect Polarization Analyzers (PEPA), which each create a polarization-dependent signal by taking advantage of the image reversal of one polarization component at high numerical aperture (NA). PSM Polarimetry employs a series of these patterns to form a complete polarization characterization of any arbitrary illumination scheme. The fourth and fifth classes employ sub-resolution interferometric reference probes to coherently interact with proximity effect spillover from a surrounding pattern. They measure the effective phase and transmission of the shifted regions of an alternating PSM and projection lens birefringence, respectively. A secondary objective of this dissertation has been to leverage some of these functions to extend the application of pattern matching software to rapidly identify areas in a circuit design layout that may be vulnerable to polarization and high-NA effects. Additionally, polarization aberrations have been investigated, as they may become important with hyper-NA imaging systems. Three multi-phase test reticles have been developed for this thesis and have pushed the limits of photomask fabrication. Coupled with a variety of experimental and simulation studies at 193nm wavelength, they have validated the scientific principles of the PSM monitors and have offered unique insight into implementation issues such as electromagnetic (EM) effects and mask making tolerances. Although all five classes are novel theoretical concepts, it is believed that PSM Polarimetry is commercially viable. Despite a 70% loss of sensitivity due to mask making limitations and a 20% loss due to EM effects, it can likely still monitor polarization to within 2%. Experimental results are comparable to the only other known technique, which requires special equipment. Taken collectively, the five novel classes of PSM monitors offer the lithographer an independent tool-kit to ensure proper tool operation. They also provide circuit designers an understanding of the impact of imaging on layouts. Although they have been developed for optical lithography, their principles are relevant to any image-forming optical system and are likely to find applications in other fields of optics or acoustics.

  2. PEA: an integrated R toolkit for plant epitranscriptome analysis.

    PubMed

    Zhai, Jingjing; Song, Jie; Cheng, Qian; Tang, Yunjia; Ma, Chuang

    2018-05-29

    The epitranscriptome, also known as chemical modifications of RNA (CMRs), is a newly discovered layer of gene regulation, the biological importance of which emerged through analysis of only a small fraction of CMRs detected by high-throughput sequencing technologies. Understanding of the epitranscriptome is hampered by the absence of computational tools for the systematic analysis of epitranscriptome sequencing data. In addition, no tools have yet been designed for accurate prediction of CMRs in plants, or to extend epitranscriptome analysis from a fraction of the transcriptome to its entirety. Here, we introduce PEA, an integrated R toolkit to facilitate the analysis of plant epitranscriptome data. The PEA toolkit contains a comprehensive collection of functions required for read mapping, CMR calling, motif scanning and discovery, and gene functional enrichment analysis. PEA also takes advantage of machine learning technologies for transcriptome-scale CMR prediction, with high prediction accuracy, using the Positive Samples Only Learning algorithm, which addresses the two-class classification problem by using only positive samples (CMRs), in the absence of negative samples (non-CMRs). Hence PEA is a versatile epitranscriptome analysis pipeline covering CMR calling, prediction, and annotation, and we describe its application to predict N6-methyladenosine (m6A) modifications in Arabidopsis thaliana. Experimental results demonstrate that the toolkit achieved 71.6% sensitivity and 73.7% specificity, which is superior to existing m6A predictors. PEA is potentially broadly applicable to the in-depth study of epitranscriptomics. PEA Docker image is available at https://hub.docker.com/r/malab/pea, source codes and user manual are available at https://github.com/cma2015/PEA. chuangma2006@gmail.com. Supplementary data are available at Bioinformatics online.

  3. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    PubMed

    Kasahara, Kota; Kinoshita, Kengo

    2016-01-01

    Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  4. Exposing exposure: automated anatomy-specific CT radiation exposure extraction for quality assurance and radiation monitoring.

    PubMed

    Sodickson, Aaron; Warden, Graham I; Farkas, Cameron E; Ikuta, Ichiro; Prevedello, Luciano M; Andriole, Katherine P; Khorasani, Ramin

    2012-08-01

    To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. This institutional review board-approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality control and optimization, and cumulative patient- and anatomy-specific radiation exposure monitoring. Large-scale anatomy-specific radiation exposure data repositories can be created with high fidelity from existing digital image archives by using open-source informatics tools.

  5. A software architecture for automating operations processes

    NASA Technical Reports Server (NTRS)

    Miller, Kevin J.

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a software architecture based on an integrated toolkit approach for simplifying and automating mission operations tasks. The toolkit approach is based on building adaptable, reusable graphical tools that are integrated through a combination of libraries, scripts, and system-level user interface shells. The graphical interface shells are designed to integrate and visually guide a user through the complex steps in an operations process. They provide a user with an integrated system-level picture of an overall process, defining the required inputs and possible output through interactive on-screen graphics. The OEL has developed the software for building these process-oriented graphical user interface (GUI) shells. The OEL Shell development system (OEL Shell) is an extension of JPL's Widget Creation Library (WCL). The OEL Shell system can be used to easily build user interfaces for running complex processes, applications with extensive command-line interfaces, and tool-integration tasks. The interface shells display a logical process flow using arrows and box graphics. They also allow a user to select which output products are desired and which input sources are needed, eliminating the need to know which program and its associated command-line parameters must be executed in each case. The shells have also proved valuable for use as operations training tools because of the OEL Shell hypertext help environment. The OEL toolkit approach is guided by several principles, including the use of ASCII text file interfaces with a multimission format, Perl scripts for mission-specific adaptation code, and programs that include a simple command-line interface for batch mode processing. Projects can adapt the interface shells by simple changes to the resources configuration file. This approach has allowed the development of sophisticated, automated software systems that are easy, cheap, and fast to build. This paper will discuss our toolkit approach and the OEL Shell interface builder in the context of a real operations process example. The paper will discuss the design and implementation of a Ulysses toolkit for generating the mission sequence of events. The Sequence of Events Generation (SEG) system provides an adaptable multimission toolkit for producing a time-ordered listing and timeline display of spacecraft commands, state changes, and required ground activities.

  6. Evaluation of an Extension-Delivered Resource for Accelerating Progress in Childhood Obesity Prevention: The BEPA-Toolkit

    ERIC Educational Resources Information Center

    Gunter, Katherine B.; Abi Nader, Patrick; Armington, Amanda; Hicks, John C.; John, Deborah

    2017-01-01

    The Balanced Energy Physical Activity Toolkit, or the BEPA-Toolkit, supports physical activity (PA) programming via Extension in elementary schools. In a pilot study, we evaluated the effectiveness of the BEPA-Toolkit as used by teachers through Supplemental Nutrition Assistance Program Education partnerships. We surveyed teachers (n = 57)…

  7. Data Exploration Toolkit for serial diffraction experiments

    DOE PAGES

    Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; ...

    2015-01-23

    Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the 'diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography datamore » sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.« less

  8. Electro-optical co-simulation for integrated CMOS photonic circuits with VerilogA.

    PubMed

    Sorace-Agaskar, Cheryl; Leu, Jonathan; Watts, Michael R; Stojanovic, Vladimir

    2015-10-19

    We present a Cadence toolkit library written in VerilogA for simulation of electro-optical systems. We have identified and described a set of fundamental photonic components at the physical level such that characteristics of composite devices (e.g. ring modulators) are created organically - by simple instantiation of fundamental primitives. Both the amplitude and phase of optical signals as well as optical-electrical interactions are simulated. We show that the results match other simulations and analytic solutions that have previously been compared to theory for both simple devices, such as ring resonators, and more complicated devices and systems such as single-sideband modulators, WDM links and Pound Drever Hall Locking loops. We also illustrate the capability of such toolkit for co-simulation with electronic circuits, which is a key enabler of the electro-optic system development and verification.

  9. Implementing a user-driven online quality improvement toolkit for cancer care.

    PubMed

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  10. Population-scale three-dimensional reconstruction and quantitative profiling of microglia arbors

    PubMed Central

    Rey-Villamizar, Nicolas; Merouane, Amine; Lu, Yanbin; Mukherjee, Amit; Trett, Kristen; Chong, Peter; Harris, Carolyn; Shain, William; Roysam, Badrinath

    2015-01-01

    Motivation: The arbor morphologies of brain microglia are important indicators of cell activation. This article fills the need for accurate, robust, adaptive and scalable methods for reconstructing 3-D microglial arbors and quantitatively mapping microglia activation states over extended brain tissue regions. Results: Thick rat brain sections (100–300 µm) were multiplex immunolabeled for IBA1 and Hoechst, and imaged by step-and-image confocal microscopy with automated 3-D image mosaicing, producing seamless images of extended brain regions (e.g. 5903 × 9874 × 229 voxels). An over-complete dictionary-based model was learned for the image-specific local structure of microglial processes. The microglial arbors were reconstructed seamlessly using an automated and scalable algorithm that exploits microglia-specific constraints. This method detected 80.1 and 92.8% more centered arbor points, and 53.5 and 55.5% fewer spurious points than existing vesselness and LoG-based methods, respectively, and the traces were 13.1 and 15.5% more accurate based on the DIADEM metric. The arbor morphologies were quantified using Scorcioni’s L-measure. Coifman’s harmonic co-clustering revealed four morphologically distinct classes that concord with known microglia activation patterns. This enabled us to map spatial distributions of microglial activation and cell abundances. Availability and implementation: Experimental protocols, sample datasets, scalable open-source multi-threaded software implementation (C++, MATLAB) in the electronic supplement, and website (www.farsight-toolkit.org). http://www.farsight-toolkit.org/wiki/Population-scale_Three-dimensional_Reconstruction_and_Quanti-tative_Profiling_of_Microglia_Arbors Contact: broysam@central.uh.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25701570

  11. A practical approach for comparing management strategies in complex forest ecosystems using meta-modelling toolkits

    Treesearch

    Andrew Fall; B. Sturtevant; M.-J. Fortin; M. Papaik; F. Doyon; D. Morgan; K. Berninger; C. Messier

    2010-01-01

    The complexity and multi-scaled nature of forests poses significant challenges to understanding and management. Models can provide useful insights into process and their interactions, and implications of alternative management options. Most models, particularly scientific models, focus on a relatively small set of processes and are designed to operate within a...

  12. Network Science Research Laboratory (NSRL) Discrete Event Toolkit

    DTIC Science & Technology

    2016-01-01

    ARL-TR-7579 ● JAN 2016 US Army Research Laboratory Network Science Research Laboratory (NSRL) Discrete Event Toolkit by...Laboratory (NSRL) Discrete Event Toolkit by Theron Trout and Andrew J Toth Computational and Information Sciences Directorate, ARL...Research Laboratory (NSRL) Discrete Event Toolkit 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Theron Trout

  13. The Orthanc Ecosystem for Medical Imaging.

    PubMed

    Jodogne, Sébastien

    2018-05-03

    This paper reviews the components of Orthanc, a free and open-source, highly versatile ecosystem for medical imaging. At the core of the Orthanc ecosystem, the Orthanc server is a lightweight vendor neutral archive that provides PACS managers with a powerful environment to automate and optimize the imaging flows that are very specific to each hospital. The Orthanc server can be extended with plugins that provide solutions for teleradiology, digital pathology, or enterprise-ready databases. It is shown how software developers and research engineers can easily develop external software or Web portals dealing with medical images, with minimal knowledge of the DICOM standard, thanks to the advanced programming interface of the Orthanc server. The paper concludes by introducing the Stone of Orthanc, an innovative toolkit for the cross-platform rendering of medical images.

  14. Health Equity Assessment Toolkit (HEAT): software for exploring and comparing health inequalities in countries.

    PubMed

    Hosseinpoor, Ahmad Reza; Nambiar, Devaki; Schlotheuber, Anne; Reidpath, Daniel; Ross, Zev

    2016-10-19

    It is widely recognised that the pursuit of sustainable development cannot be accomplished without addressing inequality, or observed differences between subgroups of a population. Monitoring health inequalities allows for the identification of health topics where major group differences exist, dimensions of inequality that must be prioritised to effect improvements in multiple health domains, and also population subgroups that are multiply disadvantaged. While availability of data to monitor health inequalities is gradually improving, there is a commensurate need to increase, within countries, the technical capacity for analysis of these data and interpretation of results for decision-making. Prior efforts to build capacity have yielded demand for a toolkit with the computational ability to display disaggregated data and summary measures of inequality in an interactive and customisable fashion that would facilitate interpretation and reporting of health inequality in a given country. To answer this demand, the Health Equity Assessment Toolkit (HEAT), was developed between 2014 and 2016. The software, which contains the World Health Organization's Health Equity Monitor database, allows the assessment of inequalities within a country using over 30 reproductive, maternal, newborn and child health indicators and five dimensions of inequality (economic status, education, place of residence, subnational region and child's sex, where applicable). HEAT was beta-tested in 2015 as part of ongoing capacity building workshops on health inequality monitoring. This is the first and only application of its kind; further developments are proposed to introduce an upload data feature, translate it into different languages and increase interactivity of the software. This article will present the main features and functionalities of HEAT and discuss its relevance and use for health inequality monitoring.

  15. Contrast Analysis for Side-Looking Sonar

    DTIC Science & Technology

    2013-09-30

    bound for shadow depth that can be used to validate modeling tools such as SWAT (Shallow Water Acoustics Toolkit). • Adaptive Postprocessing: Tune image...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send

  16. A fast - Monte Carlo toolkit on GPU for treatment plan dose recalculation in proton therapy

    NASA Astrophysics Data System (ADS)

    Senzacqua, M.; Schiavi, A.; Patera, V.; Pioli, S.; Battistoni, G.; Ciocca, M.; Mairani, A.; Magro, G.; Molinelli, S.

    2017-10-01

    In the context of the particle therapy a crucial role is played by Treatment Planning Systems (TPSs), tools aimed to compute and optimize the tratment plan. Nowadays one of the major issues related to the TPS in particle therapy is the large CPU time needed. We developed a software toolkit (FRED) for reducing dose recalculation time by exploiting Graphics Processing Units (GPU) hardware. Thanks to their high parallelization capability, GPUs significantly reduce the computation time, up to factor 100 respect to a standard CPU running software. The transport of proton beams in the patient is accurately described through Monte Carlo methods. Physical processes reproduced are: Multiple Coulomb Scattering, energy straggling and nuclear interactions of protons with the main nuclei composing the biological tissues. FRED toolkit does not rely on the water equivalent translation of tissues, but exploits the Computed Tomography anatomical information by reconstructing and simulating the atomic composition of each crossed tissue. FRED can be used as an efficient tool for dose recalculation, on the day of the treatment. In fact it can provide in about one minute on standard hardware the dose map obtained combining the treatment plan, earlier computed by the TPS, and the current patient anatomic arrangement.

  17. Backtracking behaviour in lost ants: an additional strategy in their navigational toolkit

    PubMed Central

    Wystrach, Antoine; Schwarz, Sebastian; Baniel, Alice; Cheng, Ken

    2013-01-01

    Ants use multiple sources of information to navigate, but do not integrate all this information into a unified representation of the world. Rather, the available information appears to serve three distinct main navigational systems: path integration, systematic search and the use of learnt information—mainly via vision. Here, we report on an additional behaviour that suggests a supplemental system in the ant's navigational toolkit: ‘backtracking’. Homing ants, having almost reached their nest but, suddenly displaced to unfamiliar areas, did not show the characteristic undirected headings of systematic searches. Instead, these ants backtracked in the compass direction opposite to the path that they had just travelled. The ecological function of this behaviour is clear as we show it increases the chances of returning to familiar terrain. Importantly, the mechanistic implications of this behaviour stress an extra level of cognitive complexity in ant navigation. Our results imply: (i) the presence of a type of ‘memory of the current trip’ allowing lost ants to take into account the familiar view recently experienced, and (ii) direct sharing of information across different navigational systems. We propose a revised architecture of the ant's navigational toolkit illustrating how the different systems may interact to produce adaptive behaviours. PMID:23966644

  18. TOOLKIT, Version 2. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, E.; Bagot, B.; McNeill, R.L.

    1990-05-09

    The purpose of this User's Guide is to show by example many of the features of Toolkit II. Some examples will be copies of screens as they appear while running the Toolkit. Other examples will show what the user should enter in various situations; in these instances, what the computer asserts will be in boldface and what the user responds will be in regular type. The User's Guide is divided into four sections. The first section, FOCUS Databases'', will give a broad overview of the Focus administrative databases that are available on the VAX; easy-to-use reports are available for mostmore » of them in the Toolkit. The second section, Getting Started'', will cover the steps necessary to log onto the Computer Center VAX cluster and how to start Focus and the Toolkit. The third section, Using the Toolkit'', will discuss some of the features in the Toolkit -- the available reports and how to access them, as well as some utilities. The fourth section, Helpful Hints'', will cover some useful facts about the VAX and Focus as well as some of the more common problems that can occur. The Toolkit is not set in concrete but is continually being revised and improved. If you have any opinions as to changes that you would like to see made to the Toolkit or new features that you would like included, please let us know. Since we do try to respond to the needs of the user and make periodic improvement to the Toolkit, this User's Guide may not correspond exactly to what is available in the computer. In general, changes are made to provide new options or features; rarely is an existing feature deleted.« less

  19. An economic toolkit for identifying the cost of emergency medical services (EMS) systems: detailed methodology of the EMS Cost Analysis Project (EMSCAP).

    PubMed

    Lerner, E Brooke; Garrison, Herbert G; Nichol, Graham; Maio, Ronald F; Lookman, Hunaid A; Sheahan, William D; Franz, Timothy R; Austad, James D; Ginster, Aaron M; Spaite, Daniel W

    2012-02-01

    Calculating the cost of an emergency medical services (EMS) system using a standardized method is important for determining the value of EMS. This article describes the development of a methodology for calculating the cost of an EMS system to its community. This includes a tool for calculating the cost of EMS (the "cost workbook") and detailed directions for determining cost (the "cost guide"). The 12-step process that was developed is consistent with current theories of health economics, applicable to prehospital care, flexible enough to be used in varying sizes and types of EMS systems, and comprehensive enough to provide meaningful conclusions. It was developed by an expert panel (the EMS Cost Analysis Project [EMSCAP] investigator team) in an iterative process that included pilot testing the process in three diverse communities. The iterative process allowed ongoing modification of the toolkit during the development phase, based upon direct, practical, ongoing interaction with the EMS systems that were using the toolkit. The resulting methodology estimates EMS system costs within a user-defined community, allowing either the number of patients treated or the estimated number of lives saved by EMS to be assessed in light of the cost of those efforts. Much controversy exists about the cost of EMS and whether the resources spent for this purpose are justified. However, the existence of a validated toolkit that provides a standardized process will allow meaningful assessments and comparisons to be made and will supply objective information to inform EMS and community officials who are tasked with determining the utilization of scarce societal resources. © 2012 by the Society for Academic Emergency Medicine.

  20. A toolkit for determining historical eco-hydrological interactions

    NASA Astrophysics Data System (ADS)

    Singer, M. B.; Sargeant, C. I.; Evans, C. M.; Vallet-Coulomb, C.

    2016-12-01

    Contemporary climate change is predicted to result in perturbations to hydroclimatic regimes across the globe, with some regions forecast to become warmer and drier. Given that water is a primary determinant of vegetative health and productivity, we can expect shifts in the availability of this critical resource to have significant impacts on forested ecosystems. The subject is particularly complex in environments where multiple sources of water are potentially available to vegetation and which may also exhibit spatial and temporal variability. To anticipate how subsurface hydrological partitioning may evolve in the future and impact overlying vegetation, we require well constrained, historical data and a modelling framework for assessing the dynamics of subsurface hydrology. We outline a toolkit to retrospectively investigate dynamic water use by trees. We describe a synergistic approach, which combines isotope dendrochronology of tree ring cellulose with a biomechanical model, detailed climatic and isotopic data in endmember waters to assess the mean isotopic composition of source water used in annual tree rings. We identify the data requirements and suggest three versions of the toolkit based on data availability. We present sensitivity analyses in order to identify the key variables required to constrain model predictions and then develop empirical relationships for constraining these parameters based on climate records. We demonstrate our methodology within a Mediterranean riparian forest site and show how it can be used along with subsurface hydrological modelling to validate source water determinations, which are fundamental to understanding climatic fluctuations and trends in subsurface hydrology. We suggest that the utility of our toolkit is applicable in riparian zones and in a range of forest environments where distinct isotopic endmembers are present.

  1. Local Foods, Local Places Toolkit

    EPA Pesticide Factsheets

    Toolkit to help communities that want to use local foods to spur revitalization. The toolkit gives step-by-step instructions to help communities plan and host a workshop and create an action plan to implement.

  2. Provider perceptions of an integrated primary care quality improvement strategy: The PPAQ toolkit.

    PubMed

    Beehler, Gregory P; Lilienthal, Kaitlin R

    2017-02-01

    The Primary Care Behavioral Health (PCBH) model of integrated primary care is challenging to implement with high fidelity. The Primary Care Behavioral Health Provider Adherence Questionnaire (PPAQ) was designed to assess provider adherence to essential model components and has recently been adapted into a quality improvement toolkit. The aim of this pilot project was to gather preliminary feedback on providers' perceptions of the acceptability and utility of the PPAQ toolkit for making beneficial practice changes. Twelve mental health providers working in Department of Veterans Affairs integrated primary care clinics participated in semistructured interviews to gather quantitative and qualitative data. Descriptive statistics and qualitative content analysis were used to analyze data. Providers identified several positive features of the PPAQ toolkit organization and structure that resulted in high ratings of acceptability, while also identifying several toolkit components in need of modification to improve usability. Toolkit content was considered highly representative of the (PCBH) model and therefore could be used as a diagnostic self-assessment of model adherence. The toolkit was considered to be high in applicability to providers regardless of their degree of prior professional preparation or current clinical setting. Additionally, providers identified several system-level contextual factors that could impact the usefulness of the toolkit. These findings suggest that frontline mental health providers working in (PCBH) settings may be receptive to using an adherence-focused toolkit for ongoing quality improvement. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Web based tools for visualizing imaging data and development of XNATView, a zero footprint image viewer

    PubMed Central

    Gutman, David A.; Dunn, William D.; Cobb, Jake; Stoner, Richard M.; Kalpathy-Cramer, Jayashree; Erickson, Bradley

    2014-01-01

    Advances in web technologies now allow direct visualization of imaging data sets without necessitating the download of large file sets or the installation of software. This allows centralization of file storage and facilitates image review and analysis. XNATView is a light framework recently developed in our lab to visualize DICOM images stored in The Extensible Neuroimaging Archive Toolkit (XNAT). It consists of a PyXNAT-based framework to wrap around the REST application programming interface (API) and query the data in XNAT. XNATView was developed to simplify quality assurance, help organize imaging data, and facilitate data sharing for intra- and inter-laboratory collaborations. Its zero-footprint design allows the user to connect to XNAT from a web browser, navigate through projects, experiments, and subjects, and view DICOM images with accompanying metadata all within a single viewing instance. PMID:24904399

  4. DICOM image quantification secondary capture (DICOM IQSC) integrated with numeric results, regions, and curves: implementation and applications in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan

    2017-03-01

    In this paper, we describe an enhanced DICOM Secondary Capture (SC) that integrates Image Quantification (IQ) results, Regions of Interest (ROIs), and Time Activity Curves (TACs) with screen shots by embedding extra medical imaging information into a standard DICOM header. A software toolkit of DICOM IQSC has been developed to implement the SC-centered information integration of quantitative analysis for routine practice of nuclear medicine. Primary experiments show that the DICOM IQSC method is simple and easy to implement seamlessly integrating post-processing workstations with PACS for archiving and retrieving IQ information. Additional DICOM IQSC applications in routine nuclear medicine and clinic research are also discussed.

  5. Green Infrastructure Modeling Toolkit

    EPA Pesticide Factsheets

    EPA's Green Infrastructure Modeling Toolkit is a toolkit of 5 EPA green infrastructure models and tools, along with communication materials, that can be used as a teaching tool and a quick reference resource when making GI implementation decisions.

  6. An integrative data analysis platform for gene set analysis and knowledge discovery in a data warehouse framework.

    PubMed

    Chen, Yi-An; Tripathi, Lokesh P; Mizuguchi, Kenji

    2016-01-01

    Data analysis is one of the most critical and challenging steps in drug discovery and disease biology. A user-friendly resource to visualize and analyse high-throughput data provides a powerful medium for both experimental and computational biologists to understand vastly different biological data types and obtain a concise, simplified and meaningful output for better knowledge discovery. We have previously developed TargetMine, an integrated data warehouse optimized for target prioritization. Here we describe how upgraded and newly modelled data types in TargetMine can now survey the wider biological and chemical data space, relevant to drug discovery and development. To enhance the scope of TargetMine from target prioritization to broad-based knowledge discovery, we have also developed a new auxiliary toolkit to assist with data analysis and visualization in TargetMine. This toolkit features interactive data analysis tools to query and analyse the biological data compiled within the TargetMine data warehouse. The enhanced system enables users to discover new hypotheses interactively by performing complicated searches with no programming and obtaining the results in an easy to comprehend output format. Database URL: http://targetmine.mizuguchilab.org. © The Author(s) 2016. Published by Oxford University Press.

  7. Oak Ridge Bio-surveillance Toolkit (ORBiT): Integrating Big-Data Analytics with Visual Analysis for Public Health Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Pullum, Laura L; Steed, Chad A

    In this position paper, we describe the design and implementation of the Oak Ridge Bio-surveillance Toolkit (ORBiT): a collection of novel statistical and machine learning tools implemented for (1) integrating heterogeneous traditional (e.g. emergency room visits, prescription sales data, etc.) and non-traditional (social media such as Twitter and Instagram) data sources, (2) analyzing large-scale datasets and (3) presenting the results from the analytics as a visual interface for the end-user to interact and provide feedback. We present examples of how ORBiT can be used to summarize ex- tremely large-scale datasets effectively and how user interactions can translate into the datamore » analytics process for bio-surveillance. We also present a strategy to estimate parameters relevant to dis- ease spread models from near real time data feeds and show how these estimates can be integrated with disease spread models for large-scale populations. We conclude with a perspective on how integrating data and visual analytics could lead to better forecasting and prediction of disease spread as well as improved awareness of disease susceptible regions.« less

  8. An integrative data analysis platform for gene set analysis and knowledge discovery in a data warehouse framework

    PubMed Central

    Chen, Yi-An; Tripathi, Lokesh P.; Mizuguchi, Kenji

    2016-01-01

    Data analysis is one of the most critical and challenging steps in drug discovery and disease biology. A user-friendly resource to visualize and analyse high-throughput data provides a powerful medium for both experimental and computational biologists to understand vastly different biological data types and obtain a concise, simplified and meaningful output for better knowledge discovery. We have previously developed TargetMine, an integrated data warehouse optimized for target prioritization. Here we describe how upgraded and newly modelled data types in TargetMine can now survey the wider biological and chemical data space, relevant to drug discovery and development. To enhance the scope of TargetMine from target prioritization to broad-based knowledge discovery, we have also developed a new auxiliary toolkit to assist with data analysis and visualization in TargetMine. This toolkit features interactive data analysis tools to query and analyse the biological data compiled within the TargetMine data warehouse. The enhanced system enables users to discover new hypotheses interactively by performing complicated searches with no programming and obtaining the results in an easy to comprehend output format. Database URL: http://targetmine.mizuguchilab.org PMID:26989145

  9. College Women's Health

    MedlinePlus

    ... the College Women's Social Media Kit! College Women's Social Media Toolkit Use the Social Media Toolkit to share health tips with your campus ... toolkit includes resources for young women including sample social media messages, flyers and blogs posts. NEW Social Media ...

  10. Every Place Counts Leadership Academy transportation toolkit

    DOT National Transportation Integrated Search

    2016-12-01

    The Transportation Toolkit is meant to explain the transportation process to members of the public with no prior knowledge of transportation. The Toolkit is meant to demystify transportation and help people engage in local transportation decision-mak...

  11. Business intelligence from social media: a study from the VAST Box Office Challenge.

    PubMed

    Lu, Yafeng; Wang, Feng; Maciejewski, Ross

    2014-01-01

    With over 16 million tweets per hour, 600 new blog posts per minute, and 400 million active users on Facebook, businesses have begun searching for ways to turn real-time consumer-based posts into actionable intelligence. The goal is to extract information from this noisy, unstructured data and use it for trend analysis and prediction. Current practices support the idea that visual analytics (VA) can help enable the effective analysis of such data. However, empirical evidence demonstrating the effectiveness of a VA solution is still lacking. A proposed VA toolkit extracts data from Bitly and Twitter to predict movie revenue and ratings. Results from the 2013 VAST Box Office Challenge demonstrate the benefit of an interactive environment for predictive analysis, compared to a purely statistical modeling approach. The VA approach used by the toolkit is generalizable to other domains involving social media data, such as sales forecasting and advertisement analysis.

  12. Integrating surgical robots into the next medical toolkit.

    PubMed

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality.

  13. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  14. Children and youth with disabilities: innovative methods for single qualitative interviews.

    PubMed

    Teachman, Gail; Gibson, Barbara E

    2013-02-01

    There is a paucity of explicit literature outlining methods for single-interview studies with children, and almost none have focused on engaging children with disabilities. Drawing from a pilot study, we address these gaps by describing innovative techniques, strategies, and methods for engaging children and youth with disabilities in a single qualitative interview. In the study, we explored the beliefs, assumptions, and experiences of children and youth with cerebral palsy and their parents regarding the importance of walking. We describe three key aspects of our child-interview methodological approach: collaboration with parents, a toolkit of customizable interview techniques, and strategies to consider the power differential inherent in child-researcher interactions. Examples from our research illustrate what worked well and what was less successful. Researchers can optimize single interviews with children with disabilities by collaborating with family members and by preparing a toolkit of customizable interview techniques.

  15. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  16. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    PubMed Central

    O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R

    2008-01-01

    Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers. PMID:18328109

  17. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit.

    PubMed

    O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R

    2008-03-09

    Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  18. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M

    2005-01-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecularmore » dynamics application of great interest to the computational biology community.« less

  19. A graphical user interface (GUI) toolkit for the calculation of three-dimensional (3D) multi-phase biological effective dose (BED) distributions including statistical analyses.

    PubMed

    Kauweloa, Kevin I; Gutierrez, Alonso N; Stathakis, Sotirios; Papanikolaou, Niko; Mavroidis, Panayiotis

    2016-07-01

    A toolkit has been developed for calculating the 3-dimensional biological effective dose (BED) distributions in multi-phase, external beam radiotherapy treatments such as those applied in liver stereotactic body radiation therapy (SBRT) and in multi-prescription treatments. This toolkit also provides a wide range of statistical results related to dose and BED distributions. MATLAB 2010a, version 7.10 was used to create this GUI toolkit. The input data consist of the dose distribution matrices, organ contour coordinates, and treatment planning parameters from the treatment planning system (TPS). The toolkit has the capability of calculating the multi-phase BED distributions using different formulas (denoted as true and approximate). Following the calculations of the BED distributions, the dose and BED distributions can be viewed in different projections (e.g. coronal, sagittal and transverse). The different elements of this toolkit are presented and the important steps for the execution of its calculations are illustrated. The toolkit is applied on brain, head & neck and prostate cancer patients, who received primary and boost phases in order to demonstrate its capability in calculating BED distributions, as well as measuring the inaccuracy and imprecision of the approximate BED distributions. Finally, the clinical situations in which the use of the present toolkit would have a significant clinical impact are indicated. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. The Comprehension Toolkit

    ERIC Educational Resources Information Center

    Harvey, Stephanie; Goudvis, Anne

    2005-01-01

    "The Comprehension Toolkit" focuses on reading, writing, talking, listening, and investigating, to deepen understanding of nonfiction texts. With a focus on strategic thinking, this toolkit's lessons provide a foundation for developing independent readers and learners. It also provides an alternative to the traditional assign and correct…

  1. ANTS — a simulation package for secondary scintillation Anger-camera type detector in thermal neutron imaging

    NASA Astrophysics Data System (ADS)

    Morozov, A.; Defendi, I.; Engels, R.; Fraga, F. A. F.; Fraga, M. M. F. R.; Guerard, B.; Jurkovic, M.; Kemmerling, G.; Manzin, G.; Margato, L. M. S.; Niko, H.; Pereira, L.; Petrillo, C.; Peyaud, A.; Piscitelli, F.; Raspino, D.; Rhodes, N. J.; Sacchetti, F.; Schooneveld, E. M.; Van Esch, P.; Zeitelhack, K.

    2012-08-01

    A custom and fully interactive simulation package ANTS (Anger-camera type Neutron detector: Toolkit for Simulations) has been developed to optimize the design and operation conditions of secondary scintillation Anger-camera type gaseous detectors for thermal neutron imaging. The simulation code accounts for all physical processes related to the neutron capture, energy deposition pattern, drift of electrons of the primary ionization and secondary scintillation. The photons are traced considering the wavelength-resolved refraction and transmission of the output window. Photo-detection accounts for the wavelength-resolved quantum efficiency, angular response, area sensitivity, gain and single-photoelectron spectra of the photomultipliers (PMTs). The package allows for several geometrical shapes of the PMT photocathode (round, hexagonal and square) and offers a flexible PMT array configuration: up to 100 PMTs in a custom arrangement with the square or hexagonal packing. Several read-out patterns of the PMT array are implemented. Reconstruction of the neutron capture position (projection on the plane of the light emission) is performed using the center of gravity, maximum likelihood or weighted least squares algorithm. Simulation results reproduce well the preliminary results obtained with a small-scale detector prototype. ANTS executables can be downloaded from http://coimbra.lip.pt/~andrei/.

  2. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, Aaron L

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  3. Student Success Center Toolkit

    ERIC Educational Resources Information Center

    Jobs For the Future, 2014

    2014-01-01

    "Student Success Center Toolkit" is a compilation of materials organized to assist Student Success Center directors as they staff, launch, operate, and sustain Centers. The toolkit features materials created and used by existing Centers, such as staffing and budgeting templates, launch materials, sample meeting agendas, and fundraising…

  4. Veterinary Immunology Committee Toolkit Workshop 2010: Progress and plans

    USDA-ARS?s Scientific Manuscript database

    The Third Veterinary Immunology Committee (VIC) Toolkit Workshop took place at the Ninth International Veterinary Immunology Symposium (IVIS) in Tokyo, Japan on August 18, 2020. The Workshop built on previous Toolkit Workshops and covered various aspects of reagent development, commercialisation an...

  5. 77 FR 73023 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... foreign end-users of environmental technologies that will outline U.S. approaches to a series of environmental problems and highlight participating U.S. vendors of relevant U.S. technologies. The Toolkit will... DEPARTMENT OF COMMERCE International Trade Administration U.S. Environmental Solutions Toolkit...

  6. WRIST: A WRist Image Segmentation Toolkit for carpal bone delineation from MRI.

    PubMed

    Foster, Brent; Joshi, Anand A; Borgese, Marissa; Abdelhafez, Yasser; Boutin, Robert D; Chaudhari, Abhijit J

    2018-01-01

    Segmentation of the carpal bones from 3D imaging modalities, such as magnetic resonance imaging (MRI), is commonly performed for in vivo analysis of wrist morphology, kinematics, and biomechanics. This crucial task is typically carried out manually and is labor intensive, time consuming, subject to high inter- and intra-observer variability, and may result in topologically incorrect surfaces. We present a method, WRist Image Segmentation Toolkit (WRIST), for 3D semi-automated, rapid segmentation of the carpal bones of the wrist from MRI. In our method, the boundary of the bones were iteratively found using prior known anatomical constraints and a shape-detection level set. The parameters of the method were optimized using a training dataset of 48 manually segmented carpal bones and evaluated on 112 carpal bones which included both healthy participants without known wrist conditions and participants with thumb basilar osteoarthritis (OA). Manual segmentation by two expert human observers was considered as a reference. On the healthy subject dataset we obtained a Dice overlap of 93.0 ± 3.8, Jaccard Index of 87.3 ± 6.2, and a Hausdorff distance of 2.7 ± 3.4 mm, while on the OA dataset we obtained a Dice overlap of 90.7 ± 8.6, Jaccard Index of 83.0 ± 10.6, and a Hausdorff distance of 4.0 ± 4.4 mm. The short computational time of 20.8 s per bone (or 5.1 s per bone in the parallelized version) and the high agreement with the expert observers gives WRIST the potential to be utilized in musculoskeletal research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Development of an educational 'toolkit' for health professionals and their patients with prediabetes: the WAKEUP study (Ways of Addressing Knowledge Education and Understanding in Pre-diabetes).

    PubMed

    Evans, P H; Greaves, C; Winder, R; Fearn-Smith, J; Campbell, J L

    2007-07-01

    To identify key messages about pre-diabetes and to design, develop and pilot an educational toolkit to address the information needs of patients and health professionals. Mixed qualitative methodology within an action research framework. Focus group interviews with patients and health professionals and discussion with an expert reference group aimed to identify the important messages and produce a draft toolkit. Two action research cycles were then conducted in two general practices, during which the draft toolkit was used and video-taped consultations and follow-up patient interviews provided further data. Framework analysis techniques were used to examine the data and to elicit action points for improving the toolkit. The key messages about pre-diabetes concerned the seriousness of the condition, the preventability of progression to diabetes, and the need for lifestyle change. As well as feedback on the acceptability and use of the toolkit, four main themes were identified in the data: knowledge and education needs (of both patients and health professionals); communicating knowledge and motivating change; redesign of practice systems to support pre-diabetes management and the role of the health professional. The toolkit we developed was found to be an acceptable and useful resource for both patients and health practitioners. Three key messages about pre-diabetes were identified. A toolkit of information materials for patients with pre-diabetes and the health professionals and ideas for improving practice systems for managing pre-diabetes were developed and successfully piloted. Further work is needed to establish the best mode of delivery of the WAKEUP toolkit.

  8. Toolkit of Available EPA Green Infrastructure Modeling Software. National Stormwater Calculator

    EPA Science Inventory

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementat...

  9. Teacher Quality Toolkit

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.

    2004-01-01

    This Teacher Quality Toolkit aims to support the continuum of teacher learning by providing tools that institutions of higher education, districts, and schools can use to improve both preservice and inservice teacher education. The toolkit incorporates McREL?s accumulated knowledge and experience related to teacher quality and standards-based…

  10. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... Commerce continues to develop the web- based U.S. Environmental Solutions Toolkit to be used by foreign environmental officials and foreign end-users of environmental technologies that will outline U.S. approaches to... DEPARTMENT OF COMMERCE International Trade Administration U.S. Environmental Solutions Toolkit...

  11. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    PubMed Central

    O'Boyle, Noel M; Hutchison, Geoffrey R

    2008-01-01

    Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java), have different underlying chemical models and have different application programming interfaces (APIs). Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit. PMID:19055766

  12. Integrating segmentation methods from the Insight Toolkit into a visualization application.

    PubMed

    Martin, Ken; Ibáñez, Luis; Avila, Lisa; Barré, Sébastien; Kaspersen, Jon H

    2005-12-01

    The Insight Toolkit (ITK) initiative from the National Library of Medicine has provided a suite of state-of-the-art segmentation and registration algorithms ideally suited to volume visualization and analysis. A volume visualization application that effectively utilizes these algorithms provides many benefits: it allows access to ITK functionality for non-programmers, it creates a vehicle for sharing and comparing segmentation techniques, and it serves as a visual debugger for algorithm developers. This paper describes the integration of image processing functionalities provided by the ITK into VolView, a visualization application for high performance volume rendering. A free version of this visualization application is publicly available and is available in the online version of this paper. The process for developing ITK plugins for VolView according to the publicly available API is described in detail, and an application of ITK VolView plugins to the segmentation of Abdominal Aortic Aneurysms (AAAs) is presented. The source code of the ITK plugins is also publicly available and it is included in the online version.

  13. Nuclear spectroscopy with Geant4. The superheavy challenge

    NASA Astrophysics Data System (ADS)

    Sarmiento, Luis G.

    2016-12-01

    The simulation toolkit Geant4 was originally developed at CERN for high-energy physics. Over the years it has been established as a swiss army knife not only in particle physics but it has seen an accelerated expansion towards nuclear physics and more recently to medical imaging and γ- and ion- therapy to mention but a handful of new applications. The validity of Geant4 is vast and large across many particles, ions, materials, and physical processes with typically various different models to choose from. Unfortunately, atomic nuclei with atomic number Z > 100 are not properly supported. This is likely due to the rather novelty of the field, its comparably small user base, and scarce evaluated experimental data. To circumvent this situation different workarounds have been used over the years. In this work the simulation toolkit Geant4 will be introduced with its different components and the effort to bring the software to the heavy and superheavy region will be described.

  14. Free DICOM de-identification tools in clinical research: functioning and safety of patient privacy.

    PubMed

    Aryanto, K Y E; Oudkerk, M; van Ooijen, P M A

    2015-12-01

    To compare non-commercial DICOM toolkits for their de-identification ability in removing a patient's personal health information (PHI) from a DICOM header. Ten DICOM toolkits were selected for de-identification tests. Tests were performed by using the system's default de-identification profile and, subsequently, the tools' best adjusted settings. We aimed to eliminate fifty elements considered to contain identifying patient information. The tools were also examined for their respective methods of customization. Only one tool was able to de-identify all required elements with the default setting. Not all of the toolkits provide a customizable de-identification profile. Six tools allowed changes by selecting the provided profiles, giving input through a graphical user interface (GUI) or configuration text file, or providing the appropriate command-line arguments. Using adjusted settings, four of those six toolkits were able to perform full de-identification. Only five tools could properly de-identify the defined DICOM elements, and in four cases, only after careful customization. Therefore, free DICOM toolkits should be used with extreme care to prevent the risk of disclosing PHI, especially when using the default configuration. In case optimal security is required, one of the five toolkits is proposed. • Free DICOM toolkits should be carefully used to prevent patient identity disclosure. • Each DICOM tool produces its own specific outcomes from the de-identification process. • In case optimal security is required, using one DICOM toolkit is proposed.

  15. BIO::Phylo-phyloinformatic analysis using perl.

    PubMed

    Vos, Rutger A; Caravas, Jason; Hartmann, Klaas; Jensen, Mark A; Miller, Chase

    2011-02-27

    Phyloinformatic analyses involve large amounts of data and metadata of complex structure. Collecting, processing, analyzing, visualizing and summarizing these data and metadata should be done in steps that can be automated and reproduced. This requires flexible, modular toolkits that can represent, manipulate and persist phylogenetic data and metadata as objects with programmable interfaces. This paper presents Bio::Phylo, a Perl5 toolkit for phyloinformatic analysis. It implements classes and methods that are compatible with the well-known BioPerl toolkit, but is independent from it (making it easy to install) and features a richer API and a data model that is better able to manage the complex relationships between different fundamental data and metadata objects in phylogenetics. It supports commonly used file formats for phylogenetic data including the novel NeXML standard, which allows rich annotations of phylogenetic data to be stored and shared. Bio::Phylo can interact with BioPerl, thereby giving access to the file formats that BioPerl supports. Many methods for data simulation, transformation and manipulation, the analysis of tree shape, and tree visualization are provided. Bio::Phylo is composed of 59 richly documented Perl5 modules. It has been deployed successfully on a variety of computer architectures (including various Linux distributions, Mac OS X versions, Windows, Cygwin and UNIX-like systems). It is available as open source (GPL) software from http://search.cpan.org/dist/Bio-Phylo.

  16. BIO::Phylo-phyloinformatic analysis using perl

    PubMed Central

    2011-01-01

    Background Phyloinformatic analyses involve large amounts of data and metadata of complex structure. Collecting, processing, analyzing, visualizing and summarizing these data and metadata should be done in steps that can be automated and reproduced. This requires flexible, modular toolkits that can represent, manipulate and persist phylogenetic data and metadata as objects with programmable interfaces. Results This paper presents Bio::Phylo, a Perl5 toolkit for phyloinformatic analysis. It implements classes and methods that are compatible with the well-known BioPerl toolkit, but is independent from it (making it easy to install) and features a richer API and a data model that is better able to manage the complex relationships between different fundamental data and metadata objects in phylogenetics. It supports commonly used file formats for phylogenetic data including the novel NeXML standard, which allows rich annotations of phylogenetic data to be stored and shared. Bio::Phylo can interact with BioPerl, thereby giving access to the file formats that BioPerl supports. Many methods for data simulation, transformation and manipulation, the analysis of tree shape, and tree visualization are provided. Conclusions Bio::Phylo is composed of 59 richly documented Perl5 modules. It has been deployed successfully on a variety of computer architectures (including various Linux distributions, Mac OS X versions, Windows, Cygwin and UNIX-like systems). It is available as open source (GPL) software from http://search.cpan.org/dist/Bio-Phylo PMID:21352572

  17. Exposing Exposure: Automated Anatomy-specific CT Radiation Exposure Extraction for Quality Assurance and Radiation Monitoring

    PubMed Central

    Warden, Graham I.; Farkas, Cameron E.; Ikuta, Ichiro; Prevedello, Luciano M.; Andriole, Katherine P.; Khorasani, Ramin

    2012-01-01

    Purpose: To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. Materials and Methods: This institutional review board–approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Results: Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality control and optimization, and cumulative patient- and anatomy-specific radiation exposure monitoring. Conclusion: Large-scale anatomy-specific radiation exposure data repositories can be created with high fidelity from existing digital image archives by using open-source informatics tools. ©RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12111822/-/DC1 PMID:22668563

  18. Toolkits and Libraries for Deep Learning.

    PubMed

    Erickson, Bradley J; Korfiatis, Panagiotis; Akkus, Zeynettin; Kline, Timothy; Philbrick, Kenneth

    2017-08-01

    Deep learning is an important new area of machine learning which encompasses a wide range of neural network architectures designed to complete various tasks. In the medical imaging domain, example tasks include organ segmentation, lesion detection, and tumor classification. The most popular network architecture for deep learning for images is the convolutional neural network (CNN). Whereas traditional machine learning requires determination and calculation of features from which the algorithm learns, deep learning approaches learn the important features as well as the proper weighting of those features to make predictions for new data. In this paper, we will describe some of the libraries and tools that are available to aid in the construction and efficient execution of deep learning as applied to medical images.

  19. Progress towards a semiconductor Compton camera for prompt gamma imaging during proton beam therapy for range and dose verification

    NASA Astrophysics Data System (ADS)

    Gutierrez, A.; Baker, C.; Boston, H.; Chung, S.; Judson, D. S.; Kacperek, A.; Le Crom, B.; Moss, R.; Royle, G.; Speller, R.; Boston, A. J.

    2018-01-01

    The main objective of this work is to test a new semiconductor Compton camera for prompt gamma imaging. Our device is composed of three active layers: a Si(Li) detector as a scatterer and two high purity Germanium detectors as absorbers of high-energy gamma rays. We performed Monte Carlo simulations using the Geant4 toolkit to characterise the expected gamma field during proton beam therapy and have made experimental measurements of the gamma spectrum with a 60 MeV passive scattering beam irradiating a phantom. In this proceeding, we describe the status of the Compton camera and present the first preliminary measurements with radioactive sources and their corresponding reconstructed images.

  20. WATERSHED HEALTH ASSESSMENT TOOLS-INVESTIGATING FISHERIES (WHAT-IF): A MODELING TOOLKIT FOR WATERSHED AND FISHERIES MANAGEMENT

    EPA Science Inventory

    The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...

  1. Designing and Delivering Intensive Interventions: A Teacher's Toolkit

    ERIC Educational Resources Information Center

    Murray, Christy S.; Coleman, Meghan A.; Vaughn, Sharon; Wanzek, Jeanne; Roberts, Greg

    2012-01-01

    This toolkit provides activities and resources to assist practitioners in designing and delivering intensive interventions in reading and mathematics for K-12 students with significant learning difficulties and disabilities. Grounded in research, this toolkit is based on the Center on Instruction's "Intensive Interventions for Students Struggling…

  2. The National Informal STEM Education Network

    Science.gov Websites

    Evaluation and Research Kits Explore Science: Earth & Space toolkit Building with Biology Kit Explore 2018 toolkits now available for download. Download the 2018 Digital Toolkit! Building with Biology ACTIVITY KIT Building with Biology Conversations and activities about synthetic biology; this emerging

  3. 78 FR 14773 - U.S. Environmental Solutions Toolkit-Landfill Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... and foreign end-users of environmental technologies that will outline U.S. approaches to a series of environmental problems and highlight participating U.S. vendors of relevant U.S. technologies. The Toolkit will... DEPARTMENT OF COMMERCE International Trade Administration U.S. Environmental Solutions Toolkit...

  4. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    PubMed

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  5. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.

  6. 78 FR 58520 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... notice sets forth a request for input from U.S. businesses capable of exporting their goods or services... and foreign end-users of environmental technologies The Toolkit outline U.S. approaches to a series of environmental problems and highlight participating U.S. vendors of relevant U.S. technologies. The Toolkit will...

  7. Practitioner Data Use in Schools: Workshop Toolkit. REL 2015-043

    ERIC Educational Resources Information Center

    Bocala, Candice; Henry, Susan F.; Mundry, Susan; Morgan, Claire

    2014-01-01

    The "Practitioner Data Use in Schools: Workshop Toolkit" is designed to help practitioners systematically and accurately use data to inform their teaching practice. The toolkit includes an agenda, slide deck, participant workbook, and facilitator's guide and covers the following topics: developing data literacy, engaging in a cycle of…

  8. Language Access Toolkit: An Organizing and Advocacy Resource for Community-Based Youth Programs

    ERIC Educational Resources Information Center

    Beyersdorf, Mark Ro

    2013-01-01

    Asian American Legal Defense and Education Fund (AALDEF) developed this language access toolkit to share the expertise and experiences of National Asian American Education Advocates Network (NAAEA) member organizations with other community organizations interested in developing language access campaigns. This toolkit includes an overview of…

  9. FY17Q4 Ristra project: Release Version 1.0 of a production toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungerford, Aimee L.; Daniel, David John

    2017-09-21

    The Next Generation Code project will release Version 1.0 of a production toolkit for multi-physics application development on advanced architectures. Features of this toolkit will include remap and link utilities, control and state manager, setup, visualization and I/O, as well as support for a variety of mesh and particle data representations. Numerical physics packages that operate atop this foundational toolkit will be employed in a multi-physics demonstration problem and released to the community along with results from the demonstration.

  10. Diagnosing turbulence for research aircraft safety using open source toolkits

    NASA Astrophysics Data System (ADS)

    Lang, T. J.; Guy, N.

    Open source software toolkits have been developed and applied to diagnose in-cloud turbulence in the vicinity of Earth science research aircraft, via analysis of ground-based Doppler radar data. Based on multiple retrospective analyses, these toolkits show promise for detecting significant turbulence well prior to cloud penetrations by research aircraft. A pilot study demonstrated the ability to provide mission scientists turbulence estimates in near real time during an actual field campaign, and thus these toolkits are recommended for usage in future cloud-penetrating aircraft field campaigns.

  11. Software reuse in spacecraft planning and scheduling systems

    NASA Technical Reports Server (NTRS)

    Mclean, David; Tuchman, Alan; Broseghini, Todd; Yen, Wen; Page, Brenda; Johnson, Jay; Bogovich, Lynn; Burkhardt, Chris; Mcintyre, James; Klein, Scott

    1993-01-01

    The use of a software toolkit and development methodology that supports software reuse is described. The toolkit includes source-code-level library modules and stand-alone tools which support such tasks as data reformatting and report generation, simple relational database applications, user interfaces, tactical planning, strategic planning and documentation. The current toolkit is written in C and supports applications that run on IBM-PC's under DOS and UNlX-based workstations under OpenLook and Motif. The toolkit is fully integrated for building scheduling systems that reuse AI knowledge base technology. A typical scheduling scenario and three examples of applications that utilize the reuse toolkit will be briefly described. In addition to the tools themselves, a description of the software evolution and reuse methodology that was used is presented.

  12. A framework for measurement and harmonization of pediatric multiple sclerosis etiologic research studies: The Pediatric MS Tool-Kit.

    PubMed

    Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina

    2018-06-01

    While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.

  13. Implementation of the Good School Toolkit in Uganda: a quantitative process evaluation of a successful violence prevention program.

    PubMed

    Knight, Louise; Allen, Elizabeth; Mirembe, Angel; Nakuti, Janet; Namy, Sophie; Child, Jennifer C; Sturgess, Joanna; Kyegombe, Nambusi; Walakira, Eddy J; Elbourne, Diana; Naker, Dipak; Devries, Karen M

    2018-05-09

    The Good School Toolkit, a complex behavioural intervention designed by Raising Voices a Ugandan NGO, reduced past week physical violence from school staff to primary students by an average of 42% in a recent randomised controlled trial. This process evaluation quantitatively examines what was implemented across the twenty-one intervention schools, variations in school prevalence of violence after the intervention, factors that influence exposure to the intervention and factors associated with students' experience of physical violence from staff at study endline. Implementation measures were captured prospectively in the twenty-one intervention schools over four school terms from 2012 to 2014 and Toolkit exposure captured in the student (n = 1921) and staff (n = 286) endline cross-sectional surveys in 2014. Implementation measures and the prevalence of violence are summarised across schools and are assessed for correlation using Spearman's Rank Correlation Coefficient. Regression models are used to explore individual factors associated with Toolkit exposure and with physical violence at endline. School prevalence of past week physical violence from staff against students ranged from 7% to 65% across schools at endline. Schools with higher mean levels of teacher Toolkit exposure had larger decreases in violence during the study. Students in schools categorised as implementing a 'low' number of program school-led activities reported less exposure to the Toolkit. Higher student Toolkit exposure was associated with decreased odds of experiencing physical violence from staff (OR: 0.76, 95%CI: 0.67-0.86, p-value< 0.001). Girls, students reporting poorer mental health and students in a lower grade were less exposed to the toolkit. After the intervention, and when adjusting for individual Toolkit exposure, some students remained at increased risk of experiencing violence from staff, including, girls, students reporting poorer mental health, students who experienced other violence and those reporting difficulty with self-care. Our results suggest that increasing students and teachers exposure to the Good School Toolkit within schools has the potential to bring about further reductions in violence. Effectiveness of the Toolkit may be increased by further targeting and supporting teachers' engagement with girls and students with mental health difficulties. The trial is registered at clinicaltrials.gov , NCT01678846, August 24th 2012.

  14. Automation of Hessian-Based Tubularity Measure Response Function in 3D Biomedical Images.

    PubMed

    Dzyubak, Oleksandr P; Ritman, Erik L

    2011-01-01

    The blood vessels and nerve trees consist of tubular objects interconnected into a complex tree- or web-like structure that has a range of structural scale 5 μm diameter capillaries to 3 cm aorta. This large-scale range presents two major problems; one is just making the measurements, and the other is the exponential increase of component numbers with decreasing scale. With the remarkable increase in the volume imaged by, and resolution of, modern day 3D imagers, it is almost impossible to make manual tracking of the complex multiscale parameters from those large image data sets. In addition, the manual tracking is quite subjective and unreliable. We propose a solution for automation of an adaptive nonsupervised system for tracking tubular objects based on multiscale framework and use of Hessian-based object shape detector incorporating National Library of Medicine Insight Segmentation and Registration Toolkit (ITK) image processing libraries.

  15. Simulation of nanoparticle-mediated near-infrared thermal therapy using GATE

    PubMed Central

    Cuplov, Vesna; Pain, Frédéric; Jan, Sébastien

    2017-01-01

    Application of nanotechnology for biomedicine in cancer therapy allows for direct delivery of anticancer agents to tumors. An example of such therapies is the nanoparticle-mediated near-infrared hyperthermia treatment. In order to investigate the influence of nanoparticle properties on the spatial distribution of heat in the tumor and healthy tissues, accurate simulations are required. The Geant4 Application for Emission Tomography (GATE) open-source simulation platform, based on the Geant4 toolkit, is widely used by the research community involved in molecular imaging, radiotherapy and optical imaging. We present an extension of GATE that can model nanoparticle-mediated hyperthermal therapy as well as simple heat diffusion in biological tissues. This new feature of GATE combined with optical imaging allows for the simulation of a theranostic scenario in which the patient is injected with theranostic nanosystems that can simultaneously deliver therapeutic (i.e. hyperthermia therapy) and imaging agents (i.e. fluorescence imaging). PMID:28663855

  16. New Exposure Time Calculator for NICMOS (imaging): Features, Testing and Recommendations

    NASA Astrophysics Data System (ADS)

    Arribas, S.; McLean, D.; Busko, I.; Sosey, M.

    2004-02-01

    A new NICMOS ETC for imaging mode has been developed as part of the Astronomer’s Proposal Toolkit (APT) project. This new tool fully updates the NICMOS performance for Cycles 11+, expands the funtionality of the previous ETC, providing the user more options, and homogenizes the non-instrument specific parameters (i.e. sky background, extinction laws) with other HST-ETCs. This report summarizes its main characteristics, and gives some recommendations to potential users. Details about the tool itself can be found in the documentation linked to the ETC user interface, which can be accessed from the NICMOS web site at STScI.

  17. The visible human project®: From body to bits.

    PubMed

    Ackerman, Michael J

    2016-08-01

    In the middle 1990's the U.S. National Library sponsored the acquisition and development of the Visible Human Project® data base. This image database contains anatomical cross-sectional images which allow the reconstruction of three dimensional male and female anatomy to an accuracy of less than 1.0 mm. The male anatomy is contained in a 15 gigabyte database, the female in a 39 gigabyte database. This talk will describe why and how this project was accomplished and demonstrate some of the products which the Visible Human dataset has made possible. I will conclude by describing how the Visible Human Project, completed over 20 years ago, has led the National Library of Medicine to a series of image research projects including an open source image processing toolkit which is included in several commercial products.

  18. GATE: a simulation toolkit for PET and SPECT.

    PubMed

    Jan, S; Santin, G; Strul, D; Staelens, S; Assié, K; Autret, D; Avner, S; Barbier, R; Bardiès, M; Bloomfield, P M; Brasse, D; Breton, V; Bruyndonckx, P; Buvat, I; Chatziioannou, A F; Choi, Y; Chung, Y H; Comtat, C; Donnarieix, D; Ferrer, L; Glick, S J; Groiselle, C J; Guez, D; Honore, P F; Kerhoas-Cavata, S; Kirov, A S; Kohli, V; Koole, M; Krieguer, M; van der Laan, D J; Lamare, F; Largeron, G; Lartizien, C; Lazaro, D; Maas, M C; Maigne, L; Mayet, F; Melot, F; Merheb, C; Pennacchio, E; Perez, J; Pietrzyk, U; Rannou, F R; Rey, M; Schaart, D R; Schmidtlein, C R; Simon, L; Song, T Y; Vieira, J M; Visvikis, D; Van de Walle, R; Wieërs, E; Morel, C

    2004-10-07

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at http:/www-lphe.epfl.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects towards the gridification of GATE and its extension to other domains such as dosimetry are also discussed.

  19. GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT

    PubMed Central

    Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.

    2012-01-01

    Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416

  20. Dosimetry applications in GATE Monte Carlo toolkit.

    PubMed

    Papadimitroulas, Panagiotis

    2017-09-01

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Assessing the effectiveness of the Pesticides and Farmworker Health Toolkit: a curriculum for enhancing farmworkers' understanding of pesticide safety concepts.

    PubMed

    LePrevost, Catherine E; Storm, Julia F; Asuaje, Cesar R; Arellano, Consuelo; Cope, W Gregory

    2014-01-01

    Among agricultural workers, migrant and seasonal farmworkers have been recognized as a special risk population because these laborers encounter cultural challenges and linguistic barriers while attempting to maintain their safety and health within their working environments. The crop-specific Pesticides and Farmworker Health Toolkit (Toolkit) is a pesticide safety and health curriculum designed to communicate to farmworkers pesticide hazards commonly found in their working environments and to address Worker Protection Standard (WPS) pesticide training criteria for agricultural workers. The goal of this preliminary study was to test evaluation items for measuring knowledge increases among farmworkers and to assess the effectiveness of the Toolkit in improving farmworkers' knowledge of key WPS and risk communication concepts when the Toolkit lesson was delivered by trained trainers in the field. After receiving training on the curriculum, four participating trainers provided lessons using the Toolkit as part of their regular training responsibilities and orally administered a pre- and post-lesson evaluation instrument to 20 farmworker volunteers who were generally representative of the national farmworker population. Farmworker knowledge of pesticide safety messages significantly (P<.05) increased after participation in the lesson. Further, items with visual alternatives were found to be most useful in discriminating between more and less knowledgeable farmworkers. The pilot study suggests that the Pesticides and Farmworker Health Toolkit is an effective, research-based pesticide safety and health intervention for the at-risk farmworker population and identifies a testing format appropriate for evaluating the Toolkit and other similar interventions for farmworkers in the field.

  2. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit

    PubMed Central

    Smart, Jon; Zdradzinski, Michael; Roth, Sarah; Gende, Alecia; Conroy, Kylie; Battaglioli, Nicole

    2018-01-01

    Introduction Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. Methods As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Results Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator) guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Conclusion Residents from across the world collaborated and convened to reach a consensus on high-yield—and potentially high-impact—lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum. PMID:29560061

  3. Educator Toolkits on Second Victim Syndrome, Mindfulness and Meditation, and Positive Psychology: The 2017 Resident Wellness Consensus Summit.

    PubMed

    Chung, Arlene S; Smart, Jon; Zdradzinski, Michael; Roth, Sarah; Gende, Alecia; Conroy, Kylie; Battaglioli, Nicole

    2018-03-01

    Burnout, depression, and suicidality among residents of all specialties have become a critical focus of attention for the medical education community. As part of the 2017 Resident Wellness Consensus Summit in Las Vegas, Nevada, resident participants from 31 programs collaborated in the Educator Toolkit workgroup. Over a seven-month period leading up to the summit, this workgroup convened virtually in the Wellness Think Tank, an online resident community, to perform a literature review and draft curricular plans on three core wellness topics. These topics were second victim syndrome, mindfulness and meditation, and positive psychology. At the live summit event, the workgroup expanded to include residents outside the Wellness Think Tank to obtain a broader consensus of the evidence-based toolkits for these three topics. Three educator toolkits were developed. The second victim syndrome toolkit has four modules, each with a pre-reading material and a leader (educator) guide. In the mindfulness and meditation toolkit, there are three modules with a leader guide in addition to a longitudinal, guided meditation plan. The positive psychology toolkit has two modules, each with a leader guide and a PowerPoint slide set. These toolkits provide educators the necessary resources, reading materials, and lesson plans to implement didactic sessions in their residency curriculum. Residents from across the world collaborated and convened to reach a consensus on high-yield-and potentially high-impact-lesson plans that programs can use to promote and improve resident wellness. These lesson plans may stand alone or be incorporated into a larger wellness curriculum.

  4. The RAPID Toolkit: Facilitating Utility-Scale Renewable Energy Development

    Science.gov Websites

    energy and bulk transmission projects. The RAPID Toolkit, developed by the National Renewable Energy Renewable Energy Development The RAPID Toolkit: Facilitating Utility-Scale Renewable Energy Development information about federal, state, and local permitting and regulations for utility-scale renewable energy and

  5. Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes

    ERIC Educational Resources Information Center

    Rama, Kondapalli, Ed.; Hope, Andrea, Ed.

    2009-01-01

    The Commonwealth of Learning is proud to partner with the Sri Lankan Ministry of Higher Education and UNESCO to produce this "Quality Assurance Toolkit for Distance Higher Education Institutions and Programmes". The Toolkit has been prepared with three features. First, it is a generic document on quality assurance, complete with a…

  6. 75 FR 35038 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ... components of the RED, as well as the needs of culturally and linguistically diverse patients; (2) To pre-test the revised RED Toolkit in ten varied hospital settings, evaluating how the RED Toolkit is... intensity of technical assistance (TA). (3) To modify the revised RED Toolkit based on pre-testing and to...

  7. Transition Toolkit 3.0: Meeting the Educational Needs of Youth Exposed to the Juvenile Justice System. Third Edition

    ERIC Educational Resources Information Center

    Clark, Heather Griller; Mathur, Sarup; Brock, Leslie; O'Cummings, Mindee; Milligan, DeAngela

    2016-01-01

    The third edition of the National Technical Assistance Center for the Education of Neglected or Delinquent Children and Youth's (NDTAC's) "Transition Toolkit" provides updated information on existing policies, practices, strategies, and resources for transition that build on field experience and research. The "Toolkit" offers…

  8. Problem posing and cultural tailoring: developing an HIV/AIDS health literacy toolkit with the African American community.

    PubMed

    Rikard, R V; Thompson, Maxine S; Head, Rachel; McNeil, Carlotta; White, Caressa

    2012-09-01

    The rate of HIV infection among African Americans is disproportionately higher than for other racial groups in the United States. Previous research suggests that low level of health literacy (HL) is an underlying factor to explain racial disparities in the prevalence and incidence of HIV/AIDS. The present research describes a community and university project to develop a culturally tailored HIV/AIDS HL toolkit in the African American community. Paulo Freire's pedagogical philosophy and problem-posing methodology served as the guiding framework throughout the development process. Developing the HIV/AIDS HL toolkit occurred in a two-stage process. In Stage 1, a nonprofit organization and research team established a collaborative partnership to develop a culturally tailored HIV/AIDS HL toolkit. In Stage 2, African American community members participated in focus groups conducted as Freirian cultural circles to further refine the HIV/AIDS HL toolkit. In both stages, problem posing engaged participants' knowledge, experiences, and concerns to evaluate a working draft toolkit. The discussion and implications highlight how Freire's pedagogical philosophy and methodology enhances the development of culturally tailored health information.

  9. Training Software in Artificial-Intelligence Computing Techniques

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna; Rogstad, Eric; Chalfant, Eugene

    2005-01-01

    The Artificial Intelligence (AI) Toolkit is a computer program for training scientists, engineers, and university students in three soft-computing techniques (fuzzy logic, neural networks, and genetic algorithms) used in artificial-intelligence applications. The program promotes an easily understandable tutorial interface, including an interactive graphical component through which the user can gain hands-on experience in soft-computing techniques applied to realistic example problems. The tutorial provides step-by-step instructions on the workings of soft-computing technology, whereas the hands-on examples allow interaction and reinforcement of the techniques explained throughout the tutorial. In the fuzzy-logic example, a user can interact with a robot and an obstacle course to verify how fuzzy logic is used to command a rover traverse from an arbitrary start to the goal location. For the genetic algorithm example, the problem is to determine the minimum-length path for visiting a user-chosen set of planets in the solar system. For the neural-network example, the problem is to decide, on the basis of input data on physical characteristics, whether a person is a man, woman, or child. The AI Toolkit is compatible with the Windows 95,98, ME, NT 4.0, 2000, and XP operating systems. A computer having a processor speed of at least 300 MHz, and random-access memory of at least 56MB is recommended for optimal performance. The program can be run on a slower computer having less memory, but some functions may not be executed properly.

  10. Assessing GPS Constellation Resiliency in an Urban Canyon Environment

    DTIC Science & Technology

    2015-03-26

    Taipei, Taiwan as his area of interest. His GPS constellation is modeled in the Satellite Toolkit ( STK ) where augmentation satellites can be added and...interaction. SEAS also provides a visual display of the simulation which is useful for verification and debugging portions of the analysis. Furthermore...entire system. Interpreting the model is aided by the visual display of the agents moving in the region of inter- est. Furthermore, SEAS collects

  11. A signaling visualization toolkit to support rational design of combination therapies and biomarker discovery: SiViT.

    PubMed

    Bown, James L; Shovman, Mark; Robertson, Paul; Boiko, Andrei; Goltsov, Alexey; Mullen, Peter; Harrison, David J

    2017-05-02

    Targeted cancer therapy aims to disrupt aberrant cellular signalling pathways. Biomarkers are surrogates of pathway state, but there is limited success in translating candidate biomarkers to clinical practice due to the intrinsic complexity of pathway networks. Systems biology approaches afford better understanding of complex, dynamical interactions in signalling pathways targeted by anticancer drugs. However, adoption of dynamical modelling by clinicians and biologists is impeded by model inaccessibility. Drawing on computer games technology, we present a novel visualization toolkit, SiViT, that converts systems biology models of cancer cell signalling into interactive simulations that can be used without specialist computational expertise. SiViT allows clinicians and biologists to directly introduce for example loss of function mutations and specific inhibitors. SiViT animates the effects of these introductions on pathway dynamics, suggesting further experiments and assessing candidate biomarker effectiveness. In a systems biology model of Her2 signalling we experimentally validated predictions using SiViT, revealing the dynamics of biomarkers of drug resistance and highlighting the role of pathway crosstalk. No model is ever complete: the iteration of real data and simulation facilitates continued evolution of more accurate, useful models. SiViT will make accessible libraries of models to support preclinical research, combinatorial strategy design and biomarker discovery.

  12. Simulation of orientational coherent effects via Geant4

    NASA Astrophysics Data System (ADS)

    Bagli, E.; Asai, M.; Brandt, D.; Dotti, A.; Guidi, V.; Verderi, M.; Wright, D.

    2017-10-01

    Simulation of orientational coherent effects via Geant4 beam manipulation of high-and very-high-energy particle beams is a hot topic in accelerator physics. Coherent effects of ultra-relativistic particles in bent crystals allow the steering of particle trajectories thanks to the strong electrical field generated between atomic planes. Recently, a collimation experiment with bent crystals was carried out at the CERN-LHC, paving the way to the usage of such technology in current and future accelerators. Geant4 is a widely used object-oriented tool-kit for the Monte Carlo simulation of the interaction of particles with matter in high-energy physics. Moreover, its areas of application include also nuclear and accelerator physics, as well as studies in medical and space science. We present the first Geant4 extension for the simulation of orientational effects in straight and bent crystals for high energy charged particles. The model allows the manipulation of particle trajectories by means of straight and bent crystals and the scaling of the cross sections of hadronic and electromagnetic processes for channeled particles. Based on such a model, an extension of the Geant4 toolkit has been developed. The code and the model have been validated by comparison with published experimental data regarding the deflection efficiency via channeling and the variation of the rate of inelastic nuclear interactions.

  13. Alarm rationalization: practical experience rationalizing alarm configuration for an accelerator subsystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kasemir, Kay; Hartman, Steven M

    2009-01-01

    A new alarm system toolkit has been implemented at SNS. The toolkit handles the Central Control Room (CCR) 'annunciator', or audio alarms. For the new alarm system to be effective, the alarms must be meaningful and properly configured. Along with the implementation of the new alarm toolkit, a thorough documentation and rationalization of the alarm configuration is taking place. Requirements and maintenance of a robust alarm configuration have been gathered from system and operations experts. In this paper we present our practical experience with the vacuum system alarm handling configuration of the alarm toolkit.

  14. Demonstration of the Health Literacy Universal Precautions Toolkit

    PubMed Central

    Mabachi, Natabhona M.; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G.; Albright, Karen; Weiss, Barry D.; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements. PMID:27232681

  15. Demonstration of the Health Literacy Universal Precautions Toolkit: Lessons for Quality Improvement.

    PubMed

    Mabachi, Natabhona M; Cifuentes, Maribel; Barnard, Juliana; Brega, Angela G; Albright, Karen; Weiss, Barry D; Brach, Cindy; West, David

    2016-01-01

    The Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit was developed to help primary care practices assess and make changes to improve communication with and support for patients. Twelve diverse primary care practices implemented assigned tools over a 6-month period. Qualitative results revealed challenges practices experienced during implementation, including competing demands, bureaucratic hurdles, technological challenges, limited quality improvement experience, and limited leadership support. Practices used the Toolkit flexibly and recognized the efficiencies of implementing tools in tandem and in coordination with other quality improvement initiatives. Practices recommended reducing Toolkit density and making specific refinements.

  16. Image Segmentation, Registration, Compression, and Matching

    NASA Technical Reports Server (NTRS)

    Yadegar, Jacob; Wei, Hai; Yadegar, Joseph; Ray, Nilanjan; Zabuawala, Sakina

    2011-01-01

    A novel computational framework was developed of a 2D affine invariant matching exploiting a parameter space. Named as affine invariant parameter space (AIPS), the technique can be applied to many image-processing and computer-vision problems, including image registration, template matching, and object tracking from image sequence. The AIPS is formed by the parameters in an affine combination of a set of feature points in the image plane. In cases where the entire image can be assumed to have undergone a single affine transformation, the new AIPS match metric and matching framework becomes very effective (compared with the state-of-the-art methods at the time of this reporting). No knowledge about scaling or any other transformation parameters need to be known a priori to apply the AIPS framework. An automated suite of software tools has been created to provide accurate image segmentation (for data cleaning) and high-quality 2D image and 3D surface registration (for fusing multi-resolution terrain, image, and map data). These tools are capable of supporting existing GIS toolkits already in the marketplace, and will also be usable in a stand-alone fashion. The toolkit applies novel algorithmic approaches for image segmentation, feature extraction, and registration of 2D imagery and 3D surface data, which supports first-pass, batched, fully automatic feature extraction (for segmentation), and registration. A hierarchical and adaptive approach is taken for achieving automatic feature extraction, segmentation, and registration. Surface registration is the process of aligning two (or more) data sets to a common coordinate system, during which the transformation between their different coordinate systems is determined. Also developed here are a novel, volumetric surface modeling and compression technique that provide both quality-guaranteed mesh surface approximations and compaction of the model sizes by efficiently coding the geometry and connectivity/topology components of the generated models. The highly efficient triangular mesh compression compacts the connectivity information at the rate of 1.5-4 bits per vertex (on average for triangle meshes), while reducing the 3D geometry by 40-50 percent. Finally, taking into consideration the characteristics of 3D terrain data, and using the innovative, regularized binary decomposition mesh modeling, a multistage, pattern-drive modeling, and compression technique has been developed to provide an effective framework for compressing digital elevation model (DEM) surfaces, high-resolution aerial imagery, and other types of NASA data.

  17. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNamara, A; Held, K; Paganetti, H

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less

  18. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    PubMed

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  19. A proposal for a spiritual care assessment toolkit for religious volunteers and volunteer service users.

    PubMed

    Liu, Yi-Jung

    2014-10-01

    Based on the idea that volunteer services in healthcare settings should focus on the service users' best interests and providing holistic care for the body, mind, and spirit, the aim of this study was to propose an assessment toolkit for assessing the effectiveness of religious volunteers and improving their service. By analyzing and categorizing the results of previous studies, we incorporated effective care goals and methods in the proposed religious and spiritual care assessment toolkit. Two versions of the toolkit were created. The service users' version comprises 10 questions grouped into the following five dimensions: "physical care," "psychological and emotional support," "social relationships," "religious and spiritual care," and "hope restoration." Each question could either be answered with "yes" or "no". The volunteers' version contains 14 specific care goals and 31 care methods, in addition to the 10 care dimensions in the residents' version. A small sample of 25 experts was asked to judge the usefulness of each of the toolkit items for evaluating volunteers' effectiveness. Although some experts questioned the volunteer's capacity, however, to improve the spiritual care capacity and effectiveness provided by volunteers is the main purpose of developing this assessment toolkit. The toolkit developed in this study may not be applicable to other countries, and only addressed patients' general spiritual needs. Volunteers should receive special training in caring for people with special needs.

  20. A flexible open-source toolkit for lava flow simulations

    NASA Astrophysics Data System (ADS)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The influence of the different input parameters on the quality of the simulations is discussed. REFERENCES: Felpeto et al. (2001), Assessment and modelling of lava flow hazard on Lanzarote (Canary islands), Nat. Hazards, 23, 247-257. Harris and Rowland (2001), FLOWGO: a kinematic thermo-rheological model for lava flowing in a channel, Bull. Volcanol., 63, 20-44.

  1. EHDViz: clinical dashboard development using open-source technologies

    PubMed Central

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-01-01

    Objective To design, develop and prototype clinical dashboards to integrate high-frequency health and wellness data streams using interactive and real-time data visualisation and analytics modalities. Materials and methods We developed a clinical dashboard development framework called electronic healthcare data visualization (EHDViz) toolkit for generating web-based, real-time clinical dashboards for visualising heterogeneous biomedical, healthcare and wellness data. The EHDViz is an extensible toolkit that uses R packages for data management, normalisation and producing high-quality visualisations over the web using R/Shiny web server architecture. We have developed use cases to illustrate utility of EHDViz in different scenarios of clinical and wellness setting as a visualisation aid for improving healthcare delivery. Results Using EHDViz, we prototyped clinical dashboards to demonstrate the contextual versatility of EHDViz toolkit. An outpatient cohort was used to visualise population health management tasks (n=14 221), and an inpatient cohort was used to visualise real-time acuity risk in a clinical unit (n=445), and a quantified-self example using wellness data from a fitness activity monitor worn by a single individual was also discussed (n-of-1). The back-end system retrieves relevant data from data source, populates the main panel of the application and integrates user-defined data features in real-time and renders output using modern web browsers. The visualisation elements can be customised using health features, disease names, procedure names or medical codes to populate the visualisations. The source code of EHDViz and various prototypes developed using EHDViz are available in the public domain at http://ehdviz.dudleylab.org. Conclusions Collaborative data visualisations, wellness trend predictions, risk estimation, proactive acuity status monitoring and knowledge of complex disease indicators are essential components of implementing data-driven precision medicine. As an open-source visualisation framework capable of integrating health assessment, EHDViz aims to be a valuable toolkit for rapid design, development and implementation of scalable clinical data visualisation dashboards. PMID:27013597

  2. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    ERIC Educational Resources Information Center

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards. The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the CCSS; each…

  3. A Data Audit and Analysis Toolkit To Support Assessment of the First College Year.

    ERIC Educational Resources Information Center

    Paulson, Karen

    This "toolkit" provides a process by which institutions can identify and use information resources to enhance the experiences and outcomes of first-year students. The toolkit contains a "Technical Manual" designed for use by the technical personnel who will be conducting the data audit and associated analyses. Administrators who want more…

  4. Data visualization and analysis tools for the MAVEN mission

    NASA Astrophysics Data System (ADS)

    Harter, B.; De Wolfe, A. W.; Putnam, B.; Brain, D.; Chaffin, M.

    2016-12-01

    The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. We have developed new software tools for exploring and analyzing the science data. Our open-source Python toolkit for working with data from MAVEN and other missions is based on the widely-used "tplot" IDL toolkit. We have replicated all of the basic tplot functionality in Python, and use the bokeh and matplotlib libraries to generate interactive line plots and spectrograms, providing additional functionality beyond the capabilities of IDL graphics. These Python tools are generalized to work with missions beyond MAVEN, and our software is available on Github. We have also been exploring 3D graphics as a way to better visualize the MAVEN science data and models. We have constructed a 3D visualization of MAVEN's orbit using the CesiumJS library, which not only allows viewing of MAVEN's orientation and position, but also allows the display of selected science data sets and their variation over time.

  5. A current perspective on availability of tools, resources and networks for veterinary immunology.

    PubMed

    Entrican, Gary; Lunney, Joan K; Rutten, Victor P; Baldwin, Cynthia L

    2009-03-15

    There are many diseases of fish, livestock and companion animals that impact negatively on animal health, welfare and productivity and for which there are no effective vaccines. The development of new vaccines is reliant on the availability of well-characterised immunological tools and reagents to understand host-pathogen interactions and identify protective immune responses. Veterinary immunology has always lagged behind mouse and human immunology in terms of development and availability of tools and reagents. However, several initiatives are underway to address this. The Veterinary Immunology Committee (VIC) Toolkit was initiated 6 years ago at the sixth International Veterinary Immunology Symposium (IVIS) in Uppsala and in the intervening period there have been several notable developments that have advanced reagent development and information exchange. This review will discuss advances in veterinary reagent development, networks, databases and commercial availability with particular reference to the second VIC Toolkit workshop held at the eighth IVIS in Ouro Preto, Brazil on the 15th of August 2007.

  6. Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.

    PubMed

    Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun

    2018-05-08

    Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.

  7. Development of an Integrated Human Factors Toolkit

    NASA Technical Reports Server (NTRS)

    Resnick, Marc L.

    2003-01-01

    An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.

  8. Field tests of a participatory ergonomics toolkit for Total Worker Health

    PubMed Central

    Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2018-01-01

    Growing interest in Total Worker Health® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and team-work skills of participants. PMID:28166897

  9. Fall TIPS: strategies to promote adoption and use of a fall prevention toolkit.

    PubMed

    Dykes, Patricia C; Carroll, Diane L; Hurley, Ann; Gersh-Zaremski, Ronna; Kennedy, Ann; Kurowski, Jan; Tierney, Kim; Benoit, Angela; Chang, Frank; Lipsitz, Stuart; Pang, Justine; Tsurkova, Ruslana; Zuyov, Lyubov; Middleton, Blackford

    2009-11-14

    Patient falls are serious problems in hospitals. Risk factors for falls are well understood and nurses routinely assess for fall risk on all hospitalized patients. However, the link from nursing assessment of fall risk, to identification and communication of tailored interventions to prevent falls is yet to be established. The Fall TIPS (Tailoring Interventions for Patient Safety) Toolkit was developed to leverage existing practices and workflows and to employ information technology to improve fall prevention practices. The purpose of this paper is to describe the Fall TIPS Toolkit and to report on strategies used to drive adoption of the Toolkit in four acute care hospitals. Using the IHI "Framework for Spread" as a conceptual model, the research team describes the "spread" of the Fall TIPS Toolkit as means to integrate effective fall prevention practices into the workflow of interdisciplinary caregivers, patients and family members.

  10. Next-generation analysis of cataracts: determining knowledge driven gene-gene interactions using Biofilter, and gene-environment interactions using the PhenX Toolkit.

    PubMed

    Pendergrass, Sarah A; Verma, Shefali S; Holzinger, Emily R; Moore, Carrie B; Wallace, John; Dudek, Scott M; Huggins, Wayne; Kitchner, Terrie; Waudby, Carol; Berg, Richard; McCarty, Catherine A; Ritchie, Marylyn D

    2013-01-01

    Investigating the association between biobank derived genomic data and the information of linked electronic health records (EHRs) is an emerging area of research for dissecting the architecture of complex human traits, where cases and controls for study are defined through the use of electronic phenotyping algorithms deployed in large EHR systems. For our study, 2580 cataract cases and 1367 controls were identified within the Marshfield Personalized Medicine Research Project (PMRP) Biobank and linked EHR, which is a member of the NHGRI-funded electronic Medical Records and Genomics (eMERGE) Network. Our goal was to explore potential gene-gene and gene-environment interactions within these data for 529,431 single nucleotide polymorphisms (SNPs) with minor allele frequency > 1%, in order to explore higher level associations with cataract risk beyond investigations of single SNP-phenotype associations. To build our SNP-SNP interaction models we utilized a prior-knowledge driven filtering method called Biofilter to minimize the multiple testing burden of exploring the vast array of interaction models possible from our extensive number of SNPs. Using the Biofilter, we developed 57,376 prior-knowledge directed SNP-SNP models to test for association with cataract status. We selected models that required 6 sources of external domain knowledge. We identified 5 statistically significant models with an interaction term with p-value < 0.05, as well as an overall model with p-value < 0.05 associated with cataract status. We also conducted gene-environment interaction analyses for all GWAS SNPs and a set of environmental factors from the PhenX Toolkit: smoking, UV exposure, and alcohol use; these environmental factors have been previously associated with the formation of cataracts. We found a total of 288 models that exhibit an interaction term with a p-value ≤ 1×10(-4) associated with cataract status. Our results show these approaches enable advanced searches for epistasis and gene-environment interactions beyond GWAS, and that the EHR based approach provides an additional source of data for seeking these advanced explanatory models of the etiology of complex disease/outcome such as cataracts.

  11. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    PubMed Central

    Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576

  12. PlantCV v2: Image analysis software for high-throughput plant phenotyping.

    PubMed

    Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  13. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  14. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE PAGES

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...

    2017-12-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  15. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    ERIC Educational Resources Information Center

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  16. The Customer Flow Toolkit: A Framework for Designing High Quality Customer Services.

    ERIC Educational Resources Information Center

    New York Association of Training and Employment Professionals, Albany.

    This document presents a toolkit to assist staff involved in the design and development of New York's one-stop system. Section 1 describes the preplanning issues to be addressed and the intended outcomes that serve as the framework for creation of the customer flow toolkit. Section 2 outlines the following strategies to assist in designing local…

  17. HIV Prevention in Schools: A Tool Kit for Education Leaders.

    ERIC Educational Resources Information Center

    Office of the Surgeon General (DHHS/PHS), Washington, DC.

    This packet of materials is Phase 1 of a toolkit designed to enlighten education leaders about the need for HIV prevention for youth, especially in communities of color. One element of the toolkit is a VHS videotape that features a brief message from former Surgeon General, Dr. David Satcher. The toolkit also includes a copy of a letter sent to…

  18. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part I: Building an Understanding of Family and Community Engagement

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2014

    2014-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  19. Toolkit for a Workshop on Building a Culture of Data Use. REL 2015-063

    ERIC Educational Resources Information Center

    Gerzon, Nancy; Guckenburg, Sarah

    2015-01-01

    The Culture of Data Use Workshop Toolkit helps school and district teams apply research to practice as they establish and support a culture of data use in their educational setting. The field-tested workshop toolkit guides teams through a set of structured activities to develop an understanding of data-use research in schools and to analyze…

  20. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    ERIC Educational Resources Information Center

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  1. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draxl, C.; Hodge, B. M.; Clifton, A.

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  2. PsyToolkit: a software package for programming psychological experiments using Linux.

    PubMed

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  3. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    NASA Astrophysics Data System (ADS)

    Rit, S.; Vila Oliva, M.; Brousmiche, S.; Labarbe, R.; Sarrut, D.; Sharp, G. C.

    2014-03-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  4. Development of an Online Toolkit for Measuring Performance in Health Emergency Response Exercises.

    PubMed

    Agboola, Foluso; Bernard, Dorothy; Savoia, Elena; Biddinger, Paul D

    2015-10-01

    Exercises that simulate emergency scenarios are accepted widely as an essential component of a robust Emergency Preparedness program. Unfortunately, the variability in the quality of the exercises conducted, and the lack of standardized processes to measure performance, has limited the value of exercises in measuring preparedness. In order to help health organizations improve the quality and standardization of the performance data they collect during simulated emergencies, a model online exercise evaluation toolkit was developed using performance measures tested in over 60 Emergency Preparedness exercises. The exercise evaluation toolkit contains three major components: (1) a database of measures that can be used to assess performance during an emergency response exercise; (2) a standardized data collection tool (form); and (3) a program that populates the data collection tool with the measures that have been selected by the user from the database. The evaluation toolkit was pilot tested from January through September 2014 in collaboration with 14 partnering organizations representing 10 public health agencies and four health care agencies from eight states across the US. Exercise planners from the partnering organizations were asked to use the toolkit for their exercise evaluation process and were interviewed to provide feedback on the use of the toolkit, the generated evaluation tool, and the usefulness of the data being gathered for the development of the exercise after-action report. Ninety-three percent (93%) of exercise planners reported that they found the online database of performance measures appropriate for the creation of exercise evaluation forms, and they stated that they would use it again for future exercises. Seventy-two percent (72%) liked the exercise evaluation form that was generated from the toolkit, and 93% reported that the data collected by the use of the evaluation form were useful in gauging their organization's performance during the exercise. Seventy-nine percent (79%) of exercise planners preferred the evaluation form generated by the toolkit to other forms of evaluations. Results of this project show that users found the newly developed toolkit to be user friendly and more relevant to measurement of specific public health and health care capabilities than other tools currently available. The developed toolkit may contribute to the further advancement of developing a valid approach to exercise performance measurement.

  5. Visualizing relativity: The OpenRelativity project

    NASA Astrophysics Data System (ADS)

    Sherin, Zachary W.; Cheu, Ryan; Tan, Philip; Kortemeyer, Gerd

    2016-05-01

    We present OpenRelativity, an open-source toolkit to simulate effects of special relativity within the popular Unity game engine. Intended for game developers, educators, and anyone interested in physics, OpenRelativity can help people create, test, and share experiments to explore the effects of special relativity. We describe the underlying physics and some of the implementation details of this toolset with the hope that engaging games and interactive relativistic "laboratory" experiments might be implemented.

  6. Techniques in helical scanning, dynamic imaging and image segmentation for improved quantitative analysis with X-ray micro-CT

    NASA Astrophysics Data System (ADS)

    Sheppard, Adrian; Latham, Shane; Middleton, Jill; Kingston, Andrew; Myers, Glenn; Varslot, Trond; Fogden, Andrew; Sawkins, Tim; Cruikshank, Ron; Saadatfar, Mohammad; Francois, Nicolas; Arns, Christoph; Senden, Tim

    2014-04-01

    This paper reports on recent advances at the micro-computed tomography facility at the Australian National University. Since 2000 this facility has been a significant centre for developments in imaging hardware and associated software for image reconstruction, image analysis and image-based modelling. In 2010 a new instrument was constructed that utilises theoretically-exact image reconstruction based on helical scanning trajectories, allowing higher cone angles and thus better utilisation of the available X-ray flux. We discuss the technical hurdles that needed to be overcome to allow imaging with cone angles in excess of 60°. We also present dynamic tomography algorithms that enable the changes between one moment and the next to be reconstructed from a sparse set of projections, allowing higher speed imaging of time-varying samples. Researchers at the facility have also created a sizeable distributed-memory image analysis toolkit with capabilities ranging from tomographic image reconstruction to 3D shape characterisation. We show results from image registration and present some of the new imaging and experimental techniques that it enables. Finally, we discuss the crucial question of image segmentation and evaluate some recently proposed techniques for automated segmentation.

  7. More IMPATIENT: A Gridding-Accelerated Toeplitz-based Strategy for Non-Cartesian High-Resolution 3D MRI on GPUs

    PubMed Central

    Gai, Jiading; Obeid, Nady; Holtrop, Joseph L.; Wu, Xiao-Long; Lam, Fan; Fu, Maojing; Haldar, Justin P.; Hwu, Wen-mei W.; Liang, Zhi-Pei; Sutton, Bradley P.

    2013-01-01

    Several recent methods have been proposed to obtain significant speed-ups in MRI image reconstruction by leveraging the computational power of GPUs. Previously, we implemented a GPU-based image reconstruction technique called the Illinois Massively Parallel Acquisition Toolkit for Image reconstruction with ENhanced Throughput in MRI (IMPATIENT MRI) for reconstructing data collected along arbitrary 3D trajectories. In this paper, we improve IMPATIENT by removing computational bottlenecks by using a gridding approach to accelerate the computation of various data structures needed by the previous routine. Further, we enhance the routine with capabilities for off-resonance correction and multi-sensor parallel imaging reconstruction. Through implementation of optimized gridding into our iterative reconstruction scheme, speed-ups of more than a factor of 200 are provided in the improved GPU implementation compared to the previous accelerated GPU code. PMID:23682203

  8. Facilitated family presence at resuscitation: effectiveness of a nursing student toolkit.

    PubMed

    Kantrowitz-Gordon, Ira; Bennett, Deborah; Wise Stauffer, Debra; Champ-Gibson, Erla; Fitzgerald, Cynthia; Corbett, Cynthia

    2013-10-01

    Facilitated family presence at resuscitation is endorsed by multiple nursing and specialty practice organizations. Implementation of this practice is not universal so there is a need to increase familiarity and competence with facilitated family presence at resuscitation during this significant life event. One strategy to promote this practice is to use a nursing student toolkit for pre-licensure and graduate nursing students. The toolkit includes short video simulations of facilitated family presence at resuscitation, a PowerPoint presentation of evidence-based practice, and questions to facilitate guided discussion. This study tested the effectiveness of this toolkit in increasing nursing students' knowledge, perceptions, and confidence in facilitated family presence at resuscitation. Nursing students from five universities in the United States completed the Family Presence Risk-Benefit Scale, Family Presence Self-Confidence Scale, and a knowledge test before and after the intervention. Implementing the facilitated family presence at resuscitation toolkit significantly increased nursing students' knowledge, perceptions, and confidence related to facilitated family presence at resuscitation (p<.001). The effect size was large for knowledge (d=.90) and perceptions (d=1.04) and moderate for confidence (d=.51). The facilitated family presence at resuscitation toolkit used in this study had a positive impact on students' knowledge, perception of benefits and risks, and self-confidence in facilitated family presence at resuscitation. The toolkit provides students a structured opportunity to consider the presence of family members at resuscitation prior to encountering this situation in clinical practice. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Behavioral Genetic Toolkits: Toward the Evolutionary Origins of Complex Phenotypes.

    PubMed

    Rittschof, C C; Robinson, G E

    2016-01-01

    The discovery of toolkit genes, which are highly conserved genes that consistently regulate the development of similar morphological phenotypes across diverse species, is one of the most well-known observations in the field of evolutionary developmental biology. Surprisingly, this phenomenon is also relevant for a wide array of behavioral phenotypes, despite the fact that these phenotypes are highly complex and regulated by many genes operating in diverse tissues. In this chapter, we review the use of the toolkit concept in the context of behavior, noting the challenges of comparing behaviors and genes across diverse species, but emphasizing the successes in identifying genetic toolkits for behavior; these successes are largely attributable to the creative research approaches fueled by advances in behavioral genomics. We have two general goals: (1) to acknowledge the groundbreaking progress in this field, which offers new approaches to the difficult but exciting challenge of understanding the evolutionary genetic basis of behaviors, some of the most complex phenotypes known, and (2) to provide a theoretical framework that encompasses the scope of behavioral genetic toolkit studies in order to clearly articulate the research questions relevant to the toolkit concept. We emphasize areas for growth and highlight the emerging approaches that are being used to drive the field forward. Behavioral genetic toolkit research has elevated the use of integrative and comparative approaches in the study of behavior, with potentially broad implications for evolutionary biologists and behavioral ecologists alike. © 2016 Elsevier Inc. All rights reserved.

  10. PANDA: a pipeline toolbox for analyzing brain diffusion images

    PubMed Central

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named “Pipeline for Analyzing braiN Diffusion imAges” (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies. PMID:23439846

  11. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draxl, Caroline; Hodge, Bri-Mathias

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  12. Toolkit of Resources for Engaging Families and the Community as Partners in Education: Part 2: Building a Cultural Bridge. REL 2016-151

    ERIC Educational Resources Information Center

    Garcia, Maria Elena; Frunzi, Kay; Dean, Ceri B.; Flores, Nieves; Miller, Kirsten B.

    2016-01-01

    The Toolkit of Resources for Engaging Families and the Community as Partners in Education is a four-part resource that brings together research, promising practices, and useful tools and resources to guide educators in strengthening partnerships with families and community members to support student learning. The toolkit defines family and…

  13. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 3: Building Trusting Relationships with Families & Community through Effective Communication

    ERIC Educational Resources Information Center

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  14. Designing and evaluating an interprofessional shared decision-making and goal-setting decision aid for patients with diabetes in clinical care--systematic decision aid development and study protocol.

    PubMed

    Yu, Catherine H; Stacey, Dawn; Sale, Joanna; Hall, Susan; Kaplan, David M; Ivers, Noah; Rezmovitz, Jeremy; Leung, Fok-Han; Shah, Baiju R; Straus, Sharon E

    2014-01-22

    Care of patients with diabetes often occurs in the context of other chronic illness. Competing disease priorities and competing patient-physician priorities present challenges in the provision of care for the complex patient. Guideline implementation interventions to date do not acknowledge these intricacies of clinical practice. As a result, patients and providers are left overwhelmed and paralyzed by the sheer volume of recommendations and tasks. An individualized approach to the patient with diabetes and multiple comorbid conditions using shared decision-making (SDM) and goal setting has been advocated as a patient-centred approach that may facilitate prioritization of treatment options. Furthermore, incorporating interprofessional integration into practice may overcome barriers to implementation. However, these strategies have not been taken up extensively in clinical practice. To systematically develop and test an interprofessional SDM and goal-setting toolkit for patients with diabetes and other chronic diseases, following the Knowledge to Action framework. 1. Feasibility study: Individual interviews with primary care physicians, nurses, dietitians, pharmacists, and patients with diabetes will be conducted, exploring their experiences with shared decision-making and priority-setting, including facilitators and barriers, the relevance of a decision aid and toolkit for priority-setting, and how best to integrate it into practice.2. Toolkit development: Based on this data, an evidence-based multi-component SDM toolkit will be developed. The toolkit will be reviewed by content experts (primary care, endocrinology, geriatricians, nurses, dietitians, pharmacists, patients) for accuracy and comprehensiveness.3. Heuristic evaluation: A human factors engineer will review the toolkit and identify, list and categorize usability issues by severity.4. Usability testing: This will be done using cognitive task analysis.5. Iterative refinement: Throughout the development process, the toolkit will be refined through several iterative cycles of feedback and redesign. Interprofessional shared decision-making regarding priority-setting with the use of a decision aid toolkit may help prioritize care of individuals with multiple comorbid conditions. Adhering to principles of user-centered design, we will develop and refine a toolkit to assess the feasibility of this approach.

  15. Designing and evaluating an interprofessional shared decision-making and goal-setting decision aid for patients with diabetes in clinical care - systematic decision aid development and study protocol

    PubMed Central

    2014-01-01

    Background Care of patients with diabetes often occurs in the context of other chronic illness. Competing disease priorities and competing patient-physician priorities present challenges in the provision of care for the complex patient. Guideline implementation interventions to date do not acknowledge these intricacies of clinical practice. As a result, patients and providers are left overwhelmed and paralyzed by the sheer volume of recommendations and tasks. An individualized approach to the patient with diabetes and multiple comorbid conditions using shared decision-making (SDM) and goal setting has been advocated as a patient-centred approach that may facilitate prioritization of treatment options. Furthermore, incorporating interprofessional integration into practice may overcome barriers to implementation. However, these strategies have not been taken up extensively in clinical practice. Objectives To systematically develop and test an interprofessional SDM and goal-setting toolkit for patients with diabetes and other chronic diseases, following the Knowledge to Action framework. Methods 1. Feasibility study: Individual interviews with primary care physicians, nurses, dietitians, pharmacists, and patients with diabetes will be conducted, exploring their experiences with shared decision-making and priority-setting, including facilitators and barriers, the relevance of a decision aid and toolkit for priority-setting, and how best to integrate it into practice. 2. Toolkit development: Based on this data, an evidence-based multi-component SDM toolkit will be developed. The toolkit will be reviewed by content experts (primary care, endocrinology, geriatricians, nurses, dietitians, pharmacists, patients) for accuracy and comprehensiveness. 3. Heuristic evaluation: A human factors engineer will review the toolkit and identify, list and categorize usability issues by severity. 4. Usability testing: This will be done using cognitive task analysis. 5. Iterative refinement: Throughout the development process, the toolkit will be refined through several iterative cycles of feedback and redesign. Discussion Interprofessional shared decision-making regarding priority-setting with the use of a decision aid toolkit may help prioritize care of individuals with multiple comorbid conditions. Adhering to principles of user-centered design, we will develop and refine a toolkit to assess the feasibility of this approach. PMID:24450385

  16. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  17. Applications of Earth Remote Sensing for Identifying Tornado and Severe Weather Damage

    NASA Technical Reports Server (NTRS)

    Schultz, Lori; Molthan, Andrew; Burks, Jason E.; Bell, Jordan; McGrath, Kevin; Cole, Tony

    2016-01-01

    NASA SPoRT (Short-term Prediction Research and Transition Center) provided MODIS (Moderate Resolution Imaging Spectrometer) and ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) imagery to WFOs (Weather Forecast Offices) in Alabama to support April 27th, 2011 damage assessments across the state. SPoRT was awarded a NASA Applied Science: Disasters Feasibility award to investigate the applicability of including remote sensing imagery and derived products into the NOAA/NWS (National Oceanic and Atmospheric Administration/National Weather System) Damage Assessment Toolkit (DAT). Proposal team was awarded the 3-year proposal to implement a web mapping service and associate data feeds from the USGS (U.S. Geological Survey) to provide satellite imagery and derived products directly to the NWS thru the DAT. In the United States, NOAA/NWS is charged with performing damage assessments when storm or tornado damage is suspected after a severe weather event. This has led to the development of the Damage Assessment Toolkit (DAT), an application for smartphones, tablets and web browsers that allows for the collection, geo-location, and aggregation of various damage indicators collected during storm surveys.

  18. SU-E-T-161: SOBP Beam Analysis Using Light Output of Scintillation Plate Acquired by CCD Camera.

    PubMed

    Cho, S; Lee, S; Shin, J; Min, B; Chung, K; Shin, D; Lim, Y; Park, S

    2012-06-01

    To analyze Bragg-peak beams in SOBP (spread-out Bragg-peak) beam using CCD (charge-coupled device) camera - scintillation screen system. We separated each Bragg-peak beam using light output of high sensitivity scintillation material acquired by CCD camera and compared with Bragg-peak beams calculated by Monte Carlo simulation. In this study, CCD camera - scintillation screen system was constructed with a high sensitivity scintillation plate (Gd2O2S:Tb) and a right-angled prismatic PMMA phantom, and a Marlin F-201B, EEE-1394 CCD camera. SOBP beam irradiated by the double scattering mode of a PROTEUS 235 proton therapy machine in NCC is 8 cm width, 13 g/cm 2 range. The gain, dose rate and current of this beam is 50, 2 Gy/min and 70 nA, respectively. Also, we simulated the light output of scintillation plate for SOBP beam using Geant4 toolkit. We evaluated the light output of high sensitivity scintillation plate according to intergration time (0.1 - 1.0 sec). The images of CCD camera during the shortest intergration time (0.1 sec) were acquired automatically and randomly, respectively. Bragg-peak beams in SOBP beam were analyzed by the acquired images. Then, the SOBP beam used in this study was calculated by Geant4 toolkit and Bragg-peak beams in SOBP beam were obtained by ROOT program. The SOBP beam consists of 13 Bragg-peak beams. The results of experiment were compared with that of simulation. We analyzed Bragg-peak beams in SOBP beam using light output of scintillation plate acquired by CCD camera and compared with that of Geant4 simulation. We are going to study SOBP beam analysis using more effective the image acquisition technique. © 2012 American Association of Physicists in Medicine.

  19. Toolkit of Available EPA Green Infrastructure Modeling ...

    EPA Pesticide Factsheets

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).

  20. Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P

    2015-01-01

    Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.

  1. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    PubMed

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health ® (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Measures of Success for Earth System Science Education: The DLESE Evaluation Services and the Evaluation Toolkit Collection

    NASA Astrophysics Data System (ADS)

    McCaffrey, M. S.; Buhr, S. M.; Lynds, S.

    2005-12-01

    Increased agency emphasis upon the integration of research and education coupled with the ability to provide students with access to digital background materials, learning activities and primary data sources has begun to revolutionize Earth science education in formal and informal settings. The DLESE Evaluation Services team and the related Evaluation Toolkit collection (http://www.dlese.org/cms/evalservices/ ) provides services and tools for education project leads and educators. Through the Evaluation Toolkit, educators may access high-quality digital materials to assess students' cognitive gains, examples of alternative assessments, and case studies and exemplars of authentic research. The DLESE Evaluation Services team provides support for those who are developing evaluation plans on an as-requested basis. In addition, the Toolkit provides authoritative peer reviewed articlesabout evaluation research techniques and strategies of particular importance to geoscience education. This paper will provide an overview of the DLESE Evaluation Toolkit and discuss challenges and best practices for assessing student learning and evaluating Earth system sciences education in a digital world.

  3. Respiratory Protection Toolkit: Providing Guidance Without Changing Requirements-Can We Make an Impact?

    PubMed

    Bien, Elizabeth Ann; Gillespie, Gordon Lee; Betcher, Cynthia Ann; Thrasher, Terri L; Mingerink, Donna R

    2016-12-01

    International travel and infectious respiratory illnesses worldwide place health care workers (HCWs) at increasing risk of respiratory exposures. To ensure the highest quality safety initiatives, one health care system used a quality improvement model of Plan-Do-Study-Act and guidance from Occupational Safety and Health Administration's (OSHA) May 2015 Hospital Respiratory Protection Program (RPP) Toolkit to assess a current program. The toolkit aided in identification of opportunities for improvement within their well-designed RPP. One opportunity was requiring respirator use during aerosol-generating procedures for specific infectious illnesses. Observation data demonstrated opportunities to mitigate controllable risks including strap placement, user seal check, and reuse of disposable N95 filtering facepiece respirators. Subsequent interdisciplinary collaboration resulted in other ideas to decrease risks and increase protection from potentially infectious respiratory illnesses. The toolkit's comprehensive document to evaluate the program showed that while the OSHA standards have not changed, the addition of the toolkit can better protect HCWs. © 2016 The Author(s).

  4. Can a workbook work? Examining whether a practitioner evaluation toolkit can promote instrumental use.

    PubMed

    Campbell, Rebecca; Townsend, Stephanie M; Shaw, Jessica; Karim, Nidal; Markowitz, Jenifer

    2015-10-01

    In large-scale, multi-site contexts, developing and disseminating practitioner-oriented evaluation toolkits are an increasingly common strategy for building evaluation capacity. Toolkits explain the evaluation process, present evaluation design choices, and offer step-by-step guidance to practitioners. To date, there has been limited research on whether such resources truly foster the successful design, implementation, and use of evaluation findings. In this paper, we describe a multi-site project in which we developed a practitioner evaluation toolkit and then studied the extent to which the toolkit and accompanying technical assistance was effective in promoting successful completion of local-level evaluations and fostering instrumental use of the findings (i.e., whether programs directly used their findings to improve practice, see Patton, 2008). Forensic nurse practitioners from six geographically dispersed service programs completed methodologically rigorous evaluations; furthermore, all six programs used the findings to create programmatic and community-level changes to improve local practice. Implications for evaluation capacity building are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Gigavision - A weatherproof, multibillion pixel resolution time-lapse camera system for recording and tracking phenology in every plant in a landscape

    NASA Astrophysics Data System (ADS)

    Brown, T.; Borevitz, J. O.; Zimmermann, C.

    2010-12-01

    We have a developed a camera system that can record hourly, gigapixel (multi-billion pixel) scale images of an ecosystem in a 360x90 degree panorama. The “Gigavision” camera system is solar-powered and can wirelessly stream data to a server. Quantitative data collection from multiyear timelapse gigapixel images is facilitated through an innovative web-based toolkit for recording time-series data on developmental stages (phenology) from any plant in the camera’s field of view. Gigapixel images enable time-series recording of entire landscapes with a resolution sufficient to record phenology from a majority of individuals in entire populations of plants. When coupled with next generation sequencing, quantitative population genomics can be performed in a landscape context linking ecology and evolution in situ and in real time. The Gigavision camera system achieves gigapixel image resolution by recording rows and columns of overlapping megapixel images. These images are stitched together into a single gigapixel resolution image using commercially available panorama software. Hardware consists of a 5-18 megapixel resolution DSLR or Network IP camera mounted on a pair of heavy-duty servo motors that provide pan-tilt capabilities. The servos and camera are controlled with a low-power Windows PC. Servo movement, power switching, and system status monitoring are enabled with Phidgets-brand sensor boards. System temperature, humidity, power usage, and battery voltage are all monitored at 5 minute intervals. All sensor data is uploaded via cellular or 802.11 wireless to an interactive online interface for easy remote monitoring of system status. Systems with direct internet connections upload the full sized images directly to our automated stitching server where they are stitched and available online for viewing within an hour of capture. Systems with cellular wireless upload an 80 megapixel “thumbnail” of each larger panorama and full-sized images are manually retrieved at bi-weekly intervals. Our longer-term goal is to make gigapixel time-lapse datasets available online in an interactive interface that layers plant-level phenology data with gigapixel resolution images, genomic sequence data from individual plants with weather and other abitotic sensor data. Co-visualization of all of these data types provides researchers with a powerful new tool for examining complex ecological interactions across scales from the individual to the ecosystem. We will present detailed phenostage data from more than 100 plants of multiple species from our Gigavision timelapse camera at our “Big Blowout East” field site in the Indiana Dunes State Park, IN. This camera has been recording three to four 700 million pixel images a day since February 28, 2010. The camera field of view covers an area of about 7 hectares resulting in an average image resolution of about 1 pixel per centimeter over the entire site. We will also discuss some of the many technological challenges with developing and maintaining these types of hardware systems, collecting quantitative data from gigapixel resolution time-lapse data and effectively managing terabyte-sized datasets of millions of images.

  6. Automated Neuropsychological Assessment Metrics, Version 4 (ANAM4): Examination of Select Psychometric Properties and Administration Procedures

    DTIC Science & Technology

    2016-12-01

    2017 was approved in August 2016. The supplemental project has 2 primary objectives: • Recommend cognitive assessment tools/approaches ( toolkit ) from...strategies for use in future military-relevant environments The supplemental project has two primary deliverables: • Proposed Toolkit of cognitive...6 Vet Final Report and Cognitive performance recommendations through Steering Committee Task 7 Provide Toolkit Report 16 Months 8-12 Task 8

  7. Toolkit of Resources for Engaging Families and the Community as Partners in Education. Part 1: Building an Understanding of Family and Community Engagement. REL 2016-148

    ERIC Educational Resources Information Center

    Garcia, Maria Elena; Frunzi, Kay; Dean, Ceri B.; Flores, Nieves; Miller, Kirsten B.

    2016-01-01

    The Toolkit of Resources for Engaging Families and the Community as Partners in Education is a four-part resource that brings together research, promising practices, and useful tools and resources to guide educators in strengthening partnerships with families and community members to support student learning. The toolkit defines family and…

  8. Toolkit of Resources for Engaging Families and the Community as Partners in Education: Part 3: Building Trusting Relationships with Families and the Community through Effective Communication. REL 2016-152

    ERIC Educational Resources Information Center

    Garcia, Maria Elena; Frunzi, Kay; Dean, Ceri B.; Flores, Nieves; Miller, Kirsten B.

    2016-01-01

    The Toolkit of Resources for Engaging Families and the Community as Partners in Education is a four-part resource that brings together research, promising practices, and useful tools and resources to guide educators in strengthening partnerships with families and community members to support student learning. The toolkit defines family and…

  9. BRDF profile of Tyvek and its implementation in the Geant4 simulation toolkit.

    PubMed

    Nozka, Libor; Pech, Miroslav; Hiklova, Helena; Mandat, Dusan; Hrabovsky, Miroslav; Schovanek, Petr; Palatka, Miroslav

    2011-02-28

    Diffuse and specular characteristics of the Tyvek 1025-BL material are reported with respect to their implementation in the Geant4 Monte Carlo simulation toolkit. This toolkit incorporates the UNIFIED model. Coefficients defined by the UNIFIED model were calculated from the bidirectional reflectance distribution function (BRDF) profiles measured with a scatterometer for several angles of incidence. Results were amended with profile measurements made by a profilometer.

  10. SIGKit: Software for Introductory Geophysics Toolkit

    NASA Astrophysics Data System (ADS)

    Kruse, S.; Bank, C. G.; Esmaeili, S.; Jazayeri, S.; Liu, S.; Stoikopoulos, N.

    2017-12-01

    The Software for Introductory Geophysics Toolkit (SIGKit) affords students the opportunity to create model data and perform simple processing of field data for various geophysical methods. SIGkit provides a graphical user interface built with the MATLAB programming language, but can run even without a MATLAB installation. At this time SIGkit allows students to pick first arrivals and match a two-layer model to seismic refraction data; grid total-field magnetic data, extract a profile, and compare this to a synthetic profile; and perform simple processing steps (subtraction of a mean trace, hyperbola fit) to ground-penetrating radar data. We also have preliminary tools for gravity, resistivity, and EM data representation and analysis. SIGkit is being built by students for students, and the intent of the toolkit is to provide an intuitive interface for simple data analysis and understanding of the methods, and act as an entrance to more sophisticated software. The toolkit has been used in introductory courses as well as field courses. First reactions from students are positive. Think-aloud observations of students using the toolkit have helped identify problems and helped shape it. We are planning to compare the learning outcomes of students who have used the toolkit in a field course to students in a previous course to test its effectiveness.

  11. The development of an artificial organic networks toolkit for LabVIEW.

    PubMed

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique. © 2015 Wiley Periodicals, Inc.

  12. SimITK: visual programming of the ITK image-processing library within Simulink.

    PubMed

    Dickinson, Andrew W L; Abolmaesumi, Purang; Gobbi, David G; Mousavi, Parvin

    2014-04-01

    The Insight Segmentation and Registration Toolkit (ITK) is a software library used for image analysis, visualization, and image-guided surgery applications. ITK is a collection of C++ classes that poses the challenge of a steep learning curve should the user not have appropriate C++ programming experience. To remove the programming complexities and facilitate rapid prototyping, an implementation of ITK within a higher-level visual programming environment is presented: SimITK. ITK functionalities are automatically wrapped into "blocks" within Simulink, the visual programming environment of MATLAB, where these blocks can be connected to form workflows: visual schematics that closely represent the structure of a C++ program. The heavily templated C++ nature of ITK does not facilitate direct interaction between Simulink and ITK; an intermediary is required to convert respective data types and allow intercommunication. As such, a SimITK "Virtual Block" has been developed that serves as a wrapper around an ITK class which is capable of resolving the ITK data types to native Simulink data types. Part of the challenge surrounding this implementation involves automatically capturing and storing the pertinent class information that need to be refined from an initial state prior to being reflected within the final block representation. The primary result from the SimITK wrapping procedure is multiple Simulink block libraries. From these libraries, blocks are selected and interconnected to demonstrate two examples: a 3D segmentation workflow and a 3D multimodal registration workflow. Compared to their pure-code equivalents, the workflows highlight ITK usability through an alternative visual interpretation of the code that abstracts away potentially confusing technicalities.

  13. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    NASA Astrophysics Data System (ADS)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  14. Inspection with Robotic Microscopic Imaging

    NASA Technical Reports Server (NTRS)

    Pedersen, Liam; Deans, Matthew; Kunz, Clay; Sargent, Randy; Chen, Alan; Mungas, Greg

    2005-01-01

    Future Mars rover missions will require more advanced onboard autonomy for increased scientific productivity and reduced mission operations cost. One such form of autonomy can be achieved by targeting precise science measurements to be made in a single command uplink cycle. In this paper we present an overview of our solution to the subproblems of navigating a rover into place for microscopic imaging, mapping an instrument target point selected by an operator using far away science camera images to close up hazard camera images, verifying the safety of placing a contact instrument on a sample or finding nearby safe points, and analyzing the data that comes back from the rover. The system developed includes portions used in the Multiple Target Single Cycle Instrument Placement demonstration at NASA Ames in October 2004, and portions of the MI Toolkit delivered to the Athena Microscopic Imager Instrument Team for the MER mission still operating on Mars today. Some of the component technologies are also under consideration for MSL mission infusion.

  15. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    PubMed Central

    Mosaliganti, Kishore R.; Gelas, Arnaud; Megason, Sean G.

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo. PMID:24501592

  16. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs.

    PubMed

    Mosaliganti, Kishore R; Gelas, Arnaud; Megason, Sean G

    2013-01-01

    In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK) v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse) and grid representations (point, mesh, and image-based). Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g., gradient and Hessians) across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a developing zebrafish embryo.

  17. LabVIEW application for motion tracking using USB camera

    NASA Astrophysics Data System (ADS)

    Rob, R.; Tirian, G. O.; Panoiu, M.

    2017-05-01

    The technical state of the contact line and also the additional equipment in electric rail transport is very important for realizing the repairing and maintenance of the contact line. During its functioning, the pantograph motion must stay in standard limits. Present paper proposes a LabVIEW application which is able to track in real time the motion of a laboratory pantograph and also to acquire the tracking images. An USB webcam connected to a computer acquires the desired images. The laboratory pantograph contains an automatic system which simulates the real motion. The tracking parameters are the horizontally motion (zigzag) and the vertically motion which can be studied in separate diagrams. The LabVIEW application requires appropriate tool-kits for vision development. Therefore the paper describes the subroutines that are especially programmed for real-time image acquisition and also for data processing.

  18. UTOPIA-User-Friendly Tools for Operating Informatics Applications.

    PubMed

    Pettifer, S R; Sinnott, J R; Attwood, T K

    2004-01-01

    Bioinformaticians routinely analyse vast amounts of information held both in large remote databases and in flat data files hosted on local machines. The contemporary toolkit available for this purpose consists of an ad hoc collection of data manipulation tools, scripting languages and visualization systems; these must often be combined in complex and bespoke ways, the result frequently being an unwieldy artefact capable of one specific task, which cannot easily be exploited or extended by other practitioners. Owing to the sizes of current databases and the scale of the analyses necessary, routine bioinformatics tasks are often automated, but many still require the unique experience and intuition of human researchers: this requires tools that support real-time interaction with complex datasets. Many existing tools have poor user interfaces and limited real-time performance when applied to realistically large datasets; much of the user's cognitive capacity is therefore focused on controlling the tool rather than on performing the research. The UTOPIA project is addressing some of these issues by building reusable software components that can be combined to make useful applications in the field of bioinformatics. Expertise in the fields of human computer interaction, high-performance rendering, and distributed systems is being guided by bioinformaticians and end-user biologists to create a toolkit that is both architecturally sound from a computing point of view, and directly addresses end-user and application-developer requirements.

  19. Comparison of image segmentation of lungs using methods: connected threshold, neighborhood connected, and threshold level set segmentation

    NASA Astrophysics Data System (ADS)

    Amanda, A. R.; Widita, R.

    2016-03-01

    The aim of this research is to compare some image segmentation methods for lungs based on performance evaluation parameter (Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)). In this study, the methods compared were connected threshold, neighborhood connected, and the threshold level set segmentation on the image of the lungs. These three methods require one important parameter, i.e the threshold. The threshold interval was obtained from the histogram of the original image. The software used to segment the image here was InsightToolkit-4.7.0 (ITK). This research used 5 lung images to be analyzed. Then, the results were compared using the performance evaluation parameter determined by using MATLAB. The segmentation method is said to have a good quality if it has the smallest MSE value and the highest PSNR. The results show that four sample images match the criteria of connected threshold, while one sample refers to the threshold level set segmentation. Therefore, it can be concluded that connected threshold method is better than the other two methods for these cases.

  20. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    PubMed

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  1. WebViz: A web browser based application for collaborative analysis of 3D data

    NASA Astrophysics Data System (ADS)

    Ruegg, C. S.

    2011-12-01

    In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons. These buttons have been replaced with a new layout that is easier to understand the function and is also easy to use with mobile devices. With these new changes, WebViz is easier to control and use for general use.

  2. An interactive visualization tool for mobile objects

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tetsuo

    Recent advancements in mobile devices---such as Global Positioning System (GPS), cellular phones, car navigation system, and radio-frequency identification (RFID)---have greatly influenced the nature and volume of data about individual-based movement in space and time. Due to the prevalence of mobile devices, vast amounts of mobile objects data are being produced and stored in databases, overwhelming the capacity of traditional spatial analytical methods. There is a growing need for discovering unexpected patterns, trends, and relationships that are hidden in the massive mobile objects data. Geographic visualization (GVis) and knowledge discovery in databases (KDD) are two major research fields that are associated with knowledge discovery and construction. Their major research challenges are the integration of GVis and KDD, enhancing the ability to handle large volume mobile objects data, and high interactivity between the computer and users of GVis and KDD tools. This dissertation proposes a visualization toolkit to enable highly interactive visual data exploration for mobile objects datasets. Vector algebraic representation and online analytical processing (OLAP) are utilized for managing and querying the mobile object data to accomplish high interactivity of the visualization tool. In addition, reconstructing trajectories at user-defined levels of temporal granularity with time aggregation methods allows exploration of the individual objects at different levels of movement generality. At a given level of generality, individual paths can be combined into synthetic summary paths based on three similarity measures, namely, locational similarity, directional similarity, and geometric similarity functions. A visualization toolkit based on the space-time cube concept exploits these functionalities to create a user-interactive environment for exploring mobile objects data. Furthermore, the characteristics of visualized trajectories are exported to be utilized for data mining, which leads to the integration of GVis and KDD. Case studies using three movement datasets (personal travel data survey in Lexington, Kentucky, wild chicken movement data in Thailand, and self-tracking data in Utah) demonstrate the potential of the system to extract meaningful patterns from the otherwise difficult to comprehend collections of space-time trajectories.

  3. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    NASA Astrophysics Data System (ADS)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  4. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    PubMed

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  5. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  6. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study.

    PubMed

    Lee, Lisa; Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-10-01

    To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Questionnaire-based survey of attendees at a national ePrescribing symposium. 2013 National ePrescribing Symposium in London, UK. Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals' experiences (n = 45; 64.3%) were considered the most useful types of content. There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning.

  7. Does the Good Schools Toolkit Reduce Physical, Sexual and Emotional Violence, and Injuries, in Girls and Boys equally? A Cluster-Randomised Controlled Trial.

    PubMed

    Devries, Karen M; Knight, Louise; Allen, Elizabeth; Parkes, Jenny; Kyegombe, Nambusi; Naker, Dipak

    2017-10-01

    We aimed to investigate whether the Good School Toolkit reduced emotional violence, severe physical violence, sexual violence and injuries from school staff to students, as well as emotional, physical and sexual violence between peers, in Ugandan primary schools. We performed a two-arm cluster randomised controlled trial with parallel assignment. Forty-two schools in one district were allocated to intervention (n = 21) or wait-list control (n = 21) arms in 2012. We did cross-sectional baseline and endline surveys in 2012 and 2014, and the Good School Toolkit intervention was implemented for 18 months between surveys. Analyses were by intention to treat and are adjusted for clustering within schools and for baseline school-level proportions of outcomes. The Toolkit was associated with an overall reduction in any form of violence from staff and/or peers in the past week towards both male (aOR = 0.34, 95%CI 0.22-0.53) and female students (aOR = 0.55, 95%CI 0.36-0.84). Injuries as a result of violence from school staff were also lower in male (aOR = 0.36, 95%CI 0.20-0.65) and female students (aOR = 0.51, 95%CI 0.29-0.90). Although the Toolkit seems to be effective at reducing violence in both sexes, there is some suggestion that the Toolkit may have stronger effects in boys than girls. The Toolkit is a promising intervention to reduce a wide range of different forms of violence from school staff and between peers in schools, and should be urgently considered for scale-up. Further research is needed to investigate how the intervention could engage more successfully with girls.

  8. Improving the Effectiveness of Medication Review: Guidance from the Health Literacy Universal Precautions Toolkit.

    PubMed

    Weiss, Barry D; Brega, Angela G; LeBlanc, William G; Mabachi, Natabhona M; Barnard, Juliana; Albright, Karen; Cifuentes, Maribel; Brach, Cindy; West, David R

    2016-01-01

    Although routine medication reviews in primary care practice are recommended to identify drug therapy problems, it is often difficult to get patients to bring all their medications to office visits. The objective of this study was to determine whether the medication review tool in the Agency for Healthcare Research and Quality Health Literacy Universal Precautions Toolkit can help to improve medication reviews in primary care practices. The toolkit's "Brown Bag Medication Review" was implemented in a rural private practice in Missouri and an urban teaching practice in California. Practices recorded outcomes of medication reviews with 45 patients before toolkit implementation and then changed their medication review processes based on guidance in the toolkit. Six months later we conducted interviews with practice staff to identify changes made as a result of implementing the tool, and practices recorded outcomes of medication reviews with 41 additional patients. Data analyses compared differences in whether all medications were brought to visits, the number of medications reviewed, drug therapy problems identified, and changes in medication regimens before and after implementation. Interviews revealed that practices made the changes recommended in the toolkit to encourage patients to bring medications to office visits. Evaluation before and after implementation revealed a 3-fold increase in the percentage of patients who brought all their prescription medications and a 6-fold increase in the number of prescription medications brought to office visits. The percentage of reviews in which drug therapy problems were identified doubled, as did the percentage of medication regimens revised. Use of the Health Literacy Universal Precautions Toolkit can help to identify drug therapy problems. © Copyright 2016 by the American Board of Family Medicine.

  9. Using stakeholder perspectives to develop an ePrescribing toolkit for NHS Hospitals: a questionnaire study

    PubMed Central

    Cresswell, Kathrin; Slee, Ann; Slight, Sarah P; Coleman, Jamie; Sheikh, Aziz

    2014-01-01

    Summary Objective To evaluate how an online toolkit may support ePrescribing deployments in National Health Service hospitals, by assessing the type of knowledge-based resources currently sought by key stakeholders. Design Questionnaire-based survey of attendees at a national ePrescribing symposium. Setting 2013 National ePrescribing Symposium in London, UK. Participants Eighty-four delegates were eligible for inclusion in the survey, of whom 70 completed and returned the questionnaire. Main outcome measures Estimate of the usefulness and type of content to be included in an ePrescribing toolkit. Results Interest in a toolkit designed to support the implementation and use of ePrescribing systems was high (n = 64; 91.4%). As could be expected given the current dearth of such a resource, few respondents (n = 2; 2.9%) had access or used an ePrescribing toolkit at the time of the survey. Anticipated users for the toolkit included implementation (n = 62; 88.6%) and information technology (n = 61; 87.1%) teams, pharmacists (n = 61; 87.1%), doctors (n = 58; 82.9%) and nurses (n = 56; 80.0%). Summary guidance for every stage of the implementation (n = 48; 68.6%), planning and monitoring tools (n = 47; 67.1%) and case studies of hospitals’ experiences (n = 45; 64.3%) were considered the most useful types of content. Conclusions There is a clear need for reliable and up-to-date knowledge to support ePrescribing system deployments and longer term use. The findings highlight how a toolkit may become a useful instrument for the management of knowledge in the field, not least by allowing the exchange of ideas and shared learning. PMID:25383199

  10. Pcetk: A pDynamo-based Toolkit for Protonation State Calculations in Proteins.

    PubMed

    Feliks, Mikolaj; Field, Martin J

    2015-10-26

    Pcetk (a pDynamo-based continuum electrostatic toolkit) is an open-source, object-oriented toolkit for the calculation of proton binding energetics in proteins. The toolkit is a module of the pDynamo software library, combining the versatility of the Python scripting language and the efficiency of the compiled languages, C and Cython. In the toolkit, we have connected pDynamo to the external Poisson-Boltzmann solver, extended-MEAD. Our goal was to provide a modern and extensible environment for the calculation of protonation states, electrostatic energies, titration curves, and other electrostatic-dependent properties of proteins. Pcetk is freely available under the CeCILL license, which is compatible with the GNU General Public License. The toolkit can be found on the Web at the address http://github.com/mfx9/pcetk. The calculation of protonation states in proteins requires a knowledge of pKa values of protonatable groups in aqueous solution. However, for some groups, such as protonatable ligands bound to protein, the pKa aq values are often difficult to obtain from experiment. As a complement to Pcetk, we revisit an earlier computational method for the estimation of pKa aq values that has an accuracy of ± 0.5 pKa-units or better. Finally, we verify the Pcetk module and the method for estimating pKa aq values with different model cases.

  11. Octopus-toolkit: a workflow to automate mining of public epigenomic and transcriptomic next-generation sequencing data

    PubMed Central

    Kim, Taemook; Seo, Hogyu David; Hennighausen, Lothar; Lee, Daeyoup

    2018-01-01

    Abstract Octopus-toolkit is a stand-alone application for retrieving and processing large sets of next-generation sequencing (NGS) data with a single step. Octopus-toolkit is an automated set-up-and-analysis pipeline utilizing the Aspera, SRA Toolkit, FastQC, Trimmomatic, HISAT2, STAR, Samtools, and HOMER applications. All the applications are installed on the user's computer when the program starts. Upon the installation, it can automatically retrieve original files of various epigenomic and transcriptomic data sets, including ChIP-seq, ATAC-seq, DNase-seq, MeDIP-seq, MNase-seq and RNA-seq, from the gene expression omnibus data repository. The downloaded files can then be sequentially processed to generate BAM and BigWig files, which are used for advanced analyses and visualization. Currently, it can process NGS data from popular model genomes such as, human (Homo sapiens), mouse (Mus musculus), dog (Canis lupus familiaris), plant (Arabidopsis thaliana), zebrafish (Danio rerio), fruit fly (Drosophila melanogaster), worm (Caenorhabditis elegans), and budding yeast (Saccharomyces cerevisiae) genomes. With the processed files from Octopus-toolkit, the meta-analysis of various data sets, motif searches for DNA-binding proteins, and the identification of differentially expressed genes and/or protein-binding sites can be easily conducted with few commands by users. Overall, Octopus-toolkit facilitates the systematic and integrative analysis of available epigenomic and transcriptomic NGS big data. PMID:29420797

  12. A Toolkit to assess health needs for congenital disorders in low- and middle-income countries: an instrument for public health action.

    PubMed

    Nacul, L C; Stewart, A; Alberg, C; Chowdhury, S; Darlison, M W; Grollman, C; Hall, A; Modell, B; Moorthie, S; Sagoo, G S; Burton, H

    2014-06-01

    In 2010 the World Health Assembly called for action to improve the care and prevention of congenital disorders, noting that technical guidance would be required for this task, especially in low- and middle-income countries. Responding to this call, we have developed a freely available web-accessible Toolkit for assessing health needs for congenital disorders. Materials for the Toolkit website (http://toolkit.phgfoundation.org) were prepared by an iterative process of writing, discussion and modification by the project team, with advice from external experts. A customized database was developed using epidemiological, demographic, socio-economic and health-services data from a range of validated sources. Document-processing and data integration software combines data from the database with a template to generate topic- and country-specific Calculator documents for quantitative analysis. The Toolkit guides users through selection of topics (including both clinical conditions and relevant health services), assembly and evaluation of qualitative and quantitative information, assessment of the potential effects of selected interventions, and planning and prioritization of actions to reduce the risk or prevalence of congenital disorders. The Toolkit enables users without epidemiological or public health expertise to undertake health needs assessment as a prerequisite for strategic planning in relation to congenital disorders in their country or region. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health.

  13. New Careers in Nursing Scholar Alumni Toolkit: Development of an Innovative Resource for Transition to Practice.

    PubMed

    Mauro, Ann Marie P; Escallier, Lori A; Rosario-Sim, Maria G

    2016-01-01

    The transition from student to professional nurse is challenging and may be more difficult for underrepresented minority nurses. The Robert Wood Johnson Foundation New Careers in Nursing (NCIN) program supported development of a toolkit that would serve as a transition-to-practice resource to promote retention of NCIN alumni and other new nurses. Thirteen recent NCIN alumni (54% male, 23% Hispanic/Latino, 23% African Americans) from 3 schools gave preliminary content feedback. An e-mail survey was sent to a convenience sample of 29 recent NCIN alumni who evaluated the draft toolkit using a Likert scale (poor = 1; excellent = 5). Twenty NCIN alumni draft toolkit reviewers (response rate 69%) were primarily female (80%) and Hispanic/Latino (40%). Individual chapters' mean overall rating of 4.67 demonstrated strong validation. Mean scores for overall toolkit content (4.57), usability (4.5), relevance (4.79), and quality (4.71) were also excellent. Qualitative comments were analyzed using thematic content analysis and supported the toolkit's relevance and utility. A multilevel peer review process was also conducted. Peer reviewer feedback resulted in a 6-chapter document that offers resources for successful transition to practice and lays the groundwork for continued professional growth. Future research is needed to determine the ideal time to introduce this resource. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Third Party TMDL Development Toolkit

    EPA Pesticide Factsheets

    Water Environment Federation's toolkit provides basic steps in which an organization or group other than the lead water quality agency takes responsibility for developing the TMDL document and supporting analysis.

  15. An approach to improve the care of mid-life women through the implementation of a Women’s Health Assessment Tool/Clinical Decision Support toolkit

    PubMed Central

    Silvestrin, Terry M; Steenrod, Anna W; Coyne, Karin S; Gross, David E; Esinduy, Canan B; Kodsi, Angela B; Slifka, Gayle J; Abraham, Lucy; Araiza, Anna L; Bushmakin, Andrew G; Luo, Xuemei

    2016-01-01

    The objectives of this study are to describe the implementation process of the Women’s Health Assessment Tool/Clinical Decision Support toolkit and summarize patients’ and clinicians’ perceptions of the toolkit. The Women’s Health Assessment Tool/Clinical Decision Support toolkit was piloted at three clinical sites over a 4-month period in Washington State to evaluate health outcomes among mid-life women. The implementation involved a multistep process and engagement of multiple stakeholders over 18 months. Two-thirds of patients (n = 76/110) and clinicians (n = 8/12) participating in pilot completed feedback surveys; five clinicians participated in qualitative interviews. Most patients felt more prepared for their annual visit (69.7%) and that quality of care improved (68.4%) while clinicians reported streamlined patient visits and improved communication with patients. The Women’s Health Assessment Tool/Clinical Decision Support toolkit offers a unique approach to introduce and address some of the key health issues that affect mid-life women. PMID:27558508

  16. Enhancing the sustainability and climate resiliency of health care facilities: a comparison of initiatives and toolkits.

    PubMed

    Balbus, John; Berry, Peter; Brettle, Meagan; Jagnarine-Azan, Shalini; Soares, Agnes; Ugarte, Ciro; Varangu, Linda; Prats, Elena Villalobos

    2016-09-01

    Extreme weather events have revealed the vulnerability of health care facilities and the extent of devastation to the community when they fail. With climate change anticipated to increase extreme weather and its impacts worldwide-severe droughts, floods, heat waves, and related vector-borne diseases-health care officials need to understand and address the vulnerabilities of their health care systems and take action to improve resiliency in ways that also meet sustainability goals. Generally, the health sector is among a country's largest consumers of energy and a significant source of greenhouse gas emissions. Now it has the opportunity lead climate mitigation, while reducing energy, water, and other costs. This Special Report summarizes several initiatives and compares three toolkits for implementing sustainability and resiliency measures for health care facilities: the Canadian Health Care Facility Climate Change Resiliency Toolkit, the U.S. Sustainable and Climate Resilient Health Care Facilities Toolkit, and the PAHO SMART Hospitals Toolkit of the World Health Organization/Pan American Health Organization. These tools and the lessons learned can provide a critical starting point for any health system in the Americas.

  17. CRISPR-Cas9 Toolkit for Actinomycete Genome Editing.

    PubMed

    Tong, Yaojun; Robertsen, Helene Lunde; Blin, Kai; Weber, Tilmann; Lee, Sang Yup

    2018-01-01

    Bacteria of the order Actinomycetales are one of the most important sources of bioactive natural products, which are the source of many drugs. However, many of them still lack efficient genome editing methods, some strains even cannot be manipulated at all. This restricts systematic metabolic engineering approaches for boosting known and discovering novel natural products. In order to facilitate the genome editing for actinomycetes, we developed a CRISPR-Cas9 toolkit with high efficiency for actinomyces genome editing. This basic toolkit includes a software for spacer (sgRNA) identification, a system for in-frame gene/gene cluster knockout, a system for gene loss-of-function study, a system for generating a random size deletion library, and a system for gene knockdown. For the latter, a uracil-specific excision reagent (USER) cloning technology was adapted to simplify the CRISPR vector construction process. The application of this toolkit was successfully demonstrated by perturbation of genomes of Streptomyces coelicolor A3(2) and Streptomyces collinus Tü 365. The CRISPR-Cas9 toolkit and related protocol described here can be widely used for metabolic engineering of actinomycetes.

  18. The PICWidget

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark

    2007-01-01

    The Plug-in Image Component Widget (PICWidget) is a software component for building digital imaging applications. The component is part of a methodology described in GIS Methodology for Planning Planetary-Rover Operations (NPO-41812), which appears elsewhere in this issue of NASA Tech Briefs. Planetary rover missions return a large number and wide variety of image data products that vary in complexity in many ways. Supported by a powerful, flexible image-data-processing pipeline, the PICWidget can process and render many types of imagery, including (but not limited to) thumbnail, subframed, downsampled, stereoscopic, and mosaic images; images coregistred with orbital data; and synthetic red/green/blue images. The PICWidget is capable of efficiently rendering images from data representing many more pixels than are available at a computer workstation where the images are to be displayed. The PICWidget is implemented as an Eclipse plug-in using the Standard Widget Toolkit, which provides a straightforward interface for re-use of the PICWidget in any number of application programs built upon the Eclipse application framework. Because the PICWidget is tile-based and performs aggressive tile caching, it has flexibility to perform faster or slower, depending whether more or less memory is available.

  19. X-Windows Information Sharing Protocol Widget Class

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.

    2006-01-01

    The X-Windows Information Sharing Protocol (ISP) Widget Class ("Class") is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing ISP graphical-user-interface (GUI) computer programs. ISP programming tasks require many method calls to identify, query, and interpret the connections and messages exchanged between a client and an ISP server. Most X-Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Information Sharing Protocol (ISP) Widget Class encapsulates the client side of the ISP programming libraries within the framework of an X-Windows widget. Using the widget framework, X-Windows GUI programs can interact with ISP services in an abstract way and in the same manner as that of other graphical widgets, making it easier to write ISP GUI client programs. Wrapping ISP client services inside a widget framework enables a programmer to treat an ISP server interface as though it were a GUI. Moreover, an alternate subclass could implement another communication protocol in the same sort of widget.

  20. The Diabetes Literacy and Numeracy Education Toolkit (DLNET)

    PubMed Central

    Wolff, Kathleen; Cavanaugh, Kerri; Malone, Robb; Hawk, Victoria; Gregory, Becky Pratt; Davis, Dianne; Wallston, Kenneth; Rothman, Russell L.

    2009-01-01

    Diabetes education to improve patient self-management is an important component of comprehensive diabetes care. Patients with low health literacy and numeracy may have difficulty translating information from traditional diabetes educational programs and materials into effective self-care. To address this potential barrier to successful diabetes teaching and counseling, we describe the development of the Diabetes Literacy and Numeracy Education Toolkit (DLNET) and opportunities for its use in clinical practice. The DLNET is composed of 24 interactive modules covering standard diabetes care topics that can be customized to individual patient needs and utilized by all members of the multidisciplinary diabetes care team. The material’s content and formatting aims to improve the ease of use for diabetes patients with low literacy and numeracy by adhering to a lower text reading level, using illustrations for key concepts, and color-coding and other accommodations to guide patients through instructions for self-care. Individual sections of the DLNET may be provided to patients for initial teaching, as well as for reinforcement. While designed for lower literacy and numeracy skills, the DLNET provides unique materials to facilitate diabetes education for all patients. PMID:19240246

  1. iFeature: a python package and web server for features extraction and selection from protein and peptide sequences.

    PubMed

    Chen, Zhen; Zhao, Pei; Li, Fuyi; Leier, André; Marquez-Lago, Tatiana T; Wang, Yanan; Webb, Geoffrey I; Smith, A Ian; Daly, Roger J; Chou, Kuo-Chen; Song, Jiangning

    2018-03-08

    Structural and physiochemical descriptors extracted from sequence data have been widely used to represent sequences and predict structural, functional, expression and interaction profiles of proteins and peptides as well as DNAs/RNAs. Here, we present iFeature, a versatile Python-based toolkit for generating various numerical feature representation schemes for both protein and peptide sequences. iFeature is capable of calculating and extracting a comprehensive spectrum of 18 major sequence encoding schemes that encompass 53 different types of feature descriptors. It also allows users to extract specific amino acid properties from the AAindex database. Furthermore, iFeature integrates 12 different types of commonly used feature clustering, selection, and dimensionality reduction algorithms, greatly facilitating training, analysis, and benchmarking of machine-learning models. The functionality of iFeature is made freely available via an online web server and a stand-alone toolkit. http://iFeature.erc.monash.edu/; https://github.com/Superzchen/iFeature/. jiangning.song@monash.edu; kcchou@gordonlifescience.org; roger.daly@monash.edu. Supplementary data are available at Bioinformatics online.

  2. Transportation librarian's toolkit

    DOT National Transportation Integrated Search

    2007-12-01

    The Transportation Librarians Toolkit is a product of the Transportation Library Connectivity pooled fund study, TPF- 5(105), a collaborative, grass-roots effort by transportation libraries to enhance information accessibility and professional expert...

  3. The Python ARM Radar Toolkit (Py-ART), a library for working with weather radar data in the Python programming language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmus, Jonathan J.; Collis, Scott M.

    The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. As a result, the source code for themore » toolkit is available on GitHub and is distributed under a BSD license.« less

  4. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  5. The Python ARM Radar Toolkit (Py-ART), a library for working with weather radar data in the Python programming language

    DOE PAGES

    Helmus, Jonathan J.; Collis, Scott M.

    2016-07-18

    The Python ARM Radar Toolkit is a package for reading, visualizing, correcting and analysing data from weather radars. Development began to meet the needs of the Atmospheric Radiation Measurement Climate Research Facility and has since expanded to provide a general-purpose framework for working with data from weather radars in the Python programming language. The toolkit is built on top of libraries in the Scientific Python ecosystem including NumPy, SciPy, and matplotlib, and makes use of Cython for interfacing with existing radar libraries written in C and to speed up computationally demanding algorithms. As a result, the source code for themore » toolkit is available on GitHub and is distributed under a BSD license.« less

  6. GPU-based multi-volume ray casting within VTK for medical applications.

    PubMed

    Bozorgi, Mohammadmehdi; Lindseth, Frank

    2015-03-01

    Multi-volume visualization is important for displaying relevant information in multimodal or multitemporal medical imaging studies. The main objective with the current study was to develop an efficient GPU-based multi-volume ray caster (MVRC) and validate the proposed visualization system in the context of image-guided surgical navigation. Ray casting can produce high-quality 2D images from 3D volume data but the method is computationally demanding, especially when multiple volumes are involved, so a parallel GPU version has been implemented. In the proposed MVRC, imaginary rays are sent through the volumes (one ray for each pixel in the view), and at equal and short intervals along the rays, samples are collected from each volume. Samples from all the volumes are composited using front to back α-blending. Since all the rays can be processed simultaneously, the MVRC was implemented in parallel on the GPU to achieve acceptable interactive frame rates. The method is fully integrated within the visualization toolkit (VTK) pipeline with the ability to apply different operations (e.g., transformations, clipping, and cropping) on each volume separately. The implemented method is cross-platform (Windows, Linux and Mac OSX) and runs on different graphics card (NVidia and AMD). The speed of the MVRC was tested with one to five volumes of varying sizes: 128(3), 256(3), and 512(3). A Tesla C2070 GPU was used, and the output image size was 600 × 600 pixels. The original VTK single-volume ray caster and the MVRC were compared when rendering only one volume. The multi-volume rendering system achieved an interactive frame rate (> 15 fps) when rendering five small volumes (128 (3) voxels), four medium-sized volumes (256(3) voxels), and two large volumes (512(3) voxels). When rendering single volumes, the frame rate of the MVRC was comparable to the original VTK ray caster for small and medium-sized datasets but was approximately 3 frames per second slower for large datasets. The MVRC was successfully integrated in an existing surgical navigation system and was shown to be clinically useful during an ultrasound-guided neurosurgical tumor resection. A GPU-based MVRC for VTK is a useful tool in medical visualization. The proposed multi-volume GPU-based ray caster for VTK provided high-quality images at reasonable frame rates. The MVRC was effective when used in a neurosurgical navigation application.

  7. Atomic Structure

    NASA Astrophysics Data System (ADS)

    Whelan, Colm T.

    2018-04-01

    A knowledge of atomic theory should be an essential part of every physicist's and chemist's toolkit. This book provides an introduction to the basic ideas that govern our understanding of microscopic matter, and the essential features of atomic structure and spectra are presented in a direct and easily accessible manner. Semi-classical ideas are reviewed and an introduction to the quantum mechanics of one and two electron systems and their interaction with external electromagnetic fields is featured. Multielectron atoms are also introduced, and the key methods for calculating their properties reviewed.

  8. PyEvolve: a toolkit for statistical modelling of molecular evolution.

    PubMed

    Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A

    2004-01-05

    Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software.

  9. EHDViz: clinical dashboard development using open-source technologies.

    PubMed

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-03-24

    To design, develop and prototype clinical dashboards to integrate high-frequency health and wellness data streams using interactive and real-time data visualisation and analytics modalities. We developed a clinical dashboard development framework called electronic healthcare data visualization (EHDViz) toolkit for generating web-based, real-time clinical dashboards for visualising heterogeneous biomedical, healthcare and wellness data. The EHDViz is an extensible toolkit that uses R packages for data management, normalisation and producing high-quality visualisations over the web using R/Shiny web server architecture. We have developed use cases to illustrate utility of EHDViz in different scenarios of clinical and wellness setting as a visualisation aid for improving healthcare delivery. Using EHDViz, we prototyped clinical dashboards to demonstrate the contextual versatility of EHDViz toolkit. An outpatient cohort was used to visualise population health management tasks (n=14,221), and an inpatient cohort was used to visualise real-time acuity risk in a clinical unit (n=445), and a quantified-self example using wellness data from a fitness activity monitor worn by a single individual was also discussed (n-of-1). The back-end system retrieves relevant data from data source, populates the main panel of the application and integrates user-defined data features in real-time and renders output using modern web browsers. The visualisation elements can be customised using health features, disease names, procedure names or medical codes to populate the visualisations. The source code of EHDViz and various prototypes developed using EHDViz are available in the public domain at http://ehdviz.dudleylab.org. Collaborative data visualisations, wellness trend predictions, risk estimation, proactive acuity status monitoring and knowledge of complex disease indicators are essential components of implementing data-driven precision medicine. As an open-source visualisation framework capable of integrating health assessment, EHDViz aims to be a valuable toolkit for rapid design, development and implementation of scalable clinical data visualisation dashboards. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. The Lean and Environment Toolkit

    EPA Pesticide Factsheets

    This Lean and Environment Toolkit assembles practical experience collected by the U.S. Environmental Protection Agency (EPA) and partner companies and organizations that have experience with coordinating Lean implementation and environmental management.

  11. User's manual for the two-dimensional transputer graphics toolkit

    NASA Technical Reports Server (NTRS)

    Ellis, Graham K.

    1988-01-01

    The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.

  12. A Machine Learning and Optimization Toolkit for the Swarm

    DTIC Science & Technology

    2014-11-17

    Machine   Learning  and  Op0miza0on   Toolkit  for  the  Swarm   Ilge  Akkaya,  Shuhei  Emoto...3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE A Machine Learning and Optimization Toolkit for the Swarm 5a. CONTRACT NUMBER... machine   learning   methodologies  by  providing  the  right  interfaces  between   machine   learning  tools  and

  13. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  14. UQTk Version 3.0.3 User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sargsyan, Khachik; Safta, Cosmin; Chowdhary, Kamaljit Singh

    2017-05-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  15. Medical high-resolution image sharing and electronic whiteboard system: A pure-web-based system for accessing and discussing lossless original images in telemedicine.

    PubMed

    Qiao, Liang; Li, Ying; Chen, Xin; Yang, Sheng; Gao, Peng; Liu, Hongjun; Feng, Zhengquan; Nian, Yongjian; Qiu, Mingguo

    2015-09-01

    There are various medical image sharing and electronic whiteboard systems available for diagnosis and discussion purposes. However, most of these systems ask clients to install special software tools or web plug-ins to support whiteboard discussion, special medical image format, and customized decoding algorithm of data transmission of HRIs (high-resolution images). This limits the accessibility of the software running on different devices and operating systems. In this paper, we propose a solution based on pure web pages for medical HRIs lossless sharing and e-whiteboard discussion, and have set up a medical HRI sharing and e-whiteboard system, which has four-layered design: (1) HRIs access layer: we improved an tile-pyramid model named unbalanced ratio pyramid structure (URPS), to rapidly share lossless HRIs and to adapt to the reading habits of users; (2) format conversion layer: we designed a format conversion engine (FCE) on server side to real time convert and cache DICOM tiles which clients requesting with window-level parameters, to make browsers compatible and keep response efficiency to server-client; (3) business logic layer: we built a XML behavior relationship storage structure to store and share users' behavior, to keep real time co-browsing and discussion between clients; (4) web-user-interface layer: AJAX technology and Raphael toolkit were used to combine HTML and JavaScript to build client RIA (rich Internet application), to meet clients' desktop-like interaction on any pure webpage. This system can be used to quickly browse lossless HRIs, and support discussing and co-browsing smoothly on any web browser in a diversified network environment. The proposal methods can provide a way to share HRIs safely, and may be used in the field of regional health, telemedicine and remote education at a low cost. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Food: Too Good to Waste Implementation Guide and Toolkit

    EPA Pesticide Factsheets

    The Food: Too Good to Waste (FTGTW) Implementation Guide and Toolkit is designed for community organizations, local governments, households and others interested in reducing wasteful household food management practices.

  17. A patient and public involvement (PPI) toolkit for meaningful and flexible involvement in clinical trials - a work in progress.

    PubMed

    Bagley, Heather J; Short, Hannah; Harman, Nicola L; Hickey, Helen R; Gamble, Carrol L; Woolfall, Kerry; Young, Bridget; Williamson, Paula R

    2016-01-01

    Funders of research are increasingly requiring researchers to involve patients and the public in their research. Patient and public involvement (PPI) in research can potentially help researchers make sure that the design of their research is relevant, that it is participant friendly and ethically sound. Using and sharing PPI resources can benefit those involved in undertaking PPI, but existing PPI resources are not used consistently and this can lead to duplication of effort. This paper describes how we are developing a toolkit to support clinical trials teams in a clinical trials unit. The toolkit will provide a key 'off the shelf' resource to support trial teams with limited resources, in undertaking PPI. Key activities in further developing and maintaining the toolkit are to: ● listen to the views and experience of both research teams and patient and public contributors who use the tools; ● modify the tools based on our experience of using them; ● identify the need for future tools; ● update the toolkit based on any newly identified resources that come to light; ● raise awareness of the toolkit and ● work in collaboration with others to either develop or test out PPI resources in order to reduce duplication of work in PPI. Background Patient and public involvement (PPI) in research is increasingly a funder requirement due to the potential benefits in the design of relevant, participant friendly, ethically sound research. The use and sharing of resources can benefit PPI, but available resources are not consistently used leading to duplication of effort. This paper describes a developing toolkit to support clinical trials teams to undertake effective and meaningful PPI. Methods The first phase in developing the toolkit was to describe which PPI activities should be considered in the pathway of a clinical trial and at what stage these activities should take place. This pathway was informed through review of the type and timing of PPI activities within trials coordinated by the Clinical Trials Research Centre and previously described areas of potential PPI impact in trials. In the second phase, key websites around PPI and identification of resources opportunistically, e.g. in conversation with other trialists or social media, were used to identify resources. Tools were developed where gaps existed. Results A flowchart was developed describing PPI activities that should be considered in the clinical trial pathway and the point at which these activities should happen. Three toolkit domains were identified: planning PPI; supporting PPI; recording and evaluating PPI. Four main activities and corresponding tools were identified under the planning for PPI: developing a plan; identifying patient and public contributors; allocating appropriate costs; and managing expectations. In supporting PPI, tools were developed to review participant information sheets. These tools, which require a summary of potential trial participant characteristics and circumstances help to clarify requirements and expectations of PPI review. For recording and evaluating PPI, the planned PPI interventions should be monitored in terms of impact, and a tool to monitor public contributor experience is in development. Conclusions This toolkit provides a developing 'off the shelf' resource to support trial teams with limited resources in undertaking PPI. Key activities in further developing and maintaining the toolkit are to: listen to the views and experience of both research teams and public contributors using the tools, to identify the need for future tools, to modify tools based on experience of their use; to update the toolkit based on any newly identified resources that come to light; to raise awareness of the toolkit and to work in collaboration with others to both develop and test out PPI resources in order to reduce duplication of work in PPI.

  18. ASERA: A Spectrum Eye Recognition Assistant

    NASA Astrophysics Data System (ADS)

    Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao; Zhao, Yongheng

    2018-04-01

    ASERA, ASpectrum Eye Recognition Assistant, aids in quasar spectral recognition and redshift measurement and can also be used to recognize various types of spectra of stars, galaxies and AGNs (Active Galactic Nucleus). This interactive software allows users to visualize observed spectra, superimpose template spectra from the Sloan Digital Sky Survey (SDSS), and interactively access related spectral line information. ASERA is an efficient and user-friendly semi-automated toolkit for the accurate classification of spectra observed by LAMOST (the Large Sky Area Multi-object Fiber Spectroscopic Telescope) and is available as a standalone Java application and as a Java applet. The software offers several functions, including wavelength and flux scale settings, zoom in and out, redshift estimation, and spectral line identification.

  19. Lean and Information Technology Toolkit

    EPA Pesticide Factsheets

    The Lean and Information Technology Toolkit is a how-to guide which provides resources to environmental agencies to help them use Lean Startup, Lean process improvement, and Agile tools to streamline and automate processes.

  20. Health Information in Kirundi (Rundi)

    MedlinePlus

    ... Abuse Healthy Living Toolkit: Violence In the Home - English PDF Healthy Living Toolkit: Violence In the Home - ... Parents on Talking to Children About the Flu - English PDF Advice for Parents on Talking to Children ...

  1. 78 FR 14774 - U.S. Environmental Solutions Toolkit-Universal Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-07

    ... following list: (a) Mercury Recycling Technology (b) E-Waste Recycling Technology (c) CRT Recycling Technology (d) Lamp Crushing Systems For purposes of participation in the Toolkit, ``United States exporter...

  2. Open source tools for standardized privacy protection of medical images

    NASA Astrophysics Data System (ADS)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  3. Sculpting 3D worlds with music: advanced texturing techniques

    NASA Astrophysics Data System (ADS)

    Greuel, Christian; Bolas, Mark T.; Bolas, Niko; McDowall, Ian E.

    1996-04-01

    Sound within the virtual environment is often considered to be secondary to the graphics. In a typical scenario, either audio cues are locally associated with specific 3D objects or a general aural ambiance is supplied in order to alleviate the sterility of an artificial experience. This paper discusses a completely different approach, in which cues are extracted from live or recorded music in order to create geometry and control object behaviors within a computer- generated environment. Advanced texturing techniques used to generate complex stereoscopic images are also discussed. By analyzing music for standard audio characteristics such as rhythm and frequency, information is extracted and repackaged for processing. With the Soundsculpt Toolkit, this data is mapped onto individual objects within the virtual environment, along with one or more predetermined behaviors. Mapping decisions are implemented with a user definable schedule and are based on the aesthetic requirements of directors and designers. This provides for visually active, immersive environments in which virtual objects behave in real-time correlation with the music. The resulting music-driven virtual reality opens up several possibilities for new types of artistic and entertainment experiences, such as fully immersive 3D `music videos' and interactive landscapes for live performance.

  4. Water Quality Trading Toolkit for Permit Writers

    EPA Pesticide Factsheets

    The Water Quality Trading Toolkit for Permit Writers is EPA’s first “how-to” manual on designing and implementing water quality trading programs. It helps NPDES permitting authorities incorporate trading provisions into permits.

  5. Heart Health for Women

    MedlinePlus

    ... more about how you can participate. Heart Health Social Media Toolkit The FDA Office of Women's Health offers ... informed about heart health. Use the Heart Health Social Media Toolkit to encourage women in your network to ...

  6. PyBioMed: a python library for various molecular representations of chemicals, proteins and DNAs and their interactions.

    PubMed

    Dong, Jie; Yao, Zhi-Jiang; Zhang, Lin; Luo, Feijun; Lin, Qinlu; Lu, Ai-Ping; Chen, Alex F; Cao, Dong-Sheng

    2018-03-20

    With the increasing development of biotechnology and informatics technology, publicly available data in chemistry and biology are undergoing explosive growth. Such wealthy information in these data needs to be extracted and transformed to useful knowledge by various data mining methods. Considering the amazing rate at which data are accumulated in chemistry and biology fields, new tools that process and interpret large and complex interaction data are increasingly important. So far, there are no suitable toolkits that can effectively link the chemical and biological space in view of molecular representation. To further explore these complex data, an integrated toolkit for various molecular representation is urgently needed which could be easily integrated with data mining algorithms to start a full data analysis pipeline. Herein, the python library PyBioMed is presented, which comprises functionalities for online download for various molecular objects by providing different IDs, the pretreatment of molecular structures, the computation of various molecular descriptors for chemicals, proteins, DNAs and their interactions. PyBioMed is a feature-rich and highly customized python library used for the characterization of various complex chemical and biological molecules and interaction samples. The current version of PyBioMed could calculate 775 chemical descriptors and 19 kinds of chemical fingerprints, 9920 protein descriptors based on protein sequences, more than 6000 DNA descriptors from nucleotide sequences, and interaction descriptors from pairwise samples using three different combining strategies. Several examples and five real-life applications were provided to clearly guide the users how to use PyBioMed as an integral part of data analysis projects. By using PyBioMed, users are able to start a full pipelining from getting molecular data, pretreating molecules, molecular representation to constructing machine learning models conveniently. PyBioMed provides various user-friendly and highly customized APIs to calculate various features of biological molecules and complex interaction samples conveniently, which aims at building integrated analysis pipelines from data acquisition, data checking, and descriptor calculation to modeling. PyBioMed is freely available at http://projects.scbdd.com/pybiomed.html .

  7. SimITK: rapid ITK prototyping using the Simulink visual programming environment

    NASA Astrophysics Data System (ADS)

    Dickinson, A. W. L.; Mousavi, P.; Gobbi, D. G.; Abolmaesumi, P.

    2011-03-01

    The Insight Segmentation and Registration Toolkit (ITK) is a long-established, software package used for image analysis, visualization, and image-guided surgery applications. This package is a collection of C++ libraries, that can pose usability problems for users without C++ programming experience. To bridge the gap between the programming complexities and the required learning curve of ITK, we present a higher-level visual programming environment that represents ITK methods and classes by wrapping them into "blocks" within MATLAB's visual programming environment, Simulink. These blocks can be connected to form workflows: visual schematics that closely represent the structure of a C++ program. Due to the heavily C++ templated nature of ITK, direct interaction between Simulink and ITK requires an intermediary to convert their respective datatypes and allow intercommunication. We have developed a "Virtual Block" that serves as an intermediate wrapper around the ITK class and is responsible for resolving the templated datatypes used by ITK to native types used by Simulink. Presently, the wrapping procedure for SimITK is semi-automatic in that it requires XML descriptions of the ITK classes as a starting point, as this data is used to create all other necessary integration files. The generation of all source code and object code from the XML is done automatically by a CMake build script that yields Simulink blocks as the final result. An example 3D segmentation workflow using cranial-CT data as well as a 3D MR-to-CT registration workflow are presented as a proof-of-concept.

  8. An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice.

    PubMed

    Rorrer, Audrey S

    2016-04-01

    This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. Copyright © 2016. Published by Elsevier Ltd.

  9. Health Care Facilities Resilient to Climate Change Impacts

    PubMed Central

    Paterson, Jaclyn; Berry, Peter; Ebi, Kristie; Varangu, Linda

    2014-01-01

    Climate change will increase the frequency and magnitude of extreme weather events and create risks that will impact health care facilities. Health care facilities will need to assess climate change risks and adopt adaptive management strategies to be resilient, but guidance tools are lacking. In this study, a toolkit was developed for health care facility officials to assess the resiliency of their facility to climate change impacts. A mixed methods approach was used to develop climate change resiliency indicators to inform the development of the toolkit. The toolkit consists of a checklist for officials who work in areas of emergency management, facilities management and health care services and supply chain management, a facilitator’s guide for administering the checklist, and a resource guidebook to inform adaptation. Six health care facilities representing three provinces in Canada piloted the checklist. Senior level officials with expertise in the aforementioned areas were invited to review the checklist, provide feedback during qualitative interviews and review the final toolkit at a stakeholder workshop. The toolkit helps health care facility officials identify gaps in climate change preparedness, direct allocation of adaptation resources and inform strategic planning to increase resiliency to climate change. PMID:25522050

  10. Addressing Racism in Medical Education An Interactive Training Module.

    PubMed

    White-Davis, Tanya; Edgoose, Jennifer; Brown Speights, Joedrecka S; Fraser, Kathryn; Ring, Jeffrey M; Guh, Jessica; Saba, George W

    2018-05-01

    Education of health care clinicians on racial and ethnic disparities has primarily focused on emphasizing statistics and cultural competency, with minimal attention to racism. Learning about racism and unconscious processes provides skills that reduce bias when interacting with minority patients. This paper describes the responses to a relationship-based workshop and toolkit highlighting issues that medical educators should address when teaching about racism in the context of pernicious health disparities. A multiracial, interdisciplinary team identified essential elements of teaching about racism. A 1.5-hour faculty development workshop consisted of a didactic presentation, a 3-minute video vignette depicting racial and gender microaggression within a hospital setting, small group discussion, large group debrief, and presentation of a toolkit. One hundred twenty diverse participants attended the workshop at the 2016 Society of Teachers of Family Medicine Annual Spring Conference. Qualitative information from small group facilitators and large group discussions identified some participants' emotional reactions to the video including dismay, anger, fear, and shame. A pre/postsurvey (N=72) revealed significant changes in attitude and knowledge regarding issues of racism and in participants' personal commitment to address them. Results suggest that this workshop changed knowledge and attitudes about racism and health inequities. Findings also suggest this workshop improved confidence in teaching learners to reduce racism in patient care. The authors recommend that curricula continue to be developed and disseminated nationally to equip faculty with the skills and teaching resources to effectively incorporate the discussion of racism into the education of health professionals.

  11. A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies

    DOE PAGES

    Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; ...

    2015-01-21

    Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with themore » data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. Furthermore, a tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial.« less

  12. A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies

    PubMed Central

    Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; Blackwood, Christopher B.; Rosen, Gail L.

    2015-01-01

    Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with the data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. A tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial. Availability: http://www.ece.drexel.edu/gailr/EESI/tutorial.php. PMID:25607539

  13. Current Status of VO Compliant Data Service in Japanese Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Shirasaki, Y.; Komiya, Y.; Ohishi, M.; Mizumoto, Y.; Ishihara, Y.; Tsutsumi, J.; Hiyama, T.; Nakamoto, H.; Sakamoto, M.

    2012-09-01

    In these years, standards to build a Virtual Observatory (VO) data service have been established with the efforts in the International Virtual Observatory Alliance (IVOA). We applied these newly established standards (SSAP, TAP) to our VO service toolkit which was developed to implement earlier VO standards SIAP and (deprecated) SkyNode. The toolkit can be easily installed and provides a GUI interface to construct and manage VO service. In this paper, we describes the architecture of our toolkit and how it is used to start hosting VO service.

  14. Loss of Coolant Accident (LOCA) / Emergency Core Coolant System (ECCS Evaluation of Risk-Informed Margins Management Strategies for a Representative Pressurized Water Reactor (PWR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szilard, Ronaldo Henriques

    A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.

  15. WIND Toolkit Power Data Site Index

    DOE Data Explorer

    Draxl, Caroline; Mathias-Hodge, Bri

    2016-10-19

    This spreadsheet contains per-site metadata for the WIND Toolkit sites and serves as an index for the raw data hosted on Globus connect (nrel#globus:/globusro/met_data). Aside from the metadata, per site average power and capacity factor are given. This data was prepared by 3TIER under contract by NREL and is public domain. Authoritative documentation on the creation of the underlying dataset is at: Final Report on the Creation of the Wind Integration National Dataset (WIND) Toolkit and API: http://www.nrel.gov/docs/fy16osti/66189.pdf

  16. Abnormal resting-state brain activities in patients with first-episode obsessive-compulsive disorder

    PubMed Central

    Niu, Qihui; Yang, Lei; Song, Xueqin; Chu, Congying; Liu, Hao; Zhang, Lifang; Li, Yan; Zhang, Xiang; Cheng, Jingliang; Li, Youhui

    2017-01-01

    Objective This paper attempts to explore the brain activity of patients with obsessive-compulsive disorder (OCD) and its correlation with the disease at resting duration in patients with first-episode OCD, providing a forceful imaging basis for clinic diagnosis and pathogenesis of OCD. Methods Twenty-six patients with first-episode OCD and 25 healthy controls (HC group; matched for age, sex, and education level) underwent functional magnetic resonance imaging (fMRI) scanning at resting state. Statistical parametric mapping 8, data processing assistant for resting-state fMRI analysis toolkit, and resting state fMRI data analysis toolkit packages were used to process the fMRI data on Matlab 2012a platform, and the difference of regional homogeneity (ReHo) values between the OCD group and HC group was detected with independent two-sample t-test. With age as a concomitant variable, the Pearson correlation analysis was adopted to study the correlation between the disease duration and ReHo value of whole brain. Results Compared with HC group, the ReHo values in OCD group were decreased in brain regions, including left thalamus, right thalamus, right paracentral lobule, right postcentral gyrus, and the ReHo value was increased in the left angular gyrus region. There was a negative correlation between disease duration and ReHo value in the bilateral orbitofrontal cortex (OFC). Conclusion OCD is a multifactorial disease generally caused by abnormal activities of many brain regions at resting state. Worse brain activity of the OFC is related to the OCD duration, which provides a new insight to the pathogenesis of OCD. PMID:28243104

  17. The 2016 ACCP Pharmacotherapy Didactic Curriculum Toolkit.

    PubMed

    Schwinghammer, Terry L; Crannage, Andrew J; Boyce, Eric G; Bradley, Bridget; Christensen, Alyssa; Dunnenberger, Henry M; Fravel, Michelle; Gurgle, Holly; Hammond, Drayton A; Kwon, Jennifer; Slain, Douglas; Wargo, Kurt A

    2016-11-01

    The 2016 American College of Clinical Pharmacy (ACCP) Educational Affairs Committee was charged with updating and contemporizing ACCP's 2009 Pharmacotherapy Didactic Curriculum Toolkit. The toolkit has been designed to guide schools and colleges of pharmacy in developing, maintaining, and modifying their curricula. The 2016 committee reviewed the recent medical literature and other documents to identify disease states that are responsive to drug therapy. Diseases and content topics were organized by organ system, when feasible, and grouped into tiers as defined by practice competency. Tier 1 topics should be taught in a manner that prepares all students to provide collaborative, patient-centered care upon graduation and licensure. Tier 2 topics are generally taught in the professional curriculum, but students may require additional knowledge or skills after graduation (e.g., residency training) to achieve competency in providing direct patient care. Tier 3 topics may not be taught in the professional curriculum; thus, graduates will be required to obtain the necessary knowledge and skills on their own to provide direct patient care, if required in their practice. The 2016 toolkit contains 276 diseases and content topics, of which 87 (32%) are categorized as tier 1, 133 (48%) as tier 2, and 56 (20%) as tier 3. The large number of tier 1 topics will require schools and colleges to use creative pedagogical strategies to achieve the necessary practice competencies. Almost half of the topics (48%) are tier 2, highlighting the importance of postgraduate residency training or equivalent practice experience to competently care for patients with these disorders. The Pharmacotherapy Didactic Curriculum Toolkit will continue to be updated to provide guidance to faculty at schools and colleges of pharmacy as these academic pharmacy institutions regularly evaluate and modify their curricula to keep abreast of scientific advances and associated practice changes. Access the current Pharmacotherapy Didactic Curriculum Toolkit at http://www.accp.com/docs/positions/misc/Toolkit_final.pdf. © 2016 Pharmacotherapy Publications, Inc.

  18. Patient-Centered Personal Health Record and Portal Implementation Toolkit for Ambulatory Clinics: A Feasibility Study.

    PubMed

    Nahm, Eun-Shim; Diblasi, Catherine; Gonzales, Eva; Silver, Kristi; Zhu, Shijun; Sagherian, Knar; Kongs, Katherine

    2017-04-01

    Personal health records and patient portals have been shown to be effective in managing chronic illnesses. Despite recent nationwide implementation efforts, the personal health record and patient portal adoption rates among patients are low, and the lack of support for patients using the programs remains a critical gap in most implementation processes. In this study, we implemented the Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit in a large diabetes/endocrinology center and assessed its preliminary impact on personal health record and patient portal knowledge, self-efficacy, patient-provider communication, and adherence to treatment plans. Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit is composed of Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General, clinic-level resources for clinicians, staff, and patients, and Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit Plus, an optional 4-week online resource program for patients ("MyHealthPortal"). First, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General was implemented, and all clinicians and staff were educated about the center's personal health record and patient portal. Then general patient education was initiated, while a randomized controlled trial was conducted to test the preliminary effects of "MyHealthPortal" using a small sample (n = 74) with three observations (baseline and 4 and 12 weeks). The intervention group showed significantly greater improvement than the control group in patient-provider communication at 4 weeks (t56 = 3.00, P = .004). For other variables, the intervention group tended to show greater improvement; however, the differences were not significant. In this preliminary study, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit showed potential for filling the gap in the current personal health record and patient portal implementation process. Further studies are needed using larger samples in other settings to ascertain if these results are generalizable to other populations.

  19. Grid infrastructure for automatic processing of SAR data for flood applications

    NASA Astrophysics Data System (ADS)

    Kussul, Natalia; Skakun, Serhiy; Shelestov, Andrii

    2010-05-01

    More and more geosciences applications are being put on to the Grids. Due to the complexity of geosciences applications that is caused by complex workflow, the use of computationally intensive environmental models, the need of management and integration of heterogeneous data sets, Grid offers solutions to tackle these problems. Many geosciences applications, especially those related to the disaster management and mitigations require the geospatial services to be delivered in proper time. For example, information on flooded areas should be provided to corresponding organizations (local authorities, civil protection agencies, UN agencies etc.) no more than in 24 h to be able to effectively allocate resources required to mitigate the disaster. Therefore, providing infrastructure and services that will enable automatic generation of products based on the integration of heterogeneous data represents the tasks of great importance. In this paper we present Grid infrastructure for automatic processing of synthetic-aperture radar (SAR) satellite images to derive flood products. In particular, we use SAR data acquired by ESA's ENVSAT satellite, and neural networks to derive flood extent. The data are provided in operational mode from ESA rolling archive (within ESA Category-1 grant). We developed a portal that is based on OpenLayers frameworks and provides access point to the developed services. Through the portal the user can define geographical region and search for the required data. Upon selection of data sets a workflow is automatically generated and executed on the resources of Grid infrastructure. For workflow execution and management we use Karajan language. The workflow of SAR data processing consists of the following steps: image calibration, image orthorectification, image processing with neural networks, topographic effects removal, geocoding and transformation to lat/long projection, and visualisation. These steps are executed by different software, and can be executed by different resources of the Grid system. The resulting geospatial services are available in various OGC standards such as KML and WMS. Currently, the Grid infrastructure integrates the resources of several geographically distributed organizations, in particular: Space Research Institute NASU-NSAU (Ukraine) with deployed computational and storage nodes based on Globus Toolkit 4 (htpp://www.globus.org) and gLite 3 (http://glite.web.cern.ch) middleware, access to geospatial data and a Grid portal; Institute of Cybernetics of NASU (Ukraine) with deployed computational and storage nodes (SCIT-1/2/3 clusters) based on Globus Toolkit 4 middleware and access to computational resources (approximately 500 processors); Center of Earth Observation and Digital Earth Chinese Academy of Sciences (CEODE-CAS, China) with deployed computational nodes based on Globus Toolkit 4 middleware and access to geospatial data (approximately 16 processors). We are currently adding new geospatial services based on optical satellite data, namely MODIS. This work is carried out jointly with the CEODE-CAS. Using workflow patterns that were developed for SAR data processing we are building new workflows for optical data processing.

  20. Ridesharing options analysis and practitioners' toolkit

    DOT National Transportation Integrated Search

    2010-12-01

    The purpose of this toolkit is to elaborate upon the recent changes in ridesharing, introduce the wide variety that exists in ridesharing programs today, and the developments in technology and funding availability that create greater incentives for p...

  1. Energy Savings Performance Contract Energy Sales Agreement Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    FEMP developed the Energy Savings Performance Contracting Energy Sales Agreement (ESPC ESA) Toolkit to provide federal agency contracting officers and other acquisition team members with information that will facilitate the timely execution of ESPC ESA projects.

  2. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.

    PubMed

    Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido

    2017-06-01

    The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.

  3. The Biological Reference Repository (BioR): a rapid and flexible system for genomics annotation.

    PubMed

    Kocher, Jean-Pierre A; Quest, Daniel J; Duffy, Patrick; Meiners, Michael A; Moore, Raymond M; Rider, David; Hossain, Asif; Hart, Steven N; Dinu, Valentin

    2014-07-01

    The Biological Reference Repository (BioR) is a toolkit for annotating variants. BioR stores public and user-specific annotation sources in indexed JSON-encoded flat files (catalogs). The BioR toolkit provides the functionality to combine and retrieve annotation from these catalogs via the command-line interface. Several catalogs from commonly used annotation sources and instructions for creating user-specific catalogs are provided. Commands from the toolkit can be combined with other UNIX commands for advanced annotation processing. We also provide instructions for the development of custom annotation pipelines. The package is implemented in Java and makes use of external tools written in Java and Perl. The toolkit can be executed on Mac OS X 10.5 and above or any Linux distribution. The BioR application, quickstart, and user guide documents and many biological examples are available at http://bioinformaticstools.mayo.edu. © The Author 2014. Published by Oxford University Press.

  4. The GeoViz Toolkit: Using component-oriented coordination methods for geographic visualization and analysis

    PubMed Central

    Hardisty, Frank; Robinson, Anthony C.

    2010-01-01

    In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423

  5. Security Hardened Cyber Components for Nuclear Power Plants: Phase I SBIR Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franusich, Michael D.

    SpiralGen, Inc. built a proof-of-concept toolkit for enhancing the cyber security of nuclear power plants and other critical infrastructure with high-assurance instrumentation and control code. The toolkit is based on technology from the DARPA High-Assurance Cyber Military Systems (HACMS) program, which has focused on applying the science of formal methods to the formidable set of problems involved in securing cyber physical systems. The primary challenges beyond HACMS in developing this toolkit were to make the new technology usable by control system engineers and compatible with the regulatory and commercial constraints of the nuclear power industry. The toolkit, packaged as amore » Simulink add-on, allows a system designer to assemble a high-assurance component from formally specified and proven blocks and generate provably correct control and monitor code for that subsystem.« less

  6. Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.

    PubMed

    Jones, Rachael M; Nicas, Mark

    2006-03-01

    COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always <100, with some ranging to <1, and inversely related to molecular weight. The Toolkit-GHS system generally produced margins equal to or larger than COSHH Essentials, suggesting that the Toolkit-GHS system is more protective of worker health. Although, these systems predict exposures comparable with current occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency.

  7. Effects of a Short Video-Based Resident-as-Teacher Training Toolkit on Resident Teaching.

    PubMed

    Ricciotti, Hope A; Freret, Taylor S; Aluko, Ashley; McKeon, Bri Anne; Haviland, Miriam J; Newman, Lori R

    2017-10-01

    To pilot a short video-based resident-as-teacher training toolkit and assess its effect on resident teaching skills in clinical settings. A video-based resident-as-teacher training toolkit was previously developed by educational experts at Beth Israel Deaconess Medical Center, Harvard Medical School. Residents were recruited from two academic hospitals, watched two videos from the toolkit ("Clinical Teaching Skills" and "Effective Clinical Supervision"), and completed an accompanying self-study guide. A novel assessment instrument for evaluating the effect of the toolkit on teaching was created through a modified Delphi process. Before and after the intervention, residents were observed leading a clinical teaching encounter and scored using the 15-item assessment instrument. The primary outcome of interest was the change in number of skills exhibited, which was assessed using the Wilcoxon signed-rank test. Twenty-eight residents from two academic hospitals were enrolled, and 20 (71%) completed all phases of the study. More than one third of residents who volunteered to participate reported no prior formal teacher training. After completing two training modules, residents demonstrated a significant increase in the median number of teaching skills exhibited in a clinical teaching encounter, from 7.5 (interquartile range 6.5-9.5) to 10.0 (interquartile range 9.0-11.5; P<.001). Of the 15 teaching skills assessed, there were significant improvements in asking for the learner's perspective (P=.01), providing feedback (P=.005), and encouraging questions (P=.046). Using a resident-as-teacher video-based toolkit was associated with improvements in teaching skills in residents from multiple specialties.

  8. Innovations and Challenges of Implementing a Glucose Gel Toolkit for Neonatal Hypoglycemia.

    PubMed

    Hammer, Denise; Pohl, Carla; Jacobs, Peggy J; Kaufman, Susan; Drury, Brenda

    2018-05-24

    Transient neonatal hypoglycemia occurs most commonly in newborns who are small for gestational age, large for gestational age, infants of diabetic mothers, and late preterm infants. An exact blood glucose value has not been determined for neonatal hypoglycemia, and it is important to note that poor neurologic outcomes can occur if hypoglycemia is left untreated. Interventions that separate mothers and newborns, as well as use of formula to treat hypoglycemia, have the potential to disrupt exclusive breastfeeding. To determine whether implementation of a toolkit designed to support staff in the adaptation of the practice change for management of newborns at risk for hypoglycemia, that includes 40% glucose gel in an obstetric unit with a level 2 nursery will decrease admissions to the Intermediate Care Nursery, and increase exclusive breastfeeding. This descriptive study used a retrospective chart review for pre/postimplementation of the Management of Newborns at Risk for Hypoglycemia Toolkit (Toolkit) using a convenience sample of at-risk newborns in the first 2 days of life to evaluate the proposed outcomes. Following implementation of the Toolkit, at-risk newborns had a clinically but not statistically significant 6.5% increase in exclusive breastfeeding and a clinically but not statistically significant 5% decrease in admissions to the Intermediate Care Nursery. The Toolkit was designed for ease of staff use and to improve outcomes for the at-risk newborn. Future research includes replication at other level 2 and level 1 obstetric centers and investigation into the number of 40% glucose gel doses that can safely be administered.

  9. Improved quality of care for patients infected or colonised with ESBL-producing Enterobacteriaceae in a French teaching hospital: impact of an interventional prospective study and development of specific tools.

    PubMed

    Mondain, Véronique; Lieutier, Florence; Pulcini, Céline; Degand, Nicolas; Landraud, Luce; Ruimy, Raymond; Fosse, Thierry; Roger, Pierre Marie

    2018-05-01

    The increasing incidence of ESBL-producing Enterobacteriaceae (ESBL-E) in France prompted the publication of national recommendations in 2010. Based on these, we developed a toolkit and a warning system to optimise management of ESBL-E infected or colonised patients in both community and hospital settings. The impact of this initiative on quality of care was assessed in a teaching hospital. The ESBL toolkit was developed in 2011 during multidisciplinary meetings involving a regional network of hospital, private clinic and laboratory staff in Southeastern France. It includes antibiotic treatment protocols, a check list, mail templates and a patient information sheet focusing on infection control. Upon identification of ESBL-E, the warning system involves alerting the attending physician and the infectious disease (ID) advisor, with immediate, advice-based implementation of the toolkit. The procedure and toolkit were tested in our teaching hospital. Patient management was compared before and after implementation of the toolkit over two 3-month periods (July-October 2010 and 2012). Implementation of the ESBL-E warning system and ESBL-E toolkit was tested for 87 patients in 2010 and 92 patients in 2012, resulting in improved patient management: expert advice sought and followed (16 vs 97%), information provided to the patient's general practitioner (18 vs 63%) and coding of the condition in the patient's medical file (17 vs 59%), respectively. Our multidisciplinary strategy improved quality of care for in-patients infected or colonised with ESBL-E, increasing compliance with national recommendations.

  10. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin Leigh

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  11. Resource Toolkit for Working with Education Service Providers

    ERIC Educational Resources Information Center

    National Association of Charter School Authorizers (NJ1), 2005

    2005-01-01

    This resource toolkit for working education service providers contains four sections. Section 1, "Roles Responsibilities, and Relationships," contains: (1) "Purchasing Services from an Educational Management Organization," excerpted from "The Charter School Administrative and Governance Guide" (Massachusetts Dept. of…

  12. Panoramic Images Mapping Tools Integrated Within the ESRI ArcGIS Software

    NASA Astrophysics Data System (ADS)

    Guo, Jiao; Zhong, Ruofei; Zeng, Fanyang

    2014-03-01

    There is a general study on panoramic images which are presented along with appearance of the Google street map. Despite 360 degree viewing of street, we can realize more applications over panoramic images. This paper developed a toolkits plugged in ArcGIS, which can view panoramic photographs at street level directly from ArcMap and measure and capture all visible elements as frontages, trees and bridges. We use a series of panoramic images adjoined with absolute coordinate through GPS and IMU. There are two methods in this paper to measure object from these panoramic images: one is to intersect object position through a stereogram; the other one is multichip matching involved more than three images which all cover the object. While someone wants to measure objects from these panoramic images, each two panoramic images which both contain the object can be chosen to display on ArcMap. Then we calculate correlation coefficient of the two chosen panoramic images so as to calculate the coordinate of object. Our study test different patterns of panoramic pairs and compare the results of measurement to the real value of objects so as to offer the best choosing suggestion. The article has mainly elaborated the principles of calculating correlation coefficient and multichip matching.

  13. Verification of a Multiphysics Toolkit against the Magnetized Target Fusion Concept

    NASA Technical Reports Server (NTRS)

    Thomas, Scott; Perrell, Eric; Liron, Caroline; Chiroux, Robert; Cassibry, Jason; Adams, Robert B.

    2005-01-01

    In the spring of 2004 the Advanced Concepts team at MSFC embarked on an ambitious project to develop a suite of modeling routines that would interact with one another. The tools would each numerically model a portion of any advanced propulsion system. The tools were divided by physics categories, hence the name multiphysics toolset. Currently most of the anticipated modeling tools have been created and integrated. Results are given in this paper for both a quarter nozzle with chemically reacting flow and the interaction of two plasma jets representative of a Magnetized Target Fusion device. The results have not been calibrated against real data as of yet, but this paper demonstrates the current capability of the multiphysics tool and planned future enhancements

  14. Parallel Three-Dimensional Computation of Fluid Dynamics and Fluid-Structure Interactions of Ram-Air Parachutes

    NASA Technical Reports Server (NTRS)

    Tezduyar, Tayfun E.

    1998-01-01

    This is a final report as far as our work at University of Minnesota is concerned. The report describes our research progress and accomplishments in development of high performance computing methods and tools for 3D finite element computation of aerodynamic characteristics and fluid-structure interactions (FSI) arising in airdrop systems, namely ram-air parachutes and round parachutes. This class of simulations involves complex geometries, flexible structural components, deforming fluid domains, and unsteady flow patterns. The key components of our simulation toolkit are a stabilized finite element flow solver, a nonlinear structural dynamics solver, an automatic mesh moving scheme, and an interface between the fluid and structural solvers; all of these have been developed within a parallel message-passing paradigm.

  15. Improving safety on rural local and tribal roads safety toolkit.

    DOT National Transportation Integrated Search

    2014-08-01

    Rural roadway safety is an important issue for communities throughout the country and presents a challenge for state, local, and Tribal agencies. The Improving Safety on Rural Local and Tribal Roads Safety Toolkit was created to help rural local ...

  16. A methodological toolkit for field assessments of artisanally mined alluvial diamond deposits

    USGS Publications Warehouse

    Chirico, Peter G.; Malpeli, Katherine C.

    2014-01-01

    This toolkit provides a standardized checklist of critical issues relevant to artisanal mining-related field research. An integrated sociophysical geographic approach to collecting data at artisanal mine sites is outlined. The implementation and results of a multistakeholder approach to data collection, carried out in the assessment of Guinea’s artisanally mined diamond deposits, also are summarized. This toolkit, based on recent and successful field campaigns in West Africa, has been developed as a reference document to assist other government agencies or organizations in collecting the data necessary for artisanal diamond mining or similar natural resource assessments.

  17. TRSkit: A Simple Digital Library Toolkit

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  18. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  19. Cancer-Related Analysis of Variants Toolkit (CRAVAT) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    CRAVAT is an easy to use web-based tool for analysis of cancer variants (missense, nonsense, in-frame indel, frameshift indel, splice site). CRAVAT provides scores and a variety of annotations that assist in identification of important variants. Results are provided in an interactive, highly graphical webpage and include annotated 3D structure visualization. CRAVAT is also available for local or cloud-based installation as a Docker container. MuPIT provides 3D visualization of mutation clusters and functional annotation and is now integrated with CRAVAT.

  20. Methodology for the development of a taxonomy and toolkit to evaluate health-related habits and lifestyle (eVITAL)

    PubMed Central

    2010-01-01

    Background Chronic diseases cause an ever-increasing percentage of morbidity and mortality, but many have modifiable risk factors. Many behaviors that predispose or protect an individual to chronic disease are interrelated, and therefore are best approached using an integrated model of health and the longevity paradigm, using years lived without disability as the endpoint. Findings This study used a 4-phase mixed qualitative design to create a taxonomy and related online toolkit for the evaluation of health-related habits. Core members of a working group conducted a literature review and created a framing document that defined relevant constructs. This document was revised, first by a working group and then by a series of multidisciplinary expert groups. The working group and expert panels also designed a systematic evaluation of health behaviors and risks, which was computerized and evaluated for feasibility. A demonstration study of the toolkit was performed in 11 healthy volunteers. Discussion In this protocol, we used forms of the community intelligence approach, including frame analysis, feasibility, and demonstration, to develop a clinical taxonomy and an online toolkit with standardized procedures for screening and evaluation of multiple domains of health, with a focus on longevity and the goal of integrating the toolkit into routine clinical practice. Trial Registration IMSERSO registry 200700012672 PMID:20334642

  1. Implementing a Breastfeeding Toolkit for Nursing Education.

    PubMed

    Folker-Maglaya, Catherine; Pylman, Maureen E; Couch, Kimberly A; Spatz, Diane L; Marzalik, Penny R

    All health professional organizations recommend exclusive breastfeeding for at least 6 months, with continued breastfeeding for 1 year or more after birth. Women cite lack of support from health professionals as a barrier to breastfeeding. Meanwhile, breastfeeding education is not considered essential to basic nursing education and students are not adequately prepared to support breastfeeding women. Therefore, a toolkit of comprehensive evidence-based breastfeeding educational materials was developed to provide essential breastfeeding knowledge. A study was performed to determine the effectiveness of the breastfeeding toolkit education in an associate degree nursing program. A pretest/posttest survey design with intervention and comparison groups was used. One hundred fourteen students completed pre- and posttests. Student knowledge was measured using a 12-item survey derived with minor modifications from Marzalik's 2004 instrument measuring breastfeeding knowledge. When pre- and posttests scores were compared within groups, both groups' knowledge scores increased. A change score was calculated with a significantly higher mean score for the intervention group. When regression analysis was used to control for the pretest score, belonging to the intervention group increased student scores but not significantly. The toolkit was developed to provide a curriculum that demonstrates enhanced learning to prepare nursing students for practice. The toolkit could be used in other settings, such as to educate staff nurses working with childbearing families.

  2. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    NASA Technical Reports Server (NTRS)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  3. Tailored prevention of inpatient falls: development and usability testing of the fall TIPS toolkit.

    PubMed

    Zuyev, Lyubov; Benoit, Angela N; Chang, Frank Y; Dykes, Patricia C

    2011-02-01

    Patient falls and fall-related injuries are serious problems in hospitals. The Fall TIPS application aims to prevent patient falls by translating routine nursing fall risk assessment into a decision support intervention that communicates fall risk status and creates a tailored evidence-based plan of care that is accessible to the care team, patients, and family members. In our design and implementation of the Fall TIPS toolkit, we used the Spiral Software Development Life Cycle model. Three output tools available to be generated from the toolkit are bed poster, plan of care, and patient education handout. A preliminary design of the application was based on initial requirements defined by project leaders and informed by focus groups with end users. Preliminary design partially simulated the paper version of the Morse Fall Scale currently used in hospitals involved in the research study. Strengths and weaknesses of the first prototype were identified by heuristic evaluation. Usability testing was performed at sites where research study is implemented. Suggestions mentioned by end users participating in usability studies were either directly incorporated into the toolkit and output tools, were slightly modified, or will be addressed during training. The next step is implementation of the fall prevention toolkit on the pilot testing units.

  4. Genetic Engineering of Bee Gut Microbiome Bacteria with a Toolkit for Modular Assembly of Broad-Host-Range Plasmids.

    PubMed

    Leonard, Sean P; Perutka, Jiri; Powell, J Elijah; Geng, Peng; Richhart, Darby D; Byrom, Michelle; Kar, Shaunak; Davies, Bryan W; Ellington, Andrew D; Moran, Nancy A; Barrick, Jeffrey E

    2018-05-18

    Engineering the bacteria present in animal microbiomes promises to lead to breakthroughs in medicine and agriculture, but progress is hampered by a dearth of tools for genetically modifying the diverse species that comprise these communities. Here we present a toolkit of genetic parts for the modular construction of broad-host-range plasmids built around the RSF1010 replicon. Golden Gate assembly of parts in this toolkit can be used to rapidly test various antibiotic resistance markers, promoters, fluorescent reporters, and other coding sequences in newly isolated bacteria. We demonstrate the utility of this toolkit in multiple species of Proteobacteria that are native to the gut microbiomes of honey bees ( Apis mellifera) and bumble bees (B ombus sp.). Expressing fluorescent proteins in Snodgrassella alvi, Gilliamella apicola, Bartonella apis, and Serratia strains enables us to visualize how these bacteria colonize the bee gut. We also demonstrate CRISPRi repression in B. apis and use Cas9-facilitated knockout of an S. alvi adhesion gene to show that it is important for colonization of the gut. Beyond characterizing how the gut microbiome influences the health of these prominent pollinators, this bee microbiome toolkit (BTK) will be useful for engineering bacteria found in other natural microbial communities.

  5. Track structure modeling in liquid water: A review of the Geant4-DNA very low energy extension of the Geant4 Monte Carlo simulation toolkit.

    PubMed

    Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S

    2015-12-01

    Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. PLUS: open-source toolkit for ultrasound-guided intervention systems.

    PubMed

    Lasso, Andras; Heffter, Tamas; Rankin, Adam; Pinter, Csaba; Ungi, Tamas; Fichtinger, Gabor

    2014-10-01

    A variety of advanced image analysis methods have been under the development for ultrasound-guided interventions. Unfortunately, the transition from an image analysis algorithm to clinical feasibility trials as part of an intervention system requires integration of many components, such as imaging and tracking devices, data processing algorithms, and visualization software. The objective of our paper is to provide a freely available open-source software platform-PLUS: Public software Library for Ultrasound-to facilitate rapid prototyping of ultrasound-guided intervention systems for translational clinical research. PLUS provides a variety of methods for interventional tool pose and ultrasound image acquisition from a wide range of tracking and imaging devices, spatial and temporal calibration, volume reconstruction, simulated image generation, and recording and live streaming of the acquired data. This paper introduces PLUS, explains its functionality and architecture, and presents typical uses and performance in ultrasound-guided intervention systems. PLUS fulfills the essential requirements for the development of ultrasound-guided intervention systems and it aspires to become a widely used translational research prototyping platform. PLUS is freely available as open source software under BSD license and can be downloaded from http://www.plustoolkit.org.

  7. Marine Debris and Plastic Source Reduction Toolkit

    EPA Pesticide Factsheets

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  8. 77 FR 73023 - U.S. Environmental Solutions Toolkit

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... officials and foreign end-users of environmental technologies that will outline U.S. [[Page 73024.... approaches to solving environmental problems and to U.S. companies that can export related technologies. The... DEPARTMENT OF COMMERCE International Trade Administration U.S. Environmental Solutions Toolkit...

  9. THE EPANET PROGRAMMER'S TOOLKIT FOR ANALYSIS OF WATER DISTRIBUTION SYSTEMS

    EPA Science Inventory

    The EPANET Programmer's Toolkit is a collection of functions that helps simplify computer programming of water distribution network analyses. the functions can be used to read in a pipe network description file, modify selected component properties, run multiple hydraulic and wa...

  10. Accuracy of Assessment of Eligibility for Early Medical Abortion by Community Health Workers in Ethiopia, India and South Africa.

    PubMed

    Johnston, Heidi Bart; Ganatra, Bela; Nguyen, My Huong; Habib, Ndema; Afework, Mesganaw Fantahun; Harries, Jane; Iyengar, Kirti; Moodley, Jennifer; Lema, Hailu Yeneneh; Constant, Deborah; Sen, Swapnaleen

    2016-01-01

    To assess the accuracy of assessment of eligibility for early medical abortion by community health workers using a simple checklist toolkit. Diagnostic accuracy study. Ethiopia, India and South Africa. Two hundred seventeen women in Ethiopia, 258 in India and 236 in South Africa were enrolled into the study. A checklist toolkit to determine eligibility for early medical abortion was validated by comparing results of clinician and community health worker assessment of eligibility using the checklist toolkit with the reference standard exam. Accuracy was over 90% and the negative likelihood ratio <0.1 at all three sites when used by clinician assessors. Positive likelihood ratios were 4.3 in Ethiopia, 5.8 in India and 6.3 in South Africa. When used by community health workers the overall accuracy of the toolkit was 92% in Ethiopia, 80% in India and 77% in South Africa negative likelihood ratios were 0.08 in Ethiopia, 0.25 in India and 0.22 in South Africa and positive likelihood ratios were 5.9 in Ethiopia and 2.0 in India and South Africa. The checklist toolkit, as used by clinicians, was excellent at ruling out participants who were not eligible, and moderately effective at ruling in participants who were eligible for medical abortion. Results were promising when used by community health workers particularly in Ethiopia where they had more prior experience with use of diagnostic aids and longer professional training. The checklist toolkit assessments resulted in some participants being wrongly assessed as eligible for medical abortion which is an area of concern. Further research is needed to streamline the components of the tool, explore optimal duration and content of training for community health workers, and test feasibility and acceptability.

  11. An evaluation of a toolkit for the early detection, management, and control of carbapenemase-producing Enterobacteriaceae: a survey of acute hospital trusts in England.

    PubMed

    Coope, C M; Verlander, N Q; Schneider, A; Hopkins, S; Welfare, W; Johnson, A P; Patel, B; Oliver, I

    2018-03-09

    Following hospital outbreaks of carbapenemase-producing Enterobacteriaceae (CPE), Public Health England published a toolkit in December 2013 to promote the early detection, management, and control of CPE colonization and infection in acute hospital settings. To examine awareness, uptake, implementation and usefulness of the CPE toolkit and identify potential barriers and facilitators to its adoption in order to inform future guidance. A cross-sectional survey of National Health Service (NHS) acute trusts was conducted in May 2016. Descriptive analysis and multivariable regression models were conducted, and narrative responses were analysed thematically and informed using behaviour change theory. Most (92%) acute trusts had a written CPE plan. Fewer (75%) reported consistent compliance with screening and isolation of CPE risk patients. Lower prioritization and weaker senior management support for CPE prevention were associated with poorer compliance. Awareness of the CPE toolkit was high and all trusts with patients infected or colonized with CPE had used the toolkit either as provided (32%), or to inform (65%) their own local CPE plan. Despite this, many respondents (80%) did not believe that the CPE toolkit guidance offered an effective means to prevent CPE or was practical to follow. CPE prevention and control requires robust IPC measures. Successful implementation can be hindered by a complex set of factors related to their practical execution, insufficient resources and a lack of confidence in the effectiveness of the guidance. Future CPE guidance would benefit from substantive user involvement, processes for ongoing feedback, and regular guidance updates. Copyright © 2018 The Healthcare Infection Society. All rights reserved.

  12. A randomized controlled trial to test the efficacy of the SCI Get Fit Toolkit on leisure-time physical activity behaviour and social-cognitive processes in adults with spinal cord injury.

    PubMed

    Arbour-Nicitopoulos, Kelly P; Sweet, Shane N; Lamontagne, Marie-Eve; Ginis, Kathleen A Martin; Jeske, Samantha; Routhier, François; Latimer-Cheung, Amy E

    2017-01-01

    Single blind, two-group randomized controlled trial. To evaluate the efficacy of the SCI Get Fit Toolkit delivered online on theoretical constructs and moderate-to-vigorous physical activity (MVPA) among adults with SCI. Ontario and Quebec, Canada. Inactive, English- and French-speaking Canadian adults with traumatic SCI with Internet access, and no self-reported cognitive or memory impairments. Participants ( N =90 M age =48.12±11.29 years; 79% male) were randomized to view the SCI Get Fit Toolkit or the Physical Activity Guidelines for adults with SCI (PAG-SCI) online. Primary (intentions) and secondary (outcome expectancies, self-efficacy, planning and MVPA behaviour) outcomes were assessed over a 1-month period. Of the 90 participants randomized, 77 were included in the analyses. Participants viewed the experimental stimuli only briefly, reading the 4-page toolkit for approximately 2.5 min longer than the 1-page guideline document. No condition effects were found for intentions, outcome expectancies, self-efficacy, and planning (ΔR 2 ⩽0.03). Individuals in the toolkit condition were more likely to participate in at least one bout of 20 min of MVPA behaviour at 1-week post-intervention compared to individuals in the guidelines condition (OR=3.54, 95% CI=0.95, 13.17). However, no differences were found when examining change in weekly minutes of MVPA or comparing whether participants met the PAG-SCI. No firm conclusions can be made regarding the impact of the SCI Get Fit Toolkit in comparison to the PAG-SCI on social cognitions and MVPA behaviour. The limited online access to this resource may partially explain these null findings.

  13. Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT).

    PubMed

    Murray, Elizabeth; May, Carl; Mair, Frances

    2010-10-18

    The use of Information and Communication Technology (ICT) or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice). This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT) which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format. The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience). Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit--a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls. The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations.

  14. Development and formative evaluation of the e-Health Implementation Toolkit (e-HIT)

    PubMed Central

    2010-01-01

    Background The use of Information and Communication Technology (ICT) or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice). This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT) which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format. Results The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience). Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls Conclusions The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations. PMID:20955594

  15. PAGANI Toolkit: Parallel graph-theoretical analysis package for brain network big data.

    PubMed

    Du, Haixiao; Xia, Mingrui; Zhao, Kang; Liao, Xuhong; Yang, Huazhong; Wang, Yu; He, Yong

    2018-05-01

    The recent collection of unprecedented quantities of neuroimaging data with high spatial resolution has led to brain network big data. However, a toolkit for fast and scalable computational solutions is still lacking. Here, we developed the PArallel Graph-theoretical ANalysIs (PAGANI) Toolkit based on a hybrid central processing unit-graphics processing unit (CPU-GPU) framework with a graphical user interface to facilitate the mapping and characterization of high-resolution brain networks. Specifically, the toolkit provides flexible parameters for users to customize computations of graph metrics in brain network analyses. As an empirical example, the PAGANI Toolkit was applied to individual voxel-based brain networks with ∼200,000 nodes that were derived from a resting-state fMRI dataset of 624 healthy young adults from the Human Connectome Project. Using a personal computer, this toolbox completed all computations in ∼27 h for one subject, which is markedly less than the 118 h required with a single-thread implementation. The voxel-based functional brain networks exhibited prominent small-world characteristics and densely connected hubs, which were mainly located in the medial and lateral fronto-parietal cortices. Moreover, the female group had significantly higher modularity and nodal betweenness centrality mainly in the medial/lateral fronto-parietal and occipital cortices than the male group. Significant correlations between the intelligence quotient and nodal metrics were also observed in several frontal regions. Collectively, the PAGANI Toolkit shows high computational performance and good scalability for analyzing connectome big data and provides a friendly interface without the complicated configuration of computing environments, thereby facilitating high-resolution connectomics research in health and disease. © 2018 Wiley Periodicals, Inc.

  16. Keep It Sacred | National Native Network

    Science.gov Websites

    Detection Health Care Coverage Get Involved Resources NNN Webinar Archive Newsletter Archive Podcasts Cancer Guide Tribal Public Health Data Toolkits Smoke-Free Policy Toolkit Success Stories Resource Library Colorectal Cancer Diabetes Fact Sheets & Contact General Health Problems & Cancers General State

  17. Toolkit for US colleges/schools of pharmacy to prepare learners for careers in academia.

    PubMed

    Haines, Seena L; Summa, Maria A; Peeters, Michael J; Dy-Boarman, Eliza A; Boyle, Jaclyn A; Clifford, Kalin M; Willson, Megan N

    2017-09-01

    The objective of this article is to provide an academic toolkit for use by colleges/schools of pharmacy to prepare student pharmacists/residents for academic careers. Through the American Association of Colleges of Pharmac (AACP) Section of Pharmacy Practice, the Student Resident Engagement Task Force (SRETF) collated teaching materials used by colleges/schools of pharmacy from a previously reported national survey. The SRETF developed a toolkit for student pharmacists/residents interested in academic pharmacy. Eighteen institutions provided materials; five provided materials describing didactic coursework; over fifteen provided materials for an academia-focused Advanced Pharmacy Practice Experiences (APPE), while one provided materials for an APPE teaching-research elective. SRETF members created a syllabus template and sample lesson plan by integrating submitted resources. Submissions still needed to complete the toolkit include examples of curricular tracks and certificate programs. Pharmacy faculty vacancies still exist in pharmacy education. Engaging student pharmacists/residents about academia pillars of teaching, scholarship and service is critical for the future success of the academy. Published by Elsevier Inc.

  18. Hydrogen quantitative risk assessment workshop proceedings.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersionmore » 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.« less

  19. Assessing Coverage of Population-Based and Targeted Fortification Programs with the Use of the Fortification Assessment Coverage Toolkit (FACT): Background, Toolkit Development, and Supplement Overview.

    PubMed

    Friesen, Valerie M; Aaron, Grant J; Myatt, Mark; Neufeld, Lynnette M

    2017-05-01

    Food fortification is a widely used approach to increase micronutrient intake in the diet. High coverage is essential for achieving impact. Data on coverage is limited in many countries, and tools to assess coverage of fortification programs have not been standardized. In 2013, the Global Alliance for Improved Nutrition developed the Fortification Assessment Coverage Toolkit (FACT) to carry out coverage assessments in both population-based (i.e., staple foods and/or condiments) and targeted (e.g., infant and young child) fortification programs. The toolkit was designed to generate evidence on program coverage and the use of fortified foods to provide timely and programmatically relevant information for decision making. This supplement presents results from FACT surveys that assessed the coverage of population-based and targeted food fortification programs across 14 countries. It then discusses the policy and program implications of the findings for the potential for impact and program improvement.

  20. A Synthetic Coiled-Coil Interactome Provides Heterospecific Modules for Molecular Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reinke, Aaron W.; Grant, Robert A.; Keating, Amy E.

    2010-06-21

    The versatile coiled-coil protein motif is widely used to induce and control macromolecular interactions in biology and materials science. Yet the types of interaction patterns that can be constructed using known coiled coils are limited. Here we greatly expand the coiled-coil toolkit by measuring the complete pairwise interactions of 48 synthetic coiled coils and 7 human bZIP coiled coils using peptide microarrays. The resulting 55-member protein 'interactome' includes 27 pairs of interacting peptides that preferentially heteroassociate. The 27 pairs can be used in combinations to assemble sets of 3 to 6 proteins that compose networks of varying topologies. Of specialmore » interest are heterospecific peptide pairs that participate in mutually orthogonal interactions. Such pairs provide the opportunity to dimerize two separate molecular systems without undesired crosstalk. Solution and structural characterization of two such sets of orthogonal heterodimers provide details of their interaction geometries. The orthogonal pair, along with the many other network motifs discovered in our screen, provide new capabilities for synthetic biology and other applications.« less

  1. Protocol: a multi-level intervention program to reduce stress in 9-1-1 telecommunicators.

    PubMed

    Meischke, Hendrika; Lilly, Michelle; Beaton, Randal; Calhoun, Rebecca; Tu, Ann; Stangenes, Scott; Painter, Ian; Revere, Debra; Baseman, Janet

    2018-05-02

    Nationwide, emergency response systems depend on 9-1-1 telecommunicators to prioritize, triage, and dispatch assistance to those in distress. 9-1-1 call center telecommunicators (TCs) are challenged by acute and chronic workplace stressors: tense interactions with citizen callers in crisis; overtime; shift-work; ever-changing technologies; and negative work culture, including co-worker conflict. This workforce is also subject to routine exposures to secondary traumatization while handling calls involving emergency situations and while making time urgent, high stake decisions over the phone. Our study aims to test the effectiveness of a multi-part intervention to reduce stress in 9-1-1 TCs through an online mindfulness training and a toolkit containing workplace stressor reduction resources. The study employs a randomized controlled trial design with three data collection points. The multi-part intervention includes an individual-level online mindfulness training and a call center-level organizational stress reduction toolkit. 160 TCs will be recruited from 9-1-1 call centers, complete a baseline survey at enrollment, and are randomly assigned to an intervention or a control group. Intervention group participants will start a 7-week online mindfulness training developed in-house and tailored to 9-1-1 TCs and their call center environment; control participants will be "waitlisted" and start the training after the study period ends. Following the intervention group's completion of the mindfulness training, all participants complete a second survey. Next, the online toolkit with call-center wide stress reduction resources is made available to managers of all participating call centers. After 3 months, a third survey will be completed by all participants. The primary outcome is 9-1-1 TCs' self-reported symptoms of stress at three time points as measured by the C-SOSI (Calgary Symptoms of Stress Inventory). Secondary outcomes will include: perceptions of social work environment (measured by metrics of social support and network conflict); mindfulness; and perceptions of social work environment and mindfulness as mediators of stress reduction. This study will evaluate the effectiveness of an online mindfulness training and call center-wide stress reduction toolkit in reducing self-reported stress in 9-1-1 TCs. The results of this study will add to the growing body of research on worksite stress reduction programs. ClinicalTrials.gov Registration Number: NCT02961621 Registered on November 7, 2016 (retrospectively registered).

  2. A Toolkit for Eye Recognition of LAMOST Spectroscopy

    NASA Astrophysics Data System (ADS)

    Yuan, H.; Zhang, H.; Zhang, Y.; Lei, Y.; Dong, Y.; Zhao, Y.

    2014-05-01

    The Large sky Area Multi-Object fiber Spectroscopic Telescope (LAMOST, also named the Guo Shou Jing Telescope) has finished the pilot survey and now begun the normal survey by the end of 2012 September. There have already been millions of targets observed, including thousands of quasar candidates. Because of the difficulty in the automatic identification of quasar spectra, eye recognition is always necessary and efficient. However massive spectra identification by eye is a huge job. In order to improve the efficiency and effectiveness of spectra , a toolkit for eye recognition of LAMOST spectroscopy is developed. Spectral cross-correlation templates from the Sloan Digital Sky Survey (SDSS) are applied as references, including O star, O/B transition star, B star, A star, F/A transition star, F star, G star, K star, M1 star, M3 star,M5 star,M8 star, L1 star, magnetic white dwarf, carbon star, white dwarf, B white dwarf, low metallicity K sub-dwarf, "Early-type" galaxy, galaxy, "Later-type" galaxy, Luminous Red Galaxy, QSO, QSO with some BAL activity and High-luminosity QSO. By adjusting the redshift and flux ratio of the template spectra in an interactive graphic interface, the spectral type of the target can be discriminated in a easy and feasible way and the redshift is estimated at the same time with a precision of about millesimal. The advantage of the tool in dealing with low quality spectra is indicated. Spectra from the Pilot Survey of LAMSOT are applied as examples and spectra from SDSS are also tested from comparison. Target spectra in both image format and fits format are supported. For convenience several spectra accessing manners are provided. All the spectra from LAMOST pilot survey can be located and acquired via the VOTable files on the internet as suggested by International Virtual Observatory Alliance (IVOA). After the construction of the Simple Spectral Access Protocol (SSAP) service by the Chinese Astronomical Data Center (CAsDC), spectra can be obtained and analyzed in a more efficient way.

  3. Visualising crystal packing interactions in solid-state NMR: Concepts and applications

    NASA Astrophysics Data System (ADS)

    Zilka, Miri; Sturniolo, Simone; Brown, Steven P.; Yates, Jonathan R.

    2017-10-01

    In this article, we introduce and apply a methodology, based on density functional theory and the gauge-including projector augmented wave approach, to explore the effects of packing interactions on solid-state nuclear magnetic resonance (NMR) parameters. A visual map derived from a so-termed "magnetic shielding contribution field" can be made of the contributions to the magnetic shielding of a specific site—partitioning the chemical shift to specific interactions. The relation to the established approaches of examining the molecule to crystal change in the chemical shift and the nuclear independent chemical shift is established. The results are applied to a large sample of 71 molecular crystals and three further specific examples from supermolecular chemistry and pharmaceuticals. This approach extends the NMR crystallography toolkit and provides insight into the development of both cluster based approaches to the predictions of chemical shifts and for empirical predictions of chemical shifts in solids.

  4. Studying groundwater and surface water interactions using airborne remote sensing in Heihe River basin, northwest China

    NASA Astrophysics Data System (ADS)

    Liu, C.; Liu, J.; Hu, Y.; Zheng, C.

    2015-05-01

    Managing surface water and groundwater as a unified system is important for water resource exploitation and aquatic ecosystem conservation. The unified approach to water management needs accurate characterization of surface water and groundwater interactions. Temperature is a natural tracer for identifying surface water and groundwater interactions, and the use of remote sensing techniques facilitates basin-scale temperature measurement. This study focuses on the Heihe River basin, the second largest inland river basin in the arid and semi-arid northwest of China where surface water and groundwater undergoes dynamic exchanges. The spatially continuous river-surface temperature of the midstream section of the Heihe River was obtained by using an airborne pushbroom hyperspectral thermal sensor system. By using the hot spot analysis toolkit in the ArcGIS software, abnormally cold water zones were identified as indicators of the spatial pattern of groundwater discharge to the river.

  5. Perspectives of healthcare providers and HIV-affected individuals and couples during the development of a Safer Conception Counseling Toolkit in Kenya: stigma, fears, and recommendations for the delivery of services.

    PubMed

    Mmeje, Okeoma; Njoroge, Betty; Akama, Eliud; Leddy, Anna; Breitnauer, Brooke; Darbes, Lynae; Brown, Joelle

    2016-01-01

    Reproduction is important to many HIV-affected individuals and couples and healthcare providers (HCPs) are responsible for providing resources to help them safely conceive while minimizing the risk of sexual and perinatal HIV transmission. In order to fulfill their reproductive goals, HIV-affected individuals and their partners need access to information regarding safer methods of conception. The objective of this qualitative study was to develop a Safer Conception Counseling Toolkit that can be used to train HCPs and counsel HIV-affected individuals and couples in HIV care and treatment clinics in Kenya. We conducted a two-phased qualitative study among HCPs and HIV-affected individuals and couples from eight HIV care and treatment sites in Kisumu, Kenya. We conducted in-depth interviews (IDIs) and focus group discussions (FGDs) to assess the perspectives of HCPs and HIV-affected individuals and couples in order to develop and refine the content of the Toolkit. Subsequently, IDIs were conducted among HCPs who were trained using the Toolkit and FGDs among HIV-affected individuals and couples who were counseled with the Toolkit. HIV-related stigma, fears, and recommendations for delivery of safer conception counseling were assessed during the discussions. One hundred and six individuals participated in FGDs and IDIs; 29 HCPs, 49 HIV-affected women and men, and 14 HIV-serodiscordant couples. Participants indicated that a safer conception counseling and training program for HCPs is needed and that routine provision of safer conception counseling may promote maternal and child health by enhancing reproductive autonomy among HIV-affected couples. They also reported that the Toolkit may help dispel the stigma and fears associated with reproduction in HIV-affected couples, while supporting them in achieving their reproductive goals. Additional research is needed to evaluate the Safer Conception Toolkit in order to support its implementation and use in HIV care and treatment programs in Kenya and other HIV endemic regions of sub-Saharan Africa.

  6. WIND Toolkit Offshore Summary Dataset

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draxl, Caroline; Musial, Walt; Scott, George

    This dataset contains summary statistics for offshore wind resources for the continental United States derived from the Wind Integration National Datatset (WIND) Toolkit. These data are available in two formats: GDB - Compressed geodatabases containing statistical summaries aligned with lease blocks (aliquots) stored in a GIS format. These data are partitioned into Pacific, Atlantic, and Gulf resource regions. HDF5 - Statistical summaries of all points in the offshore Pacific, Atlantic, and Gulf offshore regions. These data are located on the original WIND Toolkit grid and have not been reassigned or downsampled to lease blocks. These data were developed under contractmore » by NREL for the Bureau of Oceanic Energy Management (BOEM).« less

  7. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  8. MAVEN Data Analysis and Visualization Toolkits

    NASA Astrophysics Data System (ADS)

    Harter, B., Jr.; DeWolfe, A. W.; Brain, D.; Chaffin, M.

    2017-12-01

    The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. The MAVEN Science Data Center has developed software toolkits for analyzing and visualizing the science data. Our Data Intercomparison and Visualization Development Effort (DIVIDE) toolkit is written in IDL, and utilizes the widely used "tplot" IDL libraries. Recently, we have converted DIVIDE into Python in an effort to increase the accessibility of the MAVEN data. This conversion also necessitated the development of a Python version of the tplot libraries, which we have dubbed "PyTplot". PyTplot is generalized to work with missions beyond MAVEN, and our software is available on Github.

  9. An Industrial Physics Toolkit

    NASA Astrophysics Data System (ADS)

    Cummings, Bill

    2004-03-01

    Physicists possess many skills highly valued in industrial companies. However, with the exception of a decreasing number of positions in long range research at large companies, job openings in industry rarely say "Physicist Required." One key to a successful industrial career is to know what subset of your physics skills is most highly valued by a given industry and to continue to build these skills while working. This combination of skills from both academic and industrial experience becomes your "Industrial Physics Toolkit" and is a transferable resource when you change positions or companies. This presentation will describe how one builds and sells your own "Industrial Physics Toolkit" using concrete examples from the speaker's industrial experience.

  10. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  11. RAPID Toolkit Creates Smooth Flow Toward New Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, Aaron; Young, Katherine

    2016-07-01

    Uncertainty about the duration and outcome of the permitting process has historically been seen as a deterrent to investment in renewable energy projects, including new hydropower projects. What if the process were clearer, smoother, faster? That's the purpose of the Regulatory and Permitting Information Desktop (RAPID) Toolkit, developed by the National Renewable Energy Laboratory (NREL) with funding from the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy and the Western Governors' Association. Now, the RAPID Toolkit is being expanded to include information about developing and permitting hydropower projects, with initial outreach and information gathering occurring duringmore » 2015.« less

  12. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  13. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    PubMed

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  14. Filling the gaps between tools and users: a tool comparator, using protein-protein interaction as an example.

    PubMed

    Kano, Yoshinobu; Nguyen, Ngan; Saetre, Rune; Yoshida, Kazuhiro; Miyao, Yusuke; Tsuruoka, Yoshimasa; Matsubayashi, Yuichiro; Ananiadou, Sophia; Tsujii, Jun'ichi

    2008-01-01

    Recently, several text mining programs have reached a near-practical level of performance. Some systems are already being used by biologists and database curators. However, it has also been recognized that current Natural Language Processing (NLP) and Text Mining (TM) technology is not easy to deploy, since research groups tend to develop systems that cater specifically to their own requirements. One of the major reasons for the difficulty of deployment of NLP/TM technology is that re-usability and interoperability of software tools are typically not considered during development. While some effort has been invested in making interoperable NLP/TM toolkits, the developers of end-to-end systems still often struggle to reuse NLP/TM tools, and often opt to develop similar programs from scratch instead. This is particularly the case in BioNLP, since the requirements of biologists are so diverse that NLP tools have to be adapted and re-organized in a much more extensive manner than was originally expected. Although generic frameworks like UIMA (Unstructured Information Management Architecture) provide promising ways to solve this problem, the solution that they provide is only partial. In order for truly interoperable toolkits to become a reality, we also need sharable type systems and a developer-friendly environment for software integration that includes functionality for systematic comparisons of available tools, a simple I/O interface, and visualization tools. In this paper, we describe such an environment that was developed based on UIMA, and we show its feasibility through our experience in developing a protein-protein interaction (PPI) extraction system.

  15. A Toolkit to Implement Graduate Attributes in Geography Curricula

    ERIC Educational Resources Information Center

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  16. Developments in Geometric Metadata and Tools at the PDS Ring-Moon Systems Node

    NASA Astrophysics Data System (ADS)

    Showalter, M. R.; Ballard, L.; French, R. S.; Gordon, M. K.; Tiscareno, M. S.

    2018-04-01

    Object-Oriented Python/SPICE (OOPS) is an overlay on the SPICE toolkit that vastly simplifies and speeds up geometry calculations for planetary data products. This toolkit is the basis for much of the development at the PDS Ring-Moon Systems Node.

  17. Teacher Quality Toolkit. 2nd Edition

    ERIC Educational Resources Information Center

    Lauer, Patricia A.; Dean, Ceri B.; Martin-Glenn, Mya L.; Asensio, Margaret L.

    2005-01-01

    The Teacher Quality Toolkit addresses the continuum of teacher learning by providing tools that can be used to improve both preservice, and inservice teacher education. Each chapter provides self assessment tools that can guide progress toward improved teacher quality and describes resources for designing exemplary programs and practices. Chapters…

  18. WHU at TREC KBA Vital Filtering Track 2014

    DTIC Science & Technology

    2014-11-01

    view the problem as a classification problem and use Stanford NLP Toolkit to extract necessary information. Various kinds of features are leveraged to...profile of an entity. Our approach is to view the problem as a classification problem and use Stanford NLP Toolkit to extract necessary information

  19. Basic Internet Software Toolkit.

    ERIC Educational Resources Information Center

    Buchanan, Larry

    1998-01-01

    Once schools are connected to the Internet, the next step is getting network workstations configured for Internet access. This article describes a basic toolkit comprising software currently available on the Internet for free or modest cost. Lists URLs for Web browser, Telnet, FTP, file decompression, portable document format (PDF) reader,…

  20. Integrating medical imaging analyses through a high-throughput bundled resource imaging system

    NASA Astrophysics Data System (ADS)

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-03-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.

  1. Programmability in AIPS++

    NASA Technical Reports Server (NTRS)

    Hjellming, R. M.

    1992-01-01

    AIPS++ is an Astronomical Information Processing System being designed and implemented by an international consortium of NRAO and six other radio astronomy institutions in Australia, India, the Netherlands, the United Kingdom, Canada, and the USA. AIPS++ is intended to replace the functionality of AIPS, to be more easily programmable, and will be implemented in C++ using object-oriented techniques. Programmability in AIPS++ is planned at three levels. The first level will be that of a command-line interpreter with characteristics similar to IDL and PV-Wave, but with an intensive set of operations appropriate to telescope data handling, image formation, and image processing. The third level will be in C++ with extensive use of class libraries for both basic operations and advanced applications. The third level will allow input and output of data between external FORTRAN programs and AIPS++ telescope and image databases. In addition to summarizing the above programmability characteristics, this talk will given an overview of the classes currently being designed for telescope data calibration and editing, image formation, and the 'toolkit' of mathematical 'objects' that will perform most of the processing in AIPS++.

  2. A Platform to Monitor Tumor Cellular and Vascular Response to Radiation Therapy by Optical Coherence Tomography and Fluorescence Microscopy in vivo

    NASA Astrophysics Data System (ADS)

    Leung, Michael Ka Kit

    Radiotherapy plays a significant role in cancer treatment, and is thought to be curative by mainly killing tumor cells through damage to their genetic material. However, recent findings indicate that the tumor's vascular blood supply is also a major determinant of radiation response. The goals of this thesis are to: (1) develop an experimental platform for small animals to deliver ionizing radiation and perform high-resolution optical imaging to treatment targets, and (2) use this toolkit to longitudinally monitor the response of tumors and the associated vasculature. The thesis has achieved: (1) customization of a novel micro-irradiator for mice, (2) technical development of an improved optical coherence tomography imaging system, (3) comprehensive experimental protocol and imaging optimization for optical microscopy in a specialized animal model, and (4) completion of a feasibility study to demonstrate the capabilities of the experimental platform in monitoring the response of tumor and vasculature to radiotherapy.

  3. Optical toolkits for in vivo deep tissue laser scanning microscopy: a primer

    NASA Astrophysics Data System (ADS)

    Lee, Woei Ming; McMenamin, Thomas; Li, Yongxiao

    2018-06-01

    Life at the microscale is animated and multifaceted. The impact of dynamic in vivo microscopy in small animals has opened up opportunities to peer into a multitude of biological processes at the cellular scale in their native microenvironments. Laser scanning microscopy (LSM) coupled with targeted fluorescent proteins has become an indispensable tool to enable dynamic imaging in vivo at high temporal and spatial resolutions. In the last few decades, the technique has been translated from imaging cells in thin samples to mapping cells in the thick biological tissue of living organisms. Here, we sought to provide a concise overview of the design considerations of a LSM that enables cellular and subcellular imaging in deep tissue. Individual components under review include: long working distance microscope objectives, laser scanning technologies, adaptive optics devices, beam shaping technologies and photon detectors, with an emphasis on more recent advances. The review will conclude with the latest innovations in automated optical microscopy, which would impact tracking and quantification of heterogeneous populations of cells in vivo.

  4. Implementation of augmented reality to models sultan deli

    NASA Astrophysics Data System (ADS)

    Syahputra, M. F.; Lumbantobing, N. P.; Siregar, B.; Rahmat, R. F.; Andayani, U.

    2018-03-01

    Augmented reality is a technology that can provide visualization in the form of 3D virtual model. With the utilization of augmented reality technology hence image-based modeling to produce 3D model of Sultan Deli Istana Maimun can be applied to restore photo of Sultan of Deli into three dimension model. This is due to the Sultan of Deli which is one of the important figures in the history of the development of the city of Medan is less known by the public because the image of the Sultanate of Deli is less clear and has been very long. To achieve this goal, augmented reality applications are used with image processing methodologies into 3D models through several toolkits. The output generated from this method is the visitor’s photos Maimun Palace with 3D model of Sultan Deli with the detection of markers 20-60 cm apart so as to provide convenience for the public to recognize the Sultan Deli who had ruled in Maimun Palace.

  5. Linking Neurons to Network Function and Behavior by Two-Photon Holographic Optogenetics and Volumetric Imaging.

    PubMed

    Dal Maschio, Marco; Donovan, Joseph C; Helmbrecht, Thomas O; Baier, Herwig

    2017-05-17

    We introduce a flexible method for high-resolution interrogation of circuit function, which combines simultaneous 3D two-photon stimulation of multiple targeted neurons, volumetric functional imaging, and quantitative behavioral tracking. This integrated approach was applied to dissect how an ensemble of premotor neurons in the larval zebrafish brain drives a basic motor program, the bending of the tail. We developed an iterative photostimulation strategy to identify minimal subsets of channelrhodopsin (ChR2)-expressing neurons that are sufficient to initiate tail movements. At the same time, the induced network activity was recorded by multiplane GCaMP6 imaging across the brain. From this dataset, we computationally identified activity patterns associated with distinct components of the elicited behavior and characterized the contributions of individual neurons. Using photoactivatable GFP (paGFP), we extended our protocol to visualize single functionally identified neurons and reconstruct their morphologies. Together, this toolkit enables linking behavior to circuit activity with unprecedented resolution. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Field trials of a novel toolkit for evaluating 'intangible' values-related dimensions of projects.

    PubMed

    Burford, Gemma; Velasco, Ismael; Janoušková, Svatava; Zahradnik, Martin; Hak, Tomas; Podger, Dimity; Piggot, Georgia; Harder, Marie K

    2013-02-01

    A novel toolkit has been developed, using an original approach to develop its components, for the purpose of evaluating 'soft' outcomes and processes that have previously been generally considered 'intangible': those which are specifically values based. This represents a step-wise, significant, change in provision for the assessment of values-based achievements that are of absolutely key importance to most civil society organisations (CSOs) and values-based businesses, and fills a known gap in evaluation practice. In this paper, we demonstrate the significance and rigour of the toolkit by presenting an evaluation of it in three diverse scenarios where different CSOs use it to co-evaluate locally relevant outcomes and processes to obtain results which are both meaningful to them and potentially comparable across organisations. A key strength of the toolkit is its original use of a prior generated, peer-elicited 'menu' of values-based indicators which provides a framework for user CSOs to localise. Principles of participatory, process-based and utilisation-focused evaluation are embedded in this toolkit and shown to be critical to its success, achieving high face-validity and wide applicability. The emerging contribution of this next-generation evaluation tool to other fields, such as environmental values, development and environmental sustainable development, shared values, business, education and organisational change is outlined. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Practical computational toolkits for dendrimers and dendrons structure design.

    PubMed

    Martinho, Nuno; Silva, Liana C; Florindo, Helena F; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-01

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  8. Practical computational toolkits for dendrimers and dendrons structure design

    NASA Astrophysics Data System (ADS)

    Martinho, Nuno; Silva, Liana C.; Florindo, Helena F.; Brocchini, Steve; Barata, Teresa; Zloh, Mire

    2017-09-01

    Dendrimers and dendrons offer an excellent platform for developing novel drug delivery systems and medicines. The rational design and further development of these repetitively branched systems are restricted by difficulties in scalable synthesis and structural determination, which can be overcome by judicious use of molecular modelling and molecular simulations. A major difficulty to utilise in silico studies to design dendrimers lies in the laborious generation of their structures. Current modelling tools utilise automated assembly of simpler dendrimers or the inefficient manual assembly of monomer precursors to generate more complicated dendrimer structures. Herein we describe two novel graphical user interface toolkits written in Python that provide an improved degree of automation for rapid assembly of dendrimers and generation of their 2D and 3D structures. Our first toolkit uses the RDkit library, SMILES nomenclature of monomers and SMARTS reaction nomenclature to generate SMILES and mol files of dendrimers without 3D coordinates. These files are used for simple graphical representations and storing their structures in databases. The second toolkit assembles complex topology dendrimers from monomers to construct 3D dendrimer structures to be used as starting points for simulation using existing and widely available software and force fields. Both tools were validated for ease-of-use to prototype dendrimer structure and the second toolkit was especially relevant for dendrimers of high complexity and size.

  9. The ProteoRed MIAPE web toolkit: A User-friendly Framework to Connect and Share Proteomics Standards*

    PubMed Central

    Medina-Aunon, J. Alberto; Martínez-Bartolomé, Salvador; López-García, Miguel A.; Salazar, Emilio; Navajas, Rosana; Jones, Andrew R.; Paradela, Alberto; Albar, Juan P.

    2011-01-01

    The development of the HUPO-PSI's (Proteomics Standards Initiative) standard data formats and MIAPE (Minimum Information About a Proteomics Experiment) guidelines should improve proteomics data sharing within the scientific community. Proteomics journals have encouraged the use of these standards and guidelines to improve the quality of experimental reporting and ease the evaluation and publication of manuscripts. However, there is an evident lack of bioinformatics tools specifically designed to create and edit standard file formats and reports, or embed them within proteomics workflows. In this article, we describe a new web-based software suite (The ProteoRed MIAPE web toolkit) that performs several complementary roles related to proteomic data standards. First, it can verify that the reports fulfill the minimum information requirements of the corresponding MIAPE modules, highlighting inconsistencies or missing information. Second, the toolkit can convert several XML-based data standards directly into human readable MIAPE reports stored within the ProteoRed MIAPE repository. Finally, it can also perform the reverse operation, allowing users to export from MIAPE reports into XML files for computational processing, data sharing, or public database submission. The toolkit is thus the first application capable of automatically linking the PSI's MIAPE modules with the corresponding XML data exchange standards, enabling bidirectional conversions. This toolkit is freely available at http://www.proteored.org/MIAPE/. PMID:21983993

  10. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pordes, Rush; Snider, Erica

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation softwaremore » and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.« less

  11. Organizational Context Matters: A Research Toolkit for Conducting Standardized Case Studies of Integrated Care Initiatives

    PubMed Central

    Grudniewicz, Agnes; Gray, Carolyn Steele; Wodchis, Walter P.; Carswell, Peter; Baker, G. Ross

    2017-01-01

    Introduction: The variable success of integrated care initiatives has led experts to recommend tailoring design and implementation to the organizational context. Yet, organizational contexts are rarely described, understood, or measured with sufficient depth and breadth in empirical studies or in practice. We thus lack knowledge of when and specifically how organizational contexts matter. To facilitate the accumulation of evidence, we developed a research toolkit for conducting case studies using standardized measures of the (inter-)organizational context for integrating care. Theory and Methods: We used a multi-method approach to develop the research toolkit: (1) development and validation of the Context and Capabilities for Integrating Care (CCIC) Framework, (2) identification, assessment, and selection of survey instruments, (3) development of document review methods, (4) development of interview guide resources, and (5) pilot testing of the document review guidelines, consolidated survey, and interview guide. Results: The toolkit provides a framework and measurement tools that examine 18 organizational and inter-organizational factors that affect the implementation and success of integrated care initiatives. Discussion and Conclusion: The toolkit can be used to characterize and compare organizational contexts across cases and enable comparison of results across studies. This information can enhance our understanding of the influence of organizational contexts, support the transfer of best practices, and help explain why some integrated care initiatives succeed and some fail. PMID:28970750

  12. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  13. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  14. A Matlab toolkit for three-dimensional electrical impedance tomography: a contribution to the Electrical Impedance and Diffuse Optical Reconstruction Software project

    NASA Astrophysics Data System (ADS)

    Polydorides, Nick; Lionheart, William R. B.

    2002-12-01

    The objective of the Electrical Impedance and Diffuse Optical Reconstruction Software project is to develop freely available software that can be used to reconstruct electrical or optical material properties from boundary measurements. Nonlinear and ill posed problems such as electrical impedance and optical tomography are typically approached using a finite element model for the forward calculations and a regularized nonlinear solver for obtaining a unique and stable inverse solution. Most of the commercially available finite element programs are unsuitable for solving these problems because of their conventional inefficient way of calculating the Jacobian, and their lack of accurate electrode modelling. A complete package for the two-dimensional EIT problem was officially released by Vauhkonen et al at the second half of 2000. However most industrial and medical electrical imaging problems are fundamentally three-dimensional. To assist the development we have developed and released a free toolkit of Matlab routines which can be employed to solve the forward and inverse EIT problems in three dimensions based on the complete electrode model along with some basic visualization utilities, in the hope that it will stimulate further development. We also include a derivation of the formula for the Jacobian (or sensitivity) matrix based on the complete electrode model.

  15. IT infrastructure in the era of imaging 3.0.

    PubMed

    McGinty, Geraldine B; Allen, Bibb; Geis, J Raymond; Wald, Christoph

    2014-12-01

    Imaging 3.0 is a blueprint for the future of radiology modeled after the description of Web 3.0 as "more connected, more open, and more intelligent." Imaging 3.0 involves radiologists' using their expertise to manage all aspects of imaging care to improve patient safety and outcomes and to deliver high-value care. IT tools are critical elements and drivers of success as radiologists embrace the concepts of Imaging 3.0. Organized radiology, specifically the ACR, is the natural convener and resource for the development of this Imaging 3.0 toolkit. The ACR's new Imaging 3.0 Informatics Committee is actively working to develop the informatics tools radiologists need to improve efficiency, deliver more value, and provide quantitative ways to demonstrate their value in new health care delivery and payment systems. This article takes each step of the process of delivering high-value Imaging 3.0 care and outlines the tools available as well as additional resources available to support practicing radiologists. From the moment when imaging is considered through the delivery of a meaningful and actionable report that is communicated to the referring clinician and, when appropriate, to the patient, Imaging 3.0 IT tools will enable radiologists to position themselves as vital constituents in cost-effective, high-value health care. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  16. Understanding Disabilities in American Indian and Alaska Native Communities. Toolkit Guide.

    ERIC Educational Resources Information Center

    National Council on Disability, Washington, DC.

    This "toolkit" document is intended to provide a culturally appropriate set of resources to address the unique political and legal concerns of people with disabilities in American Indian/Alaska Native (AI/AN) communities. It provides information on education, health, vocational rehabilitation (VR), independent living, model approaches, and…

  17. Object Toolkit Version 4.2 Users Manual

    DTIC Science & Technology

    2014-10-31

    48 Figure 39. Geocentric Orbit Dialog Box...Z Side at (0.44, -0.44, 1.46). ............................................ 114 Figure 133. Geocentric Orbit Dialog Box...building an object for MEM, Object Toolkit has an Orbit menu that allows the user to specify and edit a heliocentric or geocentric orbit. The dialog

  18. Educating Globally Competent Citizens: A Toolkit. Second Edition

    ERIC Educational Resources Information Center

    Elliott-Gower, Steven; Falk, Dennis R.; Shapiro, Martin

    2012-01-01

    Educating Globally Competent Citizens, a product of AASCU's American Democracy Project and its Global Engagement Initiative, introduces readers to a set of global challenges facing society based on the Center for Strategic and International Studies' 7 Revolutions. The toolkit is designed to aid faculty in incorporating global challenges into new…

  19. 75 FR 53969 - Office of Community Services: Notice To Award an Expansion Supplement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-02

    ... related to job creation and new initiatives that target careers in energy efficiency and other...; and (3) collects, develops, and disseminates resources related to job creation and careers related to... through professional consultations and peer assistance sessions; and online toolkit(s). The T/TA CAP will...

  20. The Student Writing Toolkit: Enhancing Undergraduate Teaching of Scientific Writing in the Biological Sciences

    ERIC Educational Resources Information Center

    Dirrigl, Frank J., Jr.; Noe, Mark

    2014-01-01

    Teaching scientific writing in biology classes is challenging for both students and instructors. This article offers and reviews several useful "toolkit" items that improve student writing. These include sentence and paper-length templates, funnelling and compartmentalisation, and preparing compendiums of corrections. In addition,…

  1. The Archivists' Toolkit: Another Step toward Streamlined Archival Processing

    ERIC Educational Resources Information Center

    Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason

    2006-01-01

    The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…

  2. Healthy People 2010: Oral Health Toolkit

    ERIC Educational Resources Information Center

    Isman, Beverly

    2007-01-01

    The purpose of this Toolkit is to provide guidance, technical tools, and resources to help states, territories, tribes and communities develop and implement successful oral health components of Healthy People 2010 plans as well as other oral health plans. These plans are useful for: (1) promoting, implementing and tracking oral health objectives;…

  3. Using Toolkits to Achieve STEM Enterprise Learning Outcomes

    ERIC Educational Resources Information Center

    Watts, Carys A.; Wray, Katie

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of using several commercial tools in science, technology, engineering and maths (STEM) subjects for enterprise education at Newcastle University, UK. Design/methodology/approach: The paper provides an overview of existing toolkit use in higher education, before reviewing where and…

  4. BAT - The Bayesian analysis toolkit

    NASA Astrophysics Data System (ADS)

    Caldwell, Allen; Kollár, Daniel; Kröninger, Kevin

    2009-11-01

    We describe the development of a new toolkit for data analysis. The analysis package is based on Bayes' Theorem, and is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution. Parameter estimation, limit setting and uncertainty propagation are implemented in a straightforward manner.

  5. A Toolkit for Teacher Engagement

    ERIC Educational Resources Information Center

    Grantmakers for Education, 2014

    2014-01-01

    Teachers are critical to the success of education grantmaking strategies, yet in talking with them we discovered that the world of philanthropy is often a mystery. GFE's Toolkit for Teacher Engagement aims to assist funders in authentically and effectively involving teachers in the education reform and innovation process. Built directly from the…

  6. The Mentoring Toolkit 2.0: Resources for Developing Programs for Incarcerated Youth. Guide

    ERIC Educational Resources Information Center

    Zaugg, Nathan; Jarjoura, Roger

    2017-01-01

    "The Mentoring Toolkit 2.0: Resources for Developing Programs for Incarcerated Youth" provides information, program descriptions, and links to important resources that can assist juvenile correctional facilities and other organizations to design effective mentoring programs for neglected and delinquent youth, particularly those who are…

  7. Making Schools the Model for Healthier Environments Toolkit: What It Is

    ERIC Educational Resources Information Center

    Robert Wood Johnson Foundation, 2012

    2012-01-01

    Healthy students perform better. Poor nutrition and inadequate physical activity can affect not only academic achievement, but also other factors such as absenteeism, classroom behavior, ability to concentrate, self-esteem, cognitive performance, and test scores. This toolkit provides information to help make schools the model for healthier…

  8. Policy to Performance Toolkit: Transitioning Adults to Opportunity

    ERIC Educational Resources Information Center

    Alamprese, Judith A.; Limardo, Chrys

    2012-01-01

    The "Policy to Performance Toolkit" is designed to provide state adult education staff and key stakeholders with guidance and tools to use in developing, implementing, and monitoring state policies and their associated practices that support an effective state adult basic education (ABE) to postsecondary education and training transition…

  9. Excellence in Teaching End-of-Life Care. A New Multimedia Toolkit for Nurse Educators.

    ERIC Educational Resources Information Center

    Wilkie, Diana J.; Judge, Kay M.; Wells, Marjorie J.; Berkley, Ila Meredith

    2001-01-01

    Describes a multimedia toolkit for teaching palliative care in nursing, which contains modules on end-of-life topics: comfort, connections, ethics, grief, impact, and well-being. Other contents include myths, definitions, pre- and postassessments, teaching materials, case studies, learning activities, and resources. (SK)

  10. SmaggIce 2.0: Additional Capabilities for Interactive Grid Generation of Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Kreeger, Richard E.; Baez, Marivell; Braun, Donald C.; Schilling, Herbert W.; Vickerman, Mary B.

    2008-01-01

    The Surface Modeling and Grid Generation for Iced Airfoils (SmaggIce) software toolkit has been extended to allow interactive grid generation for multi-element iced airfoils. The essential phases of an icing effects study include geometry preparation, block creation and grid generation. SmaggIce Version 2.0 now includes these main capabilities for both single and multi-element airfoils, plus an improved flow solver interface and a variety of additional tools to enhance the efficiency and accuracy of icing effects studies. An overview of these features is given, especially the new multi-element blocking strategy using the multiple wakes method. Examples are given which illustrate the capabilities of SmaggIce for conducting an icing effects study for both single and multi-element airfoils.

  11. A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.

    1997-01-01

    This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.

  12. Multi-Fidelity Uncertainty Propagation for Cardiovascular Modeling

    NASA Astrophysics Data System (ADS)

    Fleeter, Casey; Geraci, Gianluca; Schiavazzi, Daniele; Kahn, Andrew; Marsden, Alison

    2017-11-01

    Hemodynamic models are successfully employed in the diagnosis and treatment of cardiovascular disease with increasing frequency. However, their widespread adoption is hindered by our inability to account for uncertainty stemming from multiple sources, including boundary conditions, vessel material properties, and model geometry. In this study, we propose a stochastic framework which leverages three cardiovascular model fidelities: 3D, 1D and 0D models. 3D models are generated from patient-specific medical imaging (CT and MRI) of aortic and coronary anatomies using the SimVascular open-source platform, with fluid structure interaction simulations and Windkessel boundary conditions. 1D models consist of a simplified geometry automatically extracted from the 3D model, while 0D models are obtained from equivalent circuit representations of blood flow in deformable vessels. Multi-level and multi-fidelity estimators from Sandia's open-source DAKOTA toolkit are leveraged to reduce the variance in our estimated output quantities of interest while maintaining a reasonable computational cost. The performance of these estimators in terms of computational cost reductions is investigated for a variety of output quantities of interest, including global and local hemodynamic indicators. Sandia National Labs is a multimission laboratory managed and operated by NTESS, LLC, for the U.S. DOE under contract DE-NA0003525. Funding for this project provided by NIH-NIBIB R01 EB018302.

  13. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  14. Recent Developments and Applications of the MMPBSA Method

    PubMed Central

    Wang, Changhao; Greene, D'Artagnan; Xiao, Li; Qi, Ruxi; Luo, Ray

    2018-01-01

    The Molecular Mechanics Poisson-Boltzmann Surface Area (MMPBSA) approach has been widely applied as an efficient and reliable free energy simulation method to model molecular recognition, such as for protein-ligand binding interactions. In this review, we focus on recent developments and applications of the MMPBSA method. The methodology review covers solvation terms, the entropy term, extensions to membrane proteins and high-speed screening, and new automation toolkits. Recent applications in various important biomedical and chemical fields are also reviewed. We conclude with a few future directions aimed at making MMPBSA a more robust and efficient method. PMID:29367919

  15. GEANT4 Simulation of Hadronic Interactions at 8-GeV/C to 10-GeV/C: Response to the HARP-CDP Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uzhinsky, V.; /Dubna, JINR /CERN; Apostolakis, J.

    2011-11-21

    The results of the HARP-CDP group on the comparison of GEANT4 Monte Carlo predictions versus experimental data are discussed. It is shown that the problems observed by the group are caused by an incorrect implementation of old features at the programming level, and by a lack of the nucleon Fermi motion in the simulation of quasielastic scattering. These drawbacks are not due to the physical models used. They do not manifest themselves in the most important applications of the GEANT4 toolkit.

  16. An assessment toolkit to increase the resilience of NWE catchments to periods of drought

    NASA Astrophysics Data System (ADS)

    La Jeunesse, Isabelle; Larrue, Corinne

    2013-04-01

    In many North Western Europe (NWE) areas the balance between water demand and availability is under pressure, thus under water scarcity. In addition, NWE areas are adversely affected by changes in the hydrological cycle and precipitation patterns, thus droughts periods. Over the past thirty years, droughts have dramatically increased and NWE are not immune. The summer of 2003 caused 10 billion euro damage to agriculture. In April 2012 the South West of the UK has moved to environmental drought status. Water scarcity and drought problems in the EU are increasing: 11% of the European population and 17% of its territory have been affected to date. Climate change is likely to exacerbate these adverse impacts. 50% of the NWE area are planned to be affected in 2050. Although the problems caused by drought in NWE are currently not overwhelmingly visible early action should be taken to reduce costs and prevent damage. Adapting to drought in NWE is the transnational challenge of the DROP (governance in DROught adaPtation) project. The Commission's recent "Blue Print on European Waters" states that existing policies are good but the problem lays in implementation. So the future challenge for NWE regions is to improve the implementation, meaning both governance and measures. The problem of drought is relatively new in comparison with flooding for these Regions. This demands another approach with the interaction of different stakeholders. NWE countries have proven strategies for flood prevention; no such strategies exist for drought adaptation. To do this, DROP combines science, practitioners and decisions makers, realizing the science-policy window. Thus, the aim of the DROP project is to increase the resilience of NWE catchments to periods of drought. To tackle these issues DROP will develop a governance toolkit to be used by NWE regional water authorities and will test a few pilot measures on drought adaptation. The objectives of the project are 1) to promote the use of a European governance assessment toolkit to define regional drought adaptation; 2) to improve the effectiveness of drought adaptation measures for NWE areas, and 3) to enhance the preparedness of regional stakeholders in NWE in drought adaptation. In this presentation, authors aim at presenting the assessment toolkit based on a combination of five regime dimensions and four regime qualities which have been operationalized into a questionnaire. The questionnaire helps to make a regime assessment of both the static situation and the dynamics. Acknowledgments This research is funded by the INTERREG IVB programme for the North Western Europe and DROP is leaded by the Regge en Dinkel Water Board in the Netherlands. The toolkit is developped in collaboration with the University of Twente and in particular with Stefan Kuks, Hans Bressers, Cheryl de Boer, Joanne Vinke and Gül Özerol. We specially acknowledge Regional partners of DROP.

  17. Using an Assistive Technology Toolkit to Promote Inclusion

    ERIC Educational Resources Information Center

    Judge, Sharon; Floyd, Kim; Jeffs, Tara

    2008-01-01

    Although the use of assistive technology for young children is increasing, the lack of awareness and the lack of training continue to act as major barriers to providers using assistive technology. This article describes an assistive technology toolkit designed for use with young children with disabilities that can be easily assembled and…

  18. Roles of the Volunteer in Development: Toolkits for Building Capacity.

    ERIC Educational Resources Information Center

    Slater, Marsha; Allsman, Ava; Savage, Ron; Havens, Lani; Blohm, Judee; Raftery, Kate

    This document, which was developed to assist Peace Corps volunteers and those responsible for training them, presents an introductory booklet and six toolkits for use in the training provided to and by volunteers involved in community development. All the materials emphasize long-term participatory approaches to sustainable development and a…

  19. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    ERIC Educational Resources Information Center

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  20. Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities. Executive Summary

    ERIC Educational Resources Information Center

    Kingsley, Chris

    2012-01-01

    This executive summary describes highlights from the report, "Building Management Information Systems to Coordinate Citywide Afterschool Programs: A Toolkit for Cities." City-led efforts to build coordinated systems of afterschool programming are an important strategy for improving the health, safety and academic preparedness of children…

Top