Sample records for application research toolbox

  1. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    PubMed

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  2. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  3. WEC Design Response Toolbox v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey

    2016-03-30

    The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.

  4. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    NASA Astrophysics Data System (ADS)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  5. Software Toolbox for Low-Frequency Conductivity and Current Density Imaging Using MRI.

    PubMed

    Sajib, Saurav Z K; Katoch, Nitish; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2017-11-01

    Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes. Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes.

  6. Complete scanpaths analysis toolbox.

    PubMed

    Augustyniak, Piotr; Mikrut, Zbigniew

    2006-01-01

    This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.

  7. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG

    PubMed Central

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies. PMID:29163006

  8. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    PubMed

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  9. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  10. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Benveniste, Jérôme; Knudsen, Per

    2016-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  11. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, Per; Benveniste, Jerome

    2017-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  12. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, Per; Benveniste, Jerome; Team Gut

    2016-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  13. C%2B%2B tensor toolbox user manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd D.; Kolda, Tamara Gibson

    2012-04-01

    The C++ Tensor Toolbox is a software package for computing tensor decompositions. It is based on the Matlab Tensor Toolbox, and is particularly optimized for sparse data sets. This user manual briefly overviews tensor decomposition mathematics, software capabilities, and installation of the package. Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors in C++. The Toolbox compiles into libraries and is intended for use with custom applications written by users.

  14. TRIQS: A toolbox for research on interacting quantum systems

    NASA Astrophysics Data System (ADS)

    Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka

    2015-11-01

    We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.

  15. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) Users' Workshop Presentations

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S. (Compiler)

    2018-01-01

    NASA Glenn Research Center hosted a Users' Workshop on the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) on August 21, 2017. The objective of this workshop was to update the user community on the latest features of T-MATS, and to provide a forum to present work performed using T-MATS. Presentations highlighted creative applications and the development of new features and libraries, and emphasized the flexibility and simulation power of T-MATS.

  16. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  17. A part toolbox to tune genetic expression in Bacillus subtilis

    PubMed Central

    Guiziou, Sarah; Sauveplane, Vincent; Chang, Hung-Ju; Clerté, Caroline; Declerck, Nathalie; Jules, Matthieu; Bonnet, Jerome

    2016-01-01

    Libraries of well-characterised components regulating gene expression levels are essential to many synthetic biology applications. While widely available for the Gram-negative model bacterium Escherichia coli, such libraries are lacking for the Gram-positive model Bacillus subtilis, a key organism for basic research and biotechnological applications. Here, we engineered a genetic toolbox comprising libraries of promoters, Ribosome Binding Sites (RBS), and protein degradation tags to precisely tune gene expression in B. subtilis. We first designed a modular Expression Operating Unit (EOU) facilitating parts assembly and modifications and providing a standard genetic context for gene circuits implementation. We then selected native, constitutive promoters of B. subtilis and efficient RBS sequences from which we engineered three promoters and three RBS sequence libraries exhibiting ∼14 000-fold dynamic range in gene expression levels. We also designed a collection of SsrA proteolysis tags of variable strength. Finally, by using fluorescence fluctuation methods coupled with two-photon microscopy, we quantified the absolute concentration of GFP in a subset of strains from the library. Our complete promoters and RBS sequences library comprising over 135 constructs enables tuning of GFP concentration over five orders of magnitude, from 0.05 to 700 μM. This toolbox of regulatory components will support many research and engineering applications in B. subtilis. PMID:27402159

  18. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    USGS Publications Warehouse

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  19. The conservation physiology toolbox: status and opportunities

    PubMed Central

    Love, Oliver P; Hultine, Kevin R

    2018-01-01

    Abstract For over a century, physiological tools and techniques have been allowing researchers to characterize how organisms respond to changes in their natural environment and how they interact with human activities or infrastructure. Over time, many of these techniques have become part of the conservation physiology toolbox, which is used to monitor, predict, conserve, and restore plant and animal populations under threat. Here, we provide a summary of the tools that currently comprise the conservation physiology toolbox. By assessing patterns in articles that have been published in ‘Conservation Physiology’ over the past 5 years that focus on introducing, refining and validating tools, we provide an overview of where researchers are placing emphasis in terms of taxa and physiological sub-disciplines. Although there is certainly diversity across the toolbox, metrics of stress physiology (particularly glucocorticoids) and studies focusing on mammals have garnered the greatest attention, with both comprising the majority of publications (>45%). We also summarize the types of validations that are actively being completed, including those related to logistics (sample collection, storage and processing), interpretation of variation in physiological traits and relevance for conservation science. Finally, we provide recommendations for future tool refinement, with suggestions for: (i) improving our understanding of the applicability of glucocorticoid physiology; (ii) linking multiple physiological and non-physiological tools; (iii) establishing a framework for plant conservation physiology; (iv) assessing links between environmental disturbance, physiology and fitness; (v) appreciating opportunities for validations in under-represented taxa; and (vi) emphasizing tool validation as a core component of research programmes. Overall, we are confident that conservation physiology will continue to increase its applicability to more taxa, develop more non-invasive techniques, delineate where limitations exist, and identify the contexts necessary for interpretation in captivity and the wild. PMID:29942517

  20. Transonic CFD applications at Boeing

    NASA Technical Reports Server (NTRS)

    Tinoco, E. N.

    1989-01-01

    The use of computational methods for three dimensional transonic flow design and analysis at the Boeing Company is presented. A range of computational tools consisting of production tools for every day use by project engineers, expert user tools for special applications by computational researchers, and an emerging tool which may see considerable use in the near future are described. These methods include full potential and Euler solvers, some coupled to three dimensional boundary layer analysis methods, for transonic flow analysis about nacelle, wing-body, wing-body-strut-nacelle, and complete aircraft configurations. As the examples presented show, such a toolbox of codes is necessary for the variety of applications typical of an industrial environment. Such a toolbox of codes makes possible aerodynamic advances not previously achievable in a timely manner, if at all.

  1. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and

  2. A CRISPR/Cas9 Toolbox for Multiplexed Plant Genome Editing and Transcriptional Regulation.

    PubMed

    Lowder, Levi G; Zhang, Dengwei; Baltes, Nicholas J; Paul, Joseph W; Tang, Xu; Zheng, Xuelian; Voytas, Daniel F; Hsieh, Tzung-Fu; Zhang, Yong; Qi, Yiping

    2015-10-01

    The relative ease, speed, and biological scope of clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated Protein9 (Cas9)-based reagents for genomic manipulations are revolutionizing virtually all areas of molecular biosciences, including functional genomics, genetics, applied biomedical research, and agricultural biotechnology. In plant systems, however, a number of hurdles currently exist that limit this technology from reaching its full potential. For example, significant plant molecular biology expertise and effort is still required to generate functional expression constructs that allow simultaneous editing, and especially transcriptional regulation, of multiple different genomic loci or multiplexing, which is a significant advantage of CRISPR/Cas9 versus other genome-editing systems. To streamline and facilitate rapid and wide-scale use of CRISPR/Cas9-based technologies for plant research, we developed and implemented a comprehensive molecular toolbox for multifaceted CRISPR/Cas9 applications in plants. This toolbox provides researchers with a protocol and reagents to quickly and efficiently assemble functional CRISPR/Cas9 transfer DNA constructs for monocots and dicots using Golden Gate and Gateway cloning methods. It comes with a full suite of capabilities, including multiplexed gene editing and transcriptional activation or repression of plant endogenous genes. We report the functionality and effectiveness of this toolbox in model plants such as tobacco (Nicotiana benthamiana), Arabidopsis (Arabidopsis thaliana), and rice (Oryza sativa), demonstrating its utility for basic and applied plant research. © 2015 American Society of Plant Biologists. All Rights Reserved.

  3. Building Interdisciplinary Research Models Through Interactive Education.

    PubMed

    Hessels, Amanda J; Robinson, Brian; O'Rourke, Michael; Begg, Melissa D; Larson, Elaine L

    2015-12-01

    Critical interdisciplinary research skills include effective communication with diverse disciplines and cultivating collaborative relationships. Acquiring these skills during graduate education may foster future interdisciplinary research quality and productivity. The project aim was to develop and evaluate an interactive Toolbox workshop approach within an interprofessional graduate level course to enhance student learning and skill in interdisciplinary research. We sought to examine the student experience of integrating the Toolbox workshop in modular format over the duration of a 14-week course. The Toolbox Health Sciences Instrument includes six modules that were introduced in a 110-minute dialogue session during the first class and then integrated into the course in a series of six individual workshops in three phases over the course of the semester. Seventeen students participated; the majority were nursing students. Three measures were used to assess project outcomes: pre-post intervention Toolbox survey, competency self-assessment, and a postcourse survey. All measures indicated the objectives were met by a change in survey responses, improved competencies, and favorable experience of the Toolbox modular intervention. Our experience indicates that incorporating this Toolbox modular approach into research curricula can enhance individual level scientific capacity, future interdisciplinary research project success, and ultimately impact on practice and policy. © 2015 Wiley Periodicals, Inc.

  4. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    PubMed

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  5. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    PubMed

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  6. COMETS2: An advanced MATLAB toolbox for the numerical analysis of electric fields generated by transcranial direct current stimulation.

    PubMed

    Lee, Chany; Jung, Young-Jin; Lee, Sang Jun; Im, Chang-Hwan

    2017-02-01

    Since there is no way to measure electric current generated by transcranial direct current stimulation (tDCS) inside the human head through in vivo experiments, numerical analysis based on the finite element method has been widely used to estimate the electric field inside the head. In 2013, we released a MATLAB toolbox named COMETS, which has been used by a number of groups and has helped researchers to gain insight into the electric field distribution during stimulation. The aim of this study was to develop an advanced MATLAB toolbox, named COMETS2, for the numerical analysis of the electric field generated by tDCS. COMETS2 can generate any sizes of rectangular pad electrodes on any positions on the scalp surface. To reduce the large computational burden when repeatedly testing multiple electrode locations and sizes, a new technique to decompose the global stiffness matrix was proposed. As examples of potential applications, we observed the effects of sizes and displacements of electrodes on the results of electric field analysis. The proposed mesh decomposition method significantly enhanced the overall computational efficiency. We implemented an automatic electrode modeler for the first time, and proposed a new technique to enhance the computational efficiency. In this paper, an efficient toolbox for tDCS analysis is introduced (freely available at http://www.cometstool.com). It is expected that COMETS2 will be a useful toolbox for researchers who want to benefit from the numerical analysis of electric fields generated by tDCS. Copyright © 2016. Published by Elsevier B.V.

  7. SCoT: a Python toolbox for EEG source connectivity.

    PubMed

    Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R

    2014-01-01

    Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT-a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  8. SCoT: a Python toolbox for EEG source connectivity

    PubMed Central

    Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R.

    2014-01-01

    Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT—a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT. PMID:24653694

  9. Report on the Current Inventory of the Toolbox for Plant Cell Wall Analysis: Proteinaceous and Small Molecular Probes

    PubMed Central

    Rydahl, Maja G.; Hansen, Aleksander R.; Kračun, Stjepan K.; Mravec, Jozef

    2018-01-01

    Plant cell walls are highly complex structures composed of diverse classes of polysaccharides, proteoglycans, and polyphenolics, which have numerous roles throughout the life of a plant. Significant research efforts aim to understand the biology of this cellular organelle and to facilitate cell-wall-based industrial applications. To accomplish this, researchers need to be provided with a variety of sensitive and specific detection methods for separate cell wall components, and their various molecular characteristics in vitro as well as in situ. Cell wall component-directed molecular detection probes (in short: cell wall probes, CWPs) are an essential asset to the plant glycobiology toolbox. To date, a relatively large set of CWPs has been produced—mainly consisting of monoclonal antibodies, carbohydrate-binding modules, synthetic antibodies produced by phage display, and small molecular probes. In this review, we summarize the state-of-the-art knowledge about these CWPs; their classification and their advantages and disadvantages in different applications. In particular, we elaborate on the recent advances in non-conventional approaches to the generation of novel CWPs, and identify the remaining gaps in terms of target recognition. This report also highlights the addition of new “compartments” to the probing toolbox, which is filled with novel chemical biology tools, such as metabolic labeling reagents and oligosaccharide conjugates. In the end, we also forecast future developments in this dynamic field. PMID:29774041

  10. Pointing System Simulation Toolbox with Application to a Balloon Mission Simulator

    NASA Technical Reports Server (NTRS)

    Maringolo Baldraco, Rosana M.; Aretskin-Hariton, Eliot D.; Swank, Aaron J.

    2017-01-01

    The development of attitude estimation and pointing-control algorithms is necessary in order to achieve high-fidelity modeling for a Balloon Mission Simulator (BMS). A pointing system simulation toolbox was developed to enable this. The toolbox consists of a star-tracker (ST) and Inertial Measurement Unit (IMU) signal generator, a UDP (User Datagram Protocol) communication le (bridge), and an indirect-multiplicative extended Kalman filter (imEKF). This document describes the Python toolbox developed and the results of its implementation in the imEKF.

  11. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  12. The Model Optimization, Uncertainty, and SEnsitivity analysis (MOUSE) toolbox: overview and application

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  13. The Microbial Fecal Indicator Paradigm: Tools in the Toolbox Applications in Recreational Waters

    EPA Science Inventory

    Summary of ORD’s recent research to develop tools for assessing microbial water quality in recreational waters. Methods discussed include the development of health associations between microbial fecal indicators and the development of culture, and molecular methods for fec...

  14. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  15. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  16. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    USDA-ARS?s Scientific Manuscript database

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  17. SacLab: A toolbox for saccade analysis to increase usability of eye tracking systems in clinical ophthalmology practice.

    PubMed

    Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela

    2017-01-01

    Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. biomechZoo: An open-source toolbox for the processing, analysis, and visualization of biomechanical movement data.

    PubMed

    Dixon, Philippe C; Loh, Jonathan J; Michaud-Paquette, Yannick; Pearsall, David J

    2017-03-01

    It is common for biomechanics data sets to contain numerous dependent variables recorded over time, for many subjects, groups, and/or conditions. These data often require standard sorting, processing, and analysis operations to be performed in order to answer research questions. Visualization of these data is also crucial. This manuscript presents biomechZoo, an open-source toolbox that provides tools and graphical user interfaces to help users achieve these goals. The aims of this manuscript are to (1) introduce the main features of the toolbox, including a virtual three-dimensional environment to animate motion data (Director), a data plotting suite (Ensembler), and functions for the computation of three-dimensional lower-limb joint angles, moments, and power and (2) compare these computations to those of an existing validated system. To these ends, the steps required to process and analyze a sample data set via the toolbox are outlined. The data set comprises three-dimensional marker, ground reaction force (GRF), joint kinematic, and joint kinetic data of subjects performing straight walking and 90° turning manoeuvres. Joint kinematics and kinetics processed within the toolbox were found to be similar to outputs from a commercial system. The biomechZoo toolbox represents the work of several years and multiple contributors to provide a flexible platform to examine time-series data sets typical in the movement sciences. The toolbox has previously been used to process and analyse walking, running, and ice hockey data sets, and can integrate existing routines, such as the KineMat toolbox, for additional analyses. The toolbox can help researchers and clinicians new to programming or biomechanics to process and analyze their data through a customizable workflow, while advanced users are encouraged to contribute additional functionality to the project. Students may benefit from using biomechZoo as a learning and research tool. It is hoped that the toolbox can play a role in advancing research in the movement sciences. The biomechZoo m-files, sample data, and help repositories are available online (http://www.biomechzoo.com) under the Apache 2.0 License. The toolbox is supported for Matlab (r2014b or newer, The Mathworks Inc., Natick, USA) for Windows (Microsoft Corp., Redmond, USA) and Mac OS (Apple Inc., Cupertino, USA). Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. An Early Years Toolbox for Assessing Early Executive Function, Language, Self-Regulation, and Social Development: Validity, Reliability, and Preliminary Norms

    PubMed Central

    Howard, Steven J.; Melhuish, Edward

    2016-01-01

    Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years Toolbox (EYT) offers substantial advantages for early assessment of language, EF, self-regulation, and social development. In the current study, results of our large-scale administration of this toolbox to 1,764 preschool and early primary school students indicated very good reliability, convergent validity with existing measures, and developmental sensitivity. Results were also suggestive of better capture of children’s emerging abilities relative to comparison measures. Preliminary norms are presented, showing a clear developmental trajectory across half-year age groups. The accessibility of the EYT, as well as its advantages over existing measures, offers considerably enhanced opportunities for objective measurement of young children’s abilities to enable research and educational applications. PMID:28503022

  20. An experimental toolbox for the generation of cold and ultracold polar molecules

    NASA Astrophysics Data System (ADS)

    Zeppenfeld, Martin; Gantner, Thomas; Glöckner, Rosa; Ibrügger, Martin; Koller, Manuel; Prehn, Alexander; Wu, Xing; Chervenkov, Sotir; Rempe, Gerhard

    2017-01-01

    Cold and ultracold molecules enable fascinating applications in quantum science. We present our toolbox of techniques to generate the required molecule ensembles, including buffergas cooling, centrifuge deceleration and optoelectrical Sisyphus cooling. We obtain excellent control over both the motional and internal molecular degrees of freedom, allowing us to aim at various applications.

  1. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  2. Travel demand management : a toolbox of strategies to reduce single\\0x2010occupant vehicle trips and increase alternate mode usage in Arizona.

    DOT National Transportation Integrated Search

    2012-02-01

    The report provides a suite of recommended strategies to reduce single-occupant vehicle traffic in the urban : areas of Phoenix and Tucson, Arizona, which are presented as a travel demand management toolbox. The : toolbox includes supporting research...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnamurthy, Dheepak

    This paper is an overview of Power System Simulation Toolbox (psst). psst is an open-source Python application for the simulation and analysis of power system models. psst simulates the wholesale market operation by solving a DC Optimal Power Flow (DCOPF), Security Constrained Unit Commitment (SCUC) and a Security Constrained Economic Dispatch (SCED). psst also includes models for the various entities in a power system such as Generator Companies (GenCos), Load Serving Entities (LSEs) and an Independent System Operator (ISO). psst features an open modular object oriented architecture that will make it useful for researchers to customize, expand, experiment beyond solvingmore » traditional problems. psst also includes a web based Graphical User Interface (GUI) that allows for user friendly interaction and for implementation on remote High Performance Computing (HPCs) clusters for parallelized operations. This paper also provides an illustrative application of psst and benchmarks with standard IEEE test cases to show the advanced features and the performance of toolbox.« less

  4. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  5. Optics Program Simplifies Analysis and Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Engineers at Goddard Space Flight Center partnered with software experts at Mide Technology Corporation, of Medford, Massachusetts, through a Small Business Innovation Research (SBIR) contract to design the Disturbance-Optics-Controls-Structures (DOCS) Toolbox, a software suite for performing integrated modeling for multidisciplinary analysis and design. The DOCS Toolbox integrates various discipline models into a coupled process math model that can then predict system performance as a function of subsystem design parameters. The system can be optimized for performance; design parameters can be traded; parameter uncertainties can be propagated through the math model to develop error bounds on system predictions; and the model can be updated, based on component, subsystem, or system level data. The Toolbox also allows the definition of process parameters as explicit functions of the coupled model and includes a number of functions that analyze the coupled system model and provide for redesign. The product is being sold commercially by Nightsky Systems Inc., of Raleigh, North Carolina, a spinoff company that was formed by Mide specifically to market the DOCS Toolbox. Commercial applications include use by any contractors developing large space-based optical systems, including Lockheed Martin Corporation, The Boeing Company, and Northrup Grumman Corporation, as well as companies providing technical audit services, like General Dynamics Corporation

  6. The Handover Toolbox: a knowledge exchange and training platform for improving patient care.

    PubMed

    Drachsler, Hendrik; Kicken, Wendy; van der Klink, Marcel; Stoyanov, Slavi; Boshuizen, Henny P A; Barach, Paul

    2012-12-01

    Safe and effective patient handovers remain a global organisational and training challenge. Limited evidence supports available handover training programmes. Customisable training is a promising approach to improve the quality and sustainability of handover training and outcomes. We present a Handover Toolbox designed in the context of the European HANDOVER Project. The Toolbox aims to support physicians, nurses, individuals in health professions training, medical educators and handover experts by providing customised handover training tools for different clinical needs and contexts. The Handover Toolbox uses the Technology Enhanced Learning Design Process (TEL-DP), which encompasses user requirements analysis; writing personas; group concept mapping; analysis of suitable software; plus, minus, interesting rating; and usability testing. TEL-DP is aligned with participatory design approaches and ensures development occurs in close collaboration with, and engagement of, key stakeholders. Application of TEL-DP confirmed that the ideal formats of handover training differs for practicing professionals versus individuals in health profession education programmes. Training experts from different countries differed in their views on the optimal content and delivery of training. Analysis of suitable software identified ready-to-use systems that provide required functionalities and can be further customised to users' needs. Interest rating and usability testing resulted in improved usability, navigation and uptake of the Handover Toolbox. The design of the Handover Toolbox was based on a carefully led stakeholder participatory design using the TEL-DP approach. The Toolbox supports a customisable learning approach that allows trainers to design training that addresses the specific information needs of the various target groups. We offer recommendations regarding the application of the Handover Toolbox to medical educators.

  7. Air Sensor Toolbox

    EPA Pesticide Factsheets

    Air Sensor Toolbox provides information to citizen scientists, researchers and developers interested in learning more about new lower-cost compact air sensor technologies and tools for measuring air quality.

  8. A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain

    NASA Astrophysics Data System (ADS)

    Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut

    2017-08-01

    Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.

  9. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    PubMed Central

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-01-01

    Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code. PMID:19607698

  10. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.

    PubMed

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-07-16

    Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code.

  11. Sentinel-2 data exploitation with ESA's Sentinel-2 Toolbox

    NASA Astrophysics Data System (ADS)

    Gascon, Ferran; Ramoino, Fabrizzio; deanos, Yves-louis

    2017-04-01

    The Sentinel-2 Toolbox is a project kicked off by ESA in early 2014, under the umbrella of the ESA SEOM programme with the aim to provide a tool for visualizing, analysing, and processing the Sentinel-2 datasets. The toolbox is an extension of the SeNtinel Application Platform (SNAP), a project resulting from the effort of the developers of the Sentinel-1, Sentinel-2 and Sentinel-3 toolbox to provide a single common application framework suited for the mixed exploitation of SAR, high resolution optical and medium resolution optical datasets. All three development teams collaborate to drive the evolution of the common SNAP framework in a developer forum. In this triplet, the Sentinel-2 toolbox is dedicated to enhance SNAP support for high resolution optical imagery. It is a multi-mission toolbox, already providing support for Sentinel-2, RapidEye, Deimos, SPOT 1 to SPOT 5 datasets. In terms of processing algorithms, SNAP provides tools specific to the Sentinel-2 mission : • An atmospheric correction module, Sen2Cor, is integrated into the toolbox, and provides scene classification, atmospheric correction, cirrus detection and correction. The output L2A products can be opened seamlessly in the toolbox. • A multitemporal synthesis processor (L3) • A biophysical products processor (L2B) • A water processor • A deforestation detector • OTB tools integration • SNAP Engine for Cloud Exploitation along with a set of more generic tools for high resolution optical data exploitation. Together with the generic functionalities of SNAP this provides an ideal environment for designing multi-missions processing chains and producing value-added products from raw datasets. The use of SNAP is manifold and the desktop tools provides a rich application for interactive visualization, analysis and processing of data. But all tools available from SNAP can be accessed via command-line through the Graph Processing Framework (GPT), the kernel of the SNAP processing engine. This makes it a perfect candidate for driving the processing of data on servers for bulk processing.

  12. Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John

    2014-02-21

    The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less

  13. A CRISPR-Based Toolbox for Studying T Cell Signal Transduction

    PubMed Central

    Chi, Shen; Weiss, Arthur; Wang, Haopeng

    2016-01-01

    CRISPR/Cas9 system is a powerful technology to perform genome editing in a variety of cell types. To facilitate the application of Cas9 in mapping T cell signaling pathways, we generated a toolbox for large-scale genetic screens in human Jurkat T cells. The toolbox has three different Jurkat cell lines expressing distinct Cas9 variants, including wild-type Cas9, dCas9-KRAB, and sunCas9. We demonstrated that the toolbox allows us to rapidly disrupt endogenous gene expression at the DNA level and to efficiently repress or activate gene expression at the transcriptional level. The toolbox, in combination with multiple currently existing genome-wide sgRNA libraries, will be useful to systematically investigate T cell signal transduction using both loss-of-function and gain-of-function genetic screens. PMID:27057542

  14. A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering

    PubMed Central

    Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng

    2013-01-01

    Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791

  15. Developing a congestion mitigation toolbox.

    DOT National Transportation Integrated Search

    2011-09-30

    Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference for practitioners and an educational tool for elected officials acting through public policy boards to better understand the development, planning,...

  16. ESA's Multi-mission Sentinel-1 Toolbox

    NASA Astrophysics Data System (ADS)

    Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus

    2017-04-01

    The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for exploitation of polarimetric data including speckle filters, decompositions, and classifiers. The Toolbox will also include tools for large data stacks, supervised and unsupervised classification, improved vector handling and change detection. Architectural improvements such as smart memory configuration, task queuing, and optimizations for complex data will provide better support and performance for very large products and stacks.In addition, a Cloud Exploitation Platform Extension (CEP) has been developed to add the capability to smoothly utilize a cloud computing platform where EO data repositories and high performance processing capabilities are available. The extension to the SENTINEL Application Platform would facilitate entry into cloud processing services for supporting bulk processing on high performance clusters. Since December 2016, the COMET-LiCS InSAR portal (http://comet.nerc.ac.uk/COMET-LiCS-portal/) has been live, delivering interferograms and coherence estimates over the entire Alpine-Himalayan belt. The portal already contains tens of thousands of products, which can be browsed in a user-friendly portal, and downloaded for free by the general public. For our processing, we use the facilities at the Climate and Environmental Monitoring from Space (CEMS). Here we have large storage and processing facilities to our disposal, and a complete duplicate of the Sentinel-1 archive is maintained. This greatly simplifies the infrastructure we had to develop for automated processing of large areas. Here we will give an overview of the current status of the processing system, as well as discuss future plans. We will cover the infrastructure we developed to automatically produce interferograms and its challenges, and the processing strategy for time series analysis. We will outline the objectives of the system in the near and distant future, and a roadmap for its continued development. Finally, we will highlight some of the scientific results and projects linked to the system.

  17. Turbo-Satori: a neurofeedback and brain-computer interface toolbox for real-time functional near-infrared spectroscopy.

    PubMed

    Lührs, Michael; Goebel, Rainer

    2017-10-01

    Turbo-Satori is a neurofeedback and brain-computer interface (BCI) toolbox for real-time functional near-infrared spectroscopy (fNIRS). It incorporates multiple pipelines from real-time preprocessing and analysis to neurofeedback and BCI applications. The toolbox is designed with a focus in usability, enabling a fast setup and execution of real-time experiments. Turbo-Satori uses an incremental recursive least-squares procedure for real-time general linear model calculation and support vector machine classifiers for advanced BCI applications. It communicates directly with common NIRx fNIRS hardware and was tested extensively ensuring that the calculations can be performed in real time without a significant change in calculation times for all sampling intervals during ongoing experiments of up to 6 h of recording. Enabling immediate access to advanced processing features also allows the use of this toolbox for students and nonexperts in the field of fNIRS data acquisition and processing. Flexible network interfaces allow third party stimulus applications to access the processed data and calculated statistics in real time so that this information can be easily incorporated in neurofeedback or BCI presentations.

  18. FADTTSter: accelerating hypothesis testing with functional analysis of diffusion tensor tract statistics

    NASA Astrophysics Data System (ADS)

    Noel, Jean; Prieto, Juan C.; Styner, Martin

    2017-03-01

    Functional Analysis of Diffusion Tensor Tract Statistics (FADTTS) is a toolbox for analysis of white matter (WM) fiber tracts. It allows associating diffusion properties along major WM bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these WM tract properties. However, to use this toolbox, a user must have an intermediate knowledge in scripting languages (MATLAB). FADTTSter was created to overcome this issue and make the statistical analysis accessible to any non-technical researcher. FADTTSter is actively being used by researchers at the University of North Carolina. FADTTSter guides non-technical users through a series of steps including quality control of subjects and fibers in order to setup the necessary parameters to run FADTTS. Additionally, FADTTSter implements interactive charts for FADTTS' outputs. This interactive chart enhances the researcher experience and facilitates the analysis of the results. FADTTSter's motivation is to improve usability and provide a new analysis tool to the community that complements FADTTS. Ultimately, by enabling FADTTS to a broader audience, FADTTSter seeks to accelerate hypothesis testing in neuroimaging studies involving heterogeneous clinical data and diffusion tensor imaging. This work is submitted to the Biomedical Applications in Molecular, Structural, and Functional Imaging conference. The source code of this application is available in NITRC.

  19. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap.

    PubMed

    Snigdha, Shikha; Milgram, Norton W; Willis, Sherry L; Albert, Marylin; Weintraub, S; Fortin, Norbert J; Cotman, Carl W

    2013-07-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer's disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer's disease. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap

    PubMed Central

    Snigdha, Shikha; Milgram, Norton W.; Willis, Sherry L.; Albert, Marylin; Weintraub, S.; Fortin, Norbert J.; Cotman, Carl W.

    2013-01-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer’s disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer’s disease. PMID:23434040

  2. A Michigan toolbox for mitigating traffic congestion.

    DOT National Transportation Integrated Search

    2011-09-30

    "Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference : for practitioners and an educational tool for elected officials acting through public policy boards to better : understand the development, plan...

  3. A portable toolbox to monitor and evaluate signal operations.

    DOT National Transportation Integrated Search

    2011-10-01

    Researchers from the Texas Transportation Institute developed a portable tool consisting of a fieldhardened : computer interfacing with the traffic signal cabinet through special enhanced Bus Interface Units. : The toolbox consisted of a monitoring t...

  4. Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0

    PubMed Central

    Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.

    2012-01-01

    Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097

  5. Wyrm: A Brain-Computer Interface Toolbox in Python.

    PubMed

    Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin

    2015-10-01

    In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.

  6. ReTrOS: a MATLAB toolbox for reconstructing transcriptional activity from gene and protein expression data.

    PubMed

    Minas, Giorgos; Momiji, Hiroshi; Jenkins, Dafyd J; Costa, Maria J; Rand, David A; Finkenstädt, Bärbel

    2017-06-26

    Given the development of high-throughput experimental techniques, an increasing number of whole genome transcription profiling time series data sets, with good temporal resolution, are becoming available to researchers. The ReTrOS toolbox (Reconstructing Transcription Open Software) provides MATLAB-based implementations of two related methods, namely ReTrOS-Smooth and ReTrOS-Switch, for reconstructing the temporal transcriptional activity profile of a gene from given mRNA expression time series or protein reporter time series. The methods are based on fitting a differential equation model incorporating the processes of transcription, translation and degradation. The toolbox provides a framework for model fitting along with statistical analyses of the model with a graphical interface and model visualisation. We highlight several applications of the toolbox, including the reconstruction of the temporal cascade of transcriptional activity inferred from mRNA expression data and protein reporter data in the core circadian clock in Arabidopsis thaliana, and how such reconstructed transcription profiles can be used to study the effects of different cell lines and conditions. The ReTrOS toolbox allows users to analyse gene and/or protein expression time series where, with appropriate formulation of prior information about a minimum of kinetic parameters, in particular rates of degradation, users are able to infer timings of changes in transcriptional activity. Data from any organism and obtained from a range of technologies can be used as input due to the flexible and generic nature of the model and implementation. The output from this software provides a useful analysis of time series data and can be incorporated into further modelling approaches or in hypothesis generation.

  7. Silk fibroin as biomaterial for bone tissue engineering.

    PubMed

    Melke, Johanna; Midha, Swati; Ghosh, Sourabh; Ito, Keita; Hofmann, Sandra

    2016-02-01

    Silk fibroin (SF) is a fibrous protein which is produced mainly by silkworms and spiders. Its unique mechanical properties, tunable biodegradation rate and the ability to support the differentiation of mesenchymal stem cells along the osteogenic lineage, have made SF a favorable scaffold material for bone tissue engineering. SF can be processed into various scaffold forms, combined synergistically with other biomaterials to form composites and chemically modified, which provides an impressive toolbox and allows SF scaffolds to be tailored to specific applications. This review discusses and summarizes recent advancements in processing SF, focusing on different fabrication and functionalization methods and their application to grow bone tissue in vitro and in vivo. Potential areas for future research, current challenges, uncertainties and gaps in knowledge are highlighted. Silk fibroin is a natural biomaterial with remarkable biomedical and mechanical properties which make it favorable for a broad range of bone tissue engineering applications. It can be processed into different scaffold forms, combined synergistically with other biomaterials to form composites and chemically modified which provides a unique toolbox and allows silk fibroin scaffolds to be tailored to specific applications. This review discusses and summarizes recent advancements in processing silk fibroin, focusing on different fabrication and functionalization methods and their application to grow bone tissue in vitro and in vivo. Potential areas for future research, current challenges, uncertainties and gaps in knowledge are highlighted. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  8. Cementitious Barriers Partnership FY2013 End-Year Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G. P.; Langton, C. A.; Burns, H. H.

    2013-11-01

    In FY2013, the Cementitious Barriers Partnership (CBP) demonstrated continued tangible progress toward fulfilling the objective of developing a set of software tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In November 2012, the CBP released “Version 1.0” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. In addition, the CBP completed development of new software for the “Version 2.0” Toolbox to be released in early FY2014 and demonstrated use of the Version 1.0 Toolbox on DOEmore » applications. The current primary software components in both Versions 1.0 and 2.0 are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. The CBP Software Toolbox Version 1.0 supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. Version 2.0 includes the additional analysis of chloride attack and dual regime flow and contaminant migration in fractured and non-fractured cementitious material. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). THAMES is a planned future CBP Toolbox component focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. Two CBP software demonstrations were conducted in FY2013, one to support the Saltstone Disposal Facility (SDF) at SRS and the other on a representative Hanford high-level waste tank. The CBP Toolbox demonstration on the SDF provided analysis on the most probable degradation mechanisms to the cementitious vault enclosure caused by sulfate and carbonation ingress. This analysis was documented and resulted in the issuance of a SDF Performance Assessment Special Analysis by Liquid Waste Operations this fiscal year. The two new software tools supporting chloride attack and dual-regime flow will provide additional degradation tools to better evaluate performance of DOE and commercial cementitious barriers. The CBP SRNL experimental program produced two patent applications and field data that will be used in the development and calibration of CBP software tools being developed in FY2014. The CBP software and simulation tools varies from other efforts in that all the tools are based upon specific and relevant experimental research of cementitious materials utilized in DOE applications. The CBP FY2013 program involved continuing research to improve and enhance the simulation tools as well as developing new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools through laboratory experiments and analysis of field specimens are ongoing and will continue into FY2014 to quantify and reduce the uncertainty associated with performance assessments. This end-year report summarizes FY2013 software development efforts and the various experimental programs that are providing data for calibration and validation of the CBP developed software.« less

  9. EPA EMERGENCY PLANNING TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  10. Virtual acoustic environments for comprehensive evaluation of model-based hearing devices.

    PubMed

    Grimm, Giso; Luberadzka, Joanna; Hohmann, Volker

    2018-06-01

    Create virtual acoustic environments (VAEs) with interactive dynamic rendering for applications in audiology. A toolbox for creation and rendering of dynamic virtual acoustic environments (TASCAR) that allows direct user interaction was developed for application in hearing aid research and audiology. The software architecture and the simulation methods used to produce VAEs are outlined. Example environments are described and analysed. With the proposed software, a tool for simulation of VAEs is available. A set of VAEs rendered with the proposed software was described.

  11. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    PubMed

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  12. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. OVERVIEW AND APPLICATION. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  13. A GIS tool for two-dimensional glacier-terminus change tracking

    NASA Astrophysics Data System (ADS)

    Urbanski, Jacek Andrzej

    2018-02-01

    This paper presents a Glacier Termini Tracking (GTT) toolbox for the two-dimensional analysis of glacier-terminus position changes. The input consists of a vector layer with several termini lines relating to the same glacier at different times. The output layers allow analyses to be conducted of glacier-terminus retreats, changes in retreats over time and along the ice face, and glacier-terminus fluctuations over time. The application of three tools from the toolbox is demonstrated via the analysis of eight glacier-terminus retreats and fluctuations at the Hornsund fjord in south Svalbard. It is proposed that this toolbox may also be useful in the study of other line features that change over time, like coastlines and rivers. The toolbox has been coded in Python and runs via ArcGIS.

  14. Debates - Stochastic subsurface hydrology from theory to practice: Introduction

    NASA Astrophysics Data System (ADS)

    Rajaram, Harihar

    2016-12-01

    This paper introduces the papers in the "Debates - Stochastic Subsurface Hydrology from Theory to Practice" series. Beginning in the 1970s, the field of stochastic subsurface hydrology has been an active field of research, with over 3500 journal publications, of which over 850 have appeared in Water Resources Research. We are fortunate to have insightful contributions from four groups of distinguished authors who discuss the reasons why the advanced research framework established in stochastic subsurface hydrology has not impacted the practice of groundwater flow and transport modeling and design significantly. There is reasonable consensus that a community effort aimed at developing "toolboxes" for applications of stochastic methods will make them more accessible and encourage practical applications.

  15. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Garcia-Mondejar, Albert; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the frontend for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the dataformatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat2 Baseline C. The second release of the Toolbox, published in October 2016, has a new graphical user interface and other visualisation improvements. The third release (January 2017) includes more features and solves issues from the previous versions.

  16. Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows

    NASA Astrophysics Data System (ADS)

    Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.

    2017-06-01

    The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.

  17. Improving student comprehension of the interconnectivity of the hydrologic cycle with a novel 'hydrology toolbox', integrated watershed model, and companion textbook

    NASA Astrophysics Data System (ADS)

    Huning, L. S.; Margulis, S. A.

    2013-12-01

    Concepts in introductory hydrology courses are often taught in the context of process-based modeling that ultimately is integrated into a watershed model. In an effort to reduce the learning curve associated with applying hydrologic concepts to real-world applications, we developed and incorporated a 'hydrology toolbox' that complements a new, companion textbook into introductory undergraduate hydrology courses. The hydrology toolbox contains the basic building blocks (functions coded in MATLAB) for an integrated spatially-distributed watershed model that makes hydrologic topics (e.g. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) more user-friendly and accessible for students. The toolbox functions can be used in a modular format so that students can study individual hydrologic processes and become familiar with the hydrology toolbox. This approach allows such courses to emphasize understanding and application of hydrologic concepts rather than computer coding or programming. While topics in introductory hydrology courses are often introduced and taught independently or semi-independently, they are inherently interconnected. These toolbox functions are therefore linked together at the end of the course to reinforce a holistic understanding of how these hydrologic processes are measured, interconnected, and modeled. They are integrated into a spatially-distributed watershed model or numerical laboratory where students can explore a range of topics such as rainfall-runoff modeling, urbanization, deforestation, watershed response to changes in parameters or forcings, etc. Model output can readily be visualized and analyzed by students to understand watershed response in a real river basin or a simple 'toy' basin. These tools complement the textbook, each of which has been well received by students in multiple hydrology courses with various disciplinary backgrounds. The same governing equations that students have studied in the textbook and used in the toolbox have been encapsulated in the watershed model. Therefore, the combination of the hydrology toolbox, integrated watershed model, and textbook tends to eliminate the potential disconnect between process-based modeling and an 'off-the-shelf' watershed model.

  18. Tensor Toolbox for MATLAB v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kola, Tamara; Bader, Brett W.; Acar Ataman, Evrim NMN

    Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors using MATLAB's object-oriented features. It also provides algorithms for tensor decomposition and factorization, algorithms for computing tensor eigenvalues, and methods for visualization of results.

  19. Screening and assessment of chronic pain among children with cerebral palsy: a process evaluation of a pain toolbox.

    PubMed

    Orava, Taryn; Provvidenza, Christine; Townley, Ashleigh; Kingsnorth, Shauna

    2018-06-08

    Though high numbers of children with cerebral palsy experience chronic pain, it remains under-recognized. This paper describes an evaluation of implementation supports and adoption of the Chronic Pain Assessment Toolbox for Children with Disabilities (the Toolbox) to enhance pain screening and assessment practices within a pediatric rehabilitation and complex continuing care hospital. A multicomponent knowledge translation strategy facilitated Toolbox adoption, inclusive of a clinical practice guideline, cerebral palsy practice points and assessment tools. Across the hospital, seven ambulatory care clinics with cerebral palsy caseloads participated in a staggered roll-out (Group 1: exclusive CP caseloads, March-December; Group 2: mixed diagnostic caseloads, August-December). Evaluation measures included client electronic medical record audit, document review and healthcare provider survey and interviews. A significant change in documentation of pain screening and assessment practice from pre-Toolbox (<2%) to post-Toolbox adoption (53%) was found. Uptake in Group 2 clinics lagged behind Group 1. Opportunities to use the Toolbox consistently (based on diagnostic caseload) and frequently (based on client appointments) were noted among contextual factors identified. Overall, the Toolbox was positively received and clinically useful. Findings affirm that the Toolbox, in conjunction with the application of integrated knowledge translation principles and an established knowledge translation framework, has potential to be a useful resource to enrich and standardize chronic pain screening and assessment practices among children with cerebral palsy. Implications for Rehabilitation It is important to engage healthcare providers in the conceptualization, development, implementation and evaluation of a knowledge-to-action best practice product. The Chronic Pain Toolbox for Children with Disabilities provides rehabilitation staff with guidance on pain screening and assessment best practice and offers a range of validated tools that can be incorporated in ambulatory clinic settings to meet varied client needs. Considering unique clinical contexts (i.e., opportunities for use, provider engagement, staffing absences/turnover) is required to optimize and sustain chronic pain screening and assessment practices in rehabilitation outpatient settings.

  20. Aerospace Toolbox--a flight vehicle design, analysis, simulation, and software development environment II: an in-depth overview

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.

    2002-07-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  1. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli.

    PubMed

    Crosse, Michael J; Di Liberto, Giovanni M; Bednar, Adam; Lalor, Edmund C

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter-often referred to as a temporal response function-that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application.

  2. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli

    PubMed Central

    Crosse, Michael J.; Di Liberto, Giovanni M.; Bednar, Adam; Lalor, Edmund C.

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter—often referred to as a temporal response function—that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application. PMID:27965557

  3. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    PubMed

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  4. FracPaQ: A MATLAB™ toolbox for the quantification of fracture patterns

    NASA Astrophysics Data System (ADS)

    Healy, David; Rizzo, Roberto E.; Cornwell, David G.; Farrell, Natalie J. C.; Watkins, Hannah; Timms, Nick E.; Gomez-Rivas, Enrique; Smith, Michael

    2017-02-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, and spatial distributions often exhibit some kind of order. In detail, relationships may exist among the different fracture attributes, e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture attributes and patterns. This paper describes FracPaQ, a new open source, cross-platform toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on previously published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales, rock types and tectonic settings. The implemented methods presented are inherently scale independent, and a key task where applicable is analysing and integrating quantitative fracture pattern data from micro-to macro-scales. The toolbox was developed in MATLAB™ and the source code is publicly available on GitHub™ and the Mathworks™ FileExchange. The code runs on any computer with MATLAB installed, including PCs with Microsoft Windows, Apple Macs with Mac OS X, and machines running different flavours of Linux. The application, source code and sample input files are available in open repositories in the hope that other developers and researchers will optimise and extend the functionality for the benefit of the wider community.

  5. The ROC Toolbox: A toolbox for analyzing receiver-operating characteristics derived from confidence ratings.

    PubMed

    Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P

    2017-08-01

    Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.

  6. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO CONTAMINATION THREATS TO DRINKING WATER SYSTEMS

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  7. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Escolà, Roger; Garcia-Mondejar, Albert; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrozio, Americo; Restano, Marco; Benveniste, Jérôme

    2016-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the envissat reprocessed, new features regarding data interpolation and formulas updates.

  8. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Mondéjar, Albert; Benveniste, Jérôme; Naeije, Marc; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Ambrózio, Américo; Restano, Marco

    2016-07-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Études Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the EnviSat reprocessed, new features regarding data interpolation and formulas updates.

  9. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, P.; Benveniste, J.

    2011-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),

  10. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  11. Emotion assessment using the NIH Toolbox

    PubMed Central

    Butt, Zeeshan; Pilkonis, Paul A.; Cyranowski, Jill M.; Zill, Nicholas; Hendrie, Hugh C.; Kupst, Mary Jo; Kelly, Morgen A. R.; Bode, Rita K.; Choi, Seung W.; Lai, Jin-Shei; Griffith, James W.; Stoney, Catherine M.; Brouwers, Pim; Knox, Sarah S.; Cella, David

    2013-01-01

    One of the goals of the NIH Toolbox for Assessment of Neurological and Behavioral Function was to identify or develop brief measures of emotion for use in prospective epidemiologic and clinical research. Emotional health has significant links to physical health and exerts a powerful effect on perceptions of life quality. Based on an extensive literature review and expert input, the Emotion team identified 4 central subdomains: Negative Affect, Psychological Well-Being, Stress and Self-Efficacy, and Social Relationships. A subsequent psychometric review identified several existing self-report and proxy measures of these subdomains with measurement characteristics that met the NIH Toolbox criteria. In cases where adequate measures did not exist, robust item banks were developed to assess concepts of interest. A population-weighted sample was recruited by an online survey panel to provide initial item calibration and measure validation data. Participants aged 8 to 85 years completed self-report measures whereas parents/guardians responded for children aged 3 to 12 years. Data were analyzed using a combination of classic test theory and item response theory methods, yielding efficient measures of emotional health concepts. An overview of the development of the NIH Toolbox Emotion battery is presented along with preliminary results. Norming activities led to further refinement of the battery, thus enhancing the robustness of emotional health measurement for researchers using the NIH Toolbox. PMID:23479549

  12. The triticeae toolbox: combining phenotype and genotype data to advance small-grains breeding

    USDA-ARS?s Scientific Manuscript database

    The Triticeae Toolbox (http://triticeaetoolbox.org; T3) is the database schema enabling plant breeders and researchers to combine, visualize, and interrogate the wealth of phenotype and genotype data generated by the Triticeae Coordinated Agricultural Project (TCAP). T3 enables users to define speci...

  13. LAB ANALYSIS OF EMERGENCY WATER SAMPLES CONTAINING UNKNOWN CONTAMINANTS: CONSIDERATIONS FROM THE USEPA RESPONSE PROTOCOL TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  14. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  15. CEMENTITIOUS BARRIERS PARTNERSHIP FY13 MID-YEAR REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, H.; Flach, G.; Langton, C.

    2013-05-01

    In FY2013, the Cementitious Barriers Partnership (CBP) is continuing in its effort to develop and enhance software tools demonstrating tangible progress toward fulfilling the objective of developing a set of tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In FY2012, the CBP released the initial inhouse “Beta-version” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. The current primary software components are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. THAMESmore » is a planned future CBP Toolbox component (FY13/14) focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. This past November, the CBP Software Toolbox Version 1.0 was released that supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). The CBP issued numerous reports and other documentation that accompanied the “Version 1.0” release including a CBP Software Toolbox User Guide and Installation Guide. These documents, as well as, the presentations from the CBP Software Toolbox Demonstration and User Workshop, which are briefly described below, can be accessed from the CBP webpage at http://cementbarriers.org/. The website was recently modified to describe the CBP Software Toolbox and includes an interest form for application to use the software. The CBP FY13 program is continuing research to improve and enhance the simulation tools as well as develop new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools thru laboratory experiments and analysis of field specimens are ongoing to quantify and reduce the uncertainty associated with performance assessments are ongoing. This mid-year report also includes both a summary on the FY13 software accomplishments in addition to the release of Version 1.0 of the CBP Software Toolbox and the various experimental programs that are providing data for calibration and validation of the CBP developed software. The focus this year for experimental studies was to measure transport in cementitious material by utilization of a leaching method and reduction capacity of saltstone field samples. Results are being used to calibrate and validate the updated carbonation model.« less

  16. The Benefits of Comparing Grapefruits and Tangerines: A Toolbox for European Cross-Cultural Comparisons in Engineering Education--Using This Toolbox to Study Gendered Images of Engineering among Students

    ERIC Educational Resources Information Center

    Godfroy-Genin, Anne-Sophie; Pinault, Cloe

    2006-01-01

    The main objective of the WomEng European research project was to assess when, how and why women decide to or not to study engineering. This question was addressed through an international cross-comparison by an interdisciplinary research team in seven European countries. This article presents, in the first part, the methodological toolbox…

  17. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  18. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product version is compared with the previous version. Although, the toolbox is mainly tested for the Baltic Sea yet, it can easily be adapted to different datasets and parameters, regardless of the geographic region. In this presentation the usability of the toolbox is demonstrated along with several results of the validation process.

  19. ESA Atmospheric Toolbox

    NASA Astrophysics Data System (ADS)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/

  20. EPA RESPONSE PROTOCOL TOOLBOX TO HELP EVALUATION OF CONTAMINATION THREATS & RESPONDING TO THREATS: MODULE 1-WATER UTILITY PLANNING GUIDE

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  1. NIA outreach to minority and health disparity populations can a toolbox for recruitment and retention be far behind?

    PubMed

    Harden, J Taylor; Silverberg, Nina

    2010-01-01

    The ability to locate the right research tool at the right time for recruitment and retention of minority and health disparity populations is a challenge. This article provides an introduction to a number of recruitment and retention tools in a National Institute on Aging Health Disparities Toolbox and to this special edition on challenges and opportunities in recruitment and retention of minority populations in Alzheimer disease and dementia research. The Health Disparities Toolbox and Health Disparities Resource Persons Network are described along with other more established resource tools including the Alzheimer Disease Center Education Cores, Alzheimer Disease Education and Referral Center, and Resource Centers for Minority Aging Research. Nine featured articles are introduced. The articles address a range of concerns including what we know and do not know, conceptual and theoretical perspectives framing issues of diversity and inclusion, success as a result of sustained investment of time and community partnerships, the significant issue of mistrust, willingness to participate in research as a dynamic personal attribute, Helpline Service and the amount of resources required for success, assistance in working with Limited English Proficiency elders, and sage advice from social marketing and investigations of health literacy as a barrier to recruitment and retention. Finally, an appeal is made for scientists to share tools for the National Institute on Aging Health Disparity Toolbox and to join the Health Disparities Resource Persons Network.

  2. GISMO: A MATLAB toolbox for seismic research, monitoring, & education

    NASA Astrophysics Data System (ADS)

    Thompson, G.; Reyes, C. G.; Kempler, L. A.

    2017-12-01

    GISMO is an open-source MATLAB toolbox which provides an object-oriented framework to build workflows and applications that read, process, visualize and write seismic waveform, catalog and instrument response data. GISMO can retrieve data from a variety of sources (e.g. FDSN web services, Earthworm/Winston servers) and data formats (SAC, Seisan, etc.). It can handle waveform data that crosses file boundaries. All this alleviates one of the most time consuming part for scientists developing their own codes. GISMO simplifies seismic data analysis by providing a common interface for your data, regardless of its source. Several common plots are built-in to GISMO, such as record section plots, spectrograms, depth-time sections, event count per unit time, energy release per unit time, etc. Other visualizations include map views and cross-sections of hypocentral data. Several common processing methods are also included, such as an extensive set of tools for correlation analysis. Support is being added to interface GISMO with ObsPy. GISMO encourages community development of an integrated set of codes and accompanying documentation, eliminating the need for seismologists to "reinvent the wheel". By sharing code the consistency and repeatability of results can be enhanced. GISMO is hosted on GitHub with documentation both within the source code and in the project wiki. GISMO has been used at the University of South Florida and University of Alaska Fairbanks in graduate-level courses including Seismic Data Analysis, Time Series Analysis and Computational Seismology. GISMO has also been tailored to interface with the common seismic monitoring software and data formats used by volcano observatories in the US and elsewhere. As an example, toolbox training was delivered to researchers at INETER (Nicaragua). Applications built on GISMO include IceWeb (e.g. web-based spectrograms), which has been used by Alaska Volcano Observatory since 1998 and became the prototype for the USGS Pensive system.

  3. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Casto, Gordon V.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a toolbox format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  4. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; Hetherington, Samuel E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a "toolbox" format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  5. Citizen Science Air Monitoring in the Ironbound Community ...

    EPA Pesticide Factsheets

    The Environmental Protection Agency’s (EPA) mission is to protect human health and the environment. To move toward achieving this goal, EPA is facilitating identification of potential environmental concerns, particularly in vulnerable communities. This includes actively supporting citizen science projects and providing communities with the information and assistance they need to conduct their own air pollution monitoring efforts. The Air Sensor Toolbox for Citizen Scientists1 was developed as a resource to meet stakeholder needs. Examples of materials developed for the Toolbox and ultimately pilot tested in the Ironbound Community in Newark, New Jersey are reported here. The Air Sensor Toolbox for Citizen Scientists is designed as an online resource that provides information and guidance on new, low-cost compact technologies used for measuring air quality. The Toolbox features resources developed by EPA researchers that can be used by citizens to effectively collect, analyze, interpret, and communicate air quality data. The resources include information about sampling methods, how to calibrate and validate monitors, options for measuring air quality, data interpretation guidelines, and low-cost sensor performance information. This Regional Applied Research Effort (RARE) project provided an opportunity for the Office of Research and Development (ORD) to work collaboratively with EPA Region 2 to provide the Ironbound Community with a “Toolbox” specific for c

  6. Streptomyces spp. in the biocatalysis toolbox.

    PubMed

    Spasic, Jelena; Mandic, Mina; Djokic, Lidija; Nikodinovic-Runic, Jasmina

    2018-04-01

    About 20,100 research publications dated 2000-2017 were recovered searching the PubMed and Web of Science databases for Streptomyces, which are the richest known source of bioactive molecules. However, these bacteria with versatile metabolism are powerful suppliers of biocatalytic tools (enzymes) for advanced biotechnological applications such as green chemical transformations and biopharmaceutical and biofuel production. The recent technological advances, especially in DNA sequencing coupled with computational tools for protein functional and structural prediction, and the improved access to microbial diversity enabled the easier access to enzymes and the ability to engineer them to suit a wider range of biotechnological processes. The major driver behind a dramatic increase in the utilization of biocatalysis is sustainable development and the shift toward bioeconomy that will, in accordance to the UN policy agenda "Bioeconomy to 2030," become a global effort in the near future. Streptomyces spp. already play a significant role among industrial microorganisms. The intention of this minireview is to highlight the presence of Streptomyces in the toolbox of biocatalysis and to give an overview of the most important advances in novel biocatalyst discovery and applications. Judging by the steady increase in a number of recent references (228 for the 2000-2017 period), it is clear that biocatalysts from Streptomyces spp. hold promises in terms of valuable properties and applicative industrial potential.

  7. A Molecular Toolbox for Rapid Generation of Viral Vectors to Up- or Down-Regulate Neuronal Gene Expression in vivo

    PubMed Central

    White, Melanie D.; Milne, Ruth V. J.; Nolan, Matthew F.

    2011-01-01

    We introduce a molecular toolbox for manipulation of neuronal gene expression in vivo. The toolbox includes promoters, ion channels, optogenetic tools, fluorescent proteins, and intronic artificial microRNAs. The components are easily assembled into adeno-associated virus (AAV) or lentivirus vectors using recombination cloning. We demonstrate assembly of toolbox components into lentivirus and AAV vectors and use these vectors for in vivo expression of inwardly rectifying potassium channels (Kir2.1, Kir3.1, and Kir3.2) and an artificial microRNA targeted against the ion channel HCN1 (HCN1 miRNA). We show that AAV assembled to express HCN1 miRNA produces efficacious and specific in vivo knockdown of HCN1 channels. Comparison of in vivo viral transduction using HCN1 miRNA with mice containing a germ line deletion of HCN1 reveals similar physiological phenotypes in cerebellar Purkinje cells. The easy assembly and re-usability of the toolbox components, together with the ability to up- or down-regulate neuronal gene expression in vivo, may be useful for applications in many areas of neuroscience. PMID:21772812

  8. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  9. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling

    PubMed Central

    Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078

  10. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.

    PubMed

    Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.

  11. A versatile software package for inter-subject correlation based analyses of fMRI.

    PubMed

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  12. A versatile software package for inter-subject correlation based analyses of fMRI

    PubMed Central

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818

  13. Acrylamide mitigation strategies: critical appraisal of the FoodDrinkEurope toolbox.

    PubMed

    Palermo, M; Gökmen, V; De Meulenaer, B; Ciesarová, Z; Zhang, Y; Pedreschi, F; Fogliano, V

    2016-06-15

    FoodDrinkEurope Federation recently released the latest version of the Acrylamide Toolbox to support manufacturers in acrylamide reduction activities giving indication about the possible mitigation strategies. The Toolbox is intended for small and medium size enterprises with limited R&D resources, however no comments about the pro and cons of the different measures were provided to advise the potential users. Experts of the field are aware that not all the strategies proposed have equal value in terms of efficacy and cost/benefit ratio. This consideration prompted us to provide a qualitative science-based ranking of the mitigation strategies proposed in the acrylamide Toolbox, focusing on bakery and fried potato products. Five authors from different geographical areas having a publication record on acrylamide mitigation strategies worked independently ranking the efficacy of the acrylamide mitigation strategies taking into account three key parameters: (i) reduction rate; (ii) side effects; and (iii) applicability and economic impact. On the basis of their own experience and considering selected literature of the last ten years, the authors scored for each key parameter the acrylamide mitigation strategies proposed in the Toolbox. As expected, all strategies selected in the Toolbox turned out to be useful, however, not at the same level. The use of enzyme asparaginase and the selection of low sugar varieties were considered the best mitigation strategies in bakery and in potato products, respectively. According to authors' opinion most of the other mitigation strategies, although effective, either have relevant side effects on the sensory profile of the products, or they are not easy to implement in industrial production. The final outcome was a science based commented ranking which can enrich the acrylamide Toolbox supporting individual manufacturer in taking the best actions to reduce the acrylamide content in their specific production context.

  14. Rural ITS toolbox

    DOT National Transportation Integrated Search

    2001-11-01

    This document identifies successful rural Intelligent Transportation Systems (ITS) projects and statewide applications. These applications are referred to as "tools" and include those in the process of being tested prior to full deployment. The tools...

  15. Recent advances of molecular toolbox construction expand Pichia pastoris in synthetic biology applications.

    PubMed

    Kang, Zhen; Huang, Hao; Zhang, Yunfeng; Du, Guocheng; Chen, Jian

    2017-01-01

    Pichia pastoris: (reclassified as Komagataella phaffii), a methylotrophic yeast strain has been widely used for heterologous protein production because of its unique advantages, such as readily achievable high-density fermentation, tractable genetic modifications and typical eukaryotic post-translational modifications. More recently, P. pastoris as a metabolic pathway engineering platform has also gained much attention. In this mini-review, we addressed recent advances of molecular toolboxes, including synthetic promoters, signal peptides, and genome engineering tools that established for P. pastoris. Furthermore, the applications of P. pastoris towards synthetic biology were also discussed and prospected especially in the context of genome-scale metabolic pathway analysis.

  16. An overview of recent advances in system identification

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan

    1994-01-01

    This paper presents an overview of the recent advances in system identification for modal testing and control of large flexible structures. Several techniques are discussed including the Observer/Kalman Filter Identification, the Observer/Controller Identification, and the State-Space System Identification in the Frequency Domain. The System/Observer/Controller Toolbox developed at NASA Langley Research Center is used to show the applications of these techniques to real aerospace structures such as the Hubble spacecraft telescope and the active flexible aircraft wing.

  17. Visualizing flow fields using acoustic Doppler current profilers and the Velocity Mapping Toolbox

    USGS Publications Warehouse

    Jackson, P. Ryan

    2013-01-01

    The purpose of this fact sheet is to provide examples of how the U.S. Geological Survey is using acoustic Doppler current profilers for much more than routine discharge measurements. These instruments are capable of mapping complex three-dimensional flow fields within rivers, lakes, and estuaries. Using the Velocity Mapping Toolbox to process the ADCP data allows detailed visualization of the data, providing valuable information for a range of studies and applications.

  18. An Educational Model for Hands-On Hydrology Education

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.

    2014-12-01

    This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.

  19. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  20. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, III, F. G.

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less

  1. Atlas - a data warehouse for integrative bioinformatics.

    PubMed

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire M S; Ling, John; Ouellette, B F Francis

    2005-02-21

    We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: http://bioinformatics.ubc.ca/atlas/

  2. Atlas – a data warehouse for integrative bioinformatics

    PubMed Central

    Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire MS; Ling, John; Ouellette, BF Francis

    2005-01-01

    Background We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. Description The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. Conclusion The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: PMID:15723693

  3. PredPsych: A toolbox for predictive machine learning-based approach in experimental psychology research.

    PubMed

    Koul, Atesh; Becchio, Cristina; Cavallo, Andrea

    2017-12-12

    Recent years have seen an increased interest in machine learning-based predictive methods for analyzing quantitative behavioral data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible implementation. The aim of current work was to build an open-source R toolbox - "PredPsych" - that could make these methods readily available to all psychologists. PredPsych is a user-friendly, R toolbox based on machine-learning predictive algorithms. In this paper, we present the framework of PredPsych via the analysis of a recently published multiple-subject motion capture dataset. In addition, we discuss examples of possible research questions that can be addressed with the machine-learning algorithms implemented in PredPsych and cannot be easily addressed with univariate statistical analysis. We anticipate that PredPsych will be of use to researchers with limited programming experience not only in the field of psychology, but also in that of clinical neuroscience, enabling computational assessment of putative bio-behavioral markers for both prognosis and diagnosis.

  4. Motion Simulation in the Environment for Auditory Research

    DTIC Science & Technology

    2011-08-01

    Toolbox, Centre for Digital Music , Queen Mary University of London, 2009. http://www.isophonics.net/content/spatial-audio- matlab-toolbox (accessed July 27...this work. 46 Student Bio I studied Music Technology at Northwestern University, graduating as valedictorian of the School of... Music in 2008. In 2009, I was awarded the Gates Cambridge Scholarship to fund a postgraduate degree at the University of Cambridge. I read for a

  5. The 'Toolbox' of strategies for managing Haemonchus contortus in goats: What's in and what's out.

    PubMed

    Kearney, P E; Murray, P J; Hoy, J M; Hohenhaus, M; Kotze, A

    2016-04-15

    A dynamic and innovative approach to managing the blood-consuming nematode Haemonchus contortus in goats is critical to crack dependence on veterinary anthelmintics. H. contortus management strategies have been the subject of intense research for decades, and must be selected to create a tailored, individualized program for goat farms. Through the selection and combination of strategies from the Toolbox, an effective management program for H. contortus can be designed according to the unique conditions of each particular farm. This Toolbox investigates strategies including vaccines, bioactive forages, pasture/grazing management, behavioural management, natural immunity, FAMACHA, Refugia and strategic drenching, mineral/vitamin supplementation, copper Oxide Wire Particles (COWPs), breeding and selection/selecting resistant and resilient individuals, biological control and anthelmintic drugs. Barbervax(®), the ground-breaking Haemonchus vaccine developed and currently commercially available on a pilot scale for sheep, is prime for trialling in goats and would be an invaluable inclusion to this Toolbox. The specialised behaviours of goats, specifically their preferences to browse a variety of plants and accompanying physiological adaptations to the consumption of secondary compounds contained in browse, have long been unappreciated and thus overlooked as a valuable, sustainable strategy for Haemonchus management. These strategies are discussed in this review as to their value for inclusion into the 'Toolbox' currently, and the future implications of ongoing research for goat producers. Combining and manipulating strategies such as browsing behaviour, pasture management, bioactive forages and identifying and treating individual animals for haemonchosis, in addition to continuous evaluation of strategy effectiveness, is conducted using a model farm scenario. Selecting strategies from the Toolbox, with regard to their current availability, feasibility, economical cost and potential ease of implementation depending on the systems of production and their complementary nature, is the future of managing H. contortus in farmed goats internationally and maintaining the remaining efficacy of veterinary anthelmintics. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Disciplinary perspectives on multicultural research: Reply to Dvorakova (2016) and Yakushko et al. (2016).

    PubMed

    Hall, Gordon C Nagayama; Yip, Tiffany; Zárate, Michael A

    2016-12-01

    In their comments on Hall, Yip, and Zárate (2016), Dvorakova (2016) addresses cultural psychology methods and Yakushko, Hoffman, Consoli, and Lee (2016) address qualitative research methods. We provide evidence of the neglect of underrepresented groups in the publications of major journals in cultural psychology and qualitative psychology. We do not view any particular research method as inherently contributing to "epistemological violence" (Yakushko et al., 2016, p. 5), but it is the misguided application and/or interpretation of data generated from such methods that perpetuate oppression. We contend that best practices for representing ethnocultural diversity in research will require a diverse toolbox containing quantitative, qualitative, biological, and behavioral approaches. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. To label or not to label: applications of quantitative proteomics in neuroscience research.

    PubMed

    Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W

    2012-02-01

    Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    PubMed

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  9. Application of a moment tensor inversion code developed for mining-induced seismicity to fracture monitoring of civil engineering materials

    NASA Astrophysics Data System (ADS)

    Linzer, Lindsay; Mhamdi, Lassaad; Schumacher, Thomas

    2015-01-01

    A moment tensor inversion (MTI) code originally developed to compute source mechanisms from mining-induced seismicity data is now being used in the laboratory in a civil engineering research environment. Quantitative seismology methods designed for geological environments are being tested with the aim of developing techniques to assess and monitor fracture processes in structural concrete members such as bridge girders. In this paper, we highlight aspects of the MTI_Toolbox programme that make it applicable to performing inversions on acoustic emission (AE) data recorded by networks of uniaxial sensors. The influence of the configuration of a seismic network on the conditioning of the least-squares system and subsequent moment tensor results for a real, 3-D network are compared to a hypothetical 2-D version of the same network. This comparative analysis is undertaken for different cases: for networks consisting entirely of triaxial or uniaxial sensors; for both P and S-waves, and for P-waves only. The aim is to guide the optimal design of sensor configurations where only uniaxial sensors can be installed. Finally, the findings of recent laboratory experiments where the MTI_Toolbox has been applied to a concrete beam test are presented and discussed.

  10. EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets.

    PubMed

    Robbins, Kay A

    2012-01-01

    Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug-in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/eegvis.

  11. System-Events Toolbox--Activating Urban Places for Social Cohesion through Designing a System of Events That Relies on Local Resources

    ERIC Educational Resources Information Center

    Fassi, Davide; Motter, Roberta

    2014-01-01

    This paper is a reflection on the use of public spaces in towns and the development of a system-events toolbox to activate them towards social cohesion. It is the result of a 1 year action research developed together with POLIMI DESIS Lab of the Department of Design to develop design solutions to open up the public spaces of the campus to the…

  12. Development of Complexity Science and Technology Tools for NextGen Airspace Research and Applications

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Sawhill, Bruce K.; Herriot, James; Seehart, Ken; Zellweger, Dres; Shay, Rick

    2012-01-01

    The objective of this research by NextGen AeroSciences, LLC is twofold: 1) to deliver an initial "toolbox" of algorithms, agent-based structures, and method descriptions for introducing trajectory agency as a methodology for simulating and analyzing airspace states, including bulk properties of large numbers of heterogeneous 4D aircraft trajectories in a test airspace -- while maintaining or increasing system safety; and 2) to use these tools in a test airspace to identify possible phase transition structure to predict when an airspace will approach the limits of its capacity. These 4D trajectories continuously replan their paths in the presence of noise and uncertainty while optimizing performance measures and performing conflict detection and resolution. In this approach, trajectories are represented as extended objects endowed with pseudopotential, maintaining time and fuel-efficient paths by bending just enough to accommodate separation while remaining inside of performance envelopes. This trajectory-centric approach differs from previous aircraft-centric distributed approaches to deconfliction. The results of this project are the following: 1) we delivered a toolbox of algorithms, agent-based structures and method descriptions as pseudocode; and 2) we corroborated the existence of phase transition structure in simulation with the addition of "early warning" detected prior to "full" airspace. This research suggests that airspace "fullness" can be anticipated and remedied before the airspace becomes unsafe.

  13. Integrated system dynamics toolbox for water resources planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associatedmore » processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward achieving the technology development goals of this center.« less

  14. HYDRORECESSION: A toolbox for streamflow recession analysis

    NASA Astrophysics Data System (ADS)

    Arciniega, S.

    2015-12-01

    Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.

  15. A novel toolbox for E. coli lysis monitoring.

    PubMed

    Rajamanickam, Vignesh; Wurm, David; Slouka, Christoph; Herwig, Christoph; Spadiut, Oliver

    2017-01-01

    The bacterium Escherichia coli is a well-studied recombinant host organism with a plethora of applications in biotechnology. Highly valuable biopharmaceuticals, such as antibody fragments and growth factors, are currently being produced in E. coli. However, the high metabolic burden during recombinant protein production can lead to cell death, consequent lysis, and undesired product loss. Thus, fast and precise analyzers to monitor E. coli bioprocesses and to retrieve key process information, such as the optimal time point of harvest, are needed. However, such reliable monitoring tools are still scarce to date. In this study, we cultivated an E. coli strain producing a recombinant single-chain antibody fragment in the cytoplasm. In bioreactor cultivations, we purposely triggered cell lysis by pH ramps. We developed a novel toolbox using UV chromatograms as fingerprints and chemometric techniques to monitor these lysis events and used flow cytometry (FCM) as reference method to quantify viability offline. Summarizing, we were able to show that a novel toolbox comprising HPLC chromatogram fingerprinting and data science tools allowed the identification of E. coli lysis in a fast and reliable manner. We are convinced that this toolbox will not only facilitate E. coli bioprocess monitoring but will also allow enhanced process control in the future.

  16. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  17. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE PAGES

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.; ...

    2014-09-12

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  18. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates

    PubMed Central

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: • The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms. • The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform. • The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time. PMID:26150988

  19. Soil Fumigants

    EPA Pesticide Factsheets

    This Toolbox provides training, outreach, and resource materials for applicators and handlers, communities, state/local agencies, and others interested in understanding and implementing the current requirements for safe use of these pesticides.

  20. Using genetic data to strengthen causal inference in observational research.

    PubMed

    Pingault, Jean-Baptiste; O'Reilly, Paul F; Schoeler, Tabea; Ploubidis, George B; Rijsdijk, Frühling; Dudbridge, Frank

    2018-06-05

    Causal inference is essential across the biomedical, behavioural and social sciences.By progressing from confounded statistical associations to evidence of causal relationships, causal inference can reveal complex pathways underlying traits and diseases and help to prioritize targets for intervention. Recent progress in genetic epidemiology - including statistical innovation, massive genotyped data sets and novel computational tools for deep data mining - has fostered the intense development of methods exploiting genetic data and relatedness to strengthen causal inference in observational research. In this Review, we describe how such genetically informed methods differ in their rationale, applicability and inherent limitations and outline how they should be integrated in the future to offer a rich causal inference toolbox.

  1. Qualitative methods in health services and management research: pockets of excellence and progress, but still a long way to go.

    PubMed

    Devers, Kelly J

    2011-02-01

    The 10-year systematic review of published health services and management research by Weiner et al. (2011) chronicles the contributions of qualitative methods, highlights areas of substantial progress, and identifies areas in need of more progress. This article (Devers, 2011) discusses possible reasons for lack of progress in some areas--related to the under-supply of well-trained qualitative researchers and more tangible demand for their research--and mechanisms for future improvement. To ensure a robust health services research toolbox, the field must take additional steps to provide stronger education and training in qualitative methods and more funding and publication opportunities. Given the rapidly changing health care system post the passage of national health reform and the chalresearch issues associated with it, the health services research and management field will not meet its future challenges with quantitative methods alone or with a half-empty toolbox.

  2. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  3. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  4. A fractured rock geophysical toolbox method selection tool

    USGS Publications Warehouse

    Day-Lewis, F. D.; Johnson, C.D.; Slater, L.D.; Robinson, J.L.; Williams, J.H.; Boyden, C.L.; Werkema, D.D.; Lane, J.W.

    2016-01-01

    Geophysical technologies have the potential to improve site characterization and monitoring in fractured rock, but the appropriate and effective application of geophysics at a particular site strongly depends on project goals (e.g., identifying discrete fractures) and site characteristics (e.g., lithology). No method works at every site or for every goal. New approaches are needed to identify a set of geophysical methods appropriate to specific project goals and site conditions while considering budget constraints. To this end, we present the Excel-based Fractured-Rock Geophysical Toolbox Method Selection Tool (FRGT-MST). We envision the FRGT-MST (1) equipping remediation professionals with a tool to understand what is likely to be realistic and cost-effective when contracting geophysical services, and (2) reducing applications of geophysics with unrealistic objectives or where methods are likely to fail.

  5. Kinematic simulation and analysis of robot based on MATLAB

    NASA Astrophysics Data System (ADS)

    Liao, Shuhua; Li, Jiong

    2018-03-01

    The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.

  6. III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.

    PubMed

    Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L

    2013-08-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.

  7. Tools in Support of Planning for Weather and Climate Extremes

    NASA Astrophysics Data System (ADS)

    Done, J.; Bruyere, C. L.; Hauser, R.; Holland, G. J.; Tye, M. R.

    2016-12-01

    A major limitation to planning for weather and climate extremes is the lack of maintained and readily available tools that can provide robust and well-communicated predictions and advice on their impacts. The National Center for Atmospheric Research is facilitating a collaborative international program to develop and support such tools within its Capacity Center for Climate and Weather Extremes aimed at improving community resilience planning and reducing weather and climate impacts. A Global Risk, Resilience and Impacts Toolbox is in development and will provide: A portable web-based interface to process work requests from a variety of users and locations; A sophisticated framework that enables specialized community tools to access a comprehensive database (public and private) of geo-located hazard, vulnerability, exposure, and loss data; A community development toolkit that enables and encourages community tool developments geared towards specific user man­agement and planning needs, and A comprehensive community sup­port facilitated by NCAR utilizing tutorials and a help desk. A number of applications are in development, built off the latest climate science, and in collaboration with private industry and local and state governments. Example applications will be described, including a hurricane damage tool in collaboration with the reinsurance sector, and a weather management tool for the construction industry. These examples will serve as starting points to discuss the broader potential of the toolbox.

  8. How chemistry supports cell biology: the chemical toolbox at your service.

    PubMed

    Wijdeven, Ruud H; Neefjes, Jacques; Ovaa, Huib

    2014-12-01

    Chemical biology is a young and rapidly developing scientific field. In this field, chemistry is inspired by biology to create various tools to monitor and modulate biochemical and cell biological processes. Chemical contributions such as small-molecule inhibitors and activity-based probes (ABPs) can provide new and unique insights into previously unexplored cellular processes. This review provides an overview of recent breakthroughs in chemical biology that are likely to have a significant impact on cell biology. We also discuss the application of several chemical tools in cell biology research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    PubMed

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  10. A MATLAB toolbox for the efficient estimation of the psychometric function using the updated maximum-likelihood adaptive procedure.

    PubMed

    Shen, Yi; Dai, Wei; Richards, Virginia M

    2015-03-01

    A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given.

  11. Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis

    PubMed Central

    Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.

    2006-01-01

    In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709

  12. ICT: isotope correction toolbox.

    PubMed

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Distributed Aerodynamic Sensing and Processing Toolbox

    NASA Technical Reports Server (NTRS)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  14. BOLDSync: a MATLAB-based toolbox for synchronized stimulus presentation in functional MRI.

    PubMed

    Joshi, Jitesh; Saharan, Sumiti; Mandal, Pravat K

    2014-02-15

    Precise and synchronized presentation of paradigm stimuli in functional magnetic resonance imaging (fMRI) is central to obtaining accurate information about brain regions involved in a specific task. In this manuscript, we present a new MATLAB-based toolbox, BOLDSync, for synchronized stimulus presentation in fMRI. BOLDSync provides a user friendly platform for design and presentation of visual, audio, as well as multimodal audio-visual (AV) stimuli in functional imaging experiments. We present simulation experiments that demonstrate the millisecond synchronization accuracy of BOLDSync, and also illustrate the functionalities of BOLDSync through application to an AV fMRI study. BOLDSync gains an advantage over other available proprietary and open-source toolboxes by offering a user friendly and accessible interface that affords both precision in stimulus presentation and versatility across various types of stimulus designs and system setups. BOLDSync is a reliable, efficient, and versatile solution for synchronized stimulus presentation in fMRI study. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Cross-species 3D virtual reality toolbox for visual and cognitive experiments.

    PubMed

    Doucet, Guillaume; Gulli, Roberto A; Martinez-Trujillo, Julio C

    2016-06-15

    Although simplified visual stimuli, such as dots or gratings presented on homogeneous backgrounds, provide strict control over the stimulus parameters during visual experiments, they fail to approximate visual stimulation in natural conditions. Adoption of virtual reality (VR) in neuroscience research has been proposed to circumvent this problem, by combining strict control of experimental variables and behavioral monitoring within complex and realistic environments. We have created a VR toolbox that maximizes experimental flexibility while minimizing implementation costs. A free VR engine (Unreal 3) has been customized to interface with any control software via text commands, allowing seamless introduction into pre-existing laboratory data acquisition frameworks. Furthermore, control functions are provided for the two most common programming languages used in visual neuroscience: Matlab and Python. The toolbox offers milliseconds time resolution necessary for electrophysiological recordings and is flexible enough to support cross-species usage across a wide range of paradigms. Unlike previously proposed VR solutions whose implementation is complex and time-consuming, our toolbox requires minimal customization or technical expertise to interface with pre-existing data acquisition frameworks as it relies on already familiar programming environments. Moreover, as it is compatible with a variety of display and input devices, identical VR testing paradigms can be used across species, from rodents to humans. This toolbox facilitates the addition of VR capabilities to any laboratory without perturbing pre-existing data acquisition frameworks, or requiring any major hardware changes. Copyright © 2016 Z. All rights reserved.

  16. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction.

    PubMed

    Abulnaga, S Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M; Onyike, Chiadi U; Ying, Sarah H; Prince, Jerry L

    2016-02-27

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  17. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction

    NASA Astrophysics Data System (ADS)

    Abulnaga, S. Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M.; Onyike, Chiadi U.; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  18. Basic Radar Altimetry Toolbox: Tools to Use Radar Altimetry for Geodesy

    NASA Astrophysics Data System (ADS)

    Rosmorduc, V.; Benveniste, J. J.; Bronner, E.; Niejmeier, S.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications and uses. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for geodesy, especially combined witht ESA GOCE mission data is still somehow hard. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed (as well as, on ESA side, the GOCE User Toolbox, both being linked). The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are ongoing, some are in discussion. Examples and Data use cases on geodesy will be presented. BRAT is developed under contract with ESA and CNES.

  19. Sentinel-3 SAR Altimetry Toolbox - Scientific Exploitation of Operational Missions (SEOM) Program Element

    NASA Astrophysics Data System (ADS)

    Benveniste, Jérôme; Lucas, Bruno; Dinardo, Salvatore

    2014-05-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage pioneered by ERS-1, ERS-2, Envisat and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the Sentinel-3 series is planned for launch in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales, the French Space Agency), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, showing its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the Sentinel-3 SAR Altimetry Toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The ITT is expected to be launched in Q1 2014 and have the 1st version available before the launch of Sentinel-3.

  20. Sentinel-3 SAR Altimetry Toolbox - Scientific Exploitation of Operational Missions (SEOM) Program Element

    NASA Astrophysics Data System (ADS)

    Benveniste, Jérôme; Dinardo, Salvatore; Lucas, Bruno Manuel

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage pioneered by ERS-1, ERS-2, Envisat and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the Sentinel-3 series is planned for launch in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission’s data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d’Etudes Spatiales, the French Space Agency), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, showing its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also “use cases”, with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the Sentinel-3 SAR Altimetry Toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific “use-cases” for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The ITT is expected to be launched in Q1 2014 and have the 1st version available before the launch of Sentinel-3.

  1. Sentinel-3 SAR Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Benveniste, Jerome; Lucas, Bruno; DInardo, Salvatore

    2015-04-01

    The prime objective of the SEOM (Scientific Exploitation of Operational Missions) element is to federate, support and expand the large international research community that the ERS, ENVISAT and the Envelope programmes have build up over the last 20 years for the future European operational Earth Observation missions, the Sentinels. Sentinel-3 builds directly on a proven heritage of ERS-2 and Envisat, and CryoSat-2, with a dual-frequency (Ku and C band) advanced Synthetic Aperture Radar Altimeter (SRAL) that provides measurements at a resolution of ~300m in SAR mode along track. Sentinel-3 will provide exact measurements of sea-surface height along with accurate topography measurements over sea ice, ice sheets, rivers and lakes. The first of the two Sentinels is expected to be launched in early 2015. The current universal altimetry toolbox is BRAT (Basic Radar Altimetry Toolbox) which can read all previous and current altimetry mission's data, but it does not have the capabilities to read the upcoming Sentinel-3 L1 and L2 products. ESA will endeavour to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats, the BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with Matlab/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as netCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, showing its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the Sentinel-3 SAR Altimetry Toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Kick Off is expected to be happen in Q1 2015 and have the 1st version available before the launch of Sentinel-3.

  2. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    PubMed

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  3. Toolboxes for cyanobacteria: Recent advances and future direction.

    PubMed

    Sun, Tao; Li, Shubin; Song, Xinyu; Diao, Jinjin; Chen, Lei; Zhang, Weiwen

    2018-05-03

    Photosynthetic cyanobacteria are important primary producers and model organisms for studying photosynthesis and elements cycling on earth. Due to the ability to absorb sunlight and utilize carbon dioxide, cyanobacteria have also been proposed as renewable chassis for carbon-neutral "microbial cell factories". Recent progresses on cyanobacterial synthetic biology have led to the successful production of more than two dozen of fuels and fine chemicals directly from CO 2 , demonstrating their potential for scale-up application in the future. However, compared with popular heterotrophic chassis like Escherichia coli and Saccharomyces cerevisiae, where abundant genetic tools are available for manipulations at levels from single gene, pathway to whole genome, limited genetic tools are accessible to cyanobacteria. Consequently, this significant technical hurdle restricts both the basic biological researches and further development and application of these renewable systems. Though still lagging the heterotrophic chassis, the vital roles of genetic tools in tuning of gene expression, carbon flux re-direction as well as genome-wide manipulations have been increasingly recognized in cyanobacteria. In recent years, significant progresses on developing and introducing new and efficient genetic tools have been made for cyanobacteria, including promoters, riboswitches, ribosome binding site engineering, clustered regularly interspaced short palindromic repeats/CRISPR-associated nuclease (CRISPR/Cas) systems, small RNA regulatory tools and genome-scale modeling strategies. In this review, we critically summarize recent advances on development and applications as well as technical limitations and future directions of the genetic tools in cyanobacteria. In addition, toolboxes feasible for using in large-scale cultivation are also briefly discussed. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    PubMed

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  5. A MATLAB toolbox for the efficient estimation of the psychometric function using the updated maximum-likelihood adaptive procedure

    PubMed Central

    Richards, V. M.; Dai, W.

    2014-01-01

    A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given. PMID:24671826

  6. Nanotechnology - An emerging technology

    USGS Publications Warehouse

    Buckingham, D.

    2007-01-01

    The science of nanotechnology is still in its infancy. However, progress is being made in research and development of potential beneficial properties of nanomaterials that could play an integral part in the development of new and changing uses for mineral commodities. Nanotechnology is a kind of toolbox that allows industry to make nanomaterials and nanostructures with special properties. New nanotechnology applications of mineral commodities in their nanoscale form are being discovered, researched and developed. At the same time, there is continued research into environmental, human health and safety concerns that inherently arise from the development of a new technology. Except for a few nanomaterials (CNTs, copper, silver and zinc oxide), widespread applications are hampered by processing and suitable commercial-scale production techniques, high manufacturing costs, product price, and environmental, and human health and safety concerns. Whether nanotechnology causes a tidal wave of change or is a long-term evolutionary process of technology, new applications of familiar mineral commodities will be created. As research and development continues, the ability to manipulate matter at the nanoscale into increasingly sophisticated nanomaterials will improve and open up new possibilities for industry that will change the flow and use of mineral commodities and the materials and products that are used.

  7. Multimodal Imaging of Brain Connectivity Using the MIBCA Toolbox: Preliminary Application to Alzheimer's Disease

    NASA Astrophysics Data System (ADS)

    Ribeiro, André Santos; Lacerda, Luís Miguel; Silva, Nuno André da; Ferreira, Hugo Alexandre

    2015-06-01

    The Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox is a fully automated all-in-one connectivity analysis toolbox that offers both pre-processing, connectivity, and graph theory analysis of multimodal images such as anatomical, diffusion, and functional MRI, and PET. In this work, the MIBCA functionalities were used to study Alzheimer's Disease (AD) in a multimodal MR/PET approach. Materials and Methods: Data from 12 healthy controls, and 36 patients with EMCI, LMCI and AD (12 patients for each group) were obtained from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database (adni.loni.usc.edu), including T1-weighted (T1-w), Diffusion Tensor Imaging (DTI) data, and 18F-AV-45 (florbetapir) dynamic PET data from 40-60 min post injection (4x5 min). Both MR and PET data were automatically pre-processed for all subjects using MIBCA. T1-w data was parcellated into cortical and subcortical regions-of-interest (ROIs), and the corresponding thicknesses and volumes were calculated. DTI data was used to compute structural connectivity matrices based on fibers connecting pairs of ROIs. Lastly, dynamic PET images were summed, and the relative Standard Uptake Values calculated for each ROI. Results: An overall higher uptake of 18F-AV-45, consistent with an increased deposition of beta-amyloid, was observed for the AD group. Additionally, patients showed significant cortical atrophy (thickness and volume) especially in the entorhinal cortex and temporal areas, and a significant increase in Mean Diffusivity (MD) in the hippocampus, amygdala and temporal areas. Furthermore, patients showed a reduction of fiber connectivity with the progression of the disease, especially for intra-hemispherical connections. Conclusion: This work shows the potential of the MIBCA toolbox for the study of AD, as findings were shown to be in agreement with the literature. Here, only structural changes and beta-amyloid accumulation were considered. Yet, MIBCA is further able to process fMRI and different radiotracers, thus leading to integration of functional information, and supporting the research for new multimodal biomarkers for AD and other neurodegenerative diseases.

  8. Wave data processing toolbox manual

    USGS Publications Warehouse

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.

  9. 40 CFR 141.720 - Inactivation toolbox components.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... between the indicated values: Log credit = (0.0397 × (1.09757)Temp) × CT. (c) Site-specific study. The... this section on a site-specific basis. The State must base this approval on a site-specific study a... in this table are applicable only to post-filter applications of UV in filtered systems and to...

  10. 40 CFR 141.720 - Inactivation toolbox components.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... between the indicated values: Log credit = (0.0397 × (1.09757)Temp) × CT. (c) Site-specific study. The... this section on a site-specific basis. The State must base this approval on a site-specific study a... in this table are applicable only to post-filter applications of UV in filtered systems and to...

  11. 40 CFR 141.720 - Inactivation toolbox components.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... between the indicated values: Log credit = (0.0397 × (1.09757)Temp) × CT. (c) Site-specific study. The... this section on a site-specific basis. The State must base this approval on a site-specific study a... in this table are applicable only to post-filter applications of UV in filtered systems and to...

  12. 40 CFR 141.720 - Inactivation toolbox components.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... between the indicated values: Log credit = (0.0397 × (1.09757)Temp) × CT. (c) Site-specific study. The... this section on a site-specific basis. The State must base this approval on a site-specific study a... in this table are applicable only to post-filter applications of UV in filtered systems and to...

  13. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  14. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy

    PubMed Central

    2011-01-01

    Background Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. Results In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. Conclusions TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox. PMID:22098775

  15. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    PubMed

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  16. TRENTOOL: a Matlab open source toolbox to analyse information flow in time series data with transfer entropy.

    PubMed

    Lindner, Michael; Vicente, Raul; Priesemann, Viola; Wibral, Michael

    2011-11-18

    Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox.

  17. Predictive Mining of Time Series Data

    NASA Astrophysics Data System (ADS)

    Java, A.; Perlman, E. S.

    2002-05-01

    All-sky monitors are a relatively new development in astronomy, and their data represent a largely untapped resource. Proper utilization of this resource could lead to important discoveries not only in the physics of variable objects, but in how one observes such objects. We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build predictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets (representing several classes of objects) from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. A possible application would be in scheduling target of opportunity observations where the astronomer wants to observe an object when a certain event or series of events occurs. By combining such a toolbox with an automatic, Java query tool which regularly gathers data on objects of interest, the astronomer or telescope operator could use the real-time datastream to efficiently predict the occurrence of (for example) a flare or other event. By combining the toolbox with dynamic time warping data-mining tools, one could predict events which may happen on variable time scales.

  18. Avanti lipid tools: connecting lipids, technology, and cell biology.

    PubMed

    Sims, Kacee H; Tytler, Ewan M; Tipton, John; Hill, Kasey L; Burgess, Stephen W; Shaw, Walter A

    2014-08-01

    Lipid research is challenging owing to the complexity and diversity of the lipidome. Here we review a set of experimental tools developed for the seasoned lipid researcher, as well as, those who are new to the field of lipid research. Novel tools for probing protein-lipid interactions, applications for lipid binding antibodies, enhanced systems for the cellular delivery of lipids, improved visualization of lipid membranes using gold-labeled lipids, and advances in mass spectrometric analysis techniques will be discussed. Because lipid mediators are known to participate in a host of signal transduction and trafficking pathways within the cell, a comprehensive lipid toolbox that aids the science of lipidomics research is essential to better understand the molecular mechanisms of interactions between cellular components. This article is part of a Special Issue entitled Tools to study lipid functions. Copyright © 2014. Published by Elsevier B.V.

  19. Expanding the seat belt program strategies toolbox: a starter kit for trying new program ideas.

    DOT National Transportation Integrated Search

    2016-10-01

    The purpose of this research was to explore alternative strategies for increasing seat belt use. Researchers examined behavior : change strategies proven effective in education, healthcare, advertising, and marketing, and they considered how these : ...

  20. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    PubMed Central

    Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-01-01

    Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194

  1. Review of qualitative approaches for the construction industry: designing a risk management toolbox.

    PubMed

    Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-06-01

    This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  2. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan

    2016-04-28

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less

  3. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    PubMed Central

    Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost

    2016-01-01

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167

  4. Arc_Mat: a Matlab-based spatial data analysis toolbox

    NASA Astrophysics Data System (ADS)

    Liu, Xingjian; Lesage, James

    2010-03-01

    This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.

  5. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis

    PubMed Central

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475

  6. The CatchMod toolbox: easy and guided access to ICT tools for Water Framework Directive implementation.

    PubMed

    van Griensven, A; Vanrolleghem, P A

    2006-01-01

    Web-based toolboxes are handy tools to inform experienced users of existing software in their disciplines. However, for the implementation of the Water Framework Directive, a much more diverse public (water managers, consultancy firms, scientists, etc.) will ask for a very wide diversity of Information and Communication Technology (ICT) tools. It is obvious that the users of a web-based ICT-toolbox providing all this will not be experts in all of the disciplines and that a toolbox for ICT tools for Water Framework Directive implementation should thus go beyond just making interesting web-links. To deal with this issue, expert knowledge is brought to the users through the incorporation of visitor-geared guidance (materials) in the Harmoni-CA toolbox. Small workshops of expert teams were organized to deliver documents explaining why the tools are important, when they are required and what activity they support/perform, as well as a categorization of the multitude of available tools. An integration of this information in the web-based toolbox helps the users to browse through a toolbox containing tools, reports, guidance documents and interesting links. The Harmoni-CA toolbox thus provides not only a virtual toolbox, but incorporates a virtual expert as well.

  7. The FieldTrip-SimBio pipeline for EEG forward solutions.

    PubMed

    Vorwerk, Johannes; Oostenveld, Robert; Piastra, Maria Carla; Magyari, Lilla; Wolters, Carsten H

    2018-03-27

    Accurately solving the electroencephalography (EEG) forward problem is crucial for precise EEG source analysis. Previous studies have shown that the use of multicompartment head models in combination with the finite element method (FEM) can yield high accuracies both numerically and with regard to the geometrical approximation of the human head. However, the workload for the generation of multicompartment head models has often been too high and the use of publicly available FEM implementations too complicated for a wider application of FEM in research studies. In this paper, we present a MATLAB-based pipeline that aims to resolve this lack of easy-to-use integrated software solutions. The presented pipeline allows for the easy application of five-compartment head models with the FEM within the FieldTrip toolbox for EEG source analysis. The FEM from the SimBio toolbox, more specifically the St. Venant approach, was integrated into the FieldTrip toolbox. We give a short sketch of the implementation and its application, and we perform a source localization of somatosensory evoked potentials (SEPs) using this pipeline. We then evaluate the accuracy that can be achieved using the automatically generated five-compartment hexahedral head model [skin, skull, cerebrospinal fluid (CSF), gray matter, white matter] in comparison to a highly accurate tetrahedral head model that was generated on the basis of a semiautomatic segmentation with very careful and time-consuming manual corrections. The source analysis of the SEP data correctly localizes the P20 component and achieves a high goodness of fit. The subsequent comparison to the highly detailed tetrahedral head model shows that the automatically generated five-compartment head model performs about as well as a highly detailed four-compartment head model (skin, skull, CSF, brain). This is a significant improvement in comparison to a three-compartment head model, which is frequently used in praxis, since the importance of modeling the CSF compartment has been shown in a variety of studies. The presented pipeline facilitates the use of five-compartment head models with the FEM for EEG source analysis. The accuracy with which the EEG forward problem can thereby be solved is increased compared to the commonly used three-compartment head models, and more reliable EEG source reconstruction results can be obtained.

  8. Agricultural Conservation Planning Toolbox User's Manual

    USDA-ARS?s Scientific Manuscript database

    Agricultural Conservation Planning Framework (ACPF) comprises an approach for applying concepts of precision conservation to watershed planning in agricultural landscapes. To enable application of this approach, USDA/ARS has developed a set of Geographic Information System (GIS) based software tools...

  9. Basic Radar Altimetry Toolbox: Tools and Tutorial To Use Radar Altimetry For Cryosphere

    NASA Astrophysics Data System (ADS)

    Benveniste, J. J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious, especially for new Altimetry data products users. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them, including teachers and professors. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are under development, some are in discussion for the future. Data use cases on cryosphere applications will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  10. GridPV Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.

  11. ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Ambrozio, A.; Restano, M.

    2016-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's future release (4.0.0) is planned for September 2016. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.0 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE's variance-covariance matrices. BRAT and GUT toolboxes can be freely downloaded, along with ancillary material, at https://earth.esa.int/brat and https://earth.esa.int/gut.

  12. Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy

    NASA Astrophysics Data System (ADS)

    Jha, S.; Harry, D. L.; Schutt, D.

    2016-12-01

    The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, A; Veeraraghavan, H; Oh, J

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less

  14. EPA Office of Research and Development - I/I Research Information Update

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in devel...

  15. MOEMS Modeling Using the Geometrical Matrix Toolbox

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2005-01-01

    New technologies such as MicroOptoElectro-Mechanical Systems (MOEMS) require new modeling tools. These tools must simultaneously model the optical, electrical, and mechanical domains and the interactions between these domains. To facilitate rapid prototyping of these new technologies an optical toolbox has been developed for modeling MOEMS devices. The toolbox models are constructed using MATLAB's dynamical simulator, Simulink. Modeling toolboxes will allow users to focus their efforts on system design and analysis as opposed to developing component models. This toolbox was developed to facilitate rapid modeling and design of a MOEMS based laser ultrasonic receiver system.

  16. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.

    PubMed

    Silva, Ikaro; Moody, George B

    The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  17. A Citizen Science and Government Collaboration: Developing ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) is actively involved in supporting citizen science projects and providing communities with information and assistance for conducting their own air pollution monitoring. As part of a Regional Applied Research Effort (RARE) project, EPA's Office of Research and Development (ORD) worked collaboratively with EPA Region 2 and the Ironbound Community Corporation (ICC) in Newark, New Jersey, to develop and test the “Air Sensor Toolbox for Citizen Scientists.” In this collaboration, citizen scientists measured local gaseous and particulate air pollution levels by using a customized low-cost sensor pod designed and fabricated by EPA. This citizen science air quality measurement project provided an excellent opportunity for EPA to evaluate and improve the Toolbox resources available to communities. The Air Sensor Toolbox, developed in coordination with the ICC, can serve as a template for communities across the country to use in developing their own air pollution monitoring programs in areas where air pollution is a concern. This pilot project provided an opportunity for a highly motivated citizen science organization and the EPA to work together directly to address environmental concerns within the community. Useful lessons were learned about how to improve coordination between the government and communities and the types of tools and technologies needed for conducting an effective citizen science project that can be app

  18. Evaluation of an educational "toolbox" for improving nursing staff competence and psychosocial work environment in elderly care: results of a prospective, non-randomized controlled intervention.

    PubMed

    Arnetz, J E; Hasson, H

    2007-07-01

    Lack of professional development opportunities among nursing staff is a major concern in elderly care and has been associated with work dissatisfaction and staff turnover. There is a lack of prospective, controlled studies evaluating the effects of educational interventions on nursing competence and work satisfaction. The aim of this study was to evaluate the possible effects of an educational "toolbox" intervention on nursing staff ratings of their competence, psychosocial work environment and overall work satisfaction. The study was a prospective, non-randomized, controlled intervention. Nursing staff in two municipal elderly care organizations in western Sweden. In an initial questionnaire survey, nursing staff in the intervention municipality described several areas in which they felt a need for competence development. Measurement instruments and educational materials for improving staff knowledge and work practices were then collated by researchers and managers in a "toolbox." Nursing staff ratings of their competence and work were measured pre and post-intervention by questionnaire. Staff ratings in the intervention municipality were compared to staff ratings in the reference municipality, where no toolbox was introduced. Nursing staff ratings of their competence and psychosocial work environment, including overall work satisfaction, improved significantly over time in the intervention municipality, compared to the reference group. Both competence and work environment ratings were largely unchanged among reference municipality staff. Multivariate analysis revealed a significant interaction effect between municipalities over time for nursing staff ratings of participation, leadership, performance feedback and skills' development. Staff ratings for these four scales improved significantly in the intervention municipality as compared to the reference municipality. Compared to a reference municipality, nursing staff ratings of their competence and the psychosocial work environment improved in the municipality where the toolbox was introduced.

  19. PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.

    PubMed

    Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.

  20. PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561

  1. Processing methods for differential analysis of LC/MS profile data

    PubMed Central

    Katajamaa, Mikko; Orešič, Matej

    2005-01-01

    Background Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. Results We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. Conclusion The software is freely available under the GNU General Public License and it can be obtained from the project web page at: . PMID:16026613

  2. Processing methods for differential analysis of LC/MS profile data.

    PubMed

    Katajamaa, Mikko; Oresic, Matej

    2005-07-18

    Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. The software is freely available under the GNU General Public License and it can be obtained from the project web page at: http://mzmine.sourceforge.net/.

  3. Multivariate and repeated measures (MRM): A new toolbox for dependent and multimodal group-level neuroimaging data

    PubMed Central

    McFarquhar, Martyn; McKie, Shane; Emsley, Richard; Suckling, John; Elliott, Rebecca; Williams, Stephen

    2016-01-01

    Repeated measurements and multimodal data are common in neuroimaging research. Despite this, conventional approaches to group level analysis ignore these repeated measurements in favour of multiple between-subject models using contrasts of interest. This approach has a number of drawbacks as certain designs and comparisons of interest are either not possible or complex to implement. Unfortunately, even when attempting to analyse group level data within a repeated-measures framework, the methods implemented in popular software packages make potentially unrealistic assumptions about the covariance structure across the brain. In this paper, we describe how this issue can be addressed in a simple and efficient manner using the multivariate form of the familiar general linear model (GLM), as implemented in a new MATLAB toolbox. This multivariate framework is discussed, paying particular attention to methods of inference by permutation. Comparisons with existing approaches and software packages for dependent group-level neuroimaging data are made. We also demonstrate how this method is easily adapted for dependency at the group level when multiple modalities of imaging are collected from the same individuals. Follow-up of these multimodal models using linear discriminant functions (LDA) is also discussed, with applications to future studies wishing to integrate multiple scanning techniques into investigating populations of interest. PMID:26921716

  4. Chemical Microsensor Development for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Xu, Jennifer C.; Hunter, Gary W.; Lukco, Dorothy; Chen, Liangyu; Biaggi-Labiosa, Azlin M.

    2013-01-01

    Numerous aerospace applications, including low-false-alarm fire detection, environmental monitoring, fuel leak detection, and engine emission monitoring, would benefit greatly from robust and low weight, cost, and power consumption chemical microsensors. NASA Glenn Research Center has been working to develop a variety of chemical microsensors with these attributes to address the aforementioned applications. Chemical microsensors using different material platforms and sensing mechanisms have been produced. Approaches using electrochemical cells, resistors, and Schottky diode platforms, combined with nano-based materials, high temperature solid electrolytes, and room temperature polymer electrolytes have been realized to enable different types of microsensors. By understanding the application needs and chemical gas species to be detected, sensing materials and unique microfabrication processes were selected and applied. The chemical microsensors were designed utilizing simple structures and the least number of microfabrication processes possible, while maintaining high yield and low cost. In this presentation, an overview of carbon dioxide (CO2), oxygen (O2), and hydrogen/hydrocarbons (H2/CxHy) microsensors and their fabrication, testing results, and applications will be described. Particular challenges associated with improving the H2/CxHy microsensor contact wire-bonding pad will be discussed. These microsensors represent our research approach and serve as major tools as we expand our sensor development toolbox. Our ultimate goal is to develop robust chemical microsensor systems for aerospace and commercial applications.

  5. Speed management toolbox for rural communities.

    DOT National Transportation Integrated Search

    2013-04-01

    The primary objective of this toolbox is to summarize various known traffic-calming treatments and their effectiveness. This toolbox focuses on roadway-based treatments for speed management, particularly for rural communities with transition zones. E...

  6. EPA SSOAP Toolbox – Evolution and Applications

    EPA Science Inventory

    The nation’s sanitary sewer infrastructure is aging, with some sewers dating back more than 100 years. Nationwide, there are more than 19,500 municipal sanitary-sewer collection systems serving an estimated 150 million people and about 40,000 sanitary sewer overflow (SSO) ...

  7. 40 CFR 141.719 - Additional filtration toolbox components.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... establish a quality control release value (QCRV) for a non-destructive performance test that demonstrates... test; and Cp = the filtrate concentration measured during the challenge test. Equivalent units must be... or the applicability of the non-destructive performance test and associated QCRV, additional...

  8. 40 CFR 141.719 - Additional filtration toolbox components.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... establish a quality control release value (QCRV) for a non-destructive performance test that demonstrates... test; and Cp = the filtrate concentration measured during the challenge test. Equivalent units must be... or the applicability of the non-destructive performance test and associated QCRV, additional...

  9. 40 CFR 141.719 - Additional filtration toolbox components.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... establish a quality control release value (QCRV) for a non-destructive performance test that demonstrates... test; and Cp = the filtrate concentration measured during the challenge test. Equivalent units must be... or the applicability of the non-destructive performance test and associated QCRV, additional...

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galperin, Michael

    The progress of experimental techniques at the nanoscale in the last decade made optical measurements in current-carrying nanojunctions a reality, thus indicating the emergence of a new field of research coined optoelectronics. Optical spectroscopy of open nonequilibrium systems is a natural meeting point for (at least) two research areas: nonlinear optical spectroscopy and quantum transport, each with its own theoretical toolbox. We review recent progress in the field comparing theoretical treatments of optical response in nanojunctions as is accepted in nonlinear spectroscopy and quantum transport communities. A unified theoretical description of spectroscopy in nanojunctions is presented. Here, we argue thatmore » theoretical approaches of the quantum transport community (and in particular, the Green function based considerations) yield a convenient tool for optoelectronics when the radiation field is treated classically, and that differences between the toolboxes may become critical when studying the quantum radiation field in junctions.« less

  11. IV. NIH Toolbox Cognition Battery (CB): measuring language (vocabulary comprehension and reading decoding).

    PubMed

    Gershon, Richard C; Slotkin, Jerry; Manly, Jennifer J; Blitz, David L; Beaumont, Jennifer L; Schnipke, Deborah; Wallner-Allen, Kathleen; Golinkoff, Roberta Michnick; Gleason, Jean Berko; Hirsh-Pasek, Kathy; Adams, Marilyn Jager; Weintraub, Sandra

    2013-08-01

    Mastery of language skills is an important predictor of daily functioning and health. Vocabulary comprehension and reading decoding are relatively quick and easy to measure and correlate highly with overall cognitive functioning, as well as with success in school and work. New measures of vocabulary comprehension and reading decoding (in both English and Spanish) were developed for the NIH Toolbox Cognition Battery (CB). In the Toolbox Picture Vocabulary Test (TPVT), participants hear a spoken word while viewing four pictures, and then must choose the picture that best represents the word. This approach tests receptive vocabulary knowledge without the need to read or write, removing the literacy load for children who are developing literacy and for adults who struggle with reading and writing. In the Toolbox Oral Reading Recognition Test (TORRT), participants see a letter or word onscreen and must pronounce or identify it. The examiner determines whether it was pronounced correctly by comparing the response to the pronunciation guide on a separate computer screen. In this chapter, we discuss the importance of language during childhood and the relation of language and brain function. We also review the development of the TPVT and TORRT, including information about the item calibration process and results from a validation study. Finally, the strengths and weaknesses of the measures are discussed. © 2013 The Society for Research in Child Development, Inc.

  12. MMM: A toolbox for integrative structure modeling.

    PubMed

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  13. Advanced Thermoplastic Polymers and Additive Manufacturing Applied to ISS Columbus Toolbox: Lessons Learnt and Results

    NASA Astrophysics Data System (ADS)

    Ferrino, Marinella; Secondo, Ottaviano; Sabbagh, Amir; Della Sala, Emilio

    2014-06-01

    In the frame of the International Space Station (ISS) Exploitation Program a new toolbox has been realized by TAS-I to accommodate the tools currently in use on the ISS Columbus Module utilizing full-scale prototypes obtained with 3D rapid prototyping. The manufacturing of the flight hardware by means of advanced thermoplastic polymer UL TEM 9085 and additive manufacturing Fused Deposition Modelling (FDM) technology represent innovative elements. In this paper, the results achieved and the lessons learned are analyzed to promote future technology know-how. The acquired experience confirmed that the additive manufacturing process allows to save time/cost and to realize new shapes/features to introduce innovation in products and future design processes for space applications.

  14. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  15. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  16. Innovative Leadership: Insights from a Learning Technologist

    ERIC Educational Resources Information Center

    Campbell, Bruce

    2012-01-01

    Professor Ricardo Torres Kompen is a leading proponent for, and researcher in, personal learning environments (PLEs). During his interview, Torres Kompen clarified his research on PLEs, particularly the digital toolbox within PLEs. He elaborated on experiences with implementing PLE initiatives, personal insights on using social media and Web 2.0…

  17. Electronic audit and feedback intervention with action implementation toolbox to improve pain management in intensive care: protocol for a laboratory experiment and cluster randomised trial.

    PubMed

    Gude, Wouter T; Roos-Blom, Marie-José; van der Veer, Sabine N; de Jonge, Evert; Peek, Niels; Dongelmans, Dave A; de Keizer, Nicolette F

    2017-05-25

    Audit and feedback is often used as a strategy to improve quality of care, however, its effects are variable and often marginal. In order to learn how to design and deliver effective feedback, we need to understand their mechanisms of action. This theory-informed study will investigate how electronic audit and feedback affects improvement intentions (i.e. information-intention gap), and whether an action implementation toolbox with suggested actions and materials helps translating those intentions into action (i.e. intention-behaviour gap). The study will be executed in Dutch intensive care units (ICUs) and will be focused on pain management. We will conduct a laboratory experiment with individual ICU professionals to assess the impact of feedback on their intentions to improve practice. Next, we will conduct a cluster randomised controlled trial with ICUs allocated to feedback without or feedback with action implementation toolbox group. Participants will not be told explicitly what aspect of the intervention is randomised; they will only be aware that there are two variations of providing feedback. ICUs are eligible for participation if they submit indicator data to the Dutch National Intensive Care Evaluation (NICE) quality registry and agree to allocate a quality improvement team that spends 4 h per month on the intervention. All participating ICUs will receive access to an online quality dashboard that provides two functionalities: gaining insight into clinical performance on pain management indicators and developing action plans. ICUs with access to the toolbox can develop their action plans guided by a list of potential barriers in the care process, associated suggested actions, and supporting materials to facilitate implementation of the actions. The primary outcome measure for the laboratory experiment is the proportion of improvement intentions set by participants that are consistent with recommendations based on peer comparisons; for the randomised trial it is the proportion of patient shifts during which pain has been adequately managed. We will also conduct a process evaluation to understand how the intervention is implemented and used in clinical practice, and how implementation and use affect the intervention's impact. The results of this study will inform care providers and managers in ICU and other clinical settings how to use indicator-based performance feedback in conjunction with an action implementation toolbox to improve quality of care. Within the ICU context, this study will produce concrete and directly applicable knowledge with respect to what is or is not effective for improving pain management, and under which circumstances. The results will further guide future research that aims to understand the mechanisms behind audit and feedback and contribute to identifying the active ingredients of successful interventions. ClinicalTrials.gov NCT02922101 . Registered 26 September 2016.

  18. Laying a Firm Foundation: Embedding Evidence-Based Emergent Literacy Practices Into Early Intervention and Preschool Environments.

    PubMed

    Terrell, Pamela; Watson, Maggie

    2018-04-05

    As part of this clinical forum on curriculum-based intervention, the goal of this tutorial is to share research about the importance of language and literacy foundations in natural environments during emergent literacy skill development, from infancy through preschool. Following an overview of intervention models in schools by Powell (2018), best practices at home, in child care, and in preschool settings are discussed. Speech-language pathologists in these settings will be provided a toolbox of best emergent literacy practices. A review of published literature in speech-language pathology, early intervention, early childhood education, and literacy was completed. Subsequently, an overview of the impact of early home and preschool literacy experiences are described. Research-based implementation of best practice is supported with examples of shared book reading and child-led literacy embedded in play within the coaching model of early intervention. Finally, various aspects of emergent literacy skill development in the preschool years are discussed. These include phonemic awareness, print/alphabet awareness, oral language skills, and embedded/explicit literacy. Research indicates that rich home literacy environments and exposure to rich oral language provide an important foundation for the more structured literacy environments of school. Furthermore, there is a wealth of evidence to support a variety of direct and indirect intervention practices in the home, child care, and preschool contexts to support and enhance all aspects of oral and written literacy. Application of this "toolbox" of strategies should enable speech-language pathologists to address the prevention and intervention of literacy deficits within multiple environments during book and play activities. Additionally, clinicians will have techniques to share with parents, child care providers, and preschool teachers for evidence-based literacy instruction within all settings during typical daily activities.

  19. Open Babel: An open chemical toolbox

    PubMed Central

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  20. Application of COMSOL to Acoustic Imaging

    DTIC Science & Technology

    2010-10-01

    Marquardt (LM) (2 epochs), followed by Broyden, Fletcher, Goldfarb, and Shannon (BFGS) (2 epochs) followed by scaled conjugate gradient ( SCG )(100...Use Matlab’s excellent Neural Network Toolbox  Optimization techniques considered:  Scaled‏Con jugate‏ Gradient‏ (“ SCG ”)‏ - fast  One‏Step

  1. Derivation of Rabbit Embryonic Stem Cells from Vitrified–Thawed Embryos

    PubMed Central

    Chen, Chien-Hong; Li, Yi; Hu, Yeshu; An, Li-You; Yang, Lan; Zhang, Jifeng; Chen, Y. Eugene

    2015-01-01

    Abstract The rabbit is a useful animal model for regenerative medicine. We previously developed pluripotent rabbit embryonic stem cell (rbESC) lines using fresh embryos. We also successfully cryopreserved rabbit embryos by vitrification. In the present work, we combined these two technologies to derive rbESCs using vitrified–thawed (V/T) embryos. We demonstrate that V/T blastocysts (BLs) can be used to derive pluripotent rbESCs with efficiencies comparable to those using fresh BLs. These ESCs are undistinguishable from the ones derived from fresh embryos. We tested the developmental capacity of rbESCs derived from V/T embryos by BL injection experiments and produced chimeric kits. Our work adds cryopreservation to the toolbox of rabbit stem cell research and applications and will greatly expand the available research materials for regenerative medicine in a clinically relevant animal model. PMID:26579970

  2. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    PubMed

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  3. A Data Repository and Visualization Toolbox for Metabolic Pathways and PBPK parameter prediction

    EPA Science Inventory

    NHANES is an extensive, well-structured collection of data about hundreds chemicals products of human metabolism and their concentration in human biomarkers, which includes parent to product mapping where known. Together, these data can be used to test the efficacy of application...

  4. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    USDA-ARS?s Scientific Manuscript database

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  5. Robust Correlation Analyses: False Positive and Power Validation Using a New Open Source Matlab Toolbox

    PubMed Central

    Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.

    2012-01-01

    Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907

  6. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    PubMed

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  7. Computational Sustainability Tools Illuminating the Triple Value Model

    EPA Science Inventory

    The National Research Council (NRC) report Sustainability and the U.S. EPA recommends development of a "sustainability toolbox" to address challenges associated with implementing sustainability at the EPA. The three pillars of sustainability - industrial, social, and en...

  8. Reconstruction and downscaling of Eastern Mediterranean OSCAR satellite surface current data using DINEOF

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Andreas; Stylianou, Stavros; Georgiou, Georgios; Hajimitsis, Diofantos; Gravanis, Elias; Akylas, Evangelos

    2015-04-01

    During the last decade, Rixen (2005) and Alvera-Azkarate (2010) presented the DINEOF (Data Interpolating Empirical Orthogonal Functions) method, a EOF-based technique to reconstruct missing data in satellite images. The application of DINEOF method, proved to provide relative success in various experimental trials (Wang and Liu, 2013; Nikolaidis et al., 2013;2014), and tends to be an effective and computationally affordable solution, on the problem of data reconstruction, for missing data from geophysical fields, such as chlorophyll-a, sea surface temperatures or salinity and geophysical fields derived from satellite data. Implementation of this method in a GIS system will provide with a more complete, integrated approach, permitting the expansion of the applicability over various aspects. This may be especially useful in studies where various data of different kind, have to be examined. For this purpose, in this study we have implemented and present a GIS toolbox that aims to automate the usage of the algorithm, incorporating the DINEOF codes provided by GHER (GeoHydrodynamics and Environment Research Group of University of Liege) into the ArcGIS®. ArcGIS® is a well known standard on Geographical Information Systems, used over the years for various remote sensing procedures, in sea and land environment alike. A case-study of filling the missing satellite derived current data in the Eastern Mediterranean Sea area, for a monthly period is analyzed, as an example for the effectiveness and simplicity of the usage of this toolbox. The specific study focuses to OSCAR satellite data (http://www.oscar.noaa.gov/) collected by NOAA/NESDIS Operational Surface Current Processing and Data Center, from the respective products of OSCAR Project Office Earth and Space Research organization, that provides free online access to unfiltered (1/3 degree) resolution. All the 5-day mean products data coverage were successfully reconstructed. KEY WORDS: Remote Sensing, Cyprus, Mediterranean, DINEOF, ArcGIS, data reconstruction.

  9. MNPBEM - A Matlab toolbox for the simulation of plasmonic nanoparticles

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich; Trügler, Andreas

    2012-02-01

    MNPBEM is a Matlab toolbox for the simulation of metallic nanoparticles (MNP), using a boundary element method (BEM) approach. The main purpose of the toolbox is to solve Maxwell's equations for a dielectric environment where bodies with homogeneous and isotropic dielectric functions are separated by abrupt interfaces. Although the approach is in principle suited for arbitrary body sizes and photon energies, it is tested (and probably works best) for metallic nanoparticles with sizes ranging from a few to a few hundreds of nanometers, and for frequencies in the optical and near-infrared regime. The toolbox has been implemented with Matlab classes. These classes can be easily combined, which has the advantage that one can adapt the simulation programs flexibly for various applications. Program summaryProgram title: MNPBEM Catalogue identifier: AEKJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v2 No. of lines in distributed program, including test data, etc.: 15 700 No. of bytes in distributed program, including test data, etc.: 891 417 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b) Computer: Any which supports Matlab 7.11.0 (R2010b) Operating system: Any which supports Matlab 7.11.0 (R2010b) RAM: ⩾1 GByte Classification: 18 Nature of problem: Solve Maxwell's equations for dielectric particles with homogeneous dielectric functions separated by abrupt interfaces. Solution method: Boundary element method using electromagnetic potentials. Running time: Depending on surface discretization between seconds and hours.

  10. Photonics and spectroscopy in nanojunctions: a theoretical insight

    DOE PAGES

    Galperin, Michael

    2017-04-11

    The progress of experimental techniques at the nanoscale in the last decade made optical measurements in current-carrying nanojunctions a reality, thus indicating the emergence of a new field of research coined optoelectronics. Optical spectroscopy of open nonequilibrium systems is a natural meeting point for (at least) two research areas: nonlinear optical spectroscopy and quantum transport, each with its own theoretical toolbox. We review recent progress in the field comparing theoretical treatments of optical response in nanojunctions as is accepted in nonlinear spectroscopy and quantum transport communities. A unified theoretical description of spectroscopy in nanojunctions is presented. Here, we argue thatmore » theoretical approaches of the quantum transport community (and in particular, the Green function based considerations) yield a convenient tool for optoelectronics when the radiation field is treated classically, and that differences between the toolboxes may become critical when studying the quantum radiation field in junctions.« less

  11. A Module for Graphical Display of Model Results with the CBP Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.

    2015-04-21

    This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide amore » basis for further development of the CBP Toolbox.« less

  12. Expanding the UniFrac Toolbox

    PubMed Central

    2016-01-01

    The UniFrac distance metric is often used to separate groups in microbiome analysis, but requires a constant sequencing depth to work properly. Here we demonstrate that unweighted UniFrac is highly sensitive to rarefaction instance and to sequencing depth in uniform data sets with no clear structure or separation between groups. We show that this arises because of subcompositional effects. We introduce information UniFrac and ratio UniFrac, two new weightings that are not as sensitive to rarefaction and allow greater separation of outliers than classic unweighted and weighted UniFrac. With this expansion of the UniFrac toolbox, we hope to empower researchers to extract more varied information from their data. PMID:27632205

  13. Conducting a Social Network Audit

    ERIC Educational Resources Information Center

    Gumm, Jim; Hatala, John-Paul

    2007-01-01

    Current research makes it abundantly clear that career and technical educators in North America know how to give students the technical skills they need to be competitive in the 21st century marketplace. However, there may be a gap in many educators' toolboxes. The latest research findings indicate that the majority of people have a difficult time…

  14. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  15. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  16. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  17. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) User's Guide

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) software package is an open source, MATLABSimulink toolbox (plug in) that can be used by industry professionals and academics for the development of thermodynamic and controls simulations.

  18. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  19. Peptide chemistry toolbox - Transforming natural peptides into peptide therapeutics.

    PubMed

    Erak, Miloš; Bellmann-Sickert, Kathrin; Els-Heindl, Sylvia; Beck-Sickinger, Annette G

    2018-06-01

    The development of solid phase peptide synthesis has released tremendous opportunities for using synthetic peptides in medicinal applications. In the last decades, peptide therapeutics became an emerging market in pharmaceutical industry. The need for synthetic strategies in order to improve peptidic properties, such as longer half-life, higher bioavailability, increased potency and efficiency is accordingly rising. In this mini-review, we present a toolbox of modifications in peptide chemistry for overcoming the main drawbacks during the transition from natural peptides to peptide therapeutics. Modifications at the level of the peptide backbone, amino acid side chains and higher orders of structures are described. Furthermore, we are discussing the future of peptide therapeutics development and their impact on the pharmaceutical market. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox

    NASA Astrophysics Data System (ADS)

    Karimpour, Arash; Chen, Qin

    2017-09-01

    There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana's estuaries.

  1. GuidosToolbox: universal digital image object analysis

    Treesearch

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  2. EPA SSOAP Toolbox Application for Condition and Capacity Assessment of Wastewater Collection Systems - Paper

    EPA Science Inventory

    The nation’s sanitary sewer infrastructure is aging, with some sewers dating back more than 100 years. Nationwide, there are more than 19,500 municipal sanitary-sewer collection systems serving an estimated 150 million people and about 40,000 sanitary sewer overflow (SSO) events ...

  3. EPA SSOAP Toolbox Application for Condition and Capacity Assessment of Wastewater Collection Systems

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, with some sewers dating back over 100 years. Nationwide, there are more than 19,500 municipal sanitary-sewer collection systems serving an estimated 150 million people and about 40,000 sanitary sewer overflow (SSO) events per ...

  4. Advances in Sewer Condition and Capacity Assessment – Development and Applications of EPA SSOAP Toolbox

    EPA Science Inventory

    In the United States, sanitary sewer infrastructure is aging, with some sewers dating back over 100 years. Nationwide, there are more than 19,500 municipal sanitary-sewer collection systems serving an estimated 150 million people and about 40,000 sanitary sewer overflow (SSO) ev...

  5. A TOOLBOX FOR SANITARY SEWER OVERFLOW ANALYSIS AND PLANNING (SSOAP) AND APPLICATIONS

    EPA Science Inventory

    Rainfall Derived Infiltration and Inflow (RDII) into sanitary sewer systems has long been recognized as a source of operating problems in sewerage systems. RDII is the main cause of sanitary sewer overflows (SSOs) to basements, streets, or nearby streams and can also cause seriou...

  6. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    PubMed

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. RESPONSE PROTOCOL TOOLBOX: OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...

  8. RESPONSE PROTOCOL TOOLBOX: OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water(1). The Toolbox contains guidance that may be adopted voluntarily, a...

  9. RESPONSE PROTOCOL TOOLBOX OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...

  10. The Brain's Versatile Toolbox.

    ERIC Educational Resources Information Center

    Pinker, Steven

    1997-01-01

    Considers the role of evolution and natural selection in the functioning of the modern human brain. Natural selection equipped humans with a mental toolbox of intuitive theories about the world which were used to master rocks, tools, plants, animals, and one another. The same toolbox is used today to master the intellectual challenges of modern…

  11. Compressible Flow Toolbox

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2006-01-01

    The Compressible Flow Toolbox is primarily a MATLAB-language implementation of a set of algorithms that solve approximately 280 linear and nonlinear classical equations for compressible flow. The toolbox is useful for analysis of one-dimensional steady flow with either constant entropy, friction, heat transfer, or Mach number greater than 1. The toolbox also contains algorithms for comparing and validating the equation-solving algorithms against solutions previously published in open literature. The classical equations solved by the Compressible Flow Toolbox are as follows: The isentropic-flow equations, The Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), The Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section), The normal-shock equations, The oblique-shock equations, and The expansion equations.

  12. Multivariate and repeated measures (MRM): A new toolbox for dependent and multimodal group-level neuroimaging data.

    PubMed

    McFarquhar, Martyn; McKie, Shane; Emsley, Richard; Suckling, John; Elliott, Rebecca; Williams, Stephen

    2016-05-15

    Repeated measurements and multimodal data are common in neuroimaging research. Despite this, conventional approaches to group level analysis ignore these repeated measurements in favour of multiple between-subject models using contrasts of interest. This approach has a number of drawbacks as certain designs and comparisons of interest are either not possible or complex to implement. Unfortunately, even when attempting to analyse group level data within a repeated-measures framework, the methods implemented in popular software packages make potentially unrealistic assumptions about the covariance structure across the brain. In this paper, we describe how this issue can be addressed in a simple and efficient manner using the multivariate form of the familiar general linear model (GLM), as implemented in a new MATLAB toolbox. This multivariate framework is discussed, paying particular attention to methods of inference by permutation. Comparisons with existing approaches and software packages for dependent group-level neuroimaging data are made. We also demonstrate how this method is easily adapted for dependency at the group level when multiple modalities of imaging are collected from the same individuals. Follow-up of these multimodal models using linear discriminant functions (LDA) is also discussed, with applications to future studies wishing to integrate multiple scanning techniques into investigating populations of interest. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies

    PubMed Central

    Quinn, T. Alexander; Kohl, Peter

    2013-01-01

    Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215

  14. Genetic modification of lymphocytes by retrovirus-based vectors.

    PubMed

    Suerth, Julia D; Schambach, Axel; Baum, Christopher

    2012-10-01

    The genetic modification of lymphocytes is an important topic in the emerging field of gene therapy. Many clinical trials targeting immunodeficiency syndromes or cancer have shown therapeutic benefit; further applications address inflammatory and infectious disorders. Retroviral vector development requires a detailed understanding of the interactions with the host. Most researchers have used simple gammaretroviral vectors to modify lymphocytes, either directly or via hematopoietic stem and progenitor cells. Lentiviral, spumaviral (foamyviral) and alpharetroviral vectors were designed to reduce the necessity for cell stimulation and to utilize potentially safer integration properties. Novel surface modifications (pseudotyping) and transgenes, built using synthetic components, expand the retroviral toolbox, altogether promising increased specificity and potency. Product consistency will be an important criterion for routine clinical use. Copyright © 2012. Published by Elsevier Ltd.

  15. Watershed Modeling Applications with the Open-Access Modular Distributed Watershed Educational Toolbox (MOD-WET) and Introductory Hydrology Textbook

    NASA Astrophysics Data System (ADS)

    Huning, L. S.; Margulis, S. A.

    2014-12-01

    Traditionally, introductory hydrology courses focus on hydrologic processes as independent or semi-independent concepts that are ultimately integrated into a watershed model near the end of the term. When an "off-the-shelf" watershed model is introduced in the curriculum, this approach can result in a potential disconnect between process-based hydrology and the inherent interconnectivity of processes within the water cycle. In order to curb this and reduce the learning curve associated with applying hydrologic concepts to complex real-world problems, we developed the open-access Modular Distributed Watershed Educational Toolbox (MOD-WET). The user-friendly, MATLAB-based toolbox contains the same physical equations for hydrological processes (i.e. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) that are presented in the companion e-textbook (http://aqua.seas.ucla.edu/margulis_intro_to_hydro_textbook.html) and taught in the classroom. The modular toolbox functions can be used by students to study individual hydrologic processes. These functions are integrated together to form a simple spatially-distributed watershed model, which reinforces a holistic understanding of how hydrologic processes are interconnected and modeled. Therefore when watershed modeling is introduced, students are already familiar with the fundamental building blocks that have been unified in the MOD-WET model. Extensive effort has been placed on the development of a highly modular and well-documented code that can be run on a personal computer within the commonly-used MATLAB environment. MOD-WET was designed to: 1) increase the qualitative and quantitative understanding of hydrological processes at the basin-scale and demonstrate how they vary with watershed properties, 2) emphasize applications of hydrologic concepts rather than computer programming, 3) elucidate the underlying physical processes that can often be obscured with a complicated "off-the-shelf" watershed model in an introductory hydrology course, and 4) reduce the learning curve associated with analyzing meaningful real-world problems. The open-access MOD-WET and e-textbook have already been successfully incorporated within our undergraduate curriculum.

  16. Remote Sensing Product Verification and Validation at the NASA Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas M.

    2005-01-01

    Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.

  17. ROLE OF SOURCE WATER PROTECTION IN PLANNING FOR AND RESPONDING TO CONTAMINATION THREATS TO DRINKING WATER SYSTEMS

    EPA Science Inventory

    EPA has developed a "Response Protocol Toolbox" to address the complex, multi-faceted challenges of planning and response to intentional contamination of drinking water (http://www.epa.gov/safewater/security/ertools.html#toolbox). The toolbox is designed to be applied by a numbe...

  18. Using a Toolbox of Tailored Educational Lessons to Improve Fruit, Vegetable, and Physical Activity Behaviors among African American Women in California

    ERIC Educational Resources Information Center

    Backman, Desiree; Scruggs, Valarie; Atiedu, Akpene Ama; Bowie, Shene; Bye, Larry; Dennis, Angela; Hall, Melanie; Ossa, Alexandra; Wertlieb, Stacy; Foerster, Susan B.

    2011-01-01

    Objective: Evaluate the effectiveness of the "Fruit, Vegetable, and Physical Activity Toolbox for Community Educators" ("Toolbox"), an intervention originally designed for Spanish- and English-speaking audiences, in changing knowledge, attitudes, and behavior among low-income African American women. Design: Quasi-experimental…

  19. Proposal for the design of a zero gravity tool storage device

    NASA Technical Reports Server (NTRS)

    Stuckwisch, Sue; Carrion, Carlos A.; Phillips, Lee; Laughlin, Julia; Francois, Jason

    1994-01-01

    Astronauts frequently use a variety of hand tools during space missions, especially on repair missions. A toolbox is needed to allow storage and retrieval of tools with minimal difficulties. The toolbox must contain tools during launch, landing, and on-orbit operations. The toolbox will be used in the Shuttle Bay and therefore must withstand the hazardous space environment. The three main functions of the toolbox in space are: to protect the tools from the space environment and from damaging one another, to allow for quick, one-handed access to the tools; and to minimize the heat transfer between the astronaut's hand and the tools. This proposal explores the primary design issues associated with the design of the toolbox. Included are the customer and design specifications, global and refined function structures, possible solution principles, concept variants, and finally design recommendations.

  20. CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox

    NASA Astrophysics Data System (ADS)

    Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano

    2018-03-01

    Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.

  1. JWST Wavefront Control Toolbox

    NASA Technical Reports Server (NTRS)

    Shin, Shahram Ron; Aronstein, David L.

    2011-01-01

    A Matlab-based toolbox has been developed for the wavefront control and optimization of segmented optical surfaces to correct for possible misalignments of James Webb Space Telescope (JWST) using influence functions. The toolbox employs both iterative and non-iterative methods to converge to an optimal solution by minimizing the cost function. The toolbox could be used in either of constrained and unconstrained optimizations. The control process involves 1 to 7 degrees-of-freedom perturbations per segment of primary mirror in addition to the 5 degrees of freedom of secondary mirror. The toolbox consists of a series of Matlab/Simulink functions and modules, developed based on a "wrapper" approach, that handles the interface and data flow between existing commercial optical modeling software packages such as Zemax and Code V. The limitations of the algorithm are dictated by the constraints of the moving parts in the mirrors.

  2. ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it

    NASA Astrophysics Data System (ADS)

    Lecocq, Thomas; Megies, Tobias; Krischer, Lion; Sales de Andrade, Elliott; Barsch, Robert; Beyreuther, Moritz

    2016-04-01

    ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides * read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, * a comprehensive signal processing toolbox tuned to the needs of seismologists, * integrated access to all large data centers, web services and databases, and * convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software. ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. This contribution will give a short introduction and overview of ObsPy and highlight a number of use cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.

  3. ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; Beyreuther, M.

    2015-12-01

    ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, a comprehensive signal processing toolbox tuned to the needs of seismologists, integrated access to all large data centers, web services and databases, and convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software.ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it.This contribution will give a short introduction and overview of ObsPy and highlight a number of us cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.

  4. MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.

    PubMed

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.

  5. Basic Radar Altimetry Toolbox: tools to teach altimetry for ocean

    NASA Astrophysics Data System (ADS)

    Rosmorduc, Vinca; Benveniste, Jerome; Bronner, Emilie; Niemeijer, Sander; Lucas, Bruno Manuel; Dinardo, Salvatore

    2013-04-01

    The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, CryoSat. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (January 2013), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are in discussion for the future, including addition of the future Sentinel-3. The Basic Radar Altimetry Toolbox is able: - to read most distributed radar altimetry data, including the one from future missions like Saral, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways, including as an educational tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. Example from education uses will be presented, and feedback from those who used it as such will be most welcome. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  6. DAFNE: A Matlab toolbox for Bayesian multi-source remote sensing and ancillary data fusion, with application to flood mapping

    NASA Astrophysics Data System (ADS)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco P.; Pasquariello, Guido

    2018-03-01

    High-resolution, remotely sensed images of the Earth surface have been proven to be of help in producing detailed flood maps, thanks to their synoptic overview of the flooded area and frequent revisits. However, flood scenarios can be complex situations, requiring the integration of different data in order to provide accurate and robust flood information. Several processing approaches have been recently proposed to efficiently combine and integrate heterogeneous information sources. In this paper, we introduce DAFNE, a Matlab®-based, open source toolbox, conceived to produce flood maps from remotely sensed and other ancillary information, through a data fusion approach. DAFNE is based on Bayesian Networks, and is composed of several independent modules, each one performing a different task. Multi-temporal and multi-sensor data can be easily handled, with the possibility of following the evolution of an event through multi-temporal output flood maps. Each DAFNE module can be easily modified or upgraded to meet different user needs. The DAFNE suite is presented together with an example of its application.

  7. A Proteomics View of the Molecular Mechanisms and Biomarkers of Glaucomatous Neurodegeneration

    PubMed Central

    Tezel, Gülgün

    2013-01-01

    Despite improving understanding of glaucoma, key molecular players of neurodegeneration that can be targeted for treatment of glaucoma, or molecular biomarkers that can be useful for clinical testing, remain unclear. Proteomics technology offers a powerful toolbox to accomplish these important goals of the glaucoma research and is increasingly being applied to identify molecular mechanisms and biomarkers of glaucoma. Recent studies of glaucoma using proteomics analysis techniques have resulted in the lists of differentially expressed proteins in human glaucoma and animal models. The global analysis of protein expression in glaucoma has been followed by cell-specific proteome analysis of retinal ganglion cells and astrocytes. The proteomics data have also guided targeted studies to identify post-translational modifications and protein-protein interactions during glaucomatous neurodegeneration. In addition, recent applications of proteomics have provided a number of potential biomarker candidates. Proteomics technology holds great promise to move glaucoma research forward toward new treatment strategies and biomarker discovery. By reviewing the major proteomics approaches and their applications in the field of glaucoma, this article highlights the power of proteomics in translational and clinical research related to glaucoma and also provides a framework for future research to functionally test the importance of specific molecular pathways and validate candidate biomarkers. PMID:23396249

  8. Toolbox for Research and Exploration (TREX): Investigations of Fine-Grained Materials on Small Bodies

    NASA Technical Reports Server (NTRS)

    Domingue, D. L.; Allain, J.-P.; Banks, M.; Christoffersen, R.; Cintala, M.; Clark, R.; Cloutis, E.; Graps, A.; Hendrix, A. R.; Hsieh, H.; hide

    2018-01-01

    The Toolbox for Research and Exploration (TREX) is a NASA SSERVI (Solar System Exploration Research Virtual Institute) node. TREX (trex.psi.edu) aims to decrease risk to future missions, specifically to the Moon, the Martian moons, and near- Earth asteroids, by improving mission success and assuring the safety of astronauts, their instruments, and spacecraft. TREX studies will focus on characteristics of the fine grains that cover the surfaces of these target bodies - their spectral characteristics and the potential resources (such as H2O) they may harbor. TREX studies are organized into four Themes (Laboratory- Studies, Moon-Studies, Small-Bodies Studies, and Field-Work). In this presentation, we focus on the work targeted by the Small-Bodies Theme. The Small-Bodies' Theme delves into several topics, many which overlap or are synergistic with the other TREX Themes. The main topics include photometry, spectral modeling, laboratory simulations of space weathering processes relevant to asteroids, the assembly of an asteroid regolith database, the dichotomy between nuclear and reflectance spectroscopy, and the dynamical evolution of asteroids and the implications for the retention of volatiles.

  9. Integrating hidden Markov model and PRAAT: a toolbox for robust automatic speech transcription

    NASA Astrophysics Data System (ADS)

    Kabir, A.; Barker, J.; Giurgiu, M.

    2010-09-01

    An automatic time-aligned phone transcription toolbox of English speech corpora has been developed. Especially the toolbox would be very useful to generate robust automatic transcription and able to produce phone level transcription using speaker independent models as well as speaker dependent models without manual intervention. The system is based on standard Hidden Markov Models (HMM) approach and it was successfully experimented over a large audiovisual speech corpus namely GRID corpus. One of the most powerful features of the toolbox is the increased flexibility in speech processing where the speech community would be able to import the automatic transcription generated by HMM Toolkit (HTK) into a popular transcription software, PRAAT, and vice-versa. The toolbox has been evaluated through statistical analysis on GRID data which shows that automatic transcription deviates by an average of 20 ms with respect to manual transcription.

  10. MEG/EEG Source Reconstruction, Statistical Evaluation, and Visualization with NUTMEG

    PubMed Central

    Dalal, Sarang S.; Zumer, Johanna M.; Guggisberg, Adrian G.; Trumpis, Michael; Wong, Daniel D. E.; Sekihara, Kensuke; Nagarajan, Srikantan S.

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions. PMID:21437174

  11. MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.

    PubMed

    Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.

  12. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis.

    PubMed

    Fayyaz S, S Kiavash; Liu, Xiaoyue Cathy; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George's transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis.

  13. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis

    PubMed Central

    Fayyaz S., S. Kiavash; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George’s transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis. PMID:28981544

  14. EPA's Sustainable Port Communities: a Knowledge System Approach to Increasing Resiliency

    EPA Science Inventory

    Create a Sustainability Assessment Toolbox for US ports that: • Addresses multiple issues faced by ports • Builds on work from other EPA Research Programs • Complements other resiliency work from federal and NGO partners, e.g.: EPA’s Office of Transportat...

  15. Basic Radar Altimetry Toolbox: Tools and Tutorial to Use Cryosat Data

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Bronner, E.; Dinardo, S.; Lucas, B. M.; Rosmorduc, V.; Earith, D.; Niemeijer, S.

    2011-12-01

    Radar altimetry is very much a technique expanding its applications. Even If quite a lot of effort has been invested for oceanography users, the use of Altimetry data for cryosphere application, especially with the new ESA CryoSat-2 mission data is still somehow tedious for new Altimetry data products users. ESA and CNES therfore developed the Basic Radar Altimetry Toolbox a few years ago, and are improving and upgrading it to fit new missions and the growing number of altimetry uses. The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat, the future Saral missions and is ready for adaptation to Sentinel-3 products - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available since April 2007, and had been demonstrated during training courses and scientific meetings. About 2000 people downloaded it (Summer 2011), with many "newcomers" to altimetry among them, including teachers and professors, worldwide. Users' feedback, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in the recent version release (v3.0.1). Others are in discussion for future development. Data use cases on CryoSat data use will be presented. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  16. National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox

    USGS Publications Warehouse

    Price, Curtis

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells.

  17. The Diffusion Tensor Imaging Toolbox

    PubMed Central

    Alger, Jeffry R.

    2012-01-01

    During the past few years, the Journal of Neuroscience has published over 30 articles that describe investigations that used Diffusion Tensor Imaging (DTI) and related techniques as a primary observation method. This illustrates a growing interest in DTI within the basic and clinical neuroscience communities. This article summarizes DTI methodology in terms that can be immediately understood by the neuroscientist who has little previous exposure to DTI. It describes the fundamentals of water molecular diffusion coefficient measurement in brain tissue and illustrates how these fundamentals can be used to form vivid and useful depictions of white matter macroscopic and microscopic anatomy. It also describes current research applications and the technique’s attributes and limitations. It is hoped that this article will help the readers of this Journal to more effectively evaluate neuroscience studies that use DTI. PMID:22649222

  18. Genetic screens and functional genomics using CRISPR/Cas9 technology.

    PubMed

    Hartenian, Ella; Doench, John G

    2015-04-01

    Functional genomics attempts to understand the genome by perturbing the flow of information from DNA to RNA to protein, in order to learn how gene dysfunction leads to disease. CRISPR/Cas9 technology is the newest tool in the geneticist's toolbox, allowing researchers to edit DNA with unprecedented ease, speed and accuracy, and representing a novel means to perform genome-wide genetic screens to discover gene function. In this review, we first summarize the discovery and characterization of CRISPR/Cas9, and then compare it to other genome engineering technologies. We discuss its initial use in screening applications, with a focus on optimizing on-target activity and minimizing off-target effects. Finally, we comment on future challenges and opportunities afforded by this technology. © 2015 FEBS.

  19. SOCIB Glider toolbox: from sensor to data repository

    NASA Astrophysics Data System (ADS)

    Pau Beltran, Joan; Heslop, Emma; Ruiz, Simón; Troupin, Charles; Tintoré, Joaquín

    2015-04-01

    Nowadays in oceanography, gliders constitutes a mature, cost-effective technology for the acquisition of measurements independently of the sea state (unlike ships), providing subsurface data during sustained periods, including extreme weather events. The SOCIB glider toolbox is a set of MATLAB/Octave scripts and functions developed in order to manage the data collected by a glider fleet. They cover the main stages of the data management process, both in real-time and delayed-time modes: metadata aggregation, downloading, processing, and automatic generation of data products and figures. The toolbox is distributed under the GNU licence (http://www.gnu.org/copyleft/gpl.html) and is available at http://www.socib.es/users/glider/glider_toolbox.

  20. Investigation of a Graphical CONOPS Development Environment for Agile Systems Engineering - Phase 2

    DTIC Science & Technology

    2010-05-31

    Westerman, Crawshaw , Hockey, and Sauer, 1998) for practitioners. Thus, CTA extends research by providing a toolbox of methods for understanding teams...drive process and performance. Washington, DC: American Psychological Association. pp. 3-8. Shrayne, N. M., Westerman, S. J., Crawshaw , C. M

  1. The Air Sensor Citizen Science Toolbox: A Collaboration in Community Air Quality Monitoring and Mapping

    EPA Science Inventory

    Research in Action: Collect air quality data to characterize near-road/near-source hotspots; Determine potential impact on nearby residences & roadways; Case study of successful use of such data; Relationship between distance to roadways and industrial sources, exposure to...

  2. Sliding tethered ligands add topological interactions to the toolbox of ligand–receptor design

    PubMed Central

    Bauer, Martin; Kékicheff, Patrick; Iss, Jean; Fajolles, Christophe; Charitat, Thierry; Daillant, Jean; Marques, Carlos M.

    2015-01-01

    Adhesion in the biological realm is mediated by specific lock-and-key interactions between ligand–receptor pairs. These complementary moieties are ubiquitously anchored to substrates by tethers that control the interaction range and the mobility of the ligands and receptors, thus tuning the kinetics and strength of the binding events. Here we add sliding anchoring to the toolbox of ligand–receptor design by developing a family of tethered ligands for which the spacer can slide at the anchoring point. Our results show that this additional sliding degree of freedom changes the nature of the adhesive contact by extending the spatial range over which binding may sustain a significant force. By introducing sliding tethered ligands with self-regulating length, this work paves the way for the development of versatile and reusable bio-adhesive substrates with potential applications for drug delivery and tissue engineering. PMID:26350224

  3. Expanding the KATE toolbox

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1993-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.

  4. MuTE: a MATLAB toolbox to compare established and novel estimators of the multivariate transfer entropy.

    PubMed

    Montalto, Alessandro; Faes, Luca; Marinazzo, Daniele

    2014-01-01

    A challenge for physiologists and neuroscientists is to map information transfer between components of the systems that they study at different scales, in order to derive important knowledge on structure and function from the analysis of the recorded dynamics. The components of physiological networks often interact in a nonlinear way and through mechanisms which are in general not completely known. It is then safer that the method of choice for analyzing these interactions does not rely on any model or assumption on the nature of the data and their interactions. Transfer entropy has emerged as a powerful tool to quantify directed dynamical interactions. In this paper we compare different approaches to evaluate transfer entropy, some of them already proposed, some novel, and present their implementation in a freeware MATLAB toolbox. Applications to simulated and real data are presented.

  5. MuTE: A MATLAB Toolbox to Compare Established and Novel Estimators of the Multivariate Transfer Entropy

    PubMed Central

    Montalto, Alessandro; Faes, Luca; Marinazzo, Daniele

    2014-01-01

    A challenge for physiologists and neuroscientists is to map information transfer between components of the systems that they study at different scales, in order to derive important knowledge on structure and function from the analysis of the recorded dynamics. The components of physiological networks often interact in a nonlinear way and through mechanisms which are in general not completely known. It is then safer that the method of choice for analyzing these interactions does not rely on any model or assumption on the nature of the data and their interactions. Transfer entropy has emerged as a powerful tool to quantify directed dynamical interactions. In this paper we compare different approaches to evaluate transfer entropy, some of them already proposed, some novel, and present their implementation in a freeware MATLAB toolbox. Applications to simulated and real data are presented. PMID:25314003

  6. Open source tools for the information theoretic analysis of neural data.

    PubMed

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  7. Expanding the metabolic engineering toolbox with directed evolution.

    PubMed

    Abatemarco, Joseph; Hill, Andrew; Alper, Hal S

    2013-12-01

    Cellular systems can be engineered into factories that produce high-value chemicals from renewable feedstock. Such an approach requires an expanded toolbox for metabolic engineering. Recently, protein engineering and directed evolution strategies have started to play a growing and critical role within metabolic engineering. This review focuses on the various ways in which directed evolution can be applied in conjunction with metabolic engineering to improve product yields. Specifically, we discuss the application of directed evolution on both catalytic and non-catalytic traits of enzymes, on regulatory elements, and on whole genomes in a metabolic engineering context. We demonstrate how the goals of metabolic pathway engineering can be achieved in part through evolving cellular parts as opposed to traditional approaches that rely on gene overexpression and deletion. Finally, we discuss the current limitations in screening technology that hinder the full implementation of a metabolic pathway-directed evolution approach. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Computational Performance of a Parallelized Three-Dimensional High-Order Spectral Element Toolbox

    NASA Astrophysics Data System (ADS)

    Bosshard, Christoph; Bouffanais, Roland; Clémençon, Christian; Deville, Michel O.; Fiétier, Nicolas; Gruber, Ralf; Kehtari, Sohrab; Keller, Vincent; Latt, Jonas

    In this paper, a comprehensive performance review of an MPI-based high-order three-dimensional spectral element method C++ toolbox is presented. The focus is put on the performance evaluation of several aspects with a particular emphasis on the parallel efficiency. The performance evaluation is analyzed with help of a time prediction model based on a parameterization of the application and the hardware resources. A tailor-made CFD computation benchmark case is introduced and used to carry out this review, stressing the particular interest for clusters with up to 8192 cores. Some problems in the parallel implementation have been detected and corrected. The theoretical complexities with respect to the number of elements, to the polynomial degree, and to communication needs are correctly reproduced. It is concluded that this type of code has a nearly perfect speed up on machines with thousands of cores, and is ready to make the step to next-generation petaflop machines.

  9. A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Sadegh, M.; Mallakpour, I.

    2017-12-01

    Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.

  10. TopoToolbox: using sensor topography to calculate psychologically meaningful measures from event-related EEG/MEG.

    PubMed

    Tian, Xing; Poeppel, David; Huber, David E

    2011-01-01

    The open-source toolbox "TopoToolbox" is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004).

  11. User Guide for Compressible Flow Toolbox Version 2.1 for Use With MATLAB(Registered Trademark); Version 7

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2006-01-01

    This report provides a user guide for the Compressible Flow Toolbox, a collection of algorithms that solve almost 300 linear and nonlinear classical compressible flow relations. The algorithms, implemented in the popular MATLAB programming language, are useful for analysis of one-dimensional steady flow with constant entropy, friction, heat transfer, or shock discontinuities. The solutions do not include any gas dissociative effects. The toolbox also contains functions for comparing and validating the equation-solving algorithms against solutions previously published in the open literature. The classical equations solved by the Compressible Flow Toolbox are: isentropic-flow equations, Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section.), normal-shock equations, oblique-shock equations, and Prandtl-Meyer expansion equations. At the time this report was published, the Compressible Flow Toolbox was available without cost from the NASA Software Repository.

  12. A Model of Human Orientation and Self Motion Perception during Body Acceleration: The Orientation Modeling System

    DTIC Science & Technology

    2016-09-28

    previous research and modeling results. The OMS and Perception Toolbox were used to perform a case study of an F18 mishap. Model results imply that...request documents from DTIC. Change of Address Organizations receiving reports from the U.S. Army Aeromedical Research Laboratory on automatic...54  Coriolis head movement during a coordinated turn. .............................................55  Case Study

  13. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG

    PubMed Central

    Cowley, Benjamin U.; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap. PMID:29692705

  14. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    PubMed

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  15. morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python

    PubMed Central

    Hull, Michael J.; Willshaw, David J.

    2014-01-01

    The broad structure of a modeling study can often be explained over a cup of coffee, but converting this high-level conceptual idea into graphs of the final simulation results may require many weeks of sitting at a computer. Although models themselves can be complex, often many mental resources are wasted working around complexities of the software ecosystem such as fighting to manage files, interfacing between tools and data formats, finding mistakes in code or working out the units of variables. morphforge is a high-level, Python toolbox for building and managing simulations of small populations of multicompartmental biophysical model neurons. An entire in silico experiment, including the definition of neuronal morphologies, channel descriptions, stimuli, visualization and analysis of results can be written within a single short Python script using high-level objects. Multiple independent simulations can be created and run from a single script, allowing parameter spaces to be investigated. Consideration has been given to the reuse of both algorithmic and parameterizable components to allow both specific and stochastic parameter variations. Some other features of the toolbox include: the automatic generation of human-readable documentation (e.g., PDF files) about a simulation; the transparent handling of different biophysical units; a novel mechanism for plotting simulation results based on a system of tags; and an architecture that supports both the use of established formats for defining channels and synapses (e.g., MODL files), and the possibility to support other libraries and standards easily. We hope that this toolbox will allow scientists to quickly build simulations of multicompartmental model neurons for research and serve as a platform for further tool development. PMID:24478690

  16. Local soil quality assessment of north-central Namibia: integrating farmers' and technical knowledge

    NASA Astrophysics Data System (ADS)

    Prudat, Brice; Bloemertz, Lena; Kuhn, Nikolaus J.

    2018-02-01

    Soil degradation is a major threat for farmers of semi-arid north-central Namibia. Soil conservation practices can be promoted by the development of soil quality (SQ) evaluation toolboxes that provide ways to evaluate soil degradation. However, such toolboxes must be adapted to local conditions to reach farmers. Based on qualitative (interviews and soil descriptions) and quantitative (laboratory analyses) data, we developed a set of SQ indicators relevant for our study area that integrates farmers' field experiences (FFEs) and technical knowledge. We suggest using participatory mapping to delineate soil units (Oshikwanyama soil units, KwSUs) based on FFEs, which highlight mostly soil properties that integrate long-term productivity and soil hydrological characteristics (i.e. internal SQ). The actual SQ evaluation of a location depends on the KwSU described and is thereafter assessed by field soil texture (i.e. chemical fertility potential) and by soil colour shade (i.e. SOC status). This three-level information aims to reveal SQ improvement potential by comparing, for any location, (a) estimated clay content against median clay content (specific to KwSU) and (b) soil organic status against calculated optimal values (depends on clay content). The combination of farmers' and technical assessment cumulates advantages of both systems of knowledge, namely the integrated long-term knowledge of the farmers and a short- and medium-term SQ status assessment. The toolbox is a suggestion for evaluating SQ and aims to help farmers, rural development planners and researchers from all fields of studies understanding SQ issues in north-central Namibia. This suggested SQ toolbox is adapted to a restricted area of north-central Namibia, but similar tools could be developed in most areas where small-scale agriculture prevails.

  17. Modular-based multiscale modeling on viscoelasticity of polymer nanocomposites

    NASA Astrophysics Data System (ADS)

    Li, Ying; Liu, Zeliang; Jia, Zheng; Liu, Wing Kam; Aldousari, Saad M.; Hedia, Hassan S.; Asiri, Saeed A.

    2017-02-01

    Polymer nanocomposites have been envisioned as advanced materials for improving the mechanical performance of neat polymers used in aerospace, petrochemical, environment and energy industries. With the filler size approaching the nanoscale, composite materials tend to demonstrate remarkable thermomechanical properties, even with addition of a small amount of fillers. These observations confront the classical composite theories and are usually attributed to the high surface-area-to-volume-ratio of the fillers, which can introduce strong nanoscale interfacial effect and relevant long-range perturbation on polymer chain dynamics. Despite decades of research aimed at understanding interfacial effect and improving the mechanical performance of composite materials, it is not currently possible to accurately predict the mechanical properties of polymer nanocomposites directly from their molecular constituents. To overcome this challenge, different theoretical, experimental and computational schemes will be used to uncover the key physical mechanisms at the relevant spatial and temporal scales for predicting and tuning constitutive behaviors in silico, thereby establishing a bottom-up virtual design principle to achieve unprecedented mechanical performance of nanocomposites. A modular-based multiscale modeling approach for viscoelasticity of polymer nanocomposites has been proposed and discussed in this study, including four modules: (A) neat polymer toolbox; (B) interphase toolbox; (C) microstructural toolbox and (D) homogenization toolbox. Integrating these modules together, macroscopic viscoelasticity of polymer nanocomposites could be directly predicted from their molecular constituents. This will maximize the computational ability to design novel polymer composites with advanced performance. More importantly, elucidating the viscoelasticity of polymer nanocomposites through fundamental studies is a critical step to generate an integrated computational material engineering principle for discovering and manufacturing new composites with transformative impact on aerospace, automobile, petrochemical industries.

  18. On Feature Extraction from Large Scale Linear LiDAR Data

    NASA Astrophysics Data System (ADS)

    Acharjee, Partha Pratim

    Airborne light detection and ranging (LiDAR) can generate co-registered elevation and intensity map over large terrain. The co-registered 3D map and intensity information can be used efficiently for different feature extraction application. In this dissertation, we developed two algorithms for feature extraction, and usages of features for practical applications. One of the developed algorithms can map still and flowing waterbody features, and another one can extract building feature and estimate solar potential on rooftops and facades. Remote sensing capabilities, distinguishing characteristics of laser returns from water surface and specific data collection procedures provide LiDAR data an edge in this application domain. Furthermore, water surface mapping solutions must work on extremely large datasets, from a thousand square miles, to hundreds of thousands of square miles. National and state-wide map generation/upgradation and hydro-flattening of LiDAR data for many other applications are two leading needs of water surface mapping. These call for as much automation as possible. Researchers have developed many semi-automated algorithms using multiple semi-automated tools and human interventions. This reported work describes a consolidated algorithm and toolbox developed for large scale, automated water surface mapping. Geometric features such as flatness of water surface, higher elevation change in water-land interface and, optical properties such as dropouts caused by specular reflection, bimodal intensity distributions were some of the linear LiDAR features exploited for water surface mapping. Large-scale data handling capabilities are incorporated by automated and intelligent windowing, by resolving boundary issues and integrating all results to a single output. This whole algorithm is developed as an ArcGIS toolbox using Python libraries. Testing and validation are performed on a large datasets to determine the effectiveness of the toolbox and results are presented. Significant power demand is located in urban areas, where, theoretically, a large amount of building surface area is also available for solar panel installation. Therefore, property owners and power generation companies can benefit from a citywide solar potential map, which can provide available estimated annual solar energy at a given location. An efficient solar potential measurement is a prerequisite for an effective solar energy system in an urban area. In addition, the solar potential calculation from rooftops and building facades could open up a wide variety of options for solar panel installations. However, complex urban scenes make it hard to estimate the solar potential, partly because of shadows cast by the buildings. LiDAR-based 3D city models could possibly be the right technology for solar potential mapping. Although, most of the current LiDAR-based local solar potential assessment algorithms mainly address rooftop potential calculation, whereas building facades can contribute a significant amount of viable surface area for solar panel installation. In this paper, we introduce a new algorithm to calculate solar potential of both rooftop and building facades. Solar potential received by the rooftops and facades over the year are also investigated in the test area.

  19. Sum frequency generation vibrational spectroscopy (SFG-VS) for complex molecular surfaces and interfaces: Spectral lineshape measurement and analysis plus some controversial issues

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Fei

    2016-12-01

    Sum-frequency generation vibrational spectroscopy (SFG-VS) was first developed in the 1980s and it has been proven a uniquely sensitive and surface/interface selective spectroscopic probe for characterization of the structure, conformation and dynamics of molecular surfaces and interfaces. In recent years, there have been many progresses in the development of methodology and instrumentation in the SFG-VS toolbox that have significantly broadened the application to complex molecular surfaces and interfaces. In this review, after presenting a unified view on the theory and methodology focusing on the SFG-VS spectral lineshape, as well as the new opportunities in SFG-VS applications with such developments, some of the controversial issues that have been puzzling the community are discussed. The aim of this review is to present to the researchers and students interested in molecular surfaces and interfacial sciences up-to-date perspectives complementary to the existing textbooks and reviews on SFG-VS.

  20. Chemical Synthesis of Hydrocarbon-Stapled Peptides for Protein Interaction Research and Therapeutic Targeting

    PubMed Central

    Bird, Gregory H.; Crannell, W. Christian; Walensky, Loren D.

    2016-01-01

    The peptide alpha-helix represents one of Nature’s most featured protein shapes and is employed in a diversity of protein architectures, spanning the very cytoskeletal infrastructure of the cell to the most intimate contact points between crucial signaling proteins. By installing an all-hydrocarbon crosslink into native sequences, we recapitulate the shape and biological activity of natural peptide alpha-helices, yielding a chemical toolbox to both interrogate the protein interactome and modulate interaction networks for potential therapeutic benefit. Here, we describe our latest approach to synthesizing Stabilized Alpha-Helices (SAH) corresponding to key protein interaction domains. We emphasize a stepwise approach to the production of crosslinking non-natural amino acids, their incorporation into peptide templates, and the application of ruthenium-catalyzed ring closing metathesis to generate hydrocarbon-stapled peptides. Through facile derivatization and functionalization steps, SAHs can be tailored for a broad range of applications in biochemical, structural, proteomic, cellular and in vivo studies. PMID:23801563

  1. Sum frequency generation vibrational spectroscopy (SFG-VS) for complex molecular surfaces and interfaces: Spectral lineshape measurement and analysis plus some controversial issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hong-Fei

    Sum-frequency generation vibrational spectroscopy (SFG-VS) was first developed in the 1980s and it has been proven a uniquely sensitive and surface/interface selective spectroscopic probe for characterization of the structure, conformation and dynamics of molecular surfaces and interfaces. In recent years, there has been significant progress in the development of methodology and instrumentation in the SFG-VS toolbox that has significantly broadened the application to complex molecular surfaces and interfaces. In this review, after presenting a unified view on the theory and methodology focusing on the SFG-VS spectral lineshape, as well as the new opportunities in SFG-VS applications with such developments, somemore » of the controversial issues that have been puzzling the community are to be discussed. The aim of this review is to present to the researchers and students interested in molecular surfaces and interfacial sciences up-to-date perspectives complementary to the existing textbooks and reviews on SFG-VS.« less

  2. Interdisciplinary approach towards a systems medicine toolbox using the example of inflammatory diseases.

    PubMed

    Bauer, Christian R; Knecht, Carolin; Fretter, Christoph; Baum, Benjamin; Jendrossek, Sandra; Rühlemann, Malte; Heinsen, Femke-Anouska; Umbach, Nadine; Grimbacher, Bodo; Franke, Andre; Lieb, Wolfgang; Krawczak, Michael; Hütt, Marc-Thorsten; Sax, Ulrich

    2017-05-01

    Electronic access to multiple data types, from generic information on biological systems at different functional and cellular levels to high-throughput molecular data from human patients, is a prerequisite of successful systems medicine research. However, scientists often encounter technical and conceptual difficulties that forestall the efficient and effective use of these resources. We summarize and discuss some of these obstacles, and suggest ways to avoid or evade them.The methodological gap between data capturing and data analysis is huge in human medical research. Primary data producers often do not fully apprehend the scientific value of their data, whereas data analysts maybe ignorant of the circumstances under which the data were collected. Therefore, the provision of easy-to-use data access tools not only helps to improve data quality on the part of the data producers but also is likely to foster an informed dialogue with the data analysts.We propose a means to integrate phenotypic data, questionnaire data and microbiome data with a user-friendly Systems Medicine toolbox embedded into i2b2/tranSMART. Our approach is exemplified by the integration of a basic outlier detection tool and a more advanced microbiome analysis (alpha diversity) script. Continuous discussion with clinicians, data managers, biostatisticians and systems medicine experts should serve to enrich even further the functionality of toolboxes like ours, being geared to be used by 'informed non-experts' but at the same time attuned to existing, more sophisticated analysis tools. © The Author 2016. Published by Oxford University Press.

  3. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  4. Sensor Selection from Independence Graphs using Submodularity

    DTIC Science & Technology

    2017-02-01

    Krause , B. McMahan, Guestrin C., and Gupta A., “Robust sub- modular observation selection,” Journal of Machine Learning Research (JMLR), vol. 9, pp. 2761...235–257. Springer Berlin Heidelberg, 1983. [10] A. Krause , “SFO: A toolbox for submodular function optimization,” J. Mach. Learn. Res., vol. 11, pp

  5. RTI Strategies That Work in the K-2 Classroom

    ERIC Educational Resources Information Center

    Johnson, Eli; Karns, Michelle

    2011-01-01

    Targeted specifically to K-2 classrooms, the 25 Response-to-Intervention (RTI) strategies in this book are research-based and perfect for teachers who want to expand their toolbox of classroom interventions that work! Contents include: (1) Listening Strategies--Help students focus and understand; (2) Reading Strategies--Help students comprehend…

  6. Strengthening the Student Toolbox: Study Strategies to Boost Learning

    ERIC Educational Resources Information Center

    Dunlosky, John

    2013-01-01

    Before the "big test" did you use the following study strategies: highlighting, rereading, and cramming? As students many of us probably did, yet research shows that while these three strategies are commonly used, they have been ineffective in retaining information. Learning strategies have been discussed in almost every textbook on…

  7. Concept Mapping: An "Instagram" of Students' Thinking

    ERIC Educational Resources Information Center

    Campbell, Laurie O.

    2016-01-01

    Minimal research has been accumulated in the field of social studies education for Novakian concept mapping, yet there are many benefits from adding this learning tool to a teacher's instructional toolbox. The article defines Novakian concept mapping and invites readers to adopt digital Novakian concept mapping into the social studies classroom as…

  8. A Situation Analysis Toolbox for Course of Action Evaluation

    DTIC Science & Technology

    2011-06-01

    Presented at the 16th International Command and Control Research and Technology Symposium (ICCRTS 2011), Qu ?c City, Qu ?c, Canada, June 21-23, 2011...conference on Informa- tion Fusion, Edinburgh, Scotland, July 2010. [2] Mica R. Endsley and Daniel J. Garland. Situation Awareness Analysis and Measurement

  9. The Air Sensor Citizen Science Toolbox: A Collaboration in Community Air Quality Monitoring and Mapping?

    EPA Science Inventory

    Project GoalDevelop tools Citizen Scientists can use to assist them in conducting environmental monitoringResearch PlanIdentify a citizen science project as a potential pilot study locationEstablish their pollutant monitoring interestsDevelop a sensor package to meet their needs ...

  10. The Radiology Resident iPad Toolbox: an educational and clinical tool for radiology residents.

    PubMed

    Sharpe, Emerson E; Kendrick, Michael; Strickland, Colin; Dodd, Gerald D

    2013-07-01

    Tablet computing and mobile resources are the hot topics in technology today, with that interest spilling into the medical field. To improve resident education, a fully configured iPad, referred to as the "Radiology Resident iPad Toolbox," was created and implemented at the University of Colorado. The goal was to create a portable device with comprehensive educational, clinical, and communication tools that would contain all necessary resources for an entire 4-year radiology residency. The device was distributed to a total of 34 radiology residents (8 first-year residents, 8 second-year residents, 9 third-year residents, and 9 fourth-year residents). This article describes the process used to develop and deploy the device, provides a distillation of useful applications and resources decided upon after extensive evaluation, and assesses the impact this device had on resident education. The Radiology Resident iPad Toolbox is a cost-effective, portable, educational instrument that has increased studying efficiency; improved access to study materials such as books, radiology cases, lectures, and web-based resources; and increased interactivity in educational conferences and lectures through the use of audience-response software, with questions geared toward the new ABR board format. This preconfigured tablet fully embraces the technology shift into mobile computing and represents a paradigm shift in educational strategy. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  11. Demonstration of a Fractured Rock Geophysical Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in Fractured Rock Aquifers

    DTIC Science & Technology

    2016-01-01

    USER’S GUIDE Demonstration of a Fractured Rock Geophysical Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in...Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in Fractured Rock Aquifers F.D. Day-Lewis, C.D. Johnson, J.H. Williams, C.L...are doomed to failure. DNAPL biodegradation charactrization and monitoring, remediation, fractured rock aquifers. Unclassified Unclassified UU UL 6

  12. Explaining Society: An Expanded Toolbox for Social Scientists

    PubMed Central

    Bell, David C.; Atkinson-Schnell, Jodie L.; DiBacco, Aron E.

    2012-01-01

    We propose for social scientists a theoretical toolbox containing a set of motivations that neurobiologists have recently validated. We show how these motivations can be used to create a theory of society recognizably similar to existing stable societies (sustainable, self-reproducing, and largely peaceful). Using this toolbox, we describe society in terms of three institutions: economy (a source of sustainability), government (peace), and the family (reproducibility). Conducting a thought experiment in three parts, we begin with a simple theory with only two motivations. We then create successive theories that systematically add motivations, showing that each element in the toolbox makes its own contribution to explain the workings of a stable society and that the family has a critical role in this process. PMID:23082093

  13. Biomedical image analysis and processing in clouds

    NASA Astrophysics Data System (ADS)

    Bednarz, Tomasz; Szul, Piotr; Arzhaeva, Yulia; Wang, Dadong; Burdett, Neil; Khassapov, Alex; Chen, Shiping; Vallotton, Pascal; Lagerstrom, Ryan; Gureyev, Tim; Taylor, John

    2013-10-01

    Cloud-based Image Analysis and Processing Toolbox project runs on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) cloud infrastructure and allows access to biomedical image processing and analysis services to researchers via remotely accessible user interfaces. By providing user-friendly access to cloud computing resources and new workflow-based interfaces, our solution enables researchers to carry out various challenging image analysis and reconstruction tasks. Several case studies will be presented during the conference.

  14. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink(Registered TradeMark) (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  15. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  16. Integration of Lead Discovery Tactics and the Evolution of the Lead Discovery Toolbox.

    PubMed

    Leveridge, Melanie; Chung, Chun-Wa; Gross, Jeffrey W; Phelps, Christopher B; Green, Darren

    2018-06-01

    There has been much debate around the success rates of various screening strategies to identify starting points for drug discovery. Although high-throughput target-based and phenotypic screening has been the focus of this debate, techniques such as fragment screening, virtual screening, and DNA-encoded library screening are also increasingly reported as a source of new chemical equity. Here, we provide examples in which integration of more than one screening approach has improved the campaign outcome and discuss how strengths and weaknesses of various methods can be used to build a complementary toolbox of approaches, giving researchers the greatest probability of successfully identifying leads. Among others, we highlight case studies for receptor-interacting serine/threonine-protein kinase 1 and the bromo- and extra-terminal domain family of bromodomains. In each example, the unique insight or chemistries individual approaches provided are described, emphasizing the synergy of information obtained from the various tactics employed and the particular question each tactic was employed to answer. We conclude with a short prospective discussing how screening strategies are evolving, what this screening toolbox might look like in the future, how to maximize success through integration of multiple tactics, and scenarios that drive selection of one combination of tactics over another.

  17. Toolbox for Renewable Energy Project Development

    EPA Pesticide Factsheets

    The Toolbox for Renewable Energy Project Development summarizes key project development issues, addresses how to overcome major hurdles, and provides a curated directory of project development resources.

  18. Automation of lidar-based hydrologic feature extraction workflows using GIS

    NASA Astrophysics Data System (ADS)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  19. Next stop for the CRISPR revolution: RNA-guided epigenetic regulators.

    PubMed

    Vora, Suhani; Tuttle, Marcelle; Cheng, Jenny; Church, George

    2016-09-01

    Clustered regularly interspaced short palindromic repeats (CRISPRs) and CRISPR-associated (Cas) proteins offer a breakthrough platform for cheap, programmable, and effective sequence-specific DNA targeting. The CRISPR-Cas system is naturally equipped for targeted DNA cutting through its native nuclease activity. As such, groups researching a broad spectrum of biological organisms have quickly adopted the technology with groundbreaking applications to genomic sequence editing in over 20 different species. However, the biological code of life is not only encoded in genetics but also in epigenetics as well. While genetic sequence editing is a powerful ability, we must also be able to edit and regulate transcriptional and epigenetic code. Taking inspiration from work on earlier sequence-specific targeting technologies such as zinc fingers (ZFs) and transcription activator-like effectors (TALEs), researchers quickly expanded the CRISPR-Cas toolbox to include transcriptional activation, repression, and epigenetic modification. In this review, we highlight advances that extend the CRISPR-Cas toolkit for transcriptional and epigenetic regulation, as well as best practice guidelines for these tools, and a perspective on future applications. © 2016 The Authors. The FEBS Journal published by John Wiley & Sons Ltd on behalf of Federation of European Biochemical Societies.

  20. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments.

    PubMed

    Dalmaijer, Edwin S; Mathôt, Sebastiaan; Van der Stigchel, Stefan

    2014-12-01

    The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.

  1. Development of a Web-Based Quality Dashboard Including a Toolbox to Improve Pain Management in Dutch Intensive Care.

    PubMed

    Roos-Blom, Marie-José; Gude, Wouter T; de Jonge, Evert; Spijkstra, Jan Jaap; van der Veer, Sabine N; Dongelmans, Dave A; de Keizer, Nicolette F

    2017-01-01

    Audit and feedback (A&F) is a common strategy to improve quality of care. Meta-analyses have indicated that A&F may be more effective in realizing desired change when baseline performance is low, it is delivered by a supervisor or colleague, it is provided frequently and in a timely manner, it is delivered in both verbal and written formats, and it includes specific targets and an action plan. However, there is little information to guide operationalization of these factors. Researchers have consequently called for A&F interventions featuring well-described and carefully justified components, with their theoretical rationale made explicit. This paper describes the rationale and development of a quality dashboard including an improvement toolbox for four previous developed pain indicators, guided by Control Theory.

  2. An image analysis toolbox for high-throughput C. elegans assays

    PubMed Central

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.

    2012-01-01

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656

  3. FISSA: A neuropil decontamination toolbox for calcium imaging signals.

    PubMed

    Keemink, Sander W; Lowe, Scott C; Pakan, Janelle M P; Dylda, Evelyn; van Rossum, Mark C W; Rochefort, Nathalie L

    2018-02-22

    In vivo calcium imaging has become a method of choice to image neuronal population activity throughout the nervous system. These experiments generate large sequences of images. Their analysis is computationally intensive and typically involves motion correction, image segmentation into regions of interest (ROIs), and extraction of fluorescence traces from each ROI. Out of focus fluorescence from surrounding neuropil and other cells can strongly contaminate the signal assigned to a given ROI. In this study, we introduce the FISSA toolbox (Fast Image Signal Separation Analysis) for neuropil decontamination. Given pre-defined ROIs, the FISSA toolbox automatically extracts the surrounding local neuropil and performs blind-source separation with non-negative matrix factorization. Using both simulated and in vivo data, we show that this toolbox performs similarly or better than existing published methods. FISSA requires only little RAM, and allows for fast processing of large datasets even on a standard laptop. The FISSA toolbox is available in Python, with an option for MATLAB format outputs, and can easily be integrated into existing workflows. It is available from Github and the standard Python repositories.

  4. Electrophoretic separation techniques and their hyphenation to mass spectrometry in biological inorganic chemistry.

    PubMed

    Holtkamp, Hannah; Grabmann, Gerlinde; Hartinger, Christian G

    2016-04-01

    Electrophoretic methods have been widely applied in research on the roles of metal complexes in biological systems. In particular, CE, often hyphenated to a sensitive MS detector, has provided valuable information on the modes of action of metal-based pharmaceuticals, and more recently new methods have been added to the electrophoretic toolbox. The range of applications continues to expand as a result of enhanced CE-to-MS interfacing, with sensitivity often at picomolar level, and evolved separation modes allowing for innovative sample analysis. This article is a followup to previous reviews about CE methods in metallodrug research (Electrophoresis, 2003, 24, 2023-2037; Electrophoresis, 2007, 28, 3436-3446; Electrophoresis, 2012, 33, 622-634), also providing a comprehensive overview of metal species studied by electrophoretic methods hyphenated to MS. It highlights the latest CE developments, takes a sneak peek into gel electrophoresis, traces biomolecule labeling, and focuses on the importance of early-stage drug development. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. PV_LIB Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-11

    While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less

  6. The GMT/MATLAB Toolbox

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  7. Basic Radar Altimetry Toolbox and Radar Altimetry Tutorial: Tools for all Altimetry Users

    NASA Astrophysics Data System (ADS)

    Rosmorduc, Vinca; Benveniste, J.; Breebaart, L.; Bronner, E.; Dinardo, S.; Earith, D.; Lucas, B. M.; Maheu, C.; Niejmeier, S.; Picot, N.

    2013-09-01

    The Basic Radar Altimetry Toolbox is an "all- altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, Saral.It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. Nearly 2000 people downloaded it (January 2012), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 to 4. Others are under development, some are in discussion for the future.The Basic Radar Altimetry Toolbox is able:- to read most distributed radar altimetry data, including the one from future missions like Saral, Jason-3- to perform some processing, data editing and statistic, - and to visualize the results.It can be used at several levels/several ways, including as an educational tool, with the graphical user interface.As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data.BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (as of end of September 2012), with many "newcomers" to altimetry among them, and teachers/students. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are envisioned, some are in discussion.

  8. Health policy--why research it and how: health political science.

    PubMed

    de Leeuw, Evelyne; Clavier, Carole; Breton, Eric

    2014-09-23

    The establishment of policy is key to the implementation of actions for health. We review the nature of policy and the definition and directions of health policy. In doing so, we explicitly cast a health political science gaze on setting parameters for researching policy change for health. A brief overview of core theories of the policy process for health promotion is presented, and illustrated with empirical evidence. The key arguments are that (a) policy is not an intervention, but drives intervention development and implementation; (b) understanding policy processes and their pertinent theories is pivotal for the potential to influence policy change; (c) those theories and associated empirical work need to recognise the wicked, multi-level, and incremental nature of elements in the process; and, therefore, (d) the public health, health promotion, and education research toolbox should more explicitly embrace health political science insights. The rigorous application of insights from and theories of the policy process will enhance our understanding of not just how, but also why health policy is structured and implemented the way it is.

  9. Real time wind farm emulation using SimWindFarm toolbox

    NASA Astrophysics Data System (ADS)

    Topor, Marcel

    2016-06-01

    This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.

  10. Nucleophile Promiscuity of Natural and Engineered Aldolases.

    PubMed

    Clapes, Pere; Hernández, Karel; Szekrenyi, Anna

    2018-04-12

    Asymmetric aldol addition reaction mediated by aldolases is recognized as a green and sustainable way for carbon-carbon bond formation. Research in this line has unveiled their unprecedented synthetic potentiality toward diverse new chemical structures, novel product families and even as a technology for industrial manufacturing processes. Despite that, aldolases have long been regarded as strictly selective catalysts, particularly for the nucleophilic substrate, limiting their broad applicability. In recent years, the advances in screening technologies and metagenomics uncovered novel C-C biocatalysts from superfamilies of widely known lyases. Moreover, protein engineering revealed the extraordinary malleability of different carboligases, offering a toolbox of biocatalysts active towards a large structural diversity of nucleophile substrates. In this paper, the nucleophile ambiguity of native and engineered aldolases is discussed with recent examples proving this novel concept. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A review of estimation of distribution algorithms in bioinformatics

    PubMed Central

    Armañanzas, Rubén; Inza, Iñaki; Santana, Roberto; Saeys, Yvan; Flores, Jose Luis; Lozano, Jose Antonio; Peer, Yves Van de; Blanco, Rosa; Robles, Víctor; Bielza, Concha; Larrañaga, Pedro

    2008-01-01

    Evolutionary search algorithms have become an essential asset in the algorithmic toolbox for solving high-dimensional optimization problems in across a broad range of bioinformatics problems. Genetic algorithms, the most well-known and representative evolutionary search technique, have been the subject of the major part of such applications. Estimation of distribution algorithms (EDAs) offer a novel evolutionary paradigm that constitutes a natural and attractive alternative to genetic algorithms. They make use of a probabilistic model, learnt from the promising solutions, to guide the search process. In this paper, we set out a basic taxonomy of EDA techniques, underlining the nature and complexity of the probabilistic model of each EDA variant. We review a set of innovative works that make use of EDA techniques to solve challenging bioinformatics problems, emphasizing the EDA paradigm's potential for further research in this domain. PMID:18822112

  12. RTE: A UNIX library with on-line documentation and sample programs for microwave radiative transfer calculations

    NASA Astrophysics Data System (ADS)

    Reynolds, J. C.; Schroeder, J. A.

    1993-03-01

    The FORTRAN library that the NOAA Wave Propagation Laboratory (WPL) developed to perform radiative transfer calculations for an upward-looking microwave radiometer is described. Although the theory and algorithms have been used for many years in WPL radiometer research, the Radiative Transfer Equation (RTE) software has combined them into a toolbox that is portable, readable, application independent, and easy to update. RTE has been optimized for the UNIX environment. However, the FORTRAN source code can be compiled on any platform that provides a Standard FORTRAN 77 compiler. RTE allows a user to do cloud modeling, calibrate radiometers, simulate hypothetical radiometer systems, develop retrieval techniques, and compute weighting functions. The radiative transfer model used is valid for channel frequencies below 1000 GHz in clear conditions and for frequencies below 100 GHz when clouds are present.

  13. Promoting Complex Systems Learning through the Use of Conceptual Representations in Hypermedia

    ERIC Educational Resources Information Center

    Liu, Lei; Hmelo-Silver, Cindy E.

    2009-01-01

    Studying complex systems is increasingly important in many science domains. Many features of complex systems make it difficult for students to develop deep understanding. Our previous research indicated that a function-centered conceptual representation is part of the disciplinary toolbox of biologists, suggesting that it is an appropriate…

  14. Foucault's Toolbox: Critical Insights for Education and Technology Researchers

    ERIC Educational Resources Information Center

    Hope, Andrew

    2015-01-01

    Despite having been a prolific academic, whose work exerts considerable cross-discipline influence, the ideas of Foucault are largely neglected in educational technology scholarship. Having provided an initial brief overview of the sparse use of Foucault's work in this sub-field, this paper then seeks to generate new understandings, arguing that…

  15. LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox

    ERIC Educational Resources Information Center

    Steiner, Christina M.; Kickmeier-Rust, Michael D.; Albert, Dietrich

    2016-01-01

    To find a balance between learning analytics research and individual privacy, learning analytics initiatives need to appropriately address ethical, privacy, and data protection issues. A range of general guidelines, model codes, and principles for handling ethical issues and for appropriate data and privacy protection are available, which may…

  16. 76 FR 163 - Agency Information Collection Activities: CBP Regulations Pertaining to Customs Brokers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-03

    ... broker exam would complete CBP Form 3124E, ``Application for Customs Broker License Exam''; or to apply... U.S.C. 1641. CBP Forms 3124 and 3124E may be found at http://www.cbp.gov/xp/cgov/toolbox/forms/ . Further information about the customs broker exam and how to apply for it may be found at http://www.cbp...

  17. Smoke Ready Toolbox for Wildfires

    EPA Pesticide Factsheets

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  18. Grid Integrated Distributed PV (GridPV) Version 2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Matthew J.; Coogan, Kyle

    2014-12-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functio ns are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in th e OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included tomore » show potential uses of the toolbox functions. Each function i n the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.« less

  19. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less

  20. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    PubMed

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Rangeland Brush Estimation Toolbox (RaBET): An Approach for Evaluating Brush Management Conservation Efforts in Western Grazing Lands

    NASA Astrophysics Data System (ADS)

    Holifield Collins, C.; Kautz, M. A.; Skirvin, S. M.; Metz, L. J.

    2016-12-01

    There are over 180 million hectares of rangelands and grazed forests in the central and western United States. Due to the loss of perennial grasses and subsequent increased runoff and erosion that can degrade the system, woody cover species cannot be allowed to proliferate unchecked. The USDA-Natural Resources Conservation Service (NRCS) has allocated extensive resources to employ brush management (removal) as a conservation practice to control woody species encroachment. The Rangeland-Conservation Effects Assessment Project (CEAP) has been tasked with determining how effective the practice has been, however their land managers lack a cost-effective means to conduct these assessments at the necessary scale. An ArcGIS toolbox for generating large-scale, Landsat-based, spatial maps of woody cover on grazing lands in the western United States was developed through a collaboration with NRCS Rangeland-CEAP. The toolbox contains two main components of operation, image generation and temporal analysis, and utilizes simple interfaces requiring minimum user inputs. The image generation tool utilizes geographically specific algorithms developed from combining moderate-resolution (30-m) Landsat imagery and high-resolution (1-m) National Agricultural Imagery Program (NAIP) aerial photography to produce the woody cover scenes at the Major Land Resource (MLRA) scale. The temporal analysis tool can be used on these scenes to assess treatment effectiveness and monitor woody cover reemergence. RaBET provides rangeland managers an operational, inexpensive decision support tool to aid in the application of brush removal treatments and assessing their effectiveness.

  2. Drinking Water Cyanotoxin Risk Communication Toolbox

    EPA Pesticide Factsheets

    The drinking water cyanotoxin risk communication toolbox is a ready-to-use, “one-stop-shop” to support public water systems, states, and local governments in developing, as they deem appropriate, their own risk communication materials.

  3. EPA ExpoBox Toolbox Search

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant assessment data bases,

  4. 40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...

  5. 40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...

  6. Air Sensor Toolbox for Citizen Scientists

    EPA Pesticide Factsheets

    EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.

  7. Air Sensor Toolbox: Resources and Funding

    EPA Pesticide Factsheets

    EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.

  8. New tools for linking human and earth system models: The Toolbox for Human-Earth System Interaction & Scaling (THESIS)

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.; Kauffman, B.; Lawrence, P.

    2016-12-01

    Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.

  9. ObsPy: A Python toolbox for seismology - Sustainability, New Features, and Applications

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; MacCarthy, J.

    2016-12-01

    ObsPy (https://www.obspy.org) is a community-driven, open-source project dedicated to offer a bridge for seismology into the scientific Python ecosystem. Amongst other things, it provides Read and write support for essentially every commonly used data format in seismology with a unified interface. This includes waveform data as well as station and event meta information. A signal processing toolbox tuned to the specific needs of seismologists. Integrated access to the largest data centers, web services, and databases. Wrappers around third party codes like libmseed and evalresp. Using ObsPy enables users to take advantage of the vast scientific ecosystem that has developed around Python. In contrast to many other programming languages and tools, Python is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often must be translated to stable and production ready environments, especially in the age of big data. ObsPy has seen constant development for more than six years and enjoys a large rate of adoption in the seismological community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. We will present a short overview of the capabilities of ObsPy and point out several representative use cases and more specialized software built around ObsPy. Additionally we will discuss new and upcoming features, as well as the sustainability of open-source scientific software.

  10. PYCHEM: a multivariate analysis package for python.

    PubMed

    Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston

    2006-10-15

    We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem

  11. Toolbox for the design of LiNbO3-based passive and active integrated quantum circuits

    NASA Astrophysics Data System (ADS)

    Sharapova, P. R.; Luo, K. H.; Herrmann, H.; Reichelt, M.; Meier, T.; Silberhorn, C.

    2017-12-01

    We present and discuss perspectives of current developments on advanced quantum optical circuits monolithically integrated in the lithium niobate platform. A set of basic components comprising photon pair sources based on parametric down conversion (PDC), passive routing elements and active electro-optically controllable switches and polarisation converters are building blocks of a toolbox which is the basis for a broad range of diverse quantum circuits. We review the state-of-the-art of these components and provide models that properly describe their performance in quantum circuits. As an example for applications of these models we discuss design issues for a circuit providing on-chip two-photon interference. The circuit comprises a PDC section for photon pair generation followed by an actively controllable modified mach-Zehnder structure for observing Hong-Ou-Mandel interference. The performance of such a chip is simulated theoretically by taking even imperfections of the properties of the individual components into account.

  12. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  13. An Open-Source Toolbox for Surrogate Modeling of Joint Contact Mechanics

    PubMed Central

    Eskinazi, Ilan

    2016-01-01

    Goal Incorporation of elastic joint contact models into simulations of human movement could facilitate studying the interactions between muscles, ligaments, and bones. Unfortunately, elastic joint contact models are often too expensive computationally to be used within iterative simulation frameworks. This limitation can be overcome by using fast and accurate surrogate contact models that fit or interpolate input-output data sampled from existing elastic contact models. However, construction of surrogate contact models remains an arduous task. The aim of this paper is to introduce an open-source program called Surrogate Contact Modeling Toolbox (SCMT) that facilitates surrogate contact model creation, evaluation, and use. Methods SCMT interacts with the third party software FEBio to perform elastic contact analyses of finite element models and uses Matlab to train neural networks that fit the input-output contact data. SCMT features sample point generation for multiple domains, automated sampling, sample point filtering, and surrogate model training and testing. Results An overview of the software is presented along with two example applications. The first example demonstrates creation of surrogate contact models of artificial tibiofemoral and patellofemoral joints and evaluates their computational speed and accuracy, while the second demonstrates the use of surrogate contact models in a forward dynamic simulation of an open-chain leg extension-flexion motion. Conclusion SCMT facilitates the creation of computationally fast and accurate surrogate contact models. Additionally, it serves as a bridge between FEBio and OpenSim musculoskeletal modeling software. Significance Researchers may now create and deploy surrogate models of elastic joint contact with minimal effort. PMID:26186761

  14. SinCHet: a MATLAB toolbox for single cell heterogeneity analysis in cancer.

    PubMed

    Li, Jiannong; Smalley, Inna; Schell, Michael J; Smalley, Keiran S M; Chen, Y Ann

    2017-09-15

    Single-cell technologies allow characterization of transcriptomes and epigenomes for individual cells under different conditions and provide unprecedented resolution for researchers to investigate cellular heterogeneity in cancer. The SinCHet ( gle ell erogeneity) toolbox is developed in MATLAB and has a graphical user interface (GUI) for visualization and user interaction. It analyzes both continuous (e.g. mRNA expression) and binary omics data (e.g. discretized methylation data). The toolbox does not only quantify cellular heterogeneity using S hannon P rofile (SP) at different clonal resolutions but also detects heterogeneity differences using a D statistic between two populations. It is defined as the area under the P rofile of S hannon D ifference (PSD). This flexible tool provides a default clonal resolution using the change point of PSD detected by multivariate adaptive regression splines model; it also allows user-defined clonal resolutions for further investigation. This tool provides insights into emerging or disappearing clones between conditions, and enables the prioritization of biomarkers for follow-up experiments based on heterogeneity or marker differences between and/or within cell populations. The SinCHet software is freely available for non-profit academic use. The source code, example datasets, and the compiled package are available at http://labpages2.moffitt.org/chen/software/ . ann.chen@moffitt.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  15. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  16. ObsPy - A Python Toolbox for Seismology - and Applications

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Barsch, R.; MacCarthy, J.; Lecocq, T.; Koymans, M. R.; Carothers, L.; Eulenfeld, T.; Reyes, C. G.; Falco, N.; Sales de Andrade, E.

    2017-12-01

    Recent years witnessed the evolution of Python's ecosystem into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It is a Python toolbox offering: Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, SC3ML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. Newest features include: Full interoperability of SEED and StationXML/Inventory objects Access to the Nominal Response Library (NRL) for easy and quick creation of station metadata from scratch Support for the IRIS Federated Catalog Service Improved performance of the EarthWorm client Several improvements to MiniSEED read/write module Improved plotting capabilities for PPSD (spectrograms, PSD of discrete frequencies over time, ..) Support for.. Reading ArcLink Inventory XML Reading Reftek data format Writing SeisComp3 ML (SC3ML) Writing StationTXT format This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases and show-case some projects that are based on ObsPy, e.g.: seismo-live.org Seedlink-plotter MSNoise, and others..

  17. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  18. A toolbox for computing pebble shape and roundness indexes: experimental tests and recommendations for future applications.

    NASA Astrophysics Data System (ADS)

    Cassel, M.; Piegay, H.; Lave, J.

    2016-12-01

    Pebble rounding caused by attrition is, beside chemical dissolution, breakage, and grain size segregation, one of the key processes controlling bedload downstream fining in rivers. Downstream changes in pebble geometry is subject of consideration since Aristotle (Krynine, 1960) and its measurement represent a challenge since the end of 19th century, leading to a long standing debate (Blott and Pye, 2008). A toolbox developed by Roussillon et al. (2009) operate on automatic computation of several shape and roundness indexes from images of 2D projection plan of pebbles disposed on a one meter square red board. In order to promote the tool for future applications, we tested the effects of pebble position on board, of picture resolution and treatment on three shape and roundness indexes. We also compared the downstream patterns of these indexes on two pebble samples of the same lithology collected on the Progo River (Indonesia) based on field observations (i) and experimentation (ii). Shape and roundness were measured on (i) 8 sites distributed over a distance of 36 km along the river, and (ii) ten times on a set of particules collected on the Progo spring and transported in an annular flume over the same distance. This travel distance was monitored using passive low frequency RFID system. Results show that pebble position does not have a significant effect on shape and roundness indexes but these indexes are sensible to picture resolutions and treatments so that a clear protocol must be considered for avoiding any observer bias. Downstream changes in roundness indexes are very similar in field and experimental conditions, while abrasion environments are distinct. Discontinuities observed in downstream river pattern but not in experimental one underlined changes in Progo River pebble roundness are probably caused by sediment supplied from tributaries or bank erosion. These results highlight the toolbox potential for diagnosing river systems function.

  19. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    PubMed

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.

  20. Ironbound Community Citizen Science Toolbox Fact Sheet

    EPA Pesticide Factsheets

    EPA is partnering with Newark’s Ironbound Community Corporation (ICC) to design, develop, and pilot a Citizen Science Toolbox that will enable communities to collect their own environmental data and increase their ability to understand local conditions.

  1. Enhancing Adaptability of U.S. Military Forces. Part A: Main Report

    DTIC Science & Technology

    2011-01-01

    virtual ...Research   and  Engineering,  Rapid  Capability  Fielding  Toolbox  Study,  stated  that   virtual  environment  tools  can...the   “approved”  basis   for   requirements   is   divorced   from   reality ,   and   setting   system  

  2. Tinnitus Multimodal Imaging

    DTIC Science & Technology

    2016-12-01

    images were segmented into gray and white matter images and spatially normalized to the MNI template (3 mm isotropic voxels) using the DARTEL toolbox in...AWARD NUMBER: W81XWH-13-1-0494 TITLE: Tinnitus Multimodal Imaging PRINCIPAL INVESTIGATOR: Steven Wan Cheung CONTRACTING ORGANIZATION... Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT: Approved for Public Release; Distribution Unlimited

  3. RTI Strategies That Work in the 3-6 Classroom

    ERIC Educational Resources Information Center

    Johnson, Eli; Karns, Michelle

    2012-01-01

    This is a must-have resource for educators committed to meeting the needs of their struggling students in Grades 3-6. Teachers get a whole toolbox filled with research-based, easy to implement RTI interventions that really work! Get strategies in five core areas--plus correlations to the Common Core State Standards and effective scaffolding tips…

  4. Governmentality as a Genealogical Toolbox in Historical Analysis

    ERIC Educational Resources Information Center

    Andersson, Janicke

    2014-01-01

    The aim of this article is to show how governmentality may be used to analyze historical events and discourses, and how this historical analysis can be used as a perspective to problematize contemporary discourses. The example used in this article is from my research on life-extension handbooks published in Sweden 1700-1930, and by this I stress…

  5. Teaching at Its Best: A Research-Based Resource for College Instructors. Second Edition.

    ERIC Educational Resources Information Center

    Nilson, Linda B.

    This handbook is meant to be a toolbox, a compilation of hundreds of practical teaching techniques, formats, classroom activities, and exercises. This edition is revised and expanded to cover more about topics relevant to today's classroom, such as technology and the Internet. The 31 chapters are grouped into these sections: (1) "Sound…

  6. New(ish) tools in the toolbox: Using in vitro bioassays for cumulative assessment of steroid hormone receptor active compounds in environmental samples

    EPA Science Inventory

    In vitro assays are currently a high priority tool within USEPA for screening chemicals and samples for biological activity. The work included in this abstract describes the usage and expertise in our research group for using in vitro assays to screen environmental samples for s...

  7. Evaluating a 2D image-based computerized approach for measuring riverine pebble roundness

    NASA Astrophysics Data System (ADS)

    Cassel, Mathieu; Piégay, Hervé; Lavé, Jérôme; Vaudor, Lise; Hadmoko Sri, Danang; Wibiwo Budi, Sandy; Lavigne, Franck

    2018-06-01

    The geometrical characteristics of pebbles are important features to study transport pathways, sedimentary history, depositional environments, abrasion processes or to target sediment sources. Both the shape and roundness of pebbles can be described by a still growing number of metrics in 2D and 3D or by visual charts. Despite new developments, existing tools remain proprietary and no pebble roundness toolbox has been available widely within the scientific community. The toolbox developed by Roussillon et al. (2009) automatically computes the size, shape and roundness indexes of pebbles from their 2D maximal projection plans. Using a digital camera, this toolbox operates using 2D pictures taken of pebbles placed on a one square meter red board, allowing data collection to be quickly and efficiently acquired at a large scale. Now that the toolbox is freely available for download,

  8. National Water-Quality Assessment (NAWQA) area-characterization toolbox

    USGS Publications Warehouse

    Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  9. Sharing tools and best practice in Global Sensitivity Analysis within academia and with industry

    NASA Astrophysics Data System (ADS)

    Wagener, T.; Pianosi, F.; Noacco, V.; Sarrazin, F.

    2017-12-01

    We have spent years trying to improve the use of global sensitivity analysis (GSA) in earth and environmental modelling. Our efforts included (1) the development of tools that provide easy access to widely used GSA methods, (2) the definition of workflows so that best practice is shared in an accessible way, and (3) the development of algorithms to close gaps in available GSA methods (such as moment independent strategies) and to make GSA applications more robust (such as convergence criteria). These elements have been combined in our GSA Toolbox, called SAFE (www.safetoolbox.info), which has up to now been adopted by over 1000 (largely) academic users worldwide. However, despite growing uptake in academic circles and across a wide range of application areas, transfer to industry applications has been difficult. Initial market research regarding opportunities and barriers for uptake revealed a large potential market, but also highlighted a significant lack of knowledge regarding state-of-the-art methods and their potential value for end-users. We will present examples and discuss our experience so far in trying to overcome these problems and move beyond academia in distributing GSA tools and expertise.

  10. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  11. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide.

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  12. Towards a Comprehensive Catalog of Volcanic Seismicity

    NASA Astrophysics Data System (ADS)

    Thompson, G.

    2014-12-01

    Catalogs of earthquakes located using differential travel-time techniques are a core product of volcano observatories, and while vital, they represent an incomplete perspective of volcanic seismicity. Many (often most) earthquakes are too small to locate accurately, and are omitted from available catalogs. Low frequency events, tremor and signals related to rockfalls, pyroclastic flows and lahars are not systematically catalogued, and yet from a hazard management perspective are exceedingly important. Because STA/LTA detection schemes break down in the presence of high amplitude tremor, swarms or dome collapses, catalogs may suggest low seismicity when seismicity peaks. We propose to develop a workflow and underlying software toolbox that can be applied to near-real-time and offline waveform data to produce comprehensive catalogs of volcanic seismicity. Existing tools to detect and locate phaseless signals will be adapted to fit within this framework. For this proof of concept the toolbox will be developed in MATLAB, extending the existing GISMO toolbox (an object-oriented MATLAB toolbox for seismic data analysis). Existing database schemas such as the CSS 3.0 will need to be extended to describe this wider range of volcano-seismic signals. WOVOdat may already incorporate many of the additional tables needed. Thus our framework may act as an interface between volcano observatories (or campaign-style research projects) and WOVOdat. We aim to take the further step of reducing volcano-seismic catalogs to sets of continuous metrics that are useful for recognizing data trends, and for feeding alarm systems and forecasting techniques. Previous experience has shown that frequency index, peak frequency, mean frequency, mean event rate, median event rate, and cumulative magnitude (or energy) are potentially useful metrics to generate for all catalogs at a 1-minute sample rate (directly comparable with RSAM and similar metrics derived from continuous data). Our framework includes tools to plot these metrics in a consistent manner. We work with data from unrest at Redoubt volcano and Soufriere Hills volcano to develop our framework.

  13. What precision-protein-tuning and nano-resolved single molecule sciences can do for each other.

    PubMed

    Milles, Sigrid; Lemke, Edward A

    2013-01-01

    While innovations in modern microscopy, spectroscopy, and nanoscopy techniques have made single molecule observation a standard in many laboratories, the actual design of meaningful fluorescence reporter systems now hinders major scientific breakthroughs. Even though the field of chemical biology is supercharging the fluorescence toolbox, surprisingly few strategies exist that make the transition from model systems to biologically relevant applications. At the same time, the number of microscopy techniques is growing dramatically. We explain our view on how the impact of modern technologies is influenced not only by further hard- and software developments, but also by the availability and suitability of protein-engineering tools. We identify how the largely independent research fields of chemical biology and fluorescence nanoscopy can influence each other to synergistically drive future technology that can visualize the localization, structure, and dynamics of molecular function without constraints. Copyright © 2013 WILEY Periodicals, Inc.

  14. Recent Progress in Genome Editing Approaches for Inherited Cardiovascular Diseases.

    PubMed

    Kaur, Balpreet; Perea-Gil, Isaac; Karakikes, Ioannis

    2018-06-02

    This review describes the recent progress in nuclease-based therapeutic applications for inherited heart diseases in vitro, highlights the development of the most recent genome editing technologies and discusses the associated challenges for clinical translation. Inherited cardiovascular disorders are passed from generation to generation. Over the past decade, considerable progress has been made in understanding the genetic basis of inherited heart diseases. The timely emergence of genome editing technologies using engineered programmable nucleases has revolutionized the basic research of inherited cardiovascular diseases and holds great promise for the development of targeted therapies. The genome editing toolbox is rapidly expanding, and new tools have been recently added that significantly expand the capabilities of engineered nucleases. Newer classes of versatile engineered nucleases, such as the "base editors," have been recently developed, offering the potential for efficient and precise therapeutic manipulation of the human genome.

  15. Polymeric nanoparticles: A study on the preparation variables and characterization methods.

    PubMed

    Crucho, Carina I C; Barros, Maria Teresa

    2017-11-01

    Since the emergence of Nanotechnology in the past decades, the development and design of nanomaterials has become an important field of research. An emerging component in this field is nanomedicine, wherein nanoscale materials are being developed for use as imaging agents or for drug delivery applications. Much work is currently focused in the preparation of well-defined nanomaterials in terms of size and shape. These factors play a significantly role in the nanomaterial behavior in vivo. In this context, this review focuses on the toolbox of available methods for the preparation of polymeric nanoparticles. We highlight some recent examples from the literature that demonstrate the influence of the preparation method on the physicochemical characteristics of the nanoparticles. Additionally, in the second part, the characterization methods for this type of nanoparticles are discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Diffusion of synthetic biology: a challenge to biosafety.

    PubMed

    Schmidt, Markus

    2008-06-01

    One of the main aims of synthetic biology is to make biology easier to engineer. Major efforts in synthetic biology are made to develop a toolbox to design biological systems without having to go through a massive research and technology process. With this "de-skilling" agenda, synthetic biology might finally unleash the full potential of biotechnology and spark a wave of innovation, as more and more people have the necessary skills to engineer biology. But this ultimate domestication of biology could easily lead to unprecedented safety challenges that need to be addressed: more and more people outside the traditional biotechnology community will create self-replicating machines (life) for civil and defence applications, "biohackers" will engineer new life forms at their kitchen table; and illicit substances will be produced synthetically and much cheaper. Such a scenario is a messy and dangerous one, and we need to think about appropriate safety standards now.

  17. A toolbox of immunoprecipitation-grade monoclonal antibodies to human transcription factors.

    PubMed

    Venkataraman, Anand; Yang, Kun; Irizarry, Jose; Mackiewicz, Mark; Mita, Paolo; Kuang, Zheng; Xue, Lin; Ghosh, Devlina; Liu, Shuang; Ramos, Pedro; Hu, Shaohui; Bayron Kain, Diane; Keegan, Sarah; Saul, Richard; Colantonio, Simona; Zhang, Hongyan; Behn, Florencia Pauli; Song, Guang; Albino, Edisa; Asencio, Lillyann; Ramos, Leonardo; Lugo, Luvir; Morell, Gloriner; Rivera, Javier; Ruiz, Kimberly; Almodovar, Ruth; Nazario, Luis; Murphy, Keven; Vargas, Ivan; Rivera-Pacheco, Zully Ann; Rosa, Christian; Vargas, Moises; McDade, Jessica; Clark, Brian S; Yoo, Sooyeon; Khambadkone, Seva G; de Melo, Jimmy; Stevanovic, Milanka; Jiang, Lizhi; Li, Yana; Yap, Wendy Y; Jones, Brittany; Tandon, Atul; Campbell, Elliot; Montelione, Gaetano T; Anderson, Stephen; Myers, Richard M; Boeke, Jef D; Fenyö, David; Whiteley, Gordon; Bader, Joel S; Pino, Ignacio; Eichinger, Daniel J; Zhu, Heng; Blackshaw, Seth

    2018-03-19

    A key component of efforts to address the reproducibility crisis in biomedical research is the development of rigorously validated and renewable protein-affinity reagents. As part of the US National Institutes of Health (NIH) Protein Capture Reagents Program (PCRP), we have generated a collection of 1,406 highly validated immunoprecipitation- and/or immunoblotting-grade mouse monoclonal antibodies (mAbs) to 737 human transcription factors, using an integrated production and validation pipeline. We used HuProt human protein microarrays as a primary validation tool to identify mAbs with high specificity for their cognate targets. We further validated PCRP mAbs by means of multiple experimental applications, including immunoprecipitation, immunoblotting, chromatin immunoprecipitation followed by sequencing (ChIP-seq), and immunohistochemistry. We also conducted a meta-analysis that identified critical variables that contribute to the generation of high-quality mAbs. All validation data, protocols, and links to PCRP mAb suppliers are available at http://proteincapture.org.

  18. Traffic intensity monitoring using multiple object detection with traffic surveillance cameras

    NASA Astrophysics Data System (ADS)

    Hamdan, H. G. Muhammad; Khalifah, O. O.

    2017-11-01

    Object detection and tracking is a field of research that has many applications in the current generation with increasing number of cameras on the streets and lower cost for Internet of Things(IoT). In this paper, a traffic intensity monitoring system is implemented based on the Macroscopic Urban Traffic model is proposed using computer vision as its source. The input of this program is extracted from a traffic surveillance camera which has another program running a neural network classification which can identify and differentiate the vehicle type is implanted. The neural network toolbox is trained with positive and negative input to increase accuracy. The accuracy of the program is compared to other related works done and the trends of the traffic intensity from a road is also calculated. relevant articles in literature searches, great care should be taken in constructing both. Lastly the limitation and the future work is concluded.

  19. SSOAP Toolbox Enhancements and Case Study

    EPA Science Inventory

    Recognizing the need for tools to support the development of sanitary sewer overflow (SSO) control plans, in October 2009 the U.S. Environmental Protection Agency (EPA) released the first version of the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox. This first ve...

  20. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic System (T-MATS)

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei (OA)

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This presentation describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this presentation is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture.

  1. Understanding and managing fish populations: keeping the toolbox fit for purpose.

    PubMed

    Paris, J R; Sherman, K D; Bell, E; Boulenger, C; Delord, C; El-Mahdi, M B M; Fairfield, E A; Griffiths, A M; Gutmann Roberts, C; Hedger, R D; Holman, L E; Hooper, L H; Humphries, N E; Katsiadaki, I; King, R A; Lemopoulos, A; Payne, C J; Peirson, G; Richter, K K; Taylor, M I; Trueman, C N; Hayden, B; Stevens, J R

    2018-03-01

    Wild fish populations are currently experiencing unprecedented pressures, which are projected to intensify in the coming decades. Developing a thorough understanding of the influences of both biotic and abiotic factors on fish populations is a salient issue in contemporary fish conservation and management. During the 50th Anniversary Symposium of The Fisheries Society of the British Isles at the University of Exeter, UK, in July 2017, scientists from diverse research backgrounds gathered to discuss key topics under the broad umbrella of 'Understanding Fish Populations'. Below, the output of one such discussion group is detailed, focusing on tools used to investigate natural fish populations. Five main groups of approaches were identified: tagging and telemetry; molecular tools; survey tools; statistical and modelling tools; tissue analyses. The appraisal covered current challenges and potential solutions for each of these topics. In addition, three key themes were identified as applicable across all tool-based applications. These included data management, public engagement, and fisheries policy and governance. The continued innovation of tools and capacity to integrate interdisciplinary approaches into the future assessment and management of fish populations is highlighted as an important focus for the next 50 years of fisheries research. © 2018 The Fisheries Society of the British Isles.

  2. From gene to biorefinery: microbial β-etherases as promising biocatalysts for lignin valorization.

    PubMed

    Picart, Pere; de María, Pablo Domínguez; Schallmey, Anett

    2015-01-01

    The set-up of biorefineries for the valorization of lignocellulosic biomass will be core in the future to reach sustainability targets. In this area, biomass-degrading enzymes are attracting significant research interest for their potential in the production of chemicals and biofuels from renewable feedstock. Glutathione-dependent β-etherases are emerging enzymes for the biocatalytic depolymerization of lignin, a heterogeneous aromatic polymer abundant in nature. They selectively catalyze the reductive cleavage of β-O-4 aryl-ether bonds which account for 45-60% of linkages present in lignin. Hence, application of β-etherases in lignin depolymerization would enable a specific lignin breakdown, selectively yielding (valuable) low-molecular-mass aromatics. Albeit β-etherases have been biochemically known for decades, only very recently novel β-etherases have been identified and thoroughly characterized for lignin valorization, expanding the enzyme toolbox for efficient β-O-4 aryl-ether bond cleavage. Given their emerging importance and potential, this mini-review discusses recent developments in the field of β-etherase biocatalysis covering all aspects from enzyme identification to biocatalytic applications with real lignin samples.

  3. From gene to biorefinery: microbial β-etherases as promising biocatalysts for lignin valorization

    PubMed Central

    Picart, Pere; de María, Pablo Domínguez; Schallmey, Anett

    2015-01-01

    The set-up of biorefineries for the valorization of lignocellulosic biomass will be core in the future to reach sustainability targets. In this area, biomass-degrading enzymes are attracting significant research interest for their potential in the production of chemicals and biofuels from renewable feedstock. Glutathione-dependent β-etherases are emerging enzymes for the biocatalytic depolymerization of lignin, a heterogeneous aromatic polymer abundant in nature. They selectively catalyze the reductive cleavage of β-O-4 aryl-ether bonds which account for 45–60% of linkages present in lignin. Hence, application of β-etherases in lignin depolymerization would enable a specific lignin breakdown, selectively yielding (valuable) low-molecular-mass aromatics. Albeit β-etherases have been biochemically known for decades, only very recently novel β-etherases have been identified and thoroughly characterized for lignin valorization, expanding the enzyme toolbox for efficient β-O-4 aryl-ether bond cleavage. Given their emerging importance and potential, this mini-review discusses recent developments in the field of β-etherase biocatalysis covering all aspects from enzyme identification to biocatalytic applications with real lignin samples. PMID:26388858

  4. Application of SiPM for Modern Nuclear Physics Practical Workshop

    NASA Astrophysics Data System (ADS)

    Chepurnov, A. S.; Gavrilenko, O. I.; Caccia, Massimo; Mattone, Cristina; Oleinik, A. N.; Radchenko, V. V.

    2017-01-01

    Silicon PhotoMultipliers (SiPM) are state of the art light detectors with very high single photon sensitivity and photon number resolving capability, representing a breakthrough in several fundamental and applied Science domains. So, introduction of SiPM in to the education is important process increasing the number of specialists involved in the SiPM development and application. As a result of collaborative efforts between industry and academic institutions modular set of instruments based on SiPM light sensors has been developed by CAEN s.p.a. It is developed for educational purposes mainly and allows performing a series of experiments including photon detection, gamma spectrometry, cosmic ray observation and beta and gamma ray absorption. In addition, an educational experiments based on a SiPM set-up guides students towards a comprehensive knowledge of SiPM technology while experiencing the quantum nature of light and exploring the statistical properties of the light pulses emitted by a LED. The toolbox is actually an open platform in continuous evolution thanks to the contribution of the research community and cooperation with high schools.

  5. 3D correlative light and electron microscopy of cultured cells using serial blockface scanning electron microscopy

    PubMed Central

    Lerner, Thomas R.; Burden, Jemima J.; Nkwe, David O.; Pelchen-Matthews, Annegret; Domart, Marie-Charlotte; Durgan, Joanne; Weston, Anne; Jones, Martin L.; Peddie, Christopher J.; Carzaniga, Raffaella; Florey, Oliver; Marsh, Mark; Gutierrez, Maximiliano G.

    2017-01-01

    ABSTRACT The processes of life take place in multiple dimensions, but imaging these processes in even three dimensions is challenging. Here, we describe a workflow for 3D correlative light and electron microscopy (CLEM) of cell monolayers using fluorescence microscopy to identify and follow biological events, combined with serial blockface scanning electron microscopy to analyse the underlying ultrastructure. The workflow encompasses all steps from cell culture to sample processing, imaging strategy, and 3D image processing and analysis. We demonstrate successful application of the workflow to three studies, each aiming to better understand complex and dynamic biological processes, including bacterial and viral infections of cultured cells and formation of entotic cell-in-cell structures commonly observed in tumours. Our workflow revealed new insight into the replicative niche of Mycobacterium tuberculosis in primary human lymphatic endothelial cells, HIV-1 in human monocyte-derived macrophages, and the composition of the entotic vacuole. The broad application of this 3D CLEM technique will make it a useful addition to the correlative imaging toolbox for biomedical research. PMID:27445312

  6. A toolbox for discrete modelling of cell signalling dynamics.

    PubMed

    Paterson, Yasmin Z; Shorthouse, David; Pleijzier, Markus W; Piterman, Nir; Bendtsen, Claus; Hall, Benjamin A; Fisher, Jasmin

    2018-06-18

    In an age where the volume of data regarding biological systems exceeds our ability to analyse it, many researchers are looking towards systems biology and computational modelling to help unravel the complexities of gene and protein regulatory networks. In particular, the use of discrete modelling allows generation of signalling networks in the absence of full quantitative descriptions of systems, which are necessary for ordinary differential equation (ODE) models. In order to make such techniques more accessible to mainstream researchers, tools such as the BioModelAnalyzer (BMA) have been developed to provide a user-friendly graphical interface for discrete modelling of biological systems. Here we use the BMA to build a library of discrete target functions of known canonical molecular interactions, translated from ordinary differential equations (ODEs). We then show that these BMA target functions can be used to reconstruct complex networks, which can correctly predict many known genetic perturbations. This new library supports the accessibility ethos behind the creation of BMA, providing a toolbox for the construction of complex cell signalling models without the need for extensive experience in computer programming or mathematical modelling, and allows for construction and simulation of complex biological systems with only small amounts of quantitative data.

  7. The Self-Powered Detector Simulation `MATiSSe' Toolbox applied to SPNDs for severe accident monitoring in PWRs

    NASA Astrophysics Data System (ADS)

    Barbot, Loïc; Villard, Jean-François; Fourrez, Stéphane; Pichon, Laurent; Makil, Hamid

    2018-01-01

    In the framework of the French National Research Agency program on nuclear safety and radioprotection, the `DIstributed Sensing for COrium Monitoring and Safety' project aims at developing innovative instrumentation for corium monitoring in case of severe accident in a Pressurized Water nuclear Reactor. Among others, a new under-vessel instrumentation based on Self-Powered Neutron Detectors is developed using a numerical simulation toolbox, named `MATiSSe'. The CEA Instrumentation Sensors and Dosimetry Lab developed MATiSSe since 2010 for Self-Powered Neutron Detectors material selection and geometry design, as well as for their respective partial neutron and gamma sensitivity calculations. MATiSSe is based on a comprehensive model of neutron and gamma interactions which take place in Selfpowered neutron detector components using the MCNP6 Monte Carlo code. As member of the project consortium, the THERMOCOAX SAS Company is currently manufacturing some instrumented pole prototypes to be tested in 2017. The full severe accident monitoring equipment, including the standalone low current acquisition system, will be tested during a joined CEA-THERMOCOAX experimental campaign in some realistic irradiation conditions, in the Slovenian TRIGA Mark II research reactor.

  8. EOSPEC: a complementary toolbox for MODTRAN calculations

    NASA Astrophysics Data System (ADS)

    Dion, Denis

    2016-09-01

    For more than a decade, Defence Research and Development Canada (DRDC) has been developing a Library of computer models for the calculations of atmospheric effects on EO-IR sensor performances. The Library, called EOSPEC-LIB (EO-IR Sensor PErformance Computation LIBrary) has been designed as a complement to MODTRAN, the radiative transfer code developed by the Air Force Research Laboratory and Spectral Science Inc. in the USA. The Library comprises modules for the definition of the atmospheric conditions, including aerosols, and provides modules for the calculation of turbulence and fine refraction effects. SMART (Suite for Multi-resolution Atmospheric Radiative Transfer), a key component of EOSPEC, allows one to perform fast computations of transmittances and radiances using MODTRAN through a wide-band correlated-k computational approach. In its most recent version, EOSPEC includes a MODTRAN toolbox whose functions help generate in a format compatible to MODTRAN 5 and 6 atmospheric and aerosol profiles, user-defined refracted optical paths and inputs for configuring the MODTRAN sea radiance (BRDF) model. The paper gives an overall description of the EOSPEC features and capacities. EOSPEC provides augmented capabilities for computations in the lower atmosphere, and for computations in maritime environments.

  9. Cognition assessment using the NIH Toolbox

    PubMed Central

    Dikmen, Sureyya S.; Heaton, Robert K.; Tulsky, David S.; Zelazo, Philip D.; Bauer, Patricia J.; Carlozzi, Noelle E.; Slotkin, Jerry; Blitz, David; Wallner-Allen, Kathleen; Fox, Nathan A.; Beaumont, Jennifer L.; Mungas, Dan; Nowinski, Cindy J.; Richler, Jennifer; Deocampo, Joanne A.; Anderson, Jacob E.; Manly, Jennifer J.; Borosh, Beth; Havlik, Richard; Conway, Kevin; Edwards, Emmeline; Freund, Lisa; King, Jonathan W.; Moy, Claudia; Witt, Ellen; Gershon, Richard C.

    2013-01-01

    Cognition is 1 of 4 domains measured by the NIH Toolbox for the Assessment of Neurological and Behavioral Function (NIH-TB), and complements modules testing motor function, sensation, and emotion. On the basis of expert panels, the cognition subdomains identified as most important for health, success in school and work, and independence in daily functioning were Executive Function, Episodic Memory, Language, Processing Speed, Working Memory, and Attention. Seven measures were designed to tap constructs within these subdomains. The instruments were validated in English, in a sample of 476 participants ranging in age from 3 to 85 years, with representation from both sexes, 3 racial/ethnic categories, and 3 levels of education. This report describes the development of the Cognition Battery and presents results on test-retest reliability, age effects on performance, and convergent and discriminant construct validity. The NIH-TB Cognition Battery is intended to serve as a brief, convenient set of measures to supplement other outcome measures in epidemiologic and longitudinal research and clinical trials. With a computerized format and national standardization, this battery will provide a “common currency” among researchers for comparisons across a wide range of studies and populations. PMID:23479546

  10. The Pharmacogenomics Research Network Translational Pharmacogenetics Program: Overcoming Challenges of Real-World Implementation

    PubMed Central

    Shuldiner, AR; Relling, MV; Peterson, JF; Hicks, JK; Freimuth, RR; Sadee, W; Pereira, NL; Roden, DM; Johnson, JA; Klein, TE

    2013-01-01

    The pace of discovery of potentially actionable pharmacogenetic variants has increased dramatically in recent years. However, the implementation of this new knowledge for individualized patient care has been slow. The Pharmacogenomics Research Network (PGRN) Translational Pharmacogenetics Program seeks to identify barriers and develop real-world solutions to implementation of evidence-based pharmacogenetic tests in diverse health-care settings. Dissemination of the resulting toolbox of “implementation best practices” will prove useful to a broad audience. PMID:23588301

  11. From magic to technology: materials integration by wafer bonding

    NASA Astrophysics Data System (ADS)

    Dragoi, Viorel

    2006-02-01

    Wafer bonding became in the last decade a very powerful technology for MEMS/MOEMS manufacturing. Being able to offer a solution to overcome some problems of the standard processes used for materials integration (e.g. epitaxy, thin films deposition), wafer bonding is nowadays considered an important item in the MEMS engineer toolbox. Different principles governing the wafer bonding processes will be reviewed in this paper. Various types of applications will be presented as examples.

  12. The PHM-Ethics methodology: interdisciplinary technology assessment of personal health monitoring.

    PubMed

    Schmidt, Silke; Verweij, Marcel

    2013-01-01

    The contribution briefly introduces the PHM Ethics project and the PHM methodology. Within the PHM-Ethics project, a set of tools and modules had been developed that may assist in the evaluation and assessment of new technologies for personal health monitoring, referred to as "PHM methodology" or "PHM toolbox". An overview on this interdisciplinary methodology and its comprising modules is provided, areas of application and intended target groups are indicated.

  13. nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab

    PubMed Central

    Cajigas, I.; Malik, W.Q.; Brown, E.N.

    2012-01-01

    Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419

  14. GeoPAT: A toolbox for pattern-based information retrieval from large geospatial databases

    NASA Astrophysics Data System (ADS)

    Jasiewicz, Jarosław; Netzel, Paweł; Stepinski, Tomasz

    2015-07-01

    Geospatial Pattern Analysis Toolbox (GeoPAT) is a collection of GRASS GIS modules for carrying out pattern-based geospatial analysis of images and other spatial datasets. The need for pattern-based analysis arises when images/rasters contain rich spatial information either because of their very high resolution or their very large spatial extent. Elementary units of pattern-based analysis are scenes - patches of surface consisting of a complex arrangement of individual pixels (patterns). GeoPAT modules implement popular GIS algorithms, such as query, overlay, and segmentation, to operate on the grid of scenes. To achieve these capabilities GeoPAT includes a library of scene signatures - compact numerical descriptors of patterns, and a library of distance functions - providing numerical means of assessing dissimilarity between scenes. Ancillary GeoPAT modules use these functions to construct a grid of scenes or to assign signatures to individual scenes having regular or irregular geometries. Thus GeoPAT combines knowledge retrieval from patterns with mapping tasks within a single integrated GIS environment. GeoPAT is designed to identify and analyze complex, highly generalized classes in spatial datasets. Examples include distinguishing between different styles of urban settlements using VHR images, delineating different landscape types in land cover maps, and mapping physiographic units from DEM. The concept of pattern-based spatial analysis is explained and the roles of all modules and functions are described. A case study example pertaining to delineation of landscape types in a subregion of NLCD is given. Performance evaluation is included to highlight GeoPAT's applicability to very large datasets. The GeoPAT toolbox is available for download from

  15. Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing ...

  16. A Toolbox for Corrective Action: Resource Conservation and Recovery Act Facilities Investigation Remedy Selection Track

    EPA Pesticide Factsheets

    The purpose of this toolbox is to help EPA Regional staff and their partners to take advantage of the efficiency and quality gains from the Resource Conservation and Recovery Act (RCRA) Facilities Investigation Remedy Selection Track (FIRST) approach.

  17. Traffic analysis toolbox volume IX : work zone modeling and simulation, a guide for analysts

    DOT National Transportation Integrated Search

    2009-03-01

    This document is the second volume in the FHWA Traffic Analysis Toolbox: Work Zone Analysis series. Whereas the first volume provides guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in work zone plan...

  18. An analysis toolbox to explore mesenchymal migration heterogeneity reveals adaptive switching between distinct modes

    PubMed Central

    Shafqat-Abbasi, Hamdah; Kowalewski, Jacob M; Kiss, Alexa; Gong, Xiaowei; Hernandez-Varas, Pablo; Berge, Ulrich; Jafari-Mamaghani, Mehrdad; Lock, John G; Strömblad, Staffan

    2016-01-01

    Mesenchymal (lamellipodial) migration is heterogeneous, although whether this reflects progressive variability or discrete, 'switchable' migration modalities, remains unclear. We present an analytical toolbox, based on quantitative single-cell imaging data, to interrogate this heterogeneity. Integrating supervised behavioral classification with multivariate analyses of cell motion, membrane dynamics, cell-matrix adhesion status and F-actin organization, this toolbox here enables the detection and characterization of two quantitatively distinct mesenchymal migration modes, termed 'Continuous' and 'Discontinuous'. Quantitative mode comparisons reveal differences in cell motion, spatiotemporal coordination of membrane protrusion/retraction, and how cells within each mode reorganize with changed cell speed. These modes thus represent distinctive migratory strategies. Additional analyses illuminate the macromolecular- and cellular-scale effects of molecular targeting (fibronectin, talin, ROCK), including 'adaptive switching' between Continuous (favored at high adhesion/full contraction) and Discontinuous (low adhesion/inhibited contraction) modes. Overall, this analytical toolbox now facilitates the exploration of both spontaneous and adaptive heterogeneity in mesenchymal migration. DOI: http://dx.doi.org/10.7554/eLife.11384.001 PMID:26821527

  19. A Toolbox to Improve Algorithms for Insulin-Dosing Decision Support

    PubMed Central

    Donsa, K.; Plank, J.; Schaupp, L.; Mader, J. K.; Truskaller, T.; Tschapeller, B.; Höll, B.; Spat, S.; Pieber, T. R.

    2014-01-01

    Summary Background Standardized insulin order sets for subcutaneous basal-bolus insulin therapy are recommended by clinical guidelines for the inpatient management of diabetes. The algorithm based GlucoTab system electronically assists health care personnel by supporting clinical workflow and providing insulin-dose suggestions. Objective To develop a toolbox for improving clinical decision-support algorithms. Methods The toolbox has three main components. 1) Data preparation: Data from several heterogeneous sources is extracted, cleaned and stored in a uniform data format. 2) Simulation: The effects of algorithm modifications are estimated by simulating treatment workflows based on real data from clinical trials. 3) Analysis: Algorithm performance is measured, analyzed and simulated by using data from three clinical trials with a total of 166 patients. Results Use of the toolbox led to algorithm improvements as well as the detection of potential individualized subgroup-specific algorithms. Conclusion These results are a first step towards individualized algorithm modifications for specific patient subgroups. PMID:25024768

  20. The BRAT and GUT Couple: Broadview Radar Altimetry and GOCE User Toolboxes

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Restano, M.; Ambrózio, A.

    2017-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's next release (4.2.0) is planned for October 2017. Based on the community feedback, the front-end has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.1 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's Variance-Covariance Matrix tool (VCM). BRAT and GUT toolboxes can be freely downloaded, along with ancillary material, at https://earth.esa.int/brat and https://earth.esa.int/gut.

  1. Implementation of the DINEOF ArcGIS Toolbox: Case study of reconstruction of Chlorophyll-a missing data over the Mediterranean using MyOcean satellite data products.

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Andreas; Stylianou, Stavros; Georgiou, Georgios; Hadjimitsis, Diofantos; Akylas, Evangelos

    2014-05-01

    ArcGIS® is a well known standard on Geographical Information Systems, used over the years for various remote sensing procedures. During the last decade, Rixen (2003) and Azcarate (2011) presented the DINEOF (Data Interpolating Empirical Orthogonal Functions) method, a EOF-based technique to reconstruct missing data in satellite images. The recent results of the DINEOF method in various experimental trials (Wang and Liu, 2013; Nikolaidis et al., 2013;2014) showed that this computationally affordable method leads to effective reconstruction of missing data from geophysical fields, such as chlorophyll-a, sea surface temperatures or salinities and geophysical fields derived from satellite data. Implementing the method in a GIS system will lead to a complete and integrated approach, enhancing its applicability. The inclusion of statistical tools within the GIS, will multiply the effectiveness, providing interoperability with other sources in the same application environment. This may be especially useful in studies where various different kinds of data are of interest. For this purpose, in this study we have implemented a new GIS toolbox that aims at automating the usage of the algorithm, incorporating the DINEOF codes provided by GHER (GeoHydrodynamics and Environment Research Group of University of Liege) into the ArcGIS®. A case-study of filling the chlorophyll-a missing data in the Mediterranean Sea area, for a 18-day period is analyzed, as an example for the effectiveness and simplicity of the toolbox. More specifically, we focus on chlorophyll-a MODIS satellite data collected by CNR-ISAC (Italian National Research Council, Institute of Atmospheric Sciences and Climate), from the respective products of MyOcean2® organization, that provides free online access to Level 3, with 1 km resolution. All the daily products with an initial level of only 27% data coverage were successfully reconstructed over the Mediterranean Sea. [1] Alvera-Azcárate A., Barth A.,Sirjacobs D., Lenartz F., Beckers J.-M.. Data Interpolating Empirical Orthogonal Functions (DINEOF): a tool for geophysical data analyses. Medit. Mar. Sci., 5-11, (2011). [2] Rixen M., Beckers J. M.,, EOF Calculations and Data Filling from Incomplete Oceanographic Datasets. Journal of Atmospheric and Oceanic Technology, Vol. 20(12), pp. 1839-1856, (2003) [3] Nikolaidis A., Georgiou G., Hadjimitsis D. and E. Akylas, Applying a DINEOF algorithm on cloudy sea-surface temperature satellite data over the eastern Mediterranean Sea, Central European Journal of Geosciences 6(1), pp. 1-16, (2014) [4] Nikolaidis A., Georgiou G., Hadjimitsis D. and E. Akylas Applying DINEOF algorithm on cloudy sea-surface temperature satellite data over the eastern Mediterranean Sea, Proc. SPIE 8795, First International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2013), 87950L, 8-10 April 2013, Paphos, Cyprus, 10.1117/12.2029085 [5] Wang Y. and D. Liu (2014), Reconstruction of satellite chlorophyll-a data using a modified DINEOF method: a case study in the Bohai and Yellow seas, China, International Journal of Remote Sensing, Vol. 35(1), 204-217, (2014).

  2. Automated riverine landscape characterization: GIS-based tools for watershed-scale research, assessment, and management.

    PubMed

    Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C

    2013-09-01

    River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.

  3. Brave new worlds--review and update on virtual reality assessment and treatment in psychosis.

    PubMed

    Veling, Wim; Moritz, Steffen; van der Gaag, Mark

    2014-11-01

    In recent years, virtual reality (VR) research on psychotic disorders has been initiated. Several studies showed that VR can elicit paranoid thoughts about virtual characters (avatars), both in patients with psychotic disorders and healthy individuals. Real life symptoms and VR experiences were correlated, lending further support to its validity. Neurocognitive deficits and difficulties in social behavior were found in schizophrenia patients, not only in abstract tasks but also using naturalistic virtual environments that are more relevant to daily life, such as a city or encounters with avatars. VR treatments are conceivable for most dimensions of psychotic disorders. There is a small but expanding literature on interventions for delusions, hallucinations, neurocognition, social cognition, and social skills; preliminary results are promising. VR applications for assessment and treatment of psychotic disorders are in their infancy, but appear to have a great potential for increasing our understanding of psychosis and expanding the therapeutic toolbox. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. The Study of Imperfection in Rough Set on the Field of Engineering and Education

    NASA Astrophysics Data System (ADS)

    Sheu, Tian-Wei; Liang, Jung-Chin; You, Mei-Li; Wen, Kun-Li

    Based on the characteristic of rough set, rough set theory overlaps with many other theories, especially with fuzzy set theory, evidence theory and Boolean reasoning methods. And the rough set methodology has found many real-life applications, such as medical data analysis, finance, banking, engineering, voice recognition, image processing and others. Till now, there is rare research associating to this issue in the imperfection of rough set. Hence, the main purpose of this paper is to study the imperfection of rough set in the field of engineering and education. First of all, we preview the mathematics model of rough set, and a given two examples to enhance our approach, which one is the weighting of influence factor in muzzle noise suppressor, and the other is the weighting of evaluation factor in English learning. Third, we also apply Matlab to develop a complete human-machine interface type of toolbox in order to support the complex calculation and verification the huge data. Finally, some further suggestions are indicated for the research in the future.

  5. ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes

    PubMed Central

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273

  6. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    PubMed

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  7. SBEToolbox: A Matlab Toolbox for Biological Network Analysis

    PubMed Central

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418

  8. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    PubMed

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  9. UAS-NAS Live Virtual Constructive Distributed Environment (LVC): LVC Gateway, Gateway Toolbox, Gateway Data Logger (GDL), SaaProc Software Design Description

    NASA Technical Reports Server (NTRS)

    Jovic, Srboljub

    2015-01-01

    This document provides the software design description for the two core software components, the LVC Gateway, the LVC Gateway Toolbox, and two participants, the LVC Gateway Data Logger and the SAA Processor (SaaProc).

  10. Expanding the seat belt program strategies toolbox: a starter kit for trying new program ideas : traffic tech.

    DOT National Transportation Integrated Search

    2016-10-01

    The National Highway Traffic Safety Administration has just : released a new resource for developing seat belt programs in : the traffic safety communityExpanding the Seat Belt Program : Toolbox: A Starter Kit for Trying New Program Ideas. : Resea...

  11. Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox - slides

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing S...

  12. Motor assessment using the NIH Toolbox

    PubMed Central

    Magasi, Susan; McCreath, Heather E.; Bohannon, Richard W.; Wang, Ying-Chih; Bubela, Deborah J.; Rymer, William Z.; Beaumont, Jennifer; Rine, Rose Marie; Lai, Jin-Shei; Gershon, Richard C.

    2013-01-01

    Motor function involves complex physiologic processes and requires the integration of multiple systems, including neuromuscular, musculoskeletal, and cardiopulmonary, and neural motor and sensory-perceptual systems. Motor-functional status is indicative of current physical health status, burden of disease, and long-term health outcomes, and is integrally related to daily functioning and quality of life. Given its importance to overall neurologic health and function, motor function was identified as a key domain for inclusion in the NIH Toolbox for Assessment of Neurological and Behavioral Function (NIH Toolbox). We engaged in a 3-stage developmental process to: 1) identify key subdomains and candidate measures for inclusion in the NIH Toolbox, 2) pretest candidate measures for feasibility across the age span of people aged 3 to 85 years, and 3) validate candidate measures against criterion measures in a sample of healthy individuals aged 3 to 85 years (n = 340). Based on extensive literature review and input from content experts, the 5 subdomains of dexterity, strength, balance, locomotion, and endurance were recommended for inclusion in the NIH Toolbox motor battery. Based on our validation testing, valid and reliable measures that are simultaneously low-cost and portable have been recommended to assess each subdomain, including the 9-hole peg board for dexterity, grip dynamometry for upper-extremity strength, standing balance test, 4-m walk test for gait speed, and a 2-minute walk test for endurance. PMID:23479547

  13. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    PubMed

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  14. An ethics toolbox for neurotechnology.

    PubMed

    Farah, Martha J

    2015-04-08

    Advances in neurotechnology will raise new ethical dilemmas, to which scientists and the rest of society must respond. Here I present a "toolbox" of concepts to help us analyze these issues and communicate with each other about them across differences of ethical intuition. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Wastewater Collection System Toolbox | Eliminating Sanitary ...

    EPA Pesticide Factsheets

    2017-04-10

    Communities across the United States are working to find cost-effective, long-term approaches to managing their aging wastewater infrastructure and preventing the problems that lead to sanitary sewer overflows. The Toolbox is an effort by EPA New England to provide examples of programs and educational efforts from New England and beyond.

  16. 40 CFR 141.717 - Pre-filtration treatment toolbox components.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... surface water or GWUDI source. (c) Bank filtration. Systems receive Cryptosporidium treatment credit for... paragraph. Systems using bank filtration when they begin source water monitoring under § 141.701(a) must... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Pre-filtration treatment toolbox...

  17. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    ERIC Educational Resources Information Center

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  18. Integrating Building Information Modeling and Health and Safety for Onsite Construction

    PubMed Central

    Ganah, Abdulkadir; John, Godfaurd A.

    2014-01-01

    Background Health and safety (H&S) on a construction site can either make or break a contractor, if not properly managed. The usage of Building Information Modeling (BIM) for H&S on construction execution has the potential to augment practitioner understanding of their sites, and by so doing reduce the probability of accidents. This research explores BIM usage within the construction industry in relation to H&S communication. Methods In addition to an extensive literature review, a questionnaire survey was conducted to gather information on the embedment of H&S planning with the BIM environment for site practitioners. Results The analysis of responses indicated that BIM will enhance the current approach of H&S planning for construction site personnel. Conclusion From the survey, toolbox talk will have to be integrated with the BIM environment, because it is the predominantly used procedure for enhancing H&S issues within construction sites. The advantage is that personnel can visually understand H&S issues as work progresses during the toolbox talk onsite. PMID:25830069

  19. Integrating building information modeling and health and safety for onsite construction.

    PubMed

    Ganah, Abdulkadir; John, Godfaurd A

    2015-03-01

    Health and safety (H&S) on a construction site can either make or break a contractor, if not properly managed. The usage of Building Information Modeling (BIM) for H&S on construction execution has the potential to augment practitioner understanding of their sites, and by so doing reduce the probability of accidents. This research explores BIM usage within the construction industry in relation to H&S communication. In addition to an extensive literature review, a questionnaire survey was conducted to gather information on the embedment of H&S planning with the BIM environment for site practitioners. The analysis of responses indicated that BIM will enhance the current approach of H&S planning for construction site personnel. From the survey, toolbox talk will have to be integrated with the BIM environment, because it is the predominantly used procedure for enhancing H&S issues within construction sites. The advantage is that personnel can visually understand H&S issues as work progresses during the toolbox talk onsite.

  20. Cutting the Distance in Distance Education: Reflections on the Use of E-Technologies in a New Zealand Social Work Program

    ERIC Educational Resources Information Center

    Stanley-Clarke, Nicky; English, Awhina; Yeung, Polly

    2018-01-01

    The development of new e-technologies and an increased focus on developing distance social work education programs has created the impetus for social work educators to consider the tools they can employ in delivering distance courses. This article reflects on an action learning research project involving the development of an online toolbox of…

  1. Creating Qungasvik (a Yup'ik intervention "toolbox"): case examples from a community-developed and culturally-driven intervention.

    PubMed

    Rasmus, Stacy M; Charles, Billy; Mohatt, Gerald V

    2014-09-01

    This paper describes the development of a Yup'ik Alaska Native approach to suicide and alcohol abuse prevention that resulted in the creation of the Qungasvik, a toolbox promoting reasons for life and sobriety among youth. The Qungasvik is made up of thirty-six modules that function as cultural scripts for creating experiences in Yup'ik communities that build strengths and protection against suicide and alcohol abuse. The Qungasvik manual represents the results of a community based participatory research intervention development process grounded in culture and local process, and nurtured through a syncretic blending of Indigenous and Western theories and practices. This paper will provide a description of the collaborative steps taken at the community-level to develop the intervention modules. This process involved university researchers and community members coming together and drawing from multiple sources of data and knowledge to inform the development of prevention activities addressing youth suicide and alcohol abuse. We will present case examples describing the development of three keystone modules; Qasgiq (The Men's House), Yup'ik Kinship Terms, and Surviving Your Feelings. These modules each are representative of the process that the community co-researcher team took to develop and implement protective experiences that: (1) create supportive community, (2) strengthen families, and (3) give individuals tools to be healthy and strong.

  2. An adaptive toolbox approach to the route to expertise in sport.

    PubMed

    de Oliveira, Rita F; Lobinger, Babett H; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes' natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise.

  3. Functional Genomics Assistant (FUGA): a toolbox for the analysis of complex biological networks

    PubMed Central

    2011-01-01

    Background Cellular constituents such as proteins, DNA, and RNA form a complex web of interactions that regulate biochemical homeostasis and determine the dynamic cellular response to external stimuli. It follows that detailed understanding of these patterns is critical for the assessment of fundamental processes in cell biology and pathology. Representation and analysis of cellular constituents through network principles is a promising and popular analytical avenue towards a deeper understanding of molecular mechanisms in a system-wide context. Findings We present Functional Genomics Assistant (FUGA) - an extensible and portable MATLAB toolbox for the inference of biological relationships, graph topology analysis, random network simulation, network clustering, and functional enrichment statistics. In contrast to conventional differential expression analysis of individual genes, FUGA offers a framework for the study of system-wide properties of biological networks and highlights putative molecular targets using concepts of systems biology. Conclusion FUGA offers a simple and customizable framework for network analysis in a variety of systems biology applications. It is freely available for individual or academic use at http://code.google.com/p/fuga. PMID:22035155

  4. An adaptive toolbox approach to the route to expertise in sport

    PubMed Central

    de Oliveira, Rita F.; Lobinger, Babett H.; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes’ natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise. PMID:25071673

  5. MARTI: man-machine animation real-time interface

    NASA Astrophysics Data System (ADS)

    Jones, Christian M.; Dlay, Satnam S.

    1997-05-01

    The research introduces MARTI (man-machine animation real-time interface) for the realization of natural human-machine interfacing. The system uses simple vocal sound-tracks of human speakers to provide lip synchronization of computer graphical facial models. We present novel research in a number of engineering disciplines, which include speech recognition, facial modeling, and computer animation. This interdisciplinary research utilizes the latest, hybrid connectionist/hidden Markov model, speech recognition system to provide very accurate phone recognition and timing for speaker independent continuous speech, and expands on knowledge from the animation industry in the development of accurate facial models and automated animation. The research has many real-world applications which include the provision of a highly accurate and 'natural' man-machine interface to assist user interactions with computer systems and communication with one other using human idiosyncrasies; a complete special effects and animation toolbox providing automatic lip synchronization without the normal constraints of head-sets, joysticks, and skilled animators; compression of video data to well below standard telecommunication channel bandwidth for video communications and multi-media systems; assisting speech training and aids for the handicapped; and facilitating player interaction for 'video gaming' and 'virtual worlds.' MARTI has introduced a new level of realism to man-machine interfacing and special effect animation which has been previously unseen.

  6. Direct Duplex Detection: An Emerging Tool in the RNA Structure Analysis Toolbox.

    PubMed

    Weidmann, Chase A; Mustoe, Anthony M; Weeks, Kevin M

    2016-09-01

    While a variety of powerful tools exists for analyzing RNA structure, identifying long-range and intermolecular base-pairing interactions has remained challenging. Recently, three groups introduced a high-throughput strategy that uses psoralen-mediated crosslinking to directly identify RNA-RNA duplexes in cells. Initial application of these methods highlights the preponderance of long-range structures within and between RNA molecules and their widespread structural dynamics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Identification of the origin of faecal contamination in estuarine oysters using Bacteroidales and F-specific RNA bacteriophage markers.

    PubMed

    Mieszkin, S; Caprais, M P; Le Mennec, C; Le Goff, M; Edge, T A; Gourmelon, M

    2013-09-01

    The aim of this study was to identify the origin of faecal pollution impacting the Elorn estuary (Brittany, France) by applying microbial source tracking (MST) markers in both oysters and estuarine waters. The MST markers used were as follows: (i) human-, ruminant- and pig-associated Bacteroidales markers by real-time PCR and (ii) human genogroup II and animal genogroup I of F-specific RNA bacteriophages (FRNAPH) by culture/genotyping and by direct real-time reverse-transcriptase PCR. The higher occurrence of the human genogroup II of F-specific RNA bacteriophages using a culture/genotyping method, and human-associated Bacteroidales marker by real-time PCR, allowed the identification of human faecal contamination as the predominant source of contamination in oysters (total of 18 oyster batches tested) and waters (total of 24 water samples tested). The importance of using the intravalvular liquids instead of digestive tissues, when applying host-associated Bacteroidales markers in oysters, was also revealed. This study has shown that the application of a MST toolbox of diverse bacterial and viral methods can provide multiple lines of evidence to identify the predominant source of faecal contamination in shellfish from an estuarine environment. Application of this MST toolbox is a useful approach to understand the origin of faecal contamination in shellfish harvesting areas in an estuarine setting. © 2013 The Society for Applied Microbiology.

  8. Impact-oriented steering--the concept of NGO-IDEAs 'impact toolbox'.

    PubMed

    2008-03-01

    The NGO-IDEAs 'Impact Toolbox' has been developed with a group of NGOs all of which are active in the area of saving and credit in South India. This compilation of methods to apply in impact-oriented steering was devised by the executive staff of the Indian partner NGOs, also known as the Resource Persons, in 2006 and tested from late 2006 to early 2007. At first glance, the approach may appear to be highly specialised and difficult to transfer. However, in fact it follows principles that can be adapted for several NGOs in other countries and in other sectors. The following article presents the concept of the NGO-IDEAs 'Impact Toolbox'.

  9. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    PubMed

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  10. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing

    PubMed Central

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment. PMID:25101013

  11. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  12. Policy Analysis for Sustainable Development: The Toolbox for the Environmental Social Scientist

    ERIC Educational Resources Information Center

    Runhaar, Hens; Dieperink, Carel; Driessen, Peter

    2006-01-01

    Purpose: The paper seeks to propose the basic competencies of environmental social scientists regarding policy analysis for sustainable development. The ultimate goal is to contribute to an improvement of educational programmes in higher education by suggesting a toolbox that should be integrated in the curriculum. Design/methodology/approach:…

  13. Rural ITS toolbox and deployment plan for Regions 2, 6, 7 and 9 : ITS toolbox for rural and small urban areas

    DOT National Transportation Integrated Search

    1998-12-01

    As a part of the Small Urban and Rural ITS Study it conducted in 4 of its more rural regions, the New York State Department of Transportation has developed a compendium of systems, devices and strategies that can enhance safety, provide information, ...

  14. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. MODULE 4: ANALYTICAL GUIDE. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  15. The Psychometric Toolbox: An Excel Package for Use in Measurement and Psychometrics Courses

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Masip-Cabrera, Antoni; Navarro-González, David; Lorenzo-Seva, Urbano

    2017-01-01

    The Psychometric Toolbox (PT) is a user-friendly, non-commercial package mainly intended to be used for instructional purposes in introductory courses of educational and psychological measurement, psychometrics and statistics. The PT package is organized in six separate modules or sub-programs: Data preprocessor (descriptive analyses and data…

  16. Toolbox or Adjustable Spanner? A Critical Comparison of Two Metaphors for Adaptive Decision Making

    ERIC Educational Resources Information Center

    Söllner, Anke; Bröder, Arndt

    2016-01-01

    For multiattribute decision tasks, different metaphors exist that describe the process of decision making and its adaptation to diverse problems and situations. Multiple strategy models (MSMs) assume that decision makers choose adaptively from a set of different strategies (toolbox metaphor), whereas evidence accumulation models (EAMs) hold that a…

  17. FALCON: a toolbox for the fast contextualization of logical networks

    PubMed Central

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-01-01

    Abstract Motivation Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. Results We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. Availability and implementation FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. Contact thomas.sauter@uni.lu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28673016

  18. FALCON: a toolbox for the fast contextualization of logical networks.

    PubMed

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-11-01

    Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. thomas.sauter@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  19. TopoToolbox: Using Sensor Topography to Calculate Psychologically Meaningful Measures from Event-Related EEG/MEG

    PubMed Central

    Tian, Xing; Poeppel, David; Huber, David E.

    2011-01-01

    The open-source toolbox “TopoToolbox” is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004). PMID:21577268

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinuesa, Ricardo; Fick, Lambert; Negi, Prabal

    In the present document we describe a toolbox for the spectral-element code Nek5000, aimed at computing turbulence statistics. The toolbox is presented for a small test case, namely a square duct with L x = 2h, L y = 2h and L z = 4h, where x, y and z are the horizontal, vertical and streamwise directions, respectively. The number of elements in the xy-plane is 16 X 16 = 256, and the number of elements in z is 4, leading to a total of 1,204 spectral elements. A polynomial order of N = 5 is chosen, and the meshmore » is generated using the Nek5000 tool genbox. The toolbox presented here allows to compute mean-velocity components, the Reynolds-stress tensor as well as turbulent kinetic energy (TKE) and Reynolds-stress budgets. Note that the present toolbox allows to compute turbulence statistics in turbulent flows with one homogeneous direction (where the statistics are based on time-averaging as well as averaging in the homogeneous direction), as well as in fully three-dimensional flows (with no periodic directions, where only time-averaging is considered).« less

  1. Optimizing detection and analysis of slow waves in sleep EEG.

    PubMed

    Mensen, Armand; Riedner, Brady; Tononi, Giulio

    2016-12-01

    Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Online model evaluation of large-eddy simulations covering Germany with a horizontal resolution of 156 m

    NASA Astrophysics Data System (ADS)

    Hansen, Akio; Ament, Felix; Lammert, Andrea

    2017-04-01

    Large-eddy simulations have been performed since several decades, but due to computational limits most studies were restricted to small domains or idealised initial-/boundary conditions. Within the High definition clouds and precipitation for advancing climate prediction (HD(CP)2) project realistic weather forecasting like LES simulations were performed with the newly developed ICON LES model for several days. The domain covers central Europe with a horizontal resolution down to 156 m. The setup consists of more than 3 billion grid cells, by what one 3D dump requires roughly 500 GB. A newly developed online evaluation toolbox was created to check instantaneously for realistic model simulations. The toolbox automatically combines model results with observations and generates several quicklooks for various variables. So far temperature-/humidity profiles, cloud cover, integrated water vapour, precipitation and many more are included. All kind of observations like aircraft observations, soundings or precipitation radar networks are used. For each dataset, a specific module is created, which allows for an easy handling and enhancement of the toolbox. Most of the observations are automatically downloaded from the Standardized Atmospheric Measurement Database (SAMD). The evaluation tool should support scientists at monitoring computational costly model simulations as well as to give a first overview about model's performance. The structure of the toolbox as well as the SAMD database are presented. Furthermore, the toolbox was applied on an ICON LES sensitivity study, where example results are shown.

  3. U.S. Geological Survey groundwater toolbox, a graphical and mapping interface for analysis of hydrologic data (version 1.0): user guide for estimation of base flow, runoff, and groundwater recharge from streamflow data

    USGS Publications Warehouse

    Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark

    2015-01-01

    This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.

  4. Tracking the Sources of Fecal Contaminations: an Interdisciplinary Toolbox

    NASA Astrophysics Data System (ADS)

    Jeanneau, L.; Jarde, E.; Derrien, M.; Gruau, G.; Solecki, O.; Pourcher, A.; Marti, R.; Wéry, N.; Caprais, M.; Gourmelon, M.; Mieszkin, S.; Jadas-Hécart, A.; Communal, P.

    2011-12-01

    Fecal contaminations of inland and coastal waters induce risks to human health and economic losses. In order to improve water management, it is necessary to identify the sources of contamination, which implies the development of specific markers. In order to be considered as a valuable host-specific marker, one must (1) be source specific, (2) occur in high concentration in polluting matrices, (3) exhibit extra-intestinal persistence similar to fecal indicator bacteria (FIB) and (4) not grow out of the host. However, up to day no single marker has fulfilled all those criteria. Thus, it has been suggested to use a combination of markers in order to generate more reliable data. This has lead to the development of a Microbial Source Tracking (MST) toolbox including FIB and microbial and chemical specific markers in order to differentiate between human, bovine and porcine fecal contaminations. Those specific markers are, (1) genotypes of F-specific RNA bacteriophages, (2) bacterial markers belonging to the Bacteroidales (human-specific HF183, ruminant-specific Rum-2-Bac and pig-specific Pig-2-Bac markers), to the Bifidobacterium (Bifidobacterium adolescentis) and pig-specific Lactobacillus amylovorus, (3) fecal stanols and (4) caffeine. The development of this MST toolbox was composed of four steps, from the molecular scale to the watershed scale. At the molecular scale, the specificity and the concentration of those markers were studied in cattle and pig manures and in waste water treatment plant (WWTP) effluents and influents. At the microcosm scale, the transfer of bovine and porcine specific markers was investigated by rainfall simulations on agricultural plots amended with cattle or pig manure. Moreover, the relative persistence of FIB and human, porcine and bovine specific markers was investigated in freshwater and seawater microcosms inoculated with a WWTP influent, pig manure and cow manure. Finally, the aforementioned MST toolbox has been validated at the catchment scale by analysing three rivers impacted by fecal contaminations. The development and the application of this MST toolbox have highlighted (1) the specificity of the aforementioned markers, (2) their conservative transfer from soils to rivers and (3) their difference of persistence in seawater and in freshwater. Those results provide useful data in order to identify and manage fecal contaminations of superficial waters. In the case of single source contaminations, the markers provide coherent information: (1) the bovine or porcine markers were not detected in a river impacted by a WWTP effluent; (2) the occurrence of Rum-2-Bac and the distribution of stanols indicated a bovine contamination in a river flowing through cattle pasture. In the case of multiple source contaminations, the combination of markers is necessary to identify the main sources and the statistical treatment of the distribution of stanols could provide an approximation of their proportion.

  5. Pharmacogenetic and optical dissection for mechanistic understanding of Parkinson's disease: potential utilities revealed through behavioural assessment.

    PubMed

    Sharma, Puneet; Pienaar, Ilse S

    2014-11-01

    The technology toolbox by which neural elements can be selectively manipulated in vertebrate and invertebrate brains has expanded greatly in recent years, to now include sophisticated optogenetics and novel designer receptors. Application of such tools allow for ascertaining whether a particular behavioural phenotype associates with interrogation of a specific neural circuit. Optogenetics has already found application in the study of Parkinson's disease (PD) circuitry and therapies, whereas novel designer receptors hold promise for enlightening on current understanding of the mechanisms underlying parkinsonian motor and non-motor symptoms. In particular, this new generation of research tools provide a method by which significant insights can be gained on brain networks implicated in brain diseases such as PD. These tools also promise to assist in the development of novel therapies for targeting degenerated dopaminergic and non-dopaminergic neurons in the diseased basal ganglia system of PD patients, for providing symptomatic relief or even reverse neurodegenerative processes. The present review discusses how such technologies, in conjunction with application of sensitive behavioural assays, continue to significantly advance our knowledge of circuit and signalling properties inherent to PD pathology. The discussion also highlights how such experimental approaches provide additional explorative avenues which may result in dramatically improved therapeutic options for PD patients. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Properties and applications of undecylprodigiosin and other bacterial prodigiosins.

    PubMed

    Stankovic, Nada; Senerovic, Lidija; Ilic-Tomic, Tatjana; Vasiljevic, Branka; Nikodinovic-Runic, Jasmina

    2014-05-01

    The growing demand to fulfill the needs of present-day medicine in terms of novel effective molecules has lead to reexamining some of the old and known bacterial secondary metabolites. Bacterial prodigiosins (prodiginines) have a long history of being re markable multipurpose compounds, best examined for their anticancer and antimalarial activities. Production of prodigiosin in the most common producer strain Serratia marcescens has been described in great detail. However, few reports have discussed the ecophysiological roles of these molecules in the producing strains, as well as their antibiotic and UV-protective properties. This review describes recent advances in the production process, biosynthesis, properties, and applications of bacterial prodigiosins. Special emphasis is put on undecylprodigiosin which has generally been a less studied member of the prodigiosin family. In addition, it has been suggested that proteins involved in undecylprodigiosin synthesis, RedG and RedH, could be a useful addition to the biocatalytic toolbox being able to mediate regio- and stereoselective oxidative cyclization. Judging by the number of recent references (216 for the 2007-2013 period), it has become clear that undecylprodigiosin and other bacterial prodigiosins still hold surprises in terms of valuable properties and applicative potential to medical and other industrial fields and that they still deserve continuing research curiosity.

  7. A computationally efficient software application for calculating vibration from underground railways

    NASA Astrophysics Data System (ADS)

    Hussein, M. F. M.; Hunt, H. E. M.

    2009-08-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  8. Current and future prospects for CRISPR-based tools in bacteria

    PubMed Central

    Luo, Michelle L.; Leenay, Ryan T.; Beisel, Chase L.

    2015-01-01

    CRISPR-Cas systems have rapidly transitioned from intriguing prokaryotic defense systems to powerful and versatile biomolecular tools. This article reviews how these systems have been translated into technologies to manipulate bacterial genetics, physiology, and communities. Recent applications in bacteria have centered on multiplexed genome editing, programmable gene regulation, and sequence-specific antimicrobials, while future applications can build on advances in eukaryotes, the rich natural diversity of CRISPR-Cas systems, and the untapped potential of CRISPR-based DNA acquisition. Overall, these systems have formed the basis of an ever-expanding genetic toolbox and hold tremendous potential for our future understanding and engineering of the bacterial world. PMID:26460902

  9. Fluorogenic Behaviour of the Hetero-Diels-Alder Ligation of 5-Alkoxyoxazoles with Maleimides and their Applications.

    PubMed

    Renault, Kévin; Jouanno, Laurie-Anne; Lizzul-Jurse, Antoine; Renard, Pierre-Yves; Sabot, Cyrille

    2016-12-19

    Fluorogenic reactions are largely underrepresented in the toolbox of chemoselective ligations despite their tremendous potential, particularly in chemical biology and biochemistry. In this respect, we have investigated in full detail the fluorescence behaviour of the azaphthalamide, a scaffold which is generated through a hetero-Diels-Alder reaction of 5-alkoxyoxazole and maleimide derivatives under mild conditions that are compatible with, among others, peptide chemistry. The scope and limitations of such a fluorogenic labelling strategy were examined through four distinct applications, which target enzymatic activities or bioorthogonal reactions. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. On-demand drawing of high aspect-ratio, microsphere-tipped elastomeric micropillars

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Kim, Jaeyoun

    2017-08-01

    High aspect-ratio elastomeric micropillars are widely used in a plethora of applications, such as functional surfaces, actuators, and sensors. Their fabrication at arbitrary positions on non-planar substrates, however, has rarely been reported. Here we demonstrate a new technique for facile fabrication of high aspect-ratio, microsphere-tipped elastomeric micropillars on structures with uncommon geometries. As a proof-of-concept exemplary application, a fiber optic contact sensor is realized by integrating a micropillar onto the end facet of an optical fiber. Overall, both the fabrication technique and the resulting outcomes of this work will add new tools to the toolbox of soft-MEMS and softrobotics.

  11. Versatile Cas9-Driven Subpopulation Selection Toolbox for Lactococcus lactis.

    PubMed

    van der Els, Simon; James, Jennelle K; Kleerebezem, Michiel; Bron, Peter A

    2018-04-15

    CRISPR-Cas9 technology has been exploited for the removal or replacement of genetic elements in a wide range of prokaryotes and eukaryotes. Here, we describe the extension of the Cas9 application toolbox to the industrially important dairy species Lactococcus lactis The Cas9 expression vector pLABTarget, encoding the Streptocccus pyogenes Cas9 under the control of a constitutive promoter, was constructed, allowing plug and play introduction of short guide RNA (sgRNA) sequences to target specific genetic loci. Introduction of a pepN -targeting derivative of pLABTarget into L. lactis strain MG1363 led to a strong reduction in the number of transformants obtained, which did not occur in a pepN deletion derivative of the same strain, demonstrating the specificity and lethality of the Cas9-mediated double-strand breaks in the lactococcal chromosome. Moreover, the same pLABTarget derivative allowed the selection of a pepN deletion subpopulation from its corresponding single-crossover plasmid integrant precursor, accelerating the construction and selection of gene-specific deletion derivatives in L. lactis Finally, pLABTarget, which contained sgRNAs designed to target mobile genetic elements, allowed the effective curing of plasmids, prophages, and integrative conjugative elements (ICEs). These results establish that pLABTarget enables the effective exploitation of Cas9 targeting in L. lactis , while the broad-host-range vector used suggests that this toolbox could readily be expanded to other Gram-positive bacteria. IMPORTANCE Mobile genetic elements in Lactococcus lactis and other lactic acid bacteria (LAB) play an important role in dairy fermentation, having both positive and detrimental effects during the production of fermented dairy products. The pLABTarget vector offers an efficient cloning platform for Cas9 application in lactic acid bacteria. Targeting Cas9 toward mobile genetic elements enabled their effective curing, which is of particular interest in the context of potentially problematic prophages present in a strain. Moreover, Cas9 targeting of other mobile genetic elements enables the deciphering of their contribution to dairy fermentation processes and further establishment of their importance for product characteristics. Copyright © 2018 American Society for Microbiology.

  12. Evidence for a Common Toolbox Based on Necrotrophy in a Fungal Lineage Spanning Necrotrophs, Biotrophs, Endophytes, Host Generalists and Specialists

    PubMed Central

    Andrew, Marion; Barua, Reeta; Short, Steven M.; Kohn, Linda M.

    2012-01-01

    The Sclerotiniaceae (Ascomycotina, Leotiomycetes) is a relatively recently evolved lineage of necrotrophic host generalists, and necrotrophic or biotrophic host specialists, some latent or symptomless. We hypothesized that they inherited a basic toolbox of genes for plant symbiosis from their common ancestor. Maintenance and evolutionary diversification of symbiosis could require selection on toolbox genes or on timing and magnitude of gene expression. The genes studied were chosen because their products have been previously investigated as pathogenicity factors in the Sclerotiniaceae. They encode proteins associated with cell wall degradation: acid protease 1 (acp1), aspartyl protease (asps), and polygalacturonases (pg1, pg3, pg5, pg6), and the oxalic acid (OA) pathway: a zinc finger transcription factor (pac1), and oxaloacetate acetylhydrolase (oah), catalyst in OA production, essential for full symptom production in Sclerotinia sclerotiorum. Site-specific likelihood analyses provided evidence for purifying selection in all 8 pathogenicity-related genes. Consistent with an evolutionary arms race model, positive selection was detected in 5 of 8 genes. Only generalists produced large, proliferating disease lesions on excised Arabidopsis thaliana leaves and oxalic acid by 72 hours in vitro. In planta expression of oah was 10–300 times greater among the necrotrophic host generalists than necrotrophic and biotrophic host specialists; pac1 was not differentially expressed. Ability to amplify 6/8 pathogenicity related genes and produce oxalic acid in all genera are consistent with the common toolbox hypothesis for this gene sample. That our data did not distinguish biotrophs from necrotrophs is consistent with 1) a common toolbox based on necrotrophy and 2) the most conservative interpretation of the 3-locus housekeeping gene phylogeny – a baseline of necrotrophy from which forms of biotrophy emerged at least twice. Early oah overexpression likely expands the host range of necrotrophic generalists in the Sclerotiniaceae, while specialists and biotrophs deploy oah, or other as-yet-unknown toolbox genes, differently. PMID:22253834

  13. A toolbox for microbore liquid chromatography tandem-high-resolution mass spectrometry analysis of albumin-adducts as novel biomarkers of organophosphorus pesticide poisoning.

    PubMed

    von der Wellen, Jens; Winterhalter, Peter; Siegert, Markus; Eyer, Florian; Thiermann, Horst; John, Harald

    2018-08-01

    Exposure to toxic organophosphorus pesticides (OPP) represents a serious problem in the public healthcare sector and might be forced in terroristic attacks. Therefore, reliable verification procedures for OPP-intoxications are required for forensic, toxicological and clinical reasons. We developed and optimized a toolbox of methods to detect adducts of human serum albumin (HSA) with OPP considered as long-term biomarkers. Human serum was incubated with diethyl-oxono and diethyl-thiono pesticides for adduct formation used as reference. Afterwards serum was subjected to proteolysis using three proteases separately thus yielding phosphorylated tyrosine residues (Y*) detected as single amino acid (pronase), as hexadecapeptide LVRY* 411 TKKVPQVSTPTL (pepsin) and as the tripeptide Y* 411 TK (trypsin), respectively. Adducts were analyzed via microbore liquid chromatography coupled to electrospray ionization (μLC-ESI) and tandem-high-resolution mass spectrometry (MS/HR MS). Using paraoxon-ethyl as model OPP for adduct formation, methods were optimized with respect to MS/HR MS-parameters, protease concentrations and incubation time for proteolysis. HSA-adducts were found to be stable in serum in vitro at +37 °C and -30 °C for at least 27 days and resulting biomarkers were stable in the autosampler at 15 °C for at least 24 h. Limits of identification of adducts varied between 0.25 μM and 4.0 μM with respect to the corresponding pesticide concentrations in serum. Applicability of the methods was proven by successful detection of the adducts in samples of OPP-poisoned patients thus demonstrating the methods as a reliable toolbox for forensic and toxicological analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Optical performance assessment under environmental and mechanical perturbations in large, deployable telescopes

    NASA Astrophysics Data System (ADS)

    Folley, Christopher; Bronowicki, Allen

    2005-09-01

    Prediction of optical performance for large, deployable telescopes under environmental conditions and mechanical disturbances is a crucial part of the design verification process of such instruments for all phases of design and operation: ground testing, commissioning, and on-orbit operation. A Structural-Thermal-Optical-Performance (STOP) analysis methodology is often created that integrates the output of one analysis with the input of another. The integration of thermal environment predictions with structural models is relatively well understood, while the integration of structural deformation results into optical analysis/design software is less straightforward. A Matlab toolbox has been created that effectively integrates the predictions of mechanical deformations on optical elements generated by, for example, finite element analysis, and computes optical path differences for the distorted prescription. The engine of the toolbox is the real ray-tracing algorithm that allows the optical surfaces to be defined in a single, global coordinate system thereby allowing automatic alignment of the mechanical coordinate system with the optical coordinate system. Therefore, the physical location of the optical surfaces is identical in the optical prescription and the finite element model. The application of rigid body displacements to optical surfaces, however, is more general than for use solely in STOP analysis, such as the analysis of misalignments during the commissioning process. Furthermore, all the functionality of Matlab is available for optimization and control. Since this is a new tool for use on flight programs, it has been verified against CODE V. The toolbox' functionality, to date, is described, verification results are presented, and, as an example of its utility, results of a thermal distortion analysis are presented using the James Webb Space Telescope (JWST) prescription.

  15. Signal processing and neural network toolbox and its application to failure diagnosis and prognosis

    NASA Astrophysics Data System (ADS)

    Tu, Fang; Wen, Fang; Willett, Peter K.; Pattipati, Krishna R.; Jordan, Eric H.

    2001-07-01

    Many systems are comprised of components equipped with self-testing capability; however, if the system is complex involving feedback and the self-testing itself may occasionally be faulty, tracing faults to a single or multiple causes is difficult. Moreover, many sensors are incapable of reliable decision-making on their own. In such cases, a signal processing front-end that can match inference needs will be very helpful. The work is concerned with providing an object-oriented simulation environment for signal processing and neural network-based fault diagnosis and prognosis. In the toolbox, we implemented a wide range of spectral and statistical manipulation methods such as filters, harmonic analyzers, transient detectors, and multi-resolution decomposition to extract features for failure events from data collected by data sensors. Then we evaluated multiple learning paradigms for general classification, diagnosis and prognosis. The network models evaluated include Restricted Coulomb Energy (RCE) Neural Network, Learning Vector Quantization (LVQ), Decision Trees (C4.5), Fuzzy Adaptive Resonance Theory (FuzzyArtmap), Linear Discriminant Rule (LDR), Quadratic Discriminant Rule (QDR), Radial Basis Functions (RBF), Multiple Layer Perceptrons (MLP) and Single Layer Perceptrons (SLP). Validation techniques, such as N-fold cross-validation and bootstrap techniques, are employed for evaluating the robustness of network models. The trained networks are evaluated for their performance using test data on the basis of percent error rates obtained via cross-validation, time efficiency, generalization ability to unseen faults. Finally, the usage of neural networks for the prediction of residual life of turbine blades with thermal barrier coatings is described and the results are shown. The neural network toolbox has also been applied to fault diagnosis in mixed-signal circuits.

  16. PreSurgMapp: a MATLAB Toolbox for Presurgical Mapping of Eloquent Functional Areas Based on Task-Related and Resting-State Functional MRI.

    PubMed

    Huang, Huiyuan; Ding, Zhongxiang; Mao, Dewang; Yuan, Jianhua; Zhu, Fangmei; Chen, Shuda; Xu, Yan; Lou, Lin; Feng, Xiaoyan; Qi, Le; Qiu, Wusi; Zhang, Han; Zang, Yu-Feng

    2016-10-01

    The main goal of brain tumor surgery is to maximize tumor resection while minimizing the risk of irreversible postoperative functional sequelae. Eloquent functional areas should be delineated preoperatively, particularly for patients with tumors near eloquent areas. Functional magnetic resonance imaging (fMRI) is a noninvasive technique that demonstrates great promise for presurgical planning. However, specialized data processing toolkits for presurgical planning remain lacking. Based on several functions in open-source software such as Statistical Parametric Mapping (SPM), Resting-State fMRI Data Analysis Toolkit (REST), Data Processing Assistant for Resting-State fMRI (DPARSF) and Multiple Independent Component Analysis (MICA), here, we introduce an open-source MATLAB toolbox named PreSurgMapp. This toolbox can reveal eloquent areas using comprehensive methods and various complementary fMRI modalities. For example, PreSurgMapp supports both model-based (general linear model, GLM, and seed correlation) and data-driven (independent component analysis, ICA) methods and processes both task-based and resting-state fMRI data. PreSurgMapp is designed for highly automatic and individualized functional mapping with a user-friendly graphical user interface (GUI) for time-saving pipeline processing. For example, sensorimotor and language-related components can be automatically identified without human input interference using an effective, accurate component identification algorithm using discriminability index. All the results generated can be further evaluated and compared by neuro-radiologists or neurosurgeons. This software has substantial value for clinical neuro-radiology and neuro-oncology, including application to patients with low- and high-grade brain tumors and those with epilepsy foci in the dominant language hemisphere who are planning to undergo a temporal lobectomy.

  17. FracPaQ: a MATLAB™ Toolbox for the Quantification of Fracture Patterns

    NASA Astrophysics Data System (ADS)

    Healy, D.; Rizzo, R. E.; Cornwell, D. G.; Timms, N.; Farrell, N. J.; Watkins, H.; Gomez-Rivas, E.; Smith, M.

    2016-12-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying the fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The method presented is inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. Planned future releases will incorporate multi-scale analyses based on a wavelet method to look for scale transitions, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern.

  18. Dynamical modeling and multi-experiment fitting with PottersWheel

    PubMed Central

    Maiwald, Thomas; Timmer, Jens

    2008-01-01

    Motivation: Modelers in Systems Biology need a flexible framework that allows them to easily create new dynamic models, investigate their properties and fit several experimental datasets simultaneously. Multi-experiment-fitting is a powerful approach to estimate parameter values, to check the validity of a given model, and to discriminate competing model hypotheses. It requires high-performance integration of ordinary differential equations and robust optimization. Results: We here present the comprehensive modeling framework Potters-Wheel (PW) including novel functionalities to satisfy these requirements with strong emphasis on the inverse problem, i.e. data-based modeling of partially observed and noisy systems like signal transduction pathways and metabolic networks. PW is designed as a MATLAB toolbox and includes numerous user interfaces. Deterministic and stochastic optimization routines are combined by fitting in logarithmic parameter space allowing for robust parameter calibration. Model investigation includes statistical tests for model-data-compliance, model discrimination, identifiability analysis and calculation of Hessian- and Monte-Carlo-based parameter confidence limits. A rich application programming interface is available for customization within own MATLAB code. Within an extensive performance analysis, we identified and significantly improved an integrator–optimizer pair which decreases the fitting duration for a realistic benchmark model by a factor over 3000 compared to MATLAB with optimization toolbox. Availability: PottersWheel is freely available for academic usage at http://www.PottersWheel.de/. The website contains a detailed documentation and introductory videos. The program has been intensively used since 2005 on Windows, Linux and Macintosh computers and does not require special MATLAB toolboxes. Contact: maiwald@fdm.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18614583

  19. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  20. Synthesis of Multispectral Bands from Hyperspectral Data: Validation Based on Images Acquired by AVIRIS, Hyperion, ALI, and ETM+

    NASA Technical Reports Server (NTRS)

    Blonksi, Slawomir; Gasser, Gerald; Russell, Jeffrey; Ryan, Robert; Terrie, Greg; Zanoni, Vicki

    2001-01-01

    Multispectral data requirements for Earth science applications are not always studied rigorously studied before a new remote sensing system is designed. A study of the spatial resolution, spectral bandpasses, and radiometric sensitivity requirements of real-world applications would focus the design onto providing maximum benefits to the end-user community. To support systematic studies of multispectral data requirements, the Applications Research Toolbox (ART) has been developed at NASA's Stennis Space Center. The ART software allows users to create and assess simulated datasets while varying a wide range of system parameters. The simulations are based on data acquired by existing multispectral and hyperspectral instruments. The produced datasets can be further evaluated for specific end-user applications. Spectral synthesis of multispectral images from hyperspectral data is a key part of the ART software. In this process, hyperspectral image cubes are transformed into multispectral imagery without changes in spatial sampling and resolution. The transformation algorithm takes into account spectral responses of both the synthesized, broad, multispectral bands and the utilized, narrow, hyperspectral bands. To validate the spectral synthesis algorithm, simulated multispectral images are compared with images collected near-coincidentally by the Landsat 7 ETM+ and the EO-1 ALI instruments. Hyperspectral images acquired with the airborne AVIRIS instrument and with the Hyperion instrument onboard the EO-1 satellite were used as input data to the presented simulations.

  1. BrainNet Viewer: a network visualization tool for human brain connectomics.

    PubMed

    Xia, Mingrui; Wang, Jinhui; He, Yong

    2013-01-01

    The human brain is a complex system whose topological organization can be represented using connectomics. Recent studies have shown that human connectomes can be constructed using various neuroimaging technologies and further characterized using sophisticated analytic strategies, such as graph theory. These methods reveal the intriguing topological architectures of human brain networks in healthy populations and explore the changes throughout normal development and aging and under various pathological conditions. However, given the huge complexity of this methodology, toolboxes for graph-based network visualization are still lacking. Here, using MATLAB with a graphical user interface (GUI), we developed a graph-theoretical network visualization toolbox, called BrainNet Viewer, to illustrate human connectomes as ball-and-stick models. Within this toolbox, several combinations of defined files with connectome information can be loaded to display different combinations of brain surface, nodes and edges. In addition, display properties, such as the color and size of network elements or the layout of the figure, can be adjusted within a comprehensive but easy-to-use settings panel. Moreover, BrainNet Viewer draws the brain surface, nodes and edges in sequence and displays brain networks in multiple views, as required by the user. The figure can be manipulated with certain interaction functions to display more detailed information. Furthermore, the figures can be exported as commonly used image file formats or demonstration video for further use. BrainNet Viewer helps researchers to visualize brain networks in an easy, flexible and quick manner, and this software is freely available on the NITRC website (www.nitrc.org/projects/bnv/).

  2. Polymer application for separation/filtration of biological active compounds

    NASA Astrophysics Data System (ADS)

    Tylkowski, B.; Tsibranska, I.

    2017-06-01

    Membrane technology is an important part of the engineer's toolbox. This is especially true for industries that process food and other products with their primary source from nature. This review is focused on ongoing development work using membrane technologies for concentration and separation of biologically active compounds, such as polyphenols and flavonoids. We provide the readers not only with the last results achieve in this field but also, we deliver detailed information about the membrane types and polymers used for their preparation.

  3. Water Power Data and Tools | Water Power | NREL

    Science.gov Websites

    computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically

  4. MOFA Software for the COBRA Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesemer, Marc; Navid, Ali

    MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.

  5. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. MODULE 1: WATER UTILITIES PLANNING GUIDE - INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  6. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS, MODULE 3: SITE CHARACTERIZATION AND SAMPLING GUIDE. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  7. The panacea toolbox of a PhD biomedical student.

    PubMed

    Skaik, Younis

    2014-01-01

    Doing a PhD (doctor of philosophy) for the sake of contribution to knowledge should give the student an immense enthusiasm through the PhD period. It is the time in one's life that one spends to "hit the nail on the head" in a specific area and topic of interest. A PhD consists mostly of hard work and tenacity; however, luck and genius might also play a little role. You can pass all PhD phases without having both luck and genius. The PhD student should have pre-PhD and PhD toolboxes, which are "sine quibus non" for getting successfully a PhD degree. In this manuscript, the toolboxes of the PhD student are discussed.

  8. A Tol2 Gateway-Compatible Toolbox for the Study of the Nervous System and Neurodegenerative Disease.

    PubMed

    Don, Emily K; Formella, Isabel; Badrock, Andrew P; Hall, Thomas E; Morsch, Marco; Hortle, Elinor; Hogan, Alison; Chow, Sharron; Gwee, Serene S L; Stoddart, Jack J; Nicholson, Garth; Chung, Roger; Cole, Nicholas J

    2017-02-01

    Currently there is a lack in fundamental understanding of disease progression of most neurodegenerative diseases, and, therefore, treatments and preventative measures are limited. Consequently, there is a great need for adaptable, yet robust model systems to both investigate elementary disease mechanisms and discover effective therapeutics. We have generated a Tol2 Gateway-compatible toolbox to study neurodegenerative disorders in zebrafish, which includes promoters for astrocytes, microglia and motor neurons, multiple fluorophores, and compatibility for the introduction of genes of interest or disease-linked genes. This toolbox will advance the rapid and flexible generation of zebrafish models to discover the biology of the nervous system and the disease processes that lead to neurodegeneration.

  9. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  10. The watershed and river systems management program

    USGS Publications Warehouse

    Markstrom, S.L.; Frevert, D.; Leavesley, G.H.; ,

    2005-01-01

    The Watershed and River System Management Program (WaRSMP), a joint effort between the U.S. Geological Survey (USGS) and the U.S. Bureau of Reclamation (Reclamation), is focused on research and development of decision support systems and their application to achieve an equitable balance among diverse water resource management demands. Considerations include: (1) legal and political constraints; (2) stake holder and consensus-building; (3) sound technical knowledge; (4) flood control, consumptive use, and hydropower; (5) water transfers; (6) irrigation return flows and water quality; (7) recreation; (8) habitat for endangered species; (9) water supply and proration; (10) near-surface groundwater; and (11) water ownership, accounting, and rights. To address the interdisciplinary and multi-stake holder needs of real-time watershed management, WaRSMP has developed a decision support system toolbox. The USGS Object User Interface facilitates the coupling of Reclamation's RiverWare reservoir operations model with the USGS Modular Modeling and Precipitation Runoff Modeling Systems through a central database. This integration is accomplished through the use of Model and Data Management Interfaces. WaRSMP applications include Colorado River Main stem and Gunnison Basin, the Yakima Basin, the Middle Rio Grande Basin, the Truckee-Carson Basin, and the Umatilla Basin.

  11. A Review of Astronomy Education Research

    NASA Astrophysics Data System (ADS)

    Bailey, Janelle M.; Slater, Timothy F.

    The field of astronomy education is rapidly growing beyond merely sharing effective activities or curriculum ideas. This paper categorizes and summarizes the literature in astronomy education research and contains more than 100 references to articles, books, and Web-based materials. Research into student understanding on a variety of topics now occupies a large part of the literature. Topics include the shape of Earth and gravity, lunar phases, seasons, astrobiology, and cosmology. The effectiveness of instructional methods is now being tested systematically, taking data beyond the anecdotal with powerful research designs and statistical analyses. Quantitative, qualitative, and mixed-methods approaches have found their places in the researcher's toolbox. In all cases, the connection between the research performed and its effect on classroom instruction is largely lacking.

  12. Extending Inferential Group Analysis in Type 2 Diabetic Patients with Multivariate GLM Implemented in SPM8.

    PubMed

    Ferreira, Fábio S; Pereira, João M S; Duarte, João V; Castelo-Branco, Miguel

    2017-01-01

    Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately - using standard univariate VBM - and simultaneously, with multivariate analyses. Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities.

  13. Extending Inferential Group Analysis in Type 2 Diabetic Patients with Multivariate GLM Implemented in SPM8

    PubMed Central

    Ferreira, Fábio S.; Pereira, João M.S.; Duarte, João V.; Castelo-Branco, Miguel

    2017-01-01

    Background: Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Objective: Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). Method: We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately – using standard univariate VBM - and simultaneously, with multivariate analyses. Results: Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. Conclusion: While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities. PMID:28761571

  14. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  15. The power of simplicity: a fast-and-frugal heuristics approach to performance science.

    PubMed

    Raab, Markus; Gigerenzer, Gerd

    2015-01-01

    Performance science is a fairly new multidisciplinary field that integrates performance domains such as sports, medicine, business, and the arts. To give its many branches a structure and its research a direction, it requires a theoretical framework. We demonstrate the applications of this framework with examples from sport and medicine. Because performance science deals mainly with situations of uncertainty rather than known risks, the needed framework can be provided by the fast-and-frugal heuristics approach. According to this approach, experts learn to rely on heuristics in an adaptive way in order to make accurate decisions. We investigate the adaptive use of heuristics in three ways: the descriptive study of the heuristics in the cognitive "adaptive toolbox;" the prescriptive study of their "ecological rationality," that is, the characterization of the situations in which a given heuristic works; and the engineering study of "intuitive design," that is, the design of transparent aids for making better decisions.

  16. Computational investigations and grid refinement study of 3D transient flow in a cylindrical tank using OpenFOAM

    NASA Astrophysics Data System (ADS)

    Mohd Sakri, F.; Mat Ali, M. S.; Sheikh Salim, S. A. Z.

    2016-10-01

    The study of physic fluid for a liquid draining inside a tank is easily accessible using numerical simulation. However, numerical simulation is expensive when the liquid draining involves the multi-phase problem. Since an accurate numerical simulation can be obtained if a proper method for error estimation is accomplished, this paper provides systematic assessment of error estimation due to grid convergence error using OpenFOAM. OpenFOAM is an open source CFD-toolbox and it is well-known among the researchers and institutions because of its free applications and ready to use. In this study, three types of grid resolution are used: coarse, medium and fine grids. Grid Convergence Index (GCI) is applied to estimate the error due to the grid sensitivity. A monotonic convergence condition is obtained in this study that shows the grid convergence error has been progressively reduced. The fine grid has the GCI value below 1%. The extrapolated value from Richardson Extrapolation is in the range of the GCI obtained.

  17. The power of simplicity: a fast-and-frugal heuristics approach to performance science

    PubMed Central

    Raab, Markus; Gigerenzer, Gerd

    2015-01-01

    Performance science is a fairly new multidisciplinary field that integrates performance domains such as sports, medicine, business, and the arts. To give its many branches a structure and its research a direction, it requires a theoretical framework. We demonstrate the applications of this framework with examples from sport and medicine. Because performance science deals mainly with situations of uncertainty rather than known risks, the needed framework can be provided by the fast-and-frugal heuristics approach. According to this approach, experts learn to rely on heuristics in an adaptive way in order to make accurate decisions. We investigate the adaptive use of heuristics in three ways: the descriptive study of the heuristics in the cognitive “adaptive toolbox;” the prescriptive study of their “ecological rationality,” that is, the characterization of the situations in which a given heuristic works; and the engineering study of “intuitive design,” that is, the design of transparent aids for making better decisions. PMID:26579051

  18. Early Warning Signals of Ecological Transitions: Methods for Spatial Patterns

    PubMed Central

    Brock, William A.; Carpenter, Stephen R.; Ellison, Aaron M.; Livina, Valerie N.; Seekell, David A.; Scheffer, Marten; van Nes, Egbert H.; Dakos, Vasilis

    2014-01-01

    A number of ecosystems can exhibit abrupt shifts between alternative stable states. Because of their important ecological and economic consequences, recent research has focused on devising early warning signals for anticipating such abrupt ecological transitions. In particular, theoretical studies show that changes in spatial characteristics of the system could provide early warnings of approaching transitions. However, the empirical validation of these indicators lag behind their theoretical developments. Here, we summarize a range of currently available spatial early warning signals, suggest potential null models to interpret their trends, and apply them to three simulated spatial data sets of systems undergoing an abrupt transition. In addition to providing a step-by-step methodology for applying these signals to spatial data sets, we propose a statistical toolbox that may be used to help detect approaching transitions in a wide range of spatial data. We hope that our methodology together with the computer codes will stimulate the application and testing of spatial early warning signals on real spatial data. PMID:24658137

  19. Detrended Fluctuation Analysis: A Scale-Free View on Neuronal Oscillations

    PubMed Central

    Hardstone, Richard; Poil, Simon-Shlomo; Schiavone, Giuseppina; Jansen, Rick; Nikulin, Vadim V.; Mansvelder, Huibert D.; Linkenkaer-Hansen, Klaus

    2012-01-01

    Recent years of research have shown that the complex temporal structure of ongoing oscillations is scale-free and characterized by long-range temporal correlations. Detrended fluctuation analysis (DFA) has proven particularly useful, revealing that genetic variation, normal development, or disease can lead to differences in the scale-free amplitude modulation of oscillations. Furthermore, amplitude dynamics is remarkably independent of the time-averaged oscillation power, indicating that the DFA provides unique insights into the functional organization of neuronal systems. To facilitate understanding and encourage wider use of scaling analysis of neuronal oscillations, we provide a pedagogical explanation of the DFA algorithm and its underlying theory. Practical advice on applying DFA to oscillations is supported by MATLAB scripts from the Neurophysiological Biomarker Toolbox (NBT) and links to the NBT tutorial website http://www.nbtwiki.net/. Finally, we provide a brief overview of insights derived from the application of DFA to ongoing oscillations in health and disease, and discuss the putative relevance of criticality for understanding the mechanism underlying scale-free modulation of oscillations. PMID:23226132

  20. Proba-V Mission Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Goor, Erwin; Dries, Jeroen

    2017-04-01

    VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (a Copernicus contributing mission) EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. Furthermore data from the Copernicus Global Land Service is in scope of the platform. From November 2015 an operational Proba-V MEP environment, as an ESA operation service, is gradually deployed at the VITO data center with direct access to the complete data archive. Since autumn 2016 the platform is operational and yet several applications are released to the users, e.g. - A time series viewer, showing the evolution of Proba-V bands and derived vegetation parameters from the Copernicus Global Land Service for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains on a powerfull Hadoop/Spark backend e.g. for the calculation of N-daily composites. - Virtual Machines can be provided with access to the data archive and tools to work with this data, e.g. various toolboxes (GDAL, QGIS, GrassGIS, SNAP toolbox, …) and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. - A prototype of jupyter Notebooks is available with some examples worked out to show the potential of the data. Today the platform is used by several third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. In parallel the platform is further improved and extended. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be available as well soon. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components (e.g. Geotrellis). The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger community of users. The presentation will address these benefits for the users and discuss on the technical challenges in implementing this MEP. Furthermore demonstrations will be done. Platform URL: https://proba-v-mep.esa.int/

  1. T-MATS Toolbox for the Modeling and Analysis of Thermodynamic Systems

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a MATLABSimulink (The MathWorks Inc.) plug-in for creating and simulating thermodynamic systems and controls. The package contains generic parameterized components that can be combined with a variable input iterative solver and optimization algorithm to create complex system models, such as gas turbines.

  2. Various Solution Methods, Accompanied by Dynamic Investigation, for the Same Problem as a Means for Enriching the Mathematical Toolbox

    ERIC Educational Resources Information Center

    Oxman, Victor; Stupel, Moshe

    2018-01-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  3. Various solution methods, accompanied by dynamic investigation, for the same problem as a means for enriching the mathematical toolbox

    NASA Astrophysics Data System (ADS)

    Oxman, Victor; Stupel, Moshe

    2018-04-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  4. A sigma factor toolbox for orthogonal gene expression in Escherichia coli

    PubMed Central

    Van Brempt, Maarten; Van Nerom, Katleen; Van Hove, Bob; Maertens, Jo; De Mey, Marjan; Charlier, Daniel

    2018-01-01

    Abstract Synthetic genetic sensors and circuits enable programmable control over timing and conditions of gene expression and, as a result, are increasingly incorporated into the control of complex and multi-gene pathways. Size and complexity of genetic circuits are growing, but stay limited by a shortage of regulatory parts that can be used without interference. Therefore, orthogonal expression and regulation systems are needed to minimize undesired crosstalk and allow for dynamic control of separate modules. This work presents a set of orthogonal expression systems for use in Escherichia coli based on heterologous sigma factors from Bacillus subtilis that recognize specific promoter sequences. Up to four of the analyzed sigma factors can be combined to function orthogonally between each other and toward the host. Additionally, the toolbox is expanded by creating promoter libraries for three sigma factors without loss of their orthogonal nature. As this set covers a wide range of transcription initiation frequencies, it enables tuning of multiple outputs of the circuit in response to different sensory signals in an orthogonal manner. This sigma factor toolbox constitutes an interesting expansion of the synthetic biology toolbox and may contribute to the assembly of more complex synthetic genetic systems in the future. PMID:29361130

  5. Cementitious Barriers Partnership (CBP): Training and Release of CBP Toolbox Software, Version 1.0 - 13480

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.

    2013-07-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the Office of Tank Waste Management within the Office of Environmental Management of U.S. Department of Energy (US DOE). The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that improve understanding and predictions of the long-term hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program are intended to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance upmore » to or longer than 100 years for operating facilities and longer than 1,000 years for waste management purposes. CBP software tools were made available to selected DOE Office of Environmental Management and field site users for training and evaluation based on a set of important degradation scenarios, including sulfate ingress/attack and carbonation of cementitious materials. The tools were presented at two-day training workshops held at U.S. National Institute of Standards and Technology (NIST), Savannah River, and Hanford included LeachXS{sup TM}/ORCHESTRA, STADIUM{sup R}, and a CBP-developed GoldSim Dashboard interface. Collectively, these components form the CBP Software Toolbox. The new U.S. Environmental Protection Agency leaching test methods based on the Leaching Environmental Assessment Framework (LEAF) were also presented. The CBP Dashboard uses a custom Dynamic-link library developed by CBP to couple to the LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} codes to simulate reactive transport and degradation in cementitious materials for selected performance assessment scenarios. The first day of the workshop introduced participants to the software components via presentation materials, and the second day included hands-on tutorial exercises followed by discussions of enhancements desired by participants. Tools were revised based on feedback obtained during the workshops held from April through June 2012. The resulting improved CBP Software Toolbox, including evaluation versions of and LeachXS{sup TM}/ORCHESTRA and STADIUM{sup R} has been made available to workshop and selected other participants for further assessment. Inquiries about future workshops and requests for access to the Toolbox software can be made via the CBP web site [1]. (authors)« less

  6. Development of proteome-wide binding reagents for research and diagnostics.

    PubMed

    Taussig, Michael J; Schmidt, Ronny; Cook, Elizabeth A; Stoevesandt, Oda

    2013-12-01

    Alongside MS, antibodies and other specific protein-binding molecules have a special place in proteomics as affinity reagents in a toolbox of applications for determining protein location, quantitative distribution and function (affinity proteomics). The realisation that the range of research antibodies available, while apparently vast is nevertheless still very incomplete and frequently of uncertain quality, has stimulated projects with an objective of raising comprehensive, proteome-wide sets of protein binders. With progress in automation and throughput, a remarkable number of recent publications refer to the practical possibility of selecting binders to every protein encoded in the genome. Here we review the requirements of a pipeline of production of protein binders for the human proteome, including target prioritisation, antigen design, 'next generation' methods, databases and the approaches taken by ongoing projects in Europe and the USA. While the task of generating affinity reagents for all human proteins is complex and demanding, the benefits of well-characterised and quality-controlled pan-proteome binder resources for biomedical research, industry and life sciences in general would be enormous and justify the effort. Given the technical, personnel and financial resources needed to fulfil this aim, expansion of current efforts may best be addressed through large-scale international collaboration. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Daly, Don S.; Willse, Alan R.

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  8. From black box to toolbox: Outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions.

    PubMed

    Danaher, Brian G; Brendryen, Håvar; Seeley, John R; Tyler, Milagra S; Woolley, Tim

    2015-03-01

    mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1) mobile device functionality, which represents the technological toolbox available to intervention developers; and (2) the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be better able to achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture.

  9. Experimental verification, and domain definition, of structural alerts for protein binding: epoxides, lactones, nitroso, nitros, aldehydes and ketones.

    PubMed

    Nelms, M D; Cronin, M T D; Schultz, T W; Enoch, S J

    2013-01-01

    This study outlines how a combination of in chemico and Tetrahymena pyriformis data can be used to define the applicability domain of selected structural alerts within the profilers of the OECD QSAR Toolbox. Thirty-three chemicals were profiled using the OECD and OASIS profilers, enabling the applicability domain of six structural alerts to be defined, the alerts being: epoxides, lactones, nitrosos, nitros, aldehydes and ketones. Analysis of the experimental data showed the applicability domains for the epoxide, nitroso, aldehyde and ketone structural alerts to be well defined. In contrast, the data showed the applicability domains for the lactone and nitro structural alerts needed modifying. The accurate definition of the applicability domain for structural alerts within in silico profilers is important due to their use in the chemical category in predictive and regulatory toxicology. This study highlights the importance of utilizing multiple profilers in category formation.

  10. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability

    PubMed Central

    ChariDingari, Narahara; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P.; Kumar, G. Manoj

    2012-01-01

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real world applications, e.g. quality assurance and process monitoring. Specifically, variability in sample, system and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a non-linear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), due to its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data – highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples as well as in related areas of forensic and biological sample analysis. PMID:22292496

  11. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability.

    PubMed

    Dingari, Narahara Chari; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P; Kumar Gundawar, Manoj

    2012-03-20

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real-world applications, e.g., quality assurance and process monitoring. Specifically, variability in sample, system, and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a nonlinear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that the application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), because of its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data-highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples, as well as in related areas of forensic and biological sample analysis.

  12. OntoCAT -- simple ontology search and integration in Java, R and REST/JavaScript

    PubMed Central

    2011-01-01

    Background Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. Results OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. Conclusions OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. Availability http://www.ontocat.org PMID:21619703

  13. OntoCAT--simple ontology search and integration in Java, R and REST/JavaScript.

    PubMed

    Adamusiak, Tomasz; Burdett, Tony; Kurbatova, Natalja; Joeri van der Velde, K; Abeygunawardena, Niran; Antonakaki, Despoina; Kapushesky, Misha; Parkinson, Helen; Swertz, Morris A

    2011-05-29

    Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. http://www.ontocat.org.

  14. Dem Generation from Close-Range Photogrammetry Using Extended Python Photogrammetry Toolbox

    NASA Astrophysics Data System (ADS)

    Belmonte, A. A.; Biong, M. M. P.; Macatulad, E. G.

    2017-10-01

    Digital elevation models (DEMs) are widely used raster data for different applications concerning terrain, such as for flood modelling, viewshed analysis, mining, land development, engineering design projects, to name a few. DEMs can be obtained through various methods, including topographic survey, LiDAR or photogrammetry, and internet sources. Terrestrial close-range photogrammetry is one of the alternative methods to produce DEMs through the processing of images using photogrammetry software. There are already powerful photogrammetry software that are commercially-available and can produce high-accuracy DEMs. However, this entails corresponding cost. Although, some of these software have free or demo trials, these trials have limits in their usable features and usage time. One alternative is the use of free and open-source software (FOSS), such as the Python Photogrammetry Toolbox (PPT), which provides an interface for performing photogrammetric processes implemented through python script. For relatively small areas such as in mining or construction excavation, a relatively inexpensive, fast and accurate method would be advantageous. In this study, PPT was used to generate 3D point cloud data from images of an open pit excavation. The PPT was extended to add an algorithm converting the generated point cloud data into a usable DEM.

  15. Reprocessing Close Range Terrestrial and Uav Photogrammetric Projects with the Dbat Toolbox for Independent Verification and Quality Control

    NASA Astrophysics Data System (ADS)

    Murtiyoso, A.; Grussenmeyer, P.; Börlin, N.

    2017-11-01

    Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles), close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT) developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  16. eXtended CASA Line Analysis Software Suite (XCLASS)

    NASA Astrophysics Data System (ADS)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  17. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  18. The growing and glowing toolbox of fluorescent and photoactive proteins

    PubMed Central

    Rodriguez, Erik A.; Campbell, Robert E.; Lin, John Y.; Lin, Michael Z.; Miyawaki, Atsushi; Palmer, Amy E.; Shu, Xiaokun; Zhang, Jin

    2016-01-01

    Over the past 20 years, protein engineering has been extensively used to improve and modify the fundamental properties of fluorescent proteins (FPs) with the goal of adapting them for a fantastic range of applications. FPs have been modified by a combination of rational design, structure-based mutagenesis, and countless cycles of directed evolution (gene diversification followed by selection of clones with desired properties) that have collectively pushed the properties to photophysical and biochemical extremes. In this review, we attempt to provide both a summary of the progress that has been made during the past two decades, and a broad overview of the current state of FP development and applications in mammalian systems. PMID:27814948

  19. Advances in Engineering the Fly Genome with the CRISPR-Cas System

    PubMed Central

    Bier, Ethan; Harrison, Melissa M.; O’Connor-Giles, Kate M.; Wildonger, Jill

    2018-01-01

    Drosophila has long been a premier model for the development and application of cutting-edge genetic approaches. The CRISPR-Cas system now adds the ability to manipulate the genome with ease and precision, providing a rich toolbox to interrogate relationships between genotype and phenotype, to delineate and visualize how the genome is organized, to illuminate and manipulate RNA, and to pioneer new gene drive technologies. Myriad transformative approaches have already originated from the CRISPR-Cas system, which will likely continue to spark the creation of tools with diverse applications. Here, we provide an overview of how CRISPR-Cas gene editing has revolutionized genetic analysis in Drosophila and highlight key areas for future advances. PMID:29301946

  20. Current and future prospects for CRISPR-based tools in bacteria.

    PubMed

    Luo, Michelle L; Leenay, Ryan T; Beisel, Chase L

    2016-05-01

    CRISPR-Cas systems have rapidly transitioned from intriguing prokaryotic defense systems to powerful and versatile biomolecular tools. This article reviews how these systems have been translated into technologies to manipulate bacterial genetics, physiology, and communities. Recent applications in bacteria have centered on multiplexed genome editing, programmable gene regulation, and sequence-specific antimicrobials, while future applications can build on advances in eukaryotes, the rich natural diversity of CRISPR-Cas systems, and the untapped potential of CRISPR-based DNA acquisition. Overall, these systems have formed the basis of an ever-expanding genetic toolbox and hold tremendous potential for our future understanding and engineering of the bacterial world. © 2015 Wiley Periodicals, Inc.

Top