Sample records for response protocol toolbox

  1. RESPONSE PROTOCOL TOOLBOX: OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...

  2. RESPONSE PROTOCOL TOOLBOX: OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water(1). The Toolbox contains guidance that may be adopted voluntarily, a...

  3. RESPONSE PROTOCOL TOOLBOX OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...

  4. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO CONTAMINATION THREATS TO DRINKING WATER SYSTEMS

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  5. LAB ANALYSIS OF EMERGENCY WATER SAMPLES CONTAINING UNKNOWN CONTAMINANTS: CONSIDERATIONS FROM THE USEPA RESPONSE PROTOCOL TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  6. EPA RESPONSE PROTOCOL TOOLBOX TO HELP EVALUATION OF CONTAMINATION THREATS & RESPONDING TO THREATS: MODULE 1-WATER UTILITY PLANNING GUIDE

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  7. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. OVERVIEW AND APPLICATION. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  8. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. MODULE 4: ANALYTICAL GUIDE. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  9. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. MODULE 1: WATER UTILITIES PLANNING GUIDE - INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  10. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS, MODULE 3: SITE CHARACTERIZATION AND SAMPLING GUIDE. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  11. ROLE OF SOURCE WATER PROTECTION IN PLANNING FOR AND RESPONDING TO CONTAMINATION THREATS TO DRINKING WATER SYSTEMS

    EPA Science Inventory

    EPA has developed a "Response Protocol Toolbox" to address the complex, multi-faceted challenges of planning and response to intentional contamination of drinking water (http://www.epa.gov/safewater/security/ertools.html#toolbox). The toolbox is designed to be applied by a numbe...

  12. EPA EMERGENCY PLANNING TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  13. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling

    PubMed Central

    Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078

  14. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.

    PubMed

    Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.

  15. Pointing System Simulation Toolbox with Application to a Balloon Mission Simulator

    NASA Technical Reports Server (NTRS)

    Maringolo Baldraco, Rosana M.; Aretskin-Hariton, Eliot D.; Swank, Aaron J.

    2017-01-01

    The development of attitude estimation and pointing-control algorithms is necessary in order to achieve high-fidelity modeling for a Balloon Mission Simulator (BMS). A pointing system simulation toolbox was developed to enable this. The toolbox consists of a star-tracker (ST) and Inertial Measurement Unit (IMU) signal generator, a UDP (User Datagram Protocol) communication le (bridge), and an indirect-multiplicative extended Kalman filter (imEKF). This document describes the Python toolbox developed and the results of its implementation in the imEKF.

  16. WEC Design Response Toolbox v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey

    2016-03-30

    The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.

  17. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    PubMed

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  18. A Molecular Toolbox to Engineer Site-Specific DNA Replication Perturbation.

    PubMed

    Larsen, Nicolai B; Hickson, Ian D; Mankouri, Hocine W

    2018-01-01

    Site-specific arrest of DNA replication is a useful tool for analyzing cellular responses to DNA replication perturbation. The E. coli Tus-Ter replication barrier can be reconstituted in eukaryotic cells as a system to engineer an unscheduled collision between a replication fork and an "alien" impediment to DNA replication. To further develop this system as a versatile tool, we describe a set of reagents and a detailed protocol that can be used to engineer Tus-Ter barriers into any locus in the budding yeast genome. Because the Tus-Ter complex is a bipartite system with intrinsic DNA replication-blocking activity, the reagents and protocols developed and validated in yeast could also be optimized to engineer site-specific replication fork barriers into other eukaryotic cell types.

  19. Real-time Electrophysiology: Using Closed-loop Protocols to Probe Neuronal Dynamics and Beyond

    PubMed Central

    Linaro, Daniele; Couto, João; Giugliano, Michele

    2015-01-01

    Experimental neuroscience is witnessing an increased interest in the development and application of novel and often complex, closed-loop protocols, where the stimulus applied depends in real-time on the response of the system. Recent applications range from the implementation of virtual reality systems for studying motor responses both in mice1 and in zebrafish2, to control of seizures following cortical stroke using optogenetics3. A key advantage of closed-loop techniques resides in the capability of probing higher dimensional properties that are not directly accessible or that depend on multiple variables, such as neuronal excitability4 and reliability, while at the same time maximizing the experimental throughput. In this contribution and in the context of cellular electrophysiology, we describe how to apply a variety of closed-loop protocols to the study of the response properties of pyramidal cortical neurons, recorded intracellularly with the patch clamp technique in acute brain slices from the somatosensory cortex of juvenile rats. As no commercially available or open source software provides all the features required for efficiently performing the experiments described here, a new software toolbox called LCG5 was developed, whose modular structure maximizes reuse of computer code and facilitates the implementation of novel experimental paradigms. Stimulation waveforms are specified using a compact meta-description and full experimental protocols are described in text-based configuration files. Additionally, LCG has a command-line interface that is suited for repetition of trials and automation of experimental protocols. PMID:26132434

  20. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  1. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli.

    PubMed

    Crosse, Michael J; Di Liberto, Giovanni M; Bednar, Adam; Lalor, Edmund C

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter-often referred to as a temporal response function-that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application.

  2. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli

    PubMed Central

    Crosse, Michael J.; Di Liberto, Giovanni M.; Bednar, Adam; Lalor, Edmund C.

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter—often referred to as a temporal response function—that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application. PMID:27965557

  3. Response Protocol Toolbox: Planning for and Responding to Drinking Water Contamination Threats and Incidents. Response Guidelines

    DTIC Science & Technology

    2004-08-01

    sp ec if ic ra d io n u cl id es 9 0 0 S er ie s 3 2 In te ri m F in al – A u g u st 2 0 0 4 R E S...in je ct io n , h ea d sp ac e S ee M o d u le 4 A n a ly si s: G C /M S , G C , H P L C , L C -M S U n k n o w n in o rg an ic s 1 L...er f o r th e m et er o r k it u se d d u ri n g s

  4. Water Power Data and Tools | Water Power | NREL

    Science.gov Websites

    computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically

  5. TopoToolbox: using sensor topography to calculate psychologically meaningful measures from event-related EEG/MEG.

    PubMed

    Tian, Xing; Poeppel, David; Huber, David E

    2011-01-01

    The open-source toolbox "TopoToolbox" is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004).

  6. The MONGOOSE Rational Arithmetic Toolbox.

    PubMed

    Le, Christopher; Chindelevitch, Leonid

    2018-01-01

    The modeling of metabolic networks has seen a rapid expansion following the complete sequencing of thousands of genomes. The constraint-based modeling framework has emerged as one of the most popular approaches to reconstructing and analyzing genome-scale metabolic models. Its main assumption is that of a quasi-steady-state, requiring that the production of each internal metabolite be balanced by its consumption. However, due to the multiscale nature of the models, the large number of reactions and metabolites, and the use of floating-point arithmetic for the stoichiometric coefficients, ensuring that this assumption holds can be challenging.The MONGOOSE toolbox addresses this problem by using rational arithmetic, thus ensuring that models are analyzed in a reproducible manner and consistently with modeling assumptions. In this chapter we present a protocol for the complete analysis of a metabolic network model using the MONGOOSE toolbox, via its newly developed GUI, and describe how it can be used as a model-checking platform both during and after the model construction process.

  7. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  8. Wavefront Control Toolbox for James Webb Space Telescope Testbed

    NASA Technical Reports Server (NTRS)

    Shiri, Ron; Aronstein, David L.; Smith, Jeffery Scott; Dean, Bruce H.; Sabatke, Erin

    2007-01-01

    We have developed a Matlab toolbox for wavefront control of optical systems. We have applied this toolbox to the optical models of James Webb Space Telescope (JWST) in general and to the JWST Testbed Telescope (TBT) in particular, implementing both unconstrained and constrained wavefront optimization to correct for possible misalignments present on the segmented primary mirror or the monolithic secondary mirror. The optical models implemented in Zemax optical design program and information is exchanged between Matlab and Zemax via the Dynamic Data Exchange (DDE) interface. The model configuration is managed using the XML protocol. The optimization algorithm uses influence functions for each adjustable degree of freedom of the optical mode. The iterative and non-iterative algorithms have been developed to converge to a local minimum of the root-mean-square (rms) of wavefront error using singular value decomposition technique of the control matrix of influence functions. The toolkit is highly modular and allows the user to choose control strategies for the degrees of freedom to be adjusted on a given iteration and wavefront convergence criterion. As the influence functions are nonlinear over the control parameter space, the toolkit also allows for trade-offs between frequency of updating the local influence functions and execution speed. The functionality of the toolbox and the validity of the underlying algorithms have been verified through extensive simulations.

  9. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  10. Tunable axial gauge fields in engineered Weyl semimetals: semiclassical analysis and optical lattice implementations

    NASA Astrophysics Data System (ADS)

    Roy, Sthitadhi; Kolodrubetz, Michael; Goldman, Nathan; Grushin, Adolfo G.

    2018-04-01

    In this work, we describe a toolbox to realize and probe synthetic axial gauge fields in engineered Weyl semimetals. These synthetic electromagnetic fields, which are sensitive to the chirality associated with Weyl nodes, emerge due to spatially and temporally dependent shifts of the corresponding Weyl momenta. First, we introduce two realistic models, inspired by recent cold-atom developments, which are particularly suitable for the exploration of these synthetic axial gauge fields. Second, we describe how to realize and measure the effects of such axial fields through center-of-mass observables, based on semiclassical equations of motion and exact numerical simulations. In particular, we suggest realistic protocols to reveal an axial Hall response due to the axial electric field \

  11. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    PubMed Central

    Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-01-01

    Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194

  12. Review of qualitative approaches for the construction industry: designing a risk management toolbox.

    PubMed

    Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-06-01

    This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  13. A Protocol for Safe Lithiation Reactions Using Organolithium Reagents

    PubMed Central

    Gau, Michael R.; Zdilla, Michael J.

    2016-01-01

    Organolithium reagents are powerful tools in the synthetic chemist's toolbox. However, the extreme pyrophoric nature of the most reactive reagents warrants proper technique, thorough training, and proper personal protective equipment. To aid in the training of researchers using organolithium reagents, a thorough, step-by-step protocol for the safe and effective use of tert-butyllithium on an inert gas line or within a glovebox is described. As a model reaction, preparation of lithium tert-butyl amide by the reaction of tert-butyl amine with one equivalent of tert-butyl lithium is presented. PMID:27911386

  14. The ROC Toolbox: A toolbox for analyzing receiver-operating characteristics derived from confidence ratings.

    PubMed

    Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P

    2017-08-01

    Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.

  15. The Schultz MIDI Benchmarking Toolbox for MIDI interfaces, percussion pads, and sound cards.

    PubMed

    Schultz, Benjamin G

    2018-04-17

    The Musical Instrument Digital Interface (MIDI) was readily adopted for auditory sensorimotor synchronization experiments. These experiments typically use MIDI percussion pads to collect responses, a MIDI-USB converter (or MIDI-PCI interface) to record responses on a PC and manipulate feedback, and an external MIDI sound module to generate auditory feedback. Previous studies have suggested that auditory feedback latencies can be introduced by these devices. The Schultz MIDI Benchmarking Toolbox (SMIDIBT) is an open-source, Arduino-based package designed to measure the point-to-point latencies incurred by several devices used in the generation of response-triggered auditory feedback. Experiment 1 showed that MIDI messages are sent and received within 1 ms (on average) in the absence of any external MIDI device. Latencies decreased when the baud rate increased above the MIDI protocol default (31,250 bps). Experiment 2 benchmarked the latencies introduced by different MIDI-USB and MIDI-PCI interfaces. MIDI-PCI was superior to MIDI-USB, primarily because MIDI-USB is subject to USB polling. Experiment 3 tested three MIDI percussion pads. Both the audio and MIDI message latencies were significantly greater than 1 ms for all devices, and there were significant differences between percussion pads and instrument patches. Experiment 4 benchmarked four MIDI sound modules. Audio latencies were significantly greater than 1 ms, and there were significant differences between sound modules and instrument patches. These experiments suggest that millisecond accuracy might not be achievable with MIDI devices. The SMIDIBT can be used to benchmark a range of MIDI devices, thus allowing researchers to make informed decisions when choosing testing materials and to arrive at an acceptable latency at their discretion.

  16. TopoToolbox: Using Sensor Topography to Calculate Psychologically Meaningful Measures from Event-Related EEG/MEG

    PubMed Central

    Tian, Xing; Poeppel, David; Huber, David E.

    2011-01-01

    The open-source toolbox “TopoToolbox” is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004). PMID:21577268

  17. Building Interdisciplinary Research Models Through Interactive Education.

    PubMed

    Hessels, Amanda J; Robinson, Brian; O'Rourke, Michael; Begg, Melissa D; Larson, Elaine L

    2015-12-01

    Critical interdisciplinary research skills include effective communication with diverse disciplines and cultivating collaborative relationships. Acquiring these skills during graduate education may foster future interdisciplinary research quality and productivity. The project aim was to develop and evaluate an interactive Toolbox workshop approach within an interprofessional graduate level course to enhance student learning and skill in interdisciplinary research. We sought to examine the student experience of integrating the Toolbox workshop in modular format over the duration of a 14-week course. The Toolbox Health Sciences Instrument includes six modules that were introduced in a 110-minute dialogue session during the first class and then integrated into the course in a series of six individual workshops in three phases over the course of the semester. Seventeen students participated; the majority were nursing students. Three measures were used to assess project outcomes: pre-post intervention Toolbox survey, competency self-assessment, and a postcourse survey. All measures indicated the objectives were met by a change in survey responses, improved competencies, and favorable experience of the Toolbox modular intervention. Our experience indicates that incorporating this Toolbox modular approach into research curricula can enhance individual level scientific capacity, future interdisciplinary research project success, and ultimately impact on practice and policy. © 2015 Wiley Periodicals, Inc.

  18. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    PubMed

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  19. Improving student comprehension of the interconnectivity of the hydrologic cycle with a novel 'hydrology toolbox', integrated watershed model, and companion textbook

    NASA Astrophysics Data System (ADS)

    Huning, L. S.; Margulis, S. A.

    2013-12-01

    Concepts in introductory hydrology courses are often taught in the context of process-based modeling that ultimately is integrated into a watershed model. In an effort to reduce the learning curve associated with applying hydrologic concepts to real-world applications, we developed and incorporated a 'hydrology toolbox' that complements a new, companion textbook into introductory undergraduate hydrology courses. The hydrology toolbox contains the basic building blocks (functions coded in MATLAB) for an integrated spatially-distributed watershed model that makes hydrologic topics (e.g. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) more user-friendly and accessible for students. The toolbox functions can be used in a modular format so that students can study individual hydrologic processes and become familiar with the hydrology toolbox. This approach allows such courses to emphasize understanding and application of hydrologic concepts rather than computer coding or programming. While topics in introductory hydrology courses are often introduced and taught independently or semi-independently, they are inherently interconnected. These toolbox functions are therefore linked together at the end of the course to reinforce a holistic understanding of how these hydrologic processes are measured, interconnected, and modeled. They are integrated into a spatially-distributed watershed model or numerical laboratory where students can explore a range of topics such as rainfall-runoff modeling, urbanization, deforestation, watershed response to changes in parameters or forcings, etc. Model output can readily be visualized and analyzed by students to understand watershed response in a real river basin or a simple 'toy' basin. These tools complement the textbook, each of which has been well received by students in multiple hydrology courses with various disciplinary backgrounds. The same governing equations that students have studied in the textbook and used in the toolbox have been encapsulated in the watershed model. Therefore, the combination of the hydrology toolbox, integrated watershed model, and textbook tends to eliminate the potential disconnect between process-based modeling and an 'off-the-shelf' watershed model.

  20. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    PubMed

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.

  1. A CRISPR/Cas9 Toolbox for Multiplexed Plant Genome Editing and Transcriptional Regulation.

    PubMed

    Lowder, Levi G; Zhang, Dengwei; Baltes, Nicholas J; Paul, Joseph W; Tang, Xu; Zheng, Xuelian; Voytas, Daniel F; Hsieh, Tzung-Fu; Zhang, Yong; Qi, Yiping

    2015-10-01

    The relative ease, speed, and biological scope of clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated Protein9 (Cas9)-based reagents for genomic manipulations are revolutionizing virtually all areas of molecular biosciences, including functional genomics, genetics, applied biomedical research, and agricultural biotechnology. In plant systems, however, a number of hurdles currently exist that limit this technology from reaching its full potential. For example, significant plant molecular biology expertise and effort is still required to generate functional expression constructs that allow simultaneous editing, and especially transcriptional regulation, of multiple different genomic loci or multiplexing, which is a significant advantage of CRISPR/Cas9 versus other genome-editing systems. To streamline and facilitate rapid and wide-scale use of CRISPR/Cas9-based technologies for plant research, we developed and implemented a comprehensive molecular toolbox for multifaceted CRISPR/Cas9 applications in plants. This toolbox provides researchers with a protocol and reagents to quickly and efficiently assemble functional CRISPR/Cas9 transfer DNA constructs for monocots and dicots using Golden Gate and Gateway cloning methods. It comes with a full suite of capabilities, including multiplexed gene editing and transcriptional activation or repression of plant endogenous genes. We report the functionality and effectiveness of this toolbox in model plants such as tobacco (Nicotiana benthamiana), Arabidopsis (Arabidopsis thaliana), and rice (Oryza sativa), demonstrating its utility for basic and applied plant research. © 2015 American Society of Plant Biologists. All Rights Reserved.

  2. LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.

    PubMed

    Pernet, Cyril R; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A

    2011-01-01

    Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses.

  3. LIMO EEG: A Toolbox for Hierarchical LInear MOdeling of ElectroEncephaloGraphic Data

    PubMed Central

    Pernet, Cyril R.; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A.

    2011-01-01

    Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses. PMID:21403915

  4. DETECT: Detection of Events in Continuous Time Toolbox: User’s Guide, Examples, and Function Reference Documentation

    DTIC Science & Technology

    2013-06-01

    benefitting from rapid, automated discrimination of specific predefined signals , and is free-standing (requiring no other plugins or packages). The...previously labeled dataset, and comparing two labeled datasets. 15. SUBJECT TERMS Artifact, signal detection, EEG, MATLAB, toolbox 16. SECURITY... CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 56 19a. NAME OF RESPONSIBLE PERSON W. David Hairston a. REPORT

  5. ObspyDMT: a Python toolbox for retrieving and processing large seismological data sets

    NASA Astrophysics Data System (ADS)

    Hosseini, Kasra; Sigloch, Karin

    2017-10-01

    We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control - routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine) web service of the Data Management Center (DMC) at the Incorporated Research Institutions for Seismology (IRIS).

  6. Emotion assessment using the NIH Toolbox

    PubMed Central

    Butt, Zeeshan; Pilkonis, Paul A.; Cyranowski, Jill M.; Zill, Nicholas; Hendrie, Hugh C.; Kupst, Mary Jo; Kelly, Morgen A. R.; Bode, Rita K.; Choi, Seung W.; Lai, Jin-Shei; Griffith, James W.; Stoney, Catherine M.; Brouwers, Pim; Knox, Sarah S.; Cella, David

    2013-01-01

    One of the goals of the NIH Toolbox for Assessment of Neurological and Behavioral Function was to identify or develop brief measures of emotion for use in prospective epidemiologic and clinical research. Emotional health has significant links to physical health and exerts a powerful effect on perceptions of life quality. Based on an extensive literature review and expert input, the Emotion team identified 4 central subdomains: Negative Affect, Psychological Well-Being, Stress and Self-Efficacy, and Social Relationships. A subsequent psychometric review identified several existing self-report and proxy measures of these subdomains with measurement characteristics that met the NIH Toolbox criteria. In cases where adequate measures did not exist, robust item banks were developed to assess concepts of interest. A population-weighted sample was recruited by an online survey panel to provide initial item calibration and measure validation data. Participants aged 8 to 85 years completed self-report measures whereas parents/guardians responded for children aged 3 to 12 years. Data were analyzed using a combination of classic test theory and item response theory methods, yielding efficient measures of emotional health concepts. An overview of the development of the NIH Toolbox Emotion battery is presented along with preliminary results. Norming activities led to further refinement of the battery, thus enhancing the robustness of emotional health measurement for researchers using the NIH Toolbox. PMID:23479549

  7. Experimental plug and play quantum coin flipping.

    PubMed

    Pappa, Anna; Jouguet, Paul; Lawson, Thomas; Chailloux, André; Legré, Matthieu; Trinkler, Patrick; Kerenidis, Iordanis; Diamanti, Eleni

    2014-04-24

    Performing complex cryptographic tasks will be an essential element in future quantum communication networks. These tasks are based on a handful of fundamental primitives, such as coin flipping, where two distrustful parties wish to agree on a randomly generated bit. Although it is known that quantum versions of these primitives can offer information-theoretic security advantages with respect to classical protocols, a demonstration of such an advantage in a practical communication scenario has remained elusive. Here we experimentally implement a quantum coin flipping protocol that performs strictly better than classically possible over a distance suitable for communication over metropolitan area optical networks. The implementation is based on a practical plug and play system, developed by significantly enhancing a commercial quantum key distribution device. Moreover, we provide combined quantum coin flipping protocols that are almost perfectly secure against bounded adversaries. Our results offer a useful toolbox for future secure quantum communications.

  8. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    PubMed Central

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-01-01

    Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code. PMID:19607698

  9. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.

    PubMed

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-07-16

    Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code.

  10. A sigma factor toolbox for orthogonal gene expression in Escherichia coli

    PubMed Central

    Van Brempt, Maarten; Van Nerom, Katleen; Van Hove, Bob; Maertens, Jo; De Mey, Marjan; Charlier, Daniel

    2018-01-01

    Abstract Synthetic genetic sensors and circuits enable programmable control over timing and conditions of gene expression and, as a result, are increasingly incorporated into the control of complex and multi-gene pathways. Size and complexity of genetic circuits are growing, but stay limited by a shortage of regulatory parts that can be used without interference. Therefore, orthogonal expression and regulation systems are needed to minimize undesired crosstalk and allow for dynamic control of separate modules. This work presents a set of orthogonal expression systems for use in Escherichia coli based on heterologous sigma factors from Bacillus subtilis that recognize specific promoter sequences. Up to four of the analyzed sigma factors can be combined to function orthogonally between each other and toward the host. Additionally, the toolbox is expanded by creating promoter libraries for three sigma factors without loss of their orthogonal nature. As this set covers a wide range of transcription initiation frequencies, it enables tuning of multiple outputs of the circuit in response to different sensory signals in an orthogonal manner. This sigma factor toolbox constitutes an interesting expansion of the synthetic biology toolbox and may contribute to the assembly of more complex synthetic genetic systems in the future. PMID:29361130

  11. A MATLAB toolbox for the efficient estimation of the psychometric function using the updated maximum-likelihood adaptive procedure.

    PubMed

    Shen, Yi; Dai, Wei; Richards, Virginia M

    2015-03-01

    A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given.

  12. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    PubMed

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  13. A MATLAB toolbox for the efficient estimation of the psychometric function using the updated maximum-likelihood adaptive procedure

    PubMed Central

    Richards, V. M.; Dai, W.

    2014-01-01

    A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given. PMID:24671826

  14. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  15. A review of polymer electrolyte membrane fuel cell durability test protocols

    NASA Astrophysics Data System (ADS)

    Yuan, Xiao-Zi; Li, Hui; Zhang, Shengsheng; Martin, Jonathan; Wang, Haijiang

    Durability is one of the major barriers to polymer electrolyte membrane fuel cells (PEMFCs) being accepted as a commercially viable product. It is therefore important to understand their degradation phenomena and analyze degradation mechanisms from the component level to the cell and stack level so that novel component materials can be developed and novel designs for cells/stacks can be achieved to mitigate insufficient fuel cell durability. It is generally impractical and costly to operate a fuel cell under its normal conditions for several thousand hours, so accelerated test methods are preferred to facilitate rapid learning about key durability issues. Based on the US Department of Energy (DOE) and US Fuel Cell Council (USFCC) accelerated test protocols, as well as degradation tests performed by researchers and published in the literature, we review degradation test protocols at both component and cell/stack levels (driving cycles), aiming to gather the available information on accelerated test methods and degradation test protocols for PEMFCs, and thereby provide practitioners with a useful toolbox to study durability issues. These protocols help prevent the prolonged test periods and high costs associated with real lifetime tests, assess the performance and durability of PEMFC components, and ensure that the generated data can be compared.

  16. TH-CD-BRA-11: Implementation and Evaluation of a New 3D Dosimetry Protocol for Validating MRI Guided Radiation Therapy Treatments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mein, S; Rankine, L; Department of Radiation Oncology, Washington University School of Medicine

    Purpose: To develop, evaluate and apply a novel high-resolution 3D remote dosimetry protocol for validation of MRI guided radiation therapy treatments (MRIdian by ViewRay™). We demonstrate the first application of the protocol (including two small but required new correction terms) utilizing radiochromic 3D plastic PRESAGE™ with optical-CT readout. Methods: A detailed study of PRESAGE™ dosimeters (2kg) was conducted to investigate the temporal and spatial stability of radiation induced optical density change (ΔOD) over 8 days. Temporal stability was investigated on 3 dosimeters irradiated with four equally-spaced square 6MV fields delivering doses between 10cGy and 300cGy. Doses were imaged (read-out) bymore » optical-CT at multiple intervals. Spatial stability of ΔOD response was investigated on 3 other dosimeters irradiated uniformly with 15MV extended-SSD fields with doses of 15cGy, 30cGy and 60cGy. Temporal and spatial (radial) changes were investigated using CERR and MATLAB’s Curve Fitting Tool-box. A protocol was developed to extrapolate measured ΔOD readings at t=48hr (the typical shipment time in remote dosimetry) to time t=1hr. Results: All dosimeters were observed to gradually darken with time (<5% per day). Consistent intra-batch sensitivity (0.0930±0.002 ΔOD/cm/Gy) and linearity (R2=0.9996) was observed at t=1hr. A small radial effect (<3%) was observed, attributed to curing thermodynamics during manufacture. The refined remote dosimetry protocol (including polynomial correction terms for temporal and spatial effects, CT and CR) was then applied to independent dosimeters irradiated with MR-IGRT treatments. Excellent line profile agreement and 3D-gamma results for 3%/3mm, 10% threshold were observed, with an average passing rate 96.5%± 3.43%. Conclusion: A novel 3D remote dosimetry protocol is presented capable of validation of advanced radiation treatments (including MR-IGRT). The protocol uses 2kg radiochromic plastic dosimeters read-out by optical-CT within a week of treatment. The protocol requires small corrections for temporal and spatially-dependent behaviors observed between irradiation and readout.« less

  17. OCCIMA: Optical Channel Characterization in Maritime Atmospheres

    NASA Astrophysics Data System (ADS)

    Hammel, Steve; Tsintikidis, Dimitri; deGrassie, John; Reinhardt, Colin; McBryde, Kevin; Hallenborg, Eric; Wayne, David; Gibson, Kristofor; Cauble, Galen; Ascencio, Ana; Rudiger, Joshua

    2015-05-01

    The Navy is actively developing diverse optical application areas, including high-energy laser weapons and free- space optical communications, which depend on an accurate and timely knowledge of the state of the atmospheric channel. The Optical Channel Characterization in Maritime Atmospheres (OCCIMA) project is a comprehensive program to coalesce and extend the current capability to characterize the maritime atmosphere for all optical and infrared wavelengths. The program goal is the development of a unified and validated analysis toolbox. The foundational design for this program coordinates the development of sensors, measurement protocols, analytical models, and basic physics necessary to fulfill this goal.

  18. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan

    2016-04-28

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less

  19. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    PubMed

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  20. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    PubMed Central

    Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost

    2016-01-01

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167

  1. Arc_Mat: a Matlab-based spatial data analysis toolbox

    NASA Astrophysics Data System (ADS)

    Liu, Xingjian; Lesage, James

    2010-03-01

    This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.

  2. IV. NIH Toolbox Cognition Battery (CB): measuring language (vocabulary comprehension and reading decoding).

    PubMed

    Gershon, Richard C; Slotkin, Jerry; Manly, Jennifer J; Blitz, David L; Beaumont, Jennifer L; Schnipke, Deborah; Wallner-Allen, Kathleen; Golinkoff, Roberta Michnick; Gleason, Jean Berko; Hirsh-Pasek, Kathy; Adams, Marilyn Jager; Weintraub, Sandra

    2013-08-01

    Mastery of language skills is an important predictor of daily functioning and health. Vocabulary comprehension and reading decoding are relatively quick and easy to measure and correlate highly with overall cognitive functioning, as well as with success in school and work. New measures of vocabulary comprehension and reading decoding (in both English and Spanish) were developed for the NIH Toolbox Cognition Battery (CB). In the Toolbox Picture Vocabulary Test (TPVT), participants hear a spoken word while viewing four pictures, and then must choose the picture that best represents the word. This approach tests receptive vocabulary knowledge without the need to read or write, removing the literacy load for children who are developing literacy and for adults who struggle with reading and writing. In the Toolbox Oral Reading Recognition Test (TORRT), participants see a letter or word onscreen and must pronounce or identify it. The examiner determines whether it was pronounced correctly by comparing the response to the pronunciation guide on a separate computer screen. In this chapter, we discuss the importance of language during childhood and the relation of language and brain function. We also review the development of the TPVT and TORRT, including information about the item calibration process and results from a validation study. Finally, the strengths and weaknesses of the measures are discussed. © 2013 The Society for Research in Child Development, Inc.

  3. The CatchMod toolbox: easy and guided access to ICT tools for Water Framework Directive implementation.

    PubMed

    van Griensven, A; Vanrolleghem, P A

    2006-01-01

    Web-based toolboxes are handy tools to inform experienced users of existing software in their disciplines. However, for the implementation of the Water Framework Directive, a much more diverse public (water managers, consultancy firms, scientists, etc.) will ask for a very wide diversity of Information and Communication Technology (ICT) tools. It is obvious that the users of a web-based ICT-toolbox providing all this will not be experts in all of the disciplines and that a toolbox for ICT tools for Water Framework Directive implementation should thus go beyond just making interesting web-links. To deal with this issue, expert knowledge is brought to the users through the incorporation of visitor-geared guidance (materials) in the Harmoni-CA toolbox. Small workshops of expert teams were organized to deliver documents explaining why the tools are important, when they are required and what activity they support/perform, as well as a categorization of the multitude of available tools. An integration of this information in the web-based toolbox helps the users to browse through a toolbox containing tools, reports, guidance documents and interesting links. The Harmoni-CA toolbox thus provides not only a virtual toolbox, but incorporates a virtual expert as well.

  4. Data and Tools - Alphabetical Listing | NREL

    Science.gov Websites

    Climate Action Planning Tool Community Solar Scenario Tool Comparative PV Levelized Cost of Energy (LCOE Design Response Toolbox WEC-Sim: Wave Energy Converter Simulator West Associates Solar Monitoring Network Design and Engineering Model

  5. GridPV Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.

  6. A multiplexed microfluidic toolbox for the rapid optimization of affinity-driven partition in aqueous two phase systems.

    PubMed

    Bras, Eduardo J S; Soares, Ruben R G; Azevedo, Ana M; Fernandes, Pedro; Arévalo-Rodríguez, Miguel; Chu, Virginia; Conde, João P; Aires-Barros, M Raquel

    2017-09-15

    Antibodies and other protein products such as interferons and cytokines are biopharmaceuticals of critical importance which, in order to be safely administered, have to be thoroughly purified in a cost effective and efficient manner. The use of aqueous two-phase extraction (ATPE) is a viable option for this purification, but these systems are difficult to model and optimization procedures require lengthy and expensive screening processes. Here, a methodology for the rapid screening of antibody extraction conditions using a microfluidic channel-based toolbox is presented. A first microfluidic structure allows a simple negative-pressure driven rapid screening of up to 8 extraction conditions simultaneously, using less than 20μL of each phase-forming solution per experiment, while a second microfluidic structure allows the integration of multi-step extraction protocols based on the results obtained with the first device. In this paper, this microfluidic toolbox was used to demonstrate the potential of LYTAG fusion proteins used as affinity tags to optimize the partitioning of antibodies in ATPE processes, where a maximum partition coefficient (K) of 9.2 in a PEG 3350/phosphate system was obtained for the antibody extraction in the presence of the LYTAG-Z dual ligand. This represents an increase of approx. 3.7 fold when compared with the same conditions without the affinity molecule (K=2.5). Overall, this miniaturized and versatile approach allowed the rapid optimization of molecule partition followed by a proof-of-concept demonstration of an integrated back extraction procedure, both of which are critical procedures towards obtaining high purity biopharmaceuticals using ATPE. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John

    2014-02-21

    The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less

  8. MOEMS Modeling Using the Geometrical Matrix Toolbox

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2005-01-01

    New technologies such as MicroOptoElectro-Mechanical Systems (MOEMS) require new modeling tools. These tools must simultaneously model the optical, electrical, and mechanical domains and the interactions between these domains. To facilitate rapid prototyping of these new technologies an optical toolbox has been developed for modeling MOEMS devices. The toolbox models are constructed using MATLAB's dynamical simulator, Simulink. Modeling toolboxes will allow users to focus their efforts on system design and analysis as opposed to developing component models. This toolbox was developed to facilitate rapid modeling and design of a MOEMS based laser ultrasonic receiver system.

  9. C%2B%2B tensor toolbox user manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd D.; Kolda, Tamara Gibson

    2012-04-01

    The C++ Tensor Toolbox is a software package for computing tensor decompositions. It is based on the Matlab Tensor Toolbox, and is particularly optimized for sparse data sets. This user manual briefly overviews tensor decomposition mathematics, software capabilities, and installation of the package. Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors in C++. The Toolbox compiles into libraries and is intended for use with custom applications written by users.

  10. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.

    PubMed

    Silva, Ikaro; Moody, George B

    The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  11. Speed management toolbox for rural communities.

    DOT National Transportation Integrated Search

    2013-04-01

    The primary objective of this toolbox is to summarize various known traffic-calming treatments and their effectiveness. This toolbox focuses on roadway-based treatments for speed management, particularly for rural communities with transition zones. E...

  12. Command-line cellular electrophysiology for conventional and real-time closed-loop experiments.

    PubMed

    Linaro, Daniele; Couto, João; Giugliano, Michele

    2014-06-15

    Current software tools for electrophysiological experiments are limited in flexibility and rarely offer adequate support for advanced techniques such as dynamic clamp and hybrid experiments, which are therefore limited to laboratories with a significant expertise in neuroinformatics. We have developed lcg, a software suite based on a command-line interface (CLI) that allows performing both standard and advanced electrophysiological experiments. Stimulation protocols for classical voltage and current clamp experiments are defined by a concise and flexible meta description that allows representing complex waveforms as a piece-wise parametric decomposition of elementary sub-waveforms, abstracting the stimulation hardware. To perform complex experiments lcg provides a set of elementary building blocks that can be interconnected to yield a large variety of experimental paradigms. We present various cellular electrophysiological experiments in which lcg has been employed, ranging from the automated application of current clamp protocols for characterizing basic electrophysiological properties of neurons, to dynamic clamp, response clamp, and hybrid experiments. We finally show how the scripting capabilities behind a CLI are suited for integrating experimental trials into complex workflows, where actual experiment, online data analysis and computational modeling seamlessly integrate. We compare lcg with two open source toolboxes, RTXI and RELACS. We believe that lcg will greatly contribute to the standardization and reproducibility of both simple and complex experiments. Additionally, on the long run the increased efficiency due to a CLI will prove a great benefit for the experimental community. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0

    PubMed Central

    Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.

    2012-01-01

    Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097

  14. Microbe-ID: An open source toolbox for microbial genotyping and species identification

    USDA-ARS?s Scientific Manuscript database

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user...

  15. Screening and assessment of chronic pain among children with cerebral palsy: a process evaluation of a pain toolbox.

    PubMed

    Orava, Taryn; Provvidenza, Christine; Townley, Ashleigh; Kingsnorth, Shauna

    2018-06-08

    Though high numbers of children with cerebral palsy experience chronic pain, it remains under-recognized. This paper describes an evaluation of implementation supports and adoption of the Chronic Pain Assessment Toolbox for Children with Disabilities (the Toolbox) to enhance pain screening and assessment practices within a pediatric rehabilitation and complex continuing care hospital. A multicomponent knowledge translation strategy facilitated Toolbox adoption, inclusive of a clinical practice guideline, cerebral palsy practice points and assessment tools. Across the hospital, seven ambulatory care clinics with cerebral palsy caseloads participated in a staggered roll-out (Group 1: exclusive CP caseloads, March-December; Group 2: mixed diagnostic caseloads, August-December). Evaluation measures included client electronic medical record audit, document review and healthcare provider survey and interviews. A significant change in documentation of pain screening and assessment practice from pre-Toolbox (<2%) to post-Toolbox adoption (53%) was found. Uptake in Group 2 clinics lagged behind Group 1. Opportunities to use the Toolbox consistently (based on diagnostic caseload) and frequently (based on client appointments) were noted among contextual factors identified. Overall, the Toolbox was positively received and clinically useful. Findings affirm that the Toolbox, in conjunction with the application of integrated knowledge translation principles and an established knowledge translation framework, has potential to be a useful resource to enrich and standardize chronic pain screening and assessment practices among children with cerebral palsy. Implications for Rehabilitation It is important to engage healthcare providers in the conceptualization, development, implementation and evaluation of a knowledge-to-action best practice product. The Chronic Pain Toolbox for Children with Disabilities provides rehabilitation staff with guidance on pain screening and assessment best practice and offers a range of validated tools that can be incorporated in ambulatory clinic settings to meet varied client needs. Considering unique clinical contexts (i.e., opportunities for use, provider engagement, staffing absences/turnover) is required to optimize and sustain chronic pain screening and assessment practices in rehabilitation outpatient settings.

  16. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    PubMed

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  17. A Module for Graphical Display of Model Results with the CBP Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.

    2015-04-21

    This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide amore » basis for further development of the CBP Toolbox.« less

  18. A Software Toolbox for Systematic Evaluation of Seismometer-Digitizer System Responses

    DTIC Science & Technology

    2011-09-01

    characteristics (e.g., borehole vs. surface installation) instead of the actual seismic noise characteristics. Their results suggest that our best...Administration Award No. DE-FG02-09ER85548 ABSTRACT Measurement of the absolute amplitudes of a seismic signal requires accurate knowledge of...estimates seismic noise power spectral densities, and NOISETRAN, which generates a pseudo-amplitude response (PAR) for a seismic station, based on

  19. Language Measures of the NIH Toolbox Cognition Battery

    PubMed Central

    Gershon, Richard C.; Cook, Karon F.; Mungas, Dan; Manly, Jennifer J.; Slotkin, Jerry; Beaumont, Jennifer L.; Weintraub, Sandra

    2015-01-01

    Language facilitates communication and efficient encoding of thought and experience. Because of its essential role in early childhood development, in educational achievement and in subsequent life adaptation, language was included as one of the subdomains in the NIH Toolbox for the Assessment of Neurological and Behavioral Function Cognition Battery (NIHTB-CB). There are many different components of language functioning, including syntactic processing (i.e., morphology and grammar) and lexical semantics. For purposes of the NIHTB-CB, two tests of language—a picture vocabulary test and a reading recognition test—were selected by consensus based on literature reviews, iterative expert input, and a desire to assess in English and Spanish. NIHTB-CB’s picture vocabulary and reading recognition tests are administered using computer adaptive testing and scored using item response theory. Data are presented from the validation of the English versions in a sample of adults ages 20–85 years (Spanish results will be presented in a future publication). Both tests demonstrated high test–retest reliability and good construct validity compared to corresponding gold-standard measures. Scores on the NIH Toolbox measures were consistent with age-related expectations, namely, growth in language during early development, with relative stabilization into late adulthood. PMID:24960128

  20. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  1. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  2. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  3. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) User's Guide

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) software package is an open source, MATLABSimulink toolbox (plug in) that can be used by industry professionals and academics for the development of thermodynamic and controls simulations.

  4. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    PubMed

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  5. Electronic audit and feedback intervention with action implementation toolbox to improve pain management in intensive care: protocol for a laboratory experiment and cluster randomised trial.

    PubMed

    Gude, Wouter T; Roos-Blom, Marie-José; van der Veer, Sabine N; de Jonge, Evert; Peek, Niels; Dongelmans, Dave A; de Keizer, Nicolette F

    2017-05-25

    Audit and feedback is often used as a strategy to improve quality of care, however, its effects are variable and often marginal. In order to learn how to design and deliver effective feedback, we need to understand their mechanisms of action. This theory-informed study will investigate how electronic audit and feedback affects improvement intentions (i.e. information-intention gap), and whether an action implementation toolbox with suggested actions and materials helps translating those intentions into action (i.e. intention-behaviour gap). The study will be executed in Dutch intensive care units (ICUs) and will be focused on pain management. We will conduct a laboratory experiment with individual ICU professionals to assess the impact of feedback on their intentions to improve practice. Next, we will conduct a cluster randomised controlled trial with ICUs allocated to feedback without or feedback with action implementation toolbox group. Participants will not be told explicitly what aspect of the intervention is randomised; they will only be aware that there are two variations of providing feedback. ICUs are eligible for participation if they submit indicator data to the Dutch National Intensive Care Evaluation (NICE) quality registry and agree to allocate a quality improvement team that spends 4 h per month on the intervention. All participating ICUs will receive access to an online quality dashboard that provides two functionalities: gaining insight into clinical performance on pain management indicators and developing action plans. ICUs with access to the toolbox can develop their action plans guided by a list of potential barriers in the care process, associated suggested actions, and supporting materials to facilitate implementation of the actions. The primary outcome measure for the laboratory experiment is the proportion of improvement intentions set by participants that are consistent with recommendations based on peer comparisons; for the randomised trial it is the proportion of patient shifts during which pain has been adequately managed. We will also conduct a process evaluation to understand how the intervention is implemented and used in clinical practice, and how implementation and use affect the intervention's impact. The results of this study will inform care providers and managers in ICU and other clinical settings how to use indicator-based performance feedback in conjunction with an action implementation toolbox to improve quality of care. Within the ICU context, this study will produce concrete and directly applicable knowledge with respect to what is or is not effective for improving pain management, and under which circumstances. The results will further guide future research that aims to understand the mechanisms behind audit and feedback and contribute to identifying the active ingredients of successful interventions. ClinicalTrials.gov NCT02922101 . Registered 26 September 2016.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  7. Everything should be as simple as possible, but no simpler: towards a protocol for accumulating evidence regarding the active content of health behaviour change interventions.

    PubMed

    Peters, Gjalt-Jorn Ygram; de Bruin, Marijn; Crutzen, Rik

    2015-01-01

    There is a need to consolidate the evidence base underlying our toolbox of methods of behaviour change. Recent efforts to this effect have conducted meta-regressions on evaluations of behaviour change interventions, deriving each method's effectiveness from its association to intervention effect size. However, there are a range of issues that raise concern about whether this approach is actually furthering or instead obstructing the advancement of health psychology theories and the quality of health behaviour change interventions. Using examples from theory, the literature and data from previous meta-analyses, these concerns and their implications are explained and illustrated. An iterative protocol for evidence base accumulation is proposed that integrates evidence derived from both experimental and applied behaviour change research, and combines theory development in experimental settings with theory testing in applied real-life settings. As evidence gathered in this manner accumulates, a cumulative science of behaviour change can develop.

  8. Everything should be as simple as possible, but no simpler: towards a protocol for accumulating evidence regarding the active content of health behaviour change interventions

    PubMed Central

    Peters, Gjalt-Jorn Ygram; de Bruin, Marijn; Crutzen, Rik

    2015-01-01

    There is a need to consolidate the evidence base underlying our toolbox of methods of behaviour change. Recent efforts to this effect have conducted meta-regressions on evaluations of behaviour change interventions, deriving each method's effectiveness from its association to intervention effect size. However, there are a range of issues that raise concern about whether this approach is actually furthering or instead obstructing the advancement of health psychology theories and the quality of health behaviour change interventions. Using examples from theory, the literature and data from previous meta-analyses, these concerns and their implications are explained and illustrated. An iterative protocol for evidence base accumulation is proposed that integrates evidence derived from both experimental and applied behaviour change research, and combines theory development in experimental settings with theory testing in applied real-life settings. As evidence gathered in this manner accumulates, a cumulative science of behaviour change can develop. PMID:25793484

  9. Practical secure quantum communications

    NASA Astrophysics Data System (ADS)

    Diamanti, Eleni

    2015-05-01

    We review recent advances in the field of quantum cryptography, focusing in particular on practical implementations of two central protocols for quantum network applications, namely key distribution and coin flipping. The former allows two parties to share secret messages with information-theoretic security, even in the presence of a malicious eavesdropper in the communication channel, which is impossible with classical resources alone. The latter enables two distrustful parties to agree on a random bit, again with information-theoretic security, and with a cheating probability lower than the one that can be reached in a classical scenario. Our implementations rely on continuous-variable technology for quantum key distribution and on a plug and play discrete-variable system for coin flipping, and necessitate a rigorous security analysis adapted to the experimental schemes and their imperfections. In both cases, we demonstrate the protocols with provable security over record long distances in optical fibers and assess the performance of our systems as well as their limitations. The reported advances offer a powerful toolbox for practical applications of secure communications within future quantum networks.

  10. The Brain's Versatile Toolbox.

    ERIC Educational Resources Information Center

    Pinker, Steven

    1997-01-01

    Considers the role of evolution and natural selection in the functioning of the modern human brain. Natural selection equipped humans with a mental toolbox of intuitive theories about the world which were used to master rocks, tools, plants, animals, and one another. The same toolbox is used today to master the intellectual challenges of modern…

  11. Compressible Flow Toolbox

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2006-01-01

    The Compressible Flow Toolbox is primarily a MATLAB-language implementation of a set of algorithms that solve approximately 280 linear and nonlinear classical equations for compressible flow. The toolbox is useful for analysis of one-dimensional steady flow with either constant entropy, friction, heat transfer, or Mach number greater than 1. The toolbox also contains algorithms for comparing and validating the equation-solving algorithms against solutions previously published in open literature. The classical equations solved by the Compressible Flow Toolbox are as follows: The isentropic-flow equations, The Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), The Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section), The normal-shock equations, The oblique-shock equations, and The expansion equations.

  12. A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain

    NASA Astrophysics Data System (ADS)

    Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut

    2017-08-01

    Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.

  13. biomechZoo: An open-source toolbox for the processing, analysis, and visualization of biomechanical movement data.

    PubMed

    Dixon, Philippe C; Loh, Jonathan J; Michaud-Paquette, Yannick; Pearsall, David J

    2017-03-01

    It is common for biomechanics data sets to contain numerous dependent variables recorded over time, for many subjects, groups, and/or conditions. These data often require standard sorting, processing, and analysis operations to be performed in order to answer research questions. Visualization of these data is also crucial. This manuscript presents biomechZoo, an open-source toolbox that provides tools and graphical user interfaces to help users achieve these goals. The aims of this manuscript are to (1) introduce the main features of the toolbox, including a virtual three-dimensional environment to animate motion data (Director), a data plotting suite (Ensembler), and functions for the computation of three-dimensional lower-limb joint angles, moments, and power and (2) compare these computations to those of an existing validated system. To these ends, the steps required to process and analyze a sample data set via the toolbox are outlined. The data set comprises three-dimensional marker, ground reaction force (GRF), joint kinematic, and joint kinetic data of subjects performing straight walking and 90° turning manoeuvres. Joint kinematics and kinetics processed within the toolbox were found to be similar to outputs from a commercial system. The biomechZoo toolbox represents the work of several years and multiple contributors to provide a flexible platform to examine time-series data sets typical in the movement sciences. The toolbox has previously been used to process and analyse walking, running, and ice hockey data sets, and can integrate existing routines, such as the KineMat toolbox, for additional analyses. The toolbox can help researchers and clinicians new to programming or biomechanics to process and analyze their data through a customizable workflow, while advanced users are encouraged to contribute additional functionality to the project. Students may benefit from using biomechZoo as a learning and research tool. It is hoped that the toolbox can play a role in advancing research in the movement sciences. The biomechZoo m-files, sample data, and help repositories are available online (http://www.biomechzoo.com) under the Apache 2.0 License. The toolbox is supported for Matlab (r2014b or newer, The Mathworks Inc., Natick, USA) for Windows (Microsoft Corp., Redmond, USA) and Mac OS (Apple Inc., Cupertino, USA). Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Software Toolbox for Low-Frequency Conductivity and Current Density Imaging Using MRI.

    PubMed

    Sajib, Saurav Z K; Katoch, Nitish; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2017-11-01

    Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes. Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes.

  15. RTI Strategies That Work in the K-2 Classroom

    ERIC Educational Resources Information Center

    Johnson, Eli; Karns, Michelle

    2011-01-01

    Targeted specifically to K-2 classrooms, the 25 Response-to-Intervention (RTI) strategies in this book are research-based and perfect for teachers who want to expand their toolbox of classroom interventions that work! Contents include: (1) Listening Strategies--Help students focus and understand; (2) Reading Strategies--Help students comprehend…

  16. Contributions to the Nutrient Toolbox: Identifying Drivers, Nutrient Sources, and Attribution of Exceedances

    EPA Science Inventory

    Nutrients are a leading cause of impairments in the United States, and as a result tools are needed to identify drivers of nutrients and response variables (such as chlorophyll a), nutrient sources, and identify causes of exceedances of water quality thresholds. This presentatio...

  17. Gammapy: Python toolbox for gamma-ray astronomy

    NASA Astrophysics Data System (ADS)

    Deil, Christoph; Donath, Axel; Owen, Ellis; Terrier, Regis; Bühler, Rolf; Armstrong, Thomas

    2017-11-01

    Gammapy analyzes gamma-ray data and creates sky images, spectra and lightcurves, from event lists and instrument response information; it can also determine the position, morphology and spectra of gamma-ray sources. It is used to analyze data from H.E.S.S., Fermi-LAT, and the Cherenkov Telescope Array (CTA).

  18. Capturing the 'ome': the expanding molecular toolbox for RNA and DNA library construction.

    PubMed

    Boone, Morgane; De Koker, Andries; Callewaert, Nico

    2018-04-06

    All sequencing experiments and most functional genomics screens rely on the generation of libraries to comprehensively capture pools of targeted sequences. In the past decade especially, driven by the progress in the field of massively parallel sequencing, numerous studies have comprehensively assessed the impact of particular manipulations on library complexity and quality, and characterized the activities and specificities of several key enzymes used in library construction. Fortunately, careful protocol design and reagent choice can substantially mitigate many of these biases, and enable reliable representation of sequences in libraries. This review aims to guide the reader through the vast expanse of literature on the subject to promote informed library generation, independent of the application.

  19. Using a Toolbox of Tailored Educational Lessons to Improve Fruit, Vegetable, and Physical Activity Behaviors among African American Women in California

    ERIC Educational Resources Information Center

    Backman, Desiree; Scruggs, Valarie; Atiedu, Akpene Ama; Bowie, Shene; Bye, Larry; Dennis, Angela; Hall, Melanie; Ossa, Alexandra; Wertlieb, Stacy; Foerster, Susan B.

    2011-01-01

    Objective: Evaluate the effectiveness of the "Fruit, Vegetable, and Physical Activity Toolbox for Community Educators" ("Toolbox"), an intervention originally designed for Spanish- and English-speaking audiences, in changing knowledge, attitudes, and behavior among low-income African American women. Design: Quasi-experimental…

  20. Travel demand management : a toolbox of strategies to reduce single\\0x2010occupant vehicle trips and increase alternate mode usage in Arizona.

    DOT National Transportation Integrated Search

    2012-02-01

    The report provides a suite of recommended strategies to reduce single-occupant vehicle traffic in the urban : areas of Phoenix and Tucson, Arizona, which are presented as a travel demand management toolbox. The : toolbox includes supporting research...

  1. Proposal for the design of a zero gravity tool storage device

    NASA Technical Reports Server (NTRS)

    Stuckwisch, Sue; Carrion, Carlos A.; Phillips, Lee; Laughlin, Julia; Francois, Jason

    1994-01-01

    Astronauts frequently use a variety of hand tools during space missions, especially on repair missions. A toolbox is needed to allow storage and retrieval of tools with minimal difficulties. The toolbox must contain tools during launch, landing, and on-orbit operations. The toolbox will be used in the Shuttle Bay and therefore must withstand the hazardous space environment. The three main functions of the toolbox in space are: to protect the tools from the space environment and from damaging one another, to allow for quick, one-handed access to the tools; and to minimize the heat transfer between the astronaut's hand and the tools. This proposal explores the primary design issues associated with the design of the toolbox. Included are the customer and design specifications, global and refined function structures, possible solution principles, concept variants, and finally design recommendations.

  2. CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox

    NASA Astrophysics Data System (ADS)

    Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano

    2018-03-01

    Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.

  3. JWST Wavefront Control Toolbox

    NASA Technical Reports Server (NTRS)

    Shin, Shahram Ron; Aronstein, David L.

    2011-01-01

    A Matlab-based toolbox has been developed for the wavefront control and optimization of segmented optical surfaces to correct for possible misalignments of James Webb Space Telescope (JWST) using influence functions. The toolbox employs both iterative and non-iterative methods to converge to an optimal solution by minimizing the cost function. The toolbox could be used in either of constrained and unconstrained optimizations. The control process involves 1 to 7 degrees-of-freedom perturbations per segment of primary mirror in addition to the 5 degrees of freedom of secondary mirror. The toolbox consists of a series of Matlab/Simulink functions and modules, developed based on a "wrapper" approach, that handles the interface and data flow between existing commercial optical modeling software packages such as Zemax and Code V. The limitations of the algorithm are dictated by the constraints of the moving parts in the mirrors.

  4. A CRISPR-Based Toolbox for Studying T Cell Signal Transduction

    PubMed Central

    Chi, Shen; Weiss, Arthur; Wang, Haopeng

    2016-01-01

    CRISPR/Cas9 system is a powerful technology to perform genome editing in a variety of cell types. To facilitate the application of Cas9 in mapping T cell signaling pathways, we generated a toolbox for large-scale genetic screens in human Jurkat T cells. The toolbox has three different Jurkat cell lines expressing distinct Cas9 variants, including wild-type Cas9, dCas9-KRAB, and sunCas9. We demonstrated that the toolbox allows us to rapidly disrupt endogenous gene expression at the DNA level and to efficiently repress or activate gene expression at the transcriptional level. The toolbox, in combination with multiple currently existing genome-wide sgRNA libraries, will be useful to systematically investigate T cell signal transduction using both loss-of-function and gain-of-function genetic screens. PMID:27057542

  5. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  6. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  7. A toolbox for computing pebble shape and roundness indexes: experimental tests and recommendations for future applications.

    NASA Astrophysics Data System (ADS)

    Cassel, M.; Piegay, H.; Lave, J.

    2016-12-01

    Pebble rounding caused by attrition is, beside chemical dissolution, breakage, and grain size segregation, one of the key processes controlling bedload downstream fining in rivers. Downstream changes in pebble geometry is subject of consideration since Aristotle (Krynine, 1960) and its measurement represent a challenge since the end of 19th century, leading to a long standing debate (Blott and Pye, 2008). A toolbox developed by Roussillon et al. (2009) operate on automatic computation of several shape and roundness indexes from images of 2D projection plan of pebbles disposed on a one meter square red board. In order to promote the tool for future applications, we tested the effects of pebble position on board, of picture resolution and treatment on three shape and roundness indexes. We also compared the downstream patterns of these indexes on two pebble samples of the same lithology collected on the Progo River (Indonesia) based on field observations (i) and experimentation (ii). Shape and roundness were measured on (i) 8 sites distributed over a distance of 36 km along the river, and (ii) ten times on a set of particules collected on the Progo spring and transported in an annular flume over the same distance. This travel distance was monitored using passive low frequency RFID system. Results show that pebble position does not have a significant effect on shape and roundness indexes but these indexes are sensible to picture resolutions and treatments so that a clear protocol must be considered for avoiding any observer bias. Downstream changes in roundness indexes are very similar in field and experimental conditions, while abrasion environments are distinct. Discontinuities observed in downstream river pattern but not in experimental one underlined changes in Progo River pebble roundness are probably caused by sediment supplied from tributaries or bank erosion. These results highlight the toolbox potential for diagnosing river systems function.

  8. The Handover Toolbox: a knowledge exchange and training platform for improving patient care.

    PubMed

    Drachsler, Hendrik; Kicken, Wendy; van der Klink, Marcel; Stoyanov, Slavi; Boshuizen, Henny P A; Barach, Paul

    2012-12-01

    Safe and effective patient handovers remain a global organisational and training challenge. Limited evidence supports available handover training programmes. Customisable training is a promising approach to improve the quality and sustainability of handover training and outcomes. We present a Handover Toolbox designed in the context of the European HANDOVER Project. The Toolbox aims to support physicians, nurses, individuals in health professions training, medical educators and handover experts by providing customised handover training tools for different clinical needs and contexts. The Handover Toolbox uses the Technology Enhanced Learning Design Process (TEL-DP), which encompasses user requirements analysis; writing personas; group concept mapping; analysis of suitable software; plus, minus, interesting rating; and usability testing. TEL-DP is aligned with participatory design approaches and ensures development occurs in close collaboration with, and engagement of, key stakeholders. Application of TEL-DP confirmed that the ideal formats of handover training differs for practicing professionals versus individuals in health profession education programmes. Training experts from different countries differed in their views on the optimal content and delivery of training. Analysis of suitable software identified ready-to-use systems that provide required functionalities and can be further customised to users' needs. Interest rating and usability testing resulted in improved usability, navigation and uptake of the Handover Toolbox. The design of the Handover Toolbox was based on a carefully led stakeholder participatory design using the TEL-DP approach. The Toolbox supports a customisable learning approach that allows trainers to design training that addresses the specific information needs of the various target groups. We offer recommendations regarding the application of the Handover Toolbox to medical educators.

  9. Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy

    NASA Astrophysics Data System (ADS)

    Jha, S.; Harry, D. L.; Schutt, D.

    2016-12-01

    The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.

  10. Integrating hidden Markov model and PRAAT: a toolbox for robust automatic speech transcription

    NASA Astrophysics Data System (ADS)

    Kabir, A.; Barker, J.; Giurgiu, M.

    2010-09-01

    An automatic time-aligned phone transcription toolbox of English speech corpora has been developed. Especially the toolbox would be very useful to generate robust automatic transcription and able to produce phone level transcription using speaker independent models as well as speaker dependent models without manual intervention. The system is based on standard Hidden Markov Models (HMM) approach and it was successfully experimented over a large audiovisual speech corpus namely GRID corpus. One of the most powerful features of the toolbox is the increased flexibility in speech processing where the speech community would be able to import the automatic transcription generated by HMM Toolkit (HTK) into a popular transcription software, PRAAT, and vice-versa. The toolbox has been evaluated through statistical analysis on GRID data which shows that automatic transcription deviates by an average of 20 ms with respect to manual transcription.

  11. Evaluation of 14 nonlinear deformation algorithms applied to human brain MRI registration

    PubMed Central

    Klein, Arno; Andersson, Jesper; Ardekani, Babak A.; Ashburner, John; Avants, Brian; Chiang, Ming-Chang; Christensen, Gary E.; Collins, D. Louis; Gee, James; Hellier, Pierre; Song, Joo Hyun; Jenkinson, Mark; Lepage, Claude; Rueckert, Daniel; Thompson, Paul; Vercauteren, Tom; Woods, Roger P.; Mann, J. John; Parsey, Ramin V.

    2009-01-01

    All fields of neuroscience that employ brain imaging need to communicate their results with reference to anatomical regions. In particular, comparative morphometry and group analysis of functional and physiological data require coregistration of brains to establish correspondences across brain structures. It is well established that linear registration of one brain to another is inadequate for aligning brain structures, so numerous algorithms have emerged to nonlinearly register brains to one another. This study is the largest evaluation of nonlinear deformation algorithms applied to brain image registration ever conducted. Fourteen algorithms from laboratories around the world are evaluated using 8 different error measures. More than 45,000 registrations between 80 manually labeled brains were performed by algorithms including: AIR, ANIMAL, ART, Diffeomorphic Demons, FNIRT, IRTK, JRD-fluid, ROMEO, SICLE, SyN, and four different SPM5 algorithms (“SPM2-type” and regular Normalization, Unified Segmentation, and the DARTEL Toolbox). All of these registrations were preceded by linear registration between the same image pairs using FLIRT. One of the most significant findings of this study is that the relative performances of the registration methods under comparison appear to be little affected by the choice of subject population, labeling protocol, and type of overlap measure. This is important because it suggests that the findings are generalizable to new subject populations that are labeled or evaluated using different labeling protocols. Furthermore, we ranked the 14 methods according to three completely independent analyses (permutation tests, one-way ANOVA tests, and indifference-zone ranking) and derived three almost identical top rankings of the methods. ART, SyN, IRTK, and SPM's DARTEL Toolbox gave the best results according to overlap and distance measures, with ART and SyN delivering the most consistently high accuracy across subjects and label sets. Updates will be published on the http://www.mindboggle.info/papers/ website. PMID:19195496

  12. The Smithsonian-led Marine Global Earth Observatory (MarineGEO): Proposed Model for a Collaborative Network Linking Marine Biodiversity to Ecosystem Processes

    NASA Astrophysics Data System (ADS)

    Duffy, J. E.

    2016-02-01

    Biodiversity - the variety of functional types of organisms - is the engine of marine ecosystem processes, including productivity, nutrient cycling, and carbon sequestration. Biodiversity remains a black box in much of ocean science, despite wide recognition that effectively managing human interactions with marine ecosystems requires understanding both structure and functional consequences of biodiversity. Moreover, the inherent complexity of biological systems puts a premium on data-rich, comparative approaches, which are best met via collaborative networks. The Smithsonian Institution's MarineGEO program links a growing network of partners conducting parallel, comparative research to understand change in marine biodiversity and ecosystems, natural and anthropogenic drivers of that change, and the ecological processes mediating it. The focus is on nearshore, seabed-associated systems where biodiversity and human population are concentrated and interact most, yet which fall through the cracks of existing ocean observing programs. MarineGEO offers a standardized toolbox of research modules that efficiently capture key elements of biological diversity and its importance in ecological processes across a range of habitats. The toolbox integrates high-tech (DNA-based, imaging) and low-tech protocols (diver surveys, rapid assays of consumer activity) adaptable to differing institutional capacity and resources. The model for long-term sustainability involves leveraging in-kind support among partners, adoption of best practices wherever possible, engagement of students and citizen scientists, and benefits of training, networking, and global relevance as incentives for participation. Here I highlight several MarineGEO comparative research projects demonstrating the value of standardized, scalable assays and parallel experiments for measuring fish and invertebrate diversity, recruitment, benthic herbivory and generalist predation, decomposition, and carbon sequestration. Key remaining challenges include consensus on protocols; integration of historical data; data management and access; and informatics. These challenges are common to other fields and prospects for progress in the near future are good.

  13. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments.

    PubMed

    Dalmaijer, Edwin S; Mathôt, Sebastiaan; Van der Stigchel, Stefan

    2014-12-01

    The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galperin, Michael

    The progress of experimental techniques at the nanoscale in the last decade made optical measurements in current-carrying nanojunctions a reality, thus indicating the emergence of a new field of research coined optoelectronics. Optical spectroscopy of open nonequilibrium systems is a natural meeting point for (at least) two research areas: nonlinear optical spectroscopy and quantum transport, each with its own theoretical toolbox. We review recent progress in the field comparing theoretical treatments of optical response in nanojunctions as is accepted in nonlinear spectroscopy and quantum transport communities. A unified theoretical description of spectroscopy in nanojunctions is presented. Here, we argue thatmore » theoretical approaches of the quantum transport community (and in particular, the Green function based considerations) yield a convenient tool for optoelectronics when the radiation field is treated classically, and that differences between the toolboxes may become critical when studying the quantum radiation field in junctions.« less

  15. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  16. National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox

    USGS Publications Warehouse

    Price, Curtis

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells.

  17. Snap-, CLIP- and Halo-Tag Labelling of Budding Yeast Cells

    PubMed Central

    Stagge, Franziska; Mitronova, Gyuzel Y.; Belov, Vladimir N.; Wurm, Christian A.; Jakobs, Stefan

    2013-01-01

    Fluorescence microscopy of the localization and the spatial and temporal dynamics of specifically labelled proteins is an indispensable tool in cell biology. Besides fluorescent proteins as tags, tag-mediated labelling utilizing self-labelling proteins as the SNAP-, CLIP-, or the Halo-tag are widely used, flexible labelling systems relying on exogenously supplied fluorophores. Unfortunately, labelling of live budding yeast cells proved to be challenging with these approaches because of the limited accessibility of the cell interior to the dyes. In this study we developed a fast and reliable electroporation-based labelling protocol for living budding yeast cells expressing SNAP-, CLIP-, or Halo-tagged fusion proteins. For the Halo-tag, we demonstrate that it is crucial to use the 6′-carboxy isomers and not the 5′-carboxy isomers of important dyes to ensure cell viability. We report on a simple rule for the analysis of 1H NMR spectra to discriminate between 6′- and 5′-carboxy isomers of fluorescein and rhodamine derivatives. We demonstrate the usability of the labelling protocol by imaging yeast cells with STED super-resolution microscopy and dual colour live cell microscopy. The large number of available fluorophores for these self-labelling proteins and the simplicity of the protocol described here expands the available toolbox for the model organism Saccharomyces cerevisiae. PMID:24205303

  18. SOCIB Glider toolbox: from sensor to data repository

    NASA Astrophysics Data System (ADS)

    Pau Beltran, Joan; Heslop, Emma; Ruiz, Simón; Troupin, Charles; Tintoré, Joaquín

    2015-04-01

    Nowadays in oceanography, gliders constitutes a mature, cost-effective technology for the acquisition of measurements independently of the sea state (unlike ships), providing subsurface data during sustained periods, including extreme weather events. The SOCIB glider toolbox is a set of MATLAB/Octave scripts and functions developed in order to manage the data collected by a glider fleet. They cover the main stages of the data management process, both in real-time and delayed-time modes: metadata aggregation, downloading, processing, and automatic generation of data products and figures. The toolbox is distributed under the GNU licence (http://www.gnu.org/copyleft/gpl.html) and is available at http://www.socib.es/users/glider/glider_toolbox.

  19. A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Sadegh, M.; Mallakpour, I.

    2017-12-01

    Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.

  20. User Guide for Compressible Flow Toolbox Version 2.1 for Use With MATLAB(Registered Trademark); Version 7

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2006-01-01

    This report provides a user guide for the Compressible Flow Toolbox, a collection of algorithms that solve almost 300 linear and nonlinear classical compressible flow relations. The algorithms, implemented in the popular MATLAB programming language, are useful for analysis of one-dimensional steady flow with constant entropy, friction, heat transfer, or shock discontinuities. The solutions do not include any gas dissociative effects. The toolbox also contains functions for comparing and validating the equation-solving algorithms against solutions previously published in the open literature. The classical equations solved by the Compressible Flow Toolbox are: isentropic-flow equations, Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section.), normal-shock equations, oblique-shock equations, and Prandtl-Meyer expansion equations. At the time this report was published, the Compressible Flow Toolbox was available without cost from the NASA Software Repository.

  1. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG

    PubMed Central

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies. PMID:29163006

  2. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    PubMed

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  3. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Benveniste, Jérôme; Knudsen, Per

    2016-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  4. A versatile software package for inter-subject correlation based analyses of fMRI.

    PubMed

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  5. A versatile software package for inter-subject correlation based analyses of fMRI

    PubMed Central

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818

  6. Photonics and spectroscopy in nanojunctions: a theoretical insight

    DOE PAGES

    Galperin, Michael

    2017-04-11

    The progress of experimental techniques at the nanoscale in the last decade made optical measurements in current-carrying nanojunctions a reality, thus indicating the emergence of a new field of research coined optoelectronics. Optical spectroscopy of open nonequilibrium systems is a natural meeting point for (at least) two research areas: nonlinear optical spectroscopy and quantum transport, each with its own theoretical toolbox. We review recent progress in the field comparing theoretical treatments of optical response in nanojunctions as is accepted in nonlinear spectroscopy and quantum transport communities. A unified theoretical description of spectroscopy in nanojunctions is presented. Here, we argue thatmore » theoretical approaches of the quantum transport community (and in particular, the Green function based considerations) yield a convenient tool for optoelectronics when the radiation field is treated classically, and that differences between the toolboxes may become critical when studying the quantum radiation field in junctions.« less

  7. Open source tools for the information theoretic analysis of neural data.

    PubMed

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  8. Air Sensor Toolbox

    EPA Pesticide Factsheets

    Air Sensor Toolbox provides information to citizen scientists, researchers and developers interested in learning more about new lower-cost compact air sensor technologies and tools for measuring air quality.

  9. Demonstration of a Fractured Rock Geophysical Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in Fractured Rock Aquifers

    DTIC Science & Technology

    2016-01-01

    USER’S GUIDE Demonstration of a Fractured Rock Geophysical Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in...Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in Fractured Rock Aquifers F.D. Day-Lewis, C.D. Johnson, J.H. Williams, C.L...are doomed to failure. DNAPL biodegradation charactrization and monitoring, remediation, fractured rock aquifers. Unclassified Unclassified UU UL 6

  10. Explaining Society: An Expanded Toolbox for Social Scientists

    PubMed Central

    Bell, David C.; Atkinson-Schnell, Jodie L.; DiBacco, Aron E.

    2012-01-01

    We propose for social scientists a theoretical toolbox containing a set of motivations that neurobiologists have recently validated. We show how these motivations can be used to create a theory of society recognizably similar to existing stable societies (sustainable, self-reproducing, and largely peaceful). Using this toolbox, we describe society in terms of three institutions: economy (a source of sustainability), government (peace), and the family (reproducibility). Conducting a thought experiment in three parts, we begin with a simple theory with only two motivations. We then create successive theories that systematically add motivations, showing that each element in the toolbox makes its own contribution to explain the workings of a stable society and that the family has a critical role in this process. PMID:23082093

  11. Complete scanpaths analysis toolbox.

    PubMed

    Augustyniak, Piotr; Mikrut, Zbigniew

    2006-01-01

    This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.

  12. Political Leadership in the Time of Crises: Primum non Nocere.

    PubMed

    Burkle, Frederick M; Hanfling, Dan

    2015-05-29

    Long before the 2014 Ebola outbreak in West Africa, the United States was already experiencing a failure of confidence between politicians and scientists, primarily focused on differences of opinion on climate extremes. This ongoing clash has culminated in an environment where politicians most often no longer listen to scientists. Importation of Ebola virus to the United States prompted an immediate political fervor over travel bans, sealing off borders and disputes over the reliability of both quarantine and treatment protocol. This demonstrated that evidenced- based scientific discourse risks taking a back seat to political hyperbole and fear. The role of public health and medical expertise should be to ensure that cogent response strategies, based upon good science and accumulated knowledge and experience, are put in place to help inform the development of sound public policy. But in times of crisis, such reasoned expertise and experience are too often overlooked in favor of the partisan press "sound bite", where fear and insecurity have proved to be severely counterproductive. While scientists recognize that science cannot be entirely apolitical, the lessons from the impact of Ebola on political discourse shows that there is need for stronger engagement of the scientific community in crafting messages required for response to such events. This includes the creation of moral and ethical standards for the press, politicians and scientists, a partnership of confidence between the three that does not now exist and an "elected officials" toolbox that helps to translate scientific evidence and experience into readily acceptable policy and public communication.

  13. Toolbox for Renewable Energy Project Development

    EPA Pesticide Factsheets

    The Toolbox for Renewable Energy Project Development summarizes key project development issues, addresses how to overcome major hurdles, and provides a curated directory of project development resources.

  14. The Radiology Resident iPad Toolbox: an educational and clinical tool for radiology residents.

    PubMed

    Sharpe, Emerson E; Kendrick, Michael; Strickland, Colin; Dodd, Gerald D

    2013-07-01

    Tablet computing and mobile resources are the hot topics in technology today, with that interest spilling into the medical field. To improve resident education, a fully configured iPad, referred to as the "Radiology Resident iPad Toolbox," was created and implemented at the University of Colorado. The goal was to create a portable device with comprehensive educational, clinical, and communication tools that would contain all necessary resources for an entire 4-year radiology residency. The device was distributed to a total of 34 radiology residents (8 first-year residents, 8 second-year residents, 9 third-year residents, and 9 fourth-year residents). This article describes the process used to develop and deploy the device, provides a distillation of useful applications and resources decided upon after extensive evaluation, and assesses the impact this device had on resident education. The Radiology Resident iPad Toolbox is a cost-effective, portable, educational instrument that has increased studying efficiency; improved access to study materials such as books, radiology cases, lectures, and web-based resources; and increased interactivity in educational conferences and lectures through the use of audience-response software, with questions geared toward the new ABR board format. This preconfigured tablet fully embraces the technology shift into mobile computing and represents a paradigm shift in educational strategy. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. Acrylamide mitigation strategies: critical appraisal of the FoodDrinkEurope toolbox.

    PubMed

    Palermo, M; Gökmen, V; De Meulenaer, B; Ciesarová, Z; Zhang, Y; Pedreschi, F; Fogliano, V

    2016-06-15

    FoodDrinkEurope Federation recently released the latest version of the Acrylamide Toolbox to support manufacturers in acrylamide reduction activities giving indication about the possible mitigation strategies. The Toolbox is intended for small and medium size enterprises with limited R&D resources, however no comments about the pro and cons of the different measures were provided to advise the potential users. Experts of the field are aware that not all the strategies proposed have equal value in terms of efficacy and cost/benefit ratio. This consideration prompted us to provide a qualitative science-based ranking of the mitigation strategies proposed in the acrylamide Toolbox, focusing on bakery and fried potato products. Five authors from different geographical areas having a publication record on acrylamide mitigation strategies worked independently ranking the efficacy of the acrylamide mitigation strategies taking into account three key parameters: (i) reduction rate; (ii) side effects; and (iii) applicability and economic impact. On the basis of their own experience and considering selected literature of the last ten years, the authors scored for each key parameter the acrylamide mitigation strategies proposed in the Toolbox. As expected, all strategies selected in the Toolbox turned out to be useful, however, not at the same level. The use of enzyme asparaginase and the selection of low sugar varieties were considered the best mitigation strategies in bakery and in potato products, respectively. According to authors' opinion most of the other mitigation strategies, although effective, either have relevant side effects on the sensory profile of the products, or they are not easy to implement in industrial production. The final outcome was a science based commented ranking which can enrich the acrylamide Toolbox supporting individual manufacturer in taking the best actions to reduce the acrylamide content in their specific production context.

  16. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, Per; Benveniste, Jerome

    2017-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  17. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, Per; Benveniste, Jerome; Team Gut

    2016-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  18. An image analysis toolbox for high-throughput C. elegans assays

    PubMed Central

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.

    2012-01-01

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656

  19. FISSA: A neuropil decontamination toolbox for calcium imaging signals.

    PubMed

    Keemink, Sander W; Lowe, Scott C; Pakan, Janelle M P; Dylda, Evelyn; van Rossum, Mark C W; Rochefort, Nathalie L

    2018-02-22

    In vivo calcium imaging has become a method of choice to image neuronal population activity throughout the nervous system. These experiments generate large sequences of images. Their analysis is computationally intensive and typically involves motion correction, image segmentation into regions of interest (ROIs), and extraction of fluorescence traces from each ROI. Out of focus fluorescence from surrounding neuropil and other cells can strongly contaminate the signal assigned to a given ROI. In this study, we introduce the FISSA toolbox (Fast Image Signal Separation Analysis) for neuropil decontamination. Given pre-defined ROIs, the FISSA toolbox automatically extracts the surrounding local neuropil and performs blind-source separation with non-negative matrix factorization. Using both simulated and in vivo data, we show that this toolbox performs similarly or better than existing published methods. FISSA requires only little RAM, and allows for fast processing of large datasets even on a standard laptop. The FISSA toolbox is available in Python, with an option for MATLAB format outputs, and can easily be integrated into existing workflows. It is available from Github and the standard Python repositories.

  20. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    PubMed

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  2. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  3. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap.

    PubMed

    Snigdha, Shikha; Milgram, Norton W; Willis, Sherry L; Albert, Marylin; Weintraub, S; Fortin, Norbert J; Cotman, Carl W

    2013-07-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer's disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer's disease. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap

    PubMed Central

    Snigdha, Shikha; Milgram, Norton W.; Willis, Sherry L.; Albert, Marylin; Weintraub, S.; Fortin, Norbert J.; Cotman, Carl W.

    2013-01-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer’s disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer’s disease. PMID:23434040

  5. European distributed seismological data archives infrastructure: EIDA

    NASA Astrophysics Data System (ADS)

    Clinton, John; Hanka, Winfried; Mazza, Salvatore; Pederson, Helle; Sleeman, Reinoud; Stammler, Klaus; Strollo, Angelo

    2014-05-01

    The European Integrated waveform Data Archive (EIDA) is a distributed Data Center system within ORFEUS that (a) securely archives seismic waveform data and related metadata gathered by European research infrastructures, and (b) provides transparent access to the archives for the geosciences research communities. EIDA was founded in 2013 by ORFEUS Data Center, GFZ, RESIF, ETH, INGV and BGR to ensure sustainability of a distributed archive system and the implementation of standards (e.g. FDSN StationXML, FDSN webservices) and coordinate new developments. Under the mandate of the ORFEUS Board of Directors and Executive Committee the founding group is responsible for steering and maintaining the technical developments and organization of the European distributed seismic waveform data archive and the integration within broader multidisciplanry frameworks like EPOS. EIDA currently offers uniform data access to unrestricted data from 8 European archives (www.orfeus-eu.org/eida), linked by the Arclink protocol, hosting data from 75 permanent networks (1800+ stations) and 33 temporary networks (1200+) stations). Moreover, each archive may also provide unique, restricted datasets. A webinterface, developed at GFZ, offers interactive access to different catalogues (EMSC, GFZ, USGS) and EIDA waveform data. Clients and toolboxes like arclink_fetch and ObsPy can connect directly to any EIDA node to collect data. Current developments are directed to the implementation of quality parameters and strong motion parameters.

  6. PV_LIB Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-11

    While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less

  7. A GIS tool for two-dimensional glacier-terminus change tracking

    NASA Astrophysics Data System (ADS)

    Urbanski, Jacek Andrzej

    2018-02-01

    This paper presents a Glacier Termini Tracking (GTT) toolbox for the two-dimensional analysis of glacier-terminus position changes. The input consists of a vector layer with several termini lines relating to the same glacier at different times. The output layers allow analyses to be conducted of glacier-terminus retreats, changes in retreats over time and along the ice face, and glacier-terminus fluctuations over time. The application of three tools from the toolbox is demonstrated via the analysis of eight glacier-terminus retreats and fluctuations at the Hornsund fjord in south Svalbard. It is proposed that this toolbox may also be useful in the study of other line features that change over time, like coastlines and rivers. The toolbox has been coded in Python and runs via ArcGIS.

  8. The GMT/MATLAB Toolbox

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  9. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and

  10. Real time wind farm emulation using SimWindFarm toolbox

    NASA Astrophysics Data System (ADS)

    Topor, Marcel

    2016-06-01

    This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.

  11. Sentinel-2 data exploitation with ESA's Sentinel-2 Toolbox

    NASA Astrophysics Data System (ADS)

    Gascon, Ferran; Ramoino, Fabrizzio; deanos, Yves-louis

    2017-04-01

    The Sentinel-2 Toolbox is a project kicked off by ESA in early 2014, under the umbrella of the ESA SEOM programme with the aim to provide a tool for visualizing, analysing, and processing the Sentinel-2 datasets. The toolbox is an extension of the SeNtinel Application Platform (SNAP), a project resulting from the effort of the developers of the Sentinel-1, Sentinel-2 and Sentinel-3 toolbox to provide a single common application framework suited for the mixed exploitation of SAR, high resolution optical and medium resolution optical datasets. All three development teams collaborate to drive the evolution of the common SNAP framework in a developer forum. In this triplet, the Sentinel-2 toolbox is dedicated to enhance SNAP support for high resolution optical imagery. It is a multi-mission toolbox, already providing support for Sentinel-2, RapidEye, Deimos, SPOT 1 to SPOT 5 datasets. In terms of processing algorithms, SNAP provides tools specific to the Sentinel-2 mission : • An atmospheric correction module, Sen2Cor, is integrated into the toolbox, and provides scene classification, atmospheric correction, cirrus detection and correction. The output L2A products can be opened seamlessly in the toolbox. • A multitemporal synthesis processor (L3) • A biophysical products processor (L2B) • A water processor • A deforestation detector • OTB tools integration • SNAP Engine for Cloud Exploitation along with a set of more generic tools for high resolution optical data exploitation. Together with the generic functionalities of SNAP this provides an ideal environment for designing multi-missions processing chains and producing value-added products from raw datasets. The use of SNAP is manifold and the desktop tools provides a rich application for interactive visualization, analysis and processing of data. But all tools available from SNAP can be accessed via command-line through the Graph Processing Framework (GPT), the kernel of the SNAP processing engine. This makes it a perfect candidate for driving the processing of data on servers for bulk processing.

  12. Microfluidic "Pouch" Chips for Immunoassays and Nucleic Acid Amplification Tests.

    PubMed

    Mauk, Michael G; Liu, Changchun; Qiu, Xianbo; Chen, Dafeng; Song, Jinzhao; Bau, Haim H

    2017-01-01

    Microfluidic cassettes ("chips") for processing and analysis of clinical specimens and other sample types facilitate point-of-care (POC) immunoassays and nucleic acid based amplification tests. These single-use test chips can be self-contained and made amenable to autonomous operation-reducing or eliminating supporting instrumentation-by incorporating laminated, pliable "pouch" and membrane structures for fluid storage, pumping, mixing, and flow control. Materials and methods for integrating flexible pouch compartments and diaphragm valves into hard plastic (e.g., acrylic and polycarbonate) microfluidic "chips" for reagent storage, fluid actuation, and flow control are described. We review several versions of these pouch chips for immunoassay and nucleic acid amplification tests, and describe related fabrication techniques. These protocols thus offer a "toolbox" of methods for storage, pumping, and flow control functions in microfluidic devices.

  13. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Garcia-Mondejar, Albert; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the frontend for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the dataformatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat2 Baseline C. The second release of the Toolbox, published in October 2016, has a new graphical user interface and other visualisation improvements. The third release (January 2017) includes more features and solves issues from the previous versions.

  14. Smoke Ready Toolbox for Wildfires

    EPA Pesticide Factsheets

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  15. Developing a congestion mitigation toolbox.

    DOT National Transportation Integrated Search

    2011-09-30

    Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference for practitioners and an educational tool for elected officials acting through public policy boards to better understand the development, planning,...

  16. Grid Integrated Distributed PV (GridPV) Version 2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Matthew J.; Coogan, Kyle

    2014-12-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functio ns are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in th e OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included tomore » show potential uses of the toolbox functions. Each function i n the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.« less

  17. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less

  18. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    PubMed

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    PubMed

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  20. A Molecular Toolbox for Rapid Generation of Viral Vectors to Up- or Down-Regulate Neuronal Gene Expression in vivo

    PubMed Central

    White, Melanie D.; Milne, Ruth V. J.; Nolan, Matthew F.

    2011-01-01

    We introduce a molecular toolbox for manipulation of neuronal gene expression in vivo. The toolbox includes promoters, ion channels, optogenetic tools, fluorescent proteins, and intronic artificial microRNAs. The components are easily assembled into adeno-associated virus (AAV) or lentivirus vectors using recombination cloning. We demonstrate assembly of toolbox components into lentivirus and AAV vectors and use these vectors for in vivo expression of inwardly rectifying potassium channels (Kir2.1, Kir3.1, and Kir3.2) and an artificial microRNA targeted against the ion channel HCN1 (HCN1 miRNA). We show that AAV assembled to express HCN1 miRNA produces efficacious and specific in vivo knockdown of HCN1 channels. Comparison of in vivo viral transduction using HCN1 miRNA with mice containing a germ line deletion of HCN1 reveals similar physiological phenotypes in cerebellar Purkinje cells. The easy assembly and re-usability of the toolbox components, together with the ability to up- or down-regulate neuronal gene expression in vivo, may be useful for applications in many areas of neuroscience. PMID:21772812

  1. A Michigan toolbox for mitigating traffic congestion.

    DOT National Transportation Integrated Search

    2011-09-30

    "Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference : for practitioners and an educational tool for elected officials acting through public policy boards to better : understand the development, plan...

  2. Drinking Water Cyanotoxin Risk Communication Toolbox

    EPA Pesticide Factsheets

    The drinking water cyanotoxin risk communication toolbox is a ready-to-use, “one-stop-shop” to support public water systems, states, and local governments in developing, as they deem appropriate, their own risk communication materials.

  3. EPA ExpoBox Toolbox Search

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant assessment data bases,

  4. 40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...

  5. 40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...

  6. Air Sensor Toolbox for Citizen Scientists

    EPA Pesticide Factsheets

    EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.

  7. A portable toolbox to monitor and evaluate signal operations.

    DOT National Transportation Integrated Search

    2011-10-01

    Researchers from the Texas Transportation Institute developed a portable tool consisting of a fieldhardened : computer interfacing with the traffic signal cabinet through special enhanced Bus Interface Units. : The toolbox consisted of a monitoring t...

  8. Air Sensor Toolbox: Resources and Funding

    EPA Pesticide Factsheets

    EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.

  9. Ironbound Community Citizen Science Toolbox Fact Sheet

    EPA Pesticide Factsheets

    EPA is partnering with Newark’s Ironbound Community Corporation (ICC) to design, develop, and pilot a Citizen Science Toolbox that will enable communities to collect their own environmental data and increase their ability to understand local conditions.

  10. Evaluating a 2D image-based computerized approach for measuring riverine pebble roundness

    NASA Astrophysics Data System (ADS)

    Cassel, Mathieu; Piégay, Hervé; Lavé, Jérôme; Vaudor, Lise; Hadmoko Sri, Danang; Wibiwo Budi, Sandy; Lavigne, Franck

    2018-06-01

    The geometrical characteristics of pebbles are important features to study transport pathways, sedimentary history, depositional environments, abrasion processes or to target sediment sources. Both the shape and roundness of pebbles can be described by a still growing number of metrics in 2D and 3D or by visual charts. Despite new developments, existing tools remain proprietary and no pebble roundness toolbox has been available widely within the scientific community. The toolbox developed by Roussillon et al. (2009) automatically computes the size, shape and roundness indexes of pebbles from their 2D maximal projection plans. Using a digital camera, this toolbox operates using 2D pictures taken of pebbles placed on a one square meter red board, allowing data collection to be quickly and efficiently acquired at a large scale. Now that the toolbox is freely available for download,

  11. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  12. National Water-Quality Assessment (NAWQA) area-characterization toolbox

    USGS Publications Warehouse

    Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  13. Comparison of human septal nuclei MRI measurements using automated segmentation and a new manual protocol based on histology

    PubMed Central

    Butler, Tracy; Zaborszky, Laszlo; Pirraglia, Elizabeth; Li, Jinyu; Wang, Xiuyuan Hugh; Li, Yi; Tsui, Wai; Talos, Delia; Devinsky, Orrin; Kuchna, Izabela; Nowicki, Krzysztof; French, Jacqueline; Kuzniecky, Rubin; Wegiel, Jerzy; Glodzik, Lidia; Rusinek, Henry; DeLeon, Mony J.; Thesen, Thomas

    2014-01-01

    Septal nuclei, located in basal forebrain, are strongly connected with hippocampi and important in learning and memory, but have received limited research attention in human MRI studies. While probabilistic maps for estimating septal volume on MRI are now available, they have not been independently validated against manual tracing of MRI, typically considered the gold standard for delineating brain structures. We developed a protocol for manual tracing of the human septal region on MRI based on examination of neuroanatomical specimens. We applied this tracing protocol to T1 MRI scans (n=86) from subjects with temporal epilepsy and healthy controls to measure septal volume. To assess the inter-rater reliability of the protocol, a second tracer used the same protocol on 20 scans that were randomly selected from the 72 healthy controls. In addition to measuring septal volume, maximum septal thickness between the ventricles was measured and recorded. The same scans (n=86) were also analysed using septal probabilistic maps and Dartel toolbox in SPM. Results show that our manual tracing algorithm is reliable, and that septal volume measurements obtained via manual and automated methods correlate significantly with each other (p<001). Both manual and automated methods detected significantly enlarged septal nuclei in patients with temporal lobe epilepsy in accord with a proposed compensatory neuroplastic process related to the strong connections between septal nuclei and hippocampi. Septal thickness, which was simple to measure with excellent inter-rater reliability, correlated well with both manual and automated septal volume, suggesting it could serve as an easy-to-measure surrogate for septal volume in future studies. Our results call attention to the important though understudied human septal region, confirm its enlargement in temporal lobe epilepsy, and provide a reliable new manual delineation protocol that will facilitate continued study of this critical region. PMID:24736183

  14. Comparison of human septal nuclei MRI measurements using automated segmentation and a new manual protocol based on histology.

    PubMed

    Butler, Tracy; Zaborszky, Laszlo; Pirraglia, Elizabeth; Li, Jinyu; Wang, Xiuyuan Hugh; Li, Yi; Tsui, Wai; Talos, Delia; Devinsky, Orrin; Kuchna, Izabela; Nowicki, Krzysztof; French, Jacqueline; Kuzniecky, Rubin; Wegiel, Jerzy; Glodzik, Lidia; Rusinek, Henry; deLeon, Mony J; Thesen, Thomas

    2014-08-15

    Septal nuclei, located in basal forebrain, are strongly connected with hippocampi and important in learning and memory, but have received limited research attention in human MRI studies. While probabilistic maps for estimating septal volume on MRI are now available, they have not been independently validated against manual tracing of MRI, typically considered the gold standard for delineating brain structures. We developed a protocol for manual tracing of the human septal region on MRI based on examination of neuroanatomical specimens. We applied this tracing protocol to T1 MRI scans (n=86) from subjects with temporal epilepsy and healthy controls to measure septal volume. To assess the inter-rater reliability of the protocol, a second tracer used the same protocol on 20 scans that were randomly selected from the 72 healthy controls. In addition to measuring septal volume, maximum septal thickness between the ventricles was measured and recorded. The same scans (n=86) were also analyzed using septal probabilistic maps and DARTEL toolbox in SPM. Results show that our manual tracing algorithm is reliable, and that septal volume measurements obtained via manual and automated methods correlate significantly with each other (p<.001). Both manual and automated methods detected significantly enlarged septal nuclei in patients with temporal lobe epilepsy in accord with a proposed compensatory neuroplastic process related to the strong connections between septal nuclei and hippocampi. Septal thickness, which was simple to measure with excellent inter-rater reliability, correlated well with both manual and automated septal volume, suggesting it could serve as an easy-to-measure surrogate for septal volume in future studies. Our results call attention to the important though understudied human septal region, confirm its enlargement in temporal lobe epilepsy, and provide a reliable new manual delineation protocol that will facilitate continued study of this critical region. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Integrating Building Information Modeling and Health and Safety for Onsite Construction

    PubMed Central

    Ganah, Abdulkadir; John, Godfaurd A.

    2014-01-01

    Background Health and safety (H&S) on a construction site can either make or break a contractor, if not properly managed. The usage of Building Information Modeling (BIM) for H&S on construction execution has the potential to augment practitioner understanding of their sites, and by so doing reduce the probability of accidents. This research explores BIM usage within the construction industry in relation to H&S communication. Methods In addition to an extensive literature review, a questionnaire survey was conducted to gather information on the embedment of H&S planning with the BIM environment for site practitioners. Results The analysis of responses indicated that BIM will enhance the current approach of H&S planning for construction site personnel. Conclusion From the survey, toolbox talk will have to be integrated with the BIM environment, because it is the predominantly used procedure for enhancing H&S issues within construction sites. The advantage is that personnel can visually understand H&S issues as work progresses during the toolbox talk onsite. PMID:25830069

  16. Integrating building information modeling and health and safety for onsite construction.

    PubMed

    Ganah, Abdulkadir; John, Godfaurd A

    2015-03-01

    Health and safety (H&S) on a construction site can either make or break a contractor, if not properly managed. The usage of Building Information Modeling (BIM) for H&S on construction execution has the potential to augment practitioner understanding of their sites, and by so doing reduce the probability of accidents. This research explores BIM usage within the construction industry in relation to H&S communication. In addition to an extensive literature review, a questionnaire survey was conducted to gather information on the embedment of H&S planning with the BIM environment for site practitioners. The analysis of responses indicated that BIM will enhance the current approach of H&S planning for construction site personnel. From the survey, toolbox talk will have to be integrated with the BIM environment, because it is the predominantly used procedure for enhancing H&S issues within construction sites. The advantage is that personnel can visually understand H&S issues as work progresses during the toolbox talk onsite.

  17. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Escolà, Roger; Garcia-Mondejar, Albert; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrozio, Americo; Restano, Marco; Benveniste, Jérôme

    2016-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the envissat reprocessed, new features regarding data interpolation and formulas updates.

  18. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Mondéjar, Albert; Benveniste, Jérôme; Naeije, Marc; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Ambrózio, Américo; Restano, Marco

    2016-07-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Études Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the EnviSat reprocessed, new features regarding data interpolation and formulas updates.

  19. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  20. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide.

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  1. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product version is compared with the previous version. Although, the toolbox is mainly tested for the Baltic Sea yet, it can easily be adapted to different datasets and parameters, regardless of the geographic region. In this presentation the usability of the toolbox is demonstrated along with several results of the validation process.

  2. Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows

    NASA Astrophysics Data System (ADS)

    Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.

    2017-06-01

    The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.

  3. SSOAP Toolbox Enhancements and Case Study

    EPA Science Inventory

    Recognizing the need for tools to support the development of sanitary sewer overflow (SSO) control plans, in October 2009 the U.S. Environmental Protection Agency (EPA) released the first version of the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox. This first ve...

  4. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic System (T-MATS)

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei (OA)

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This presentation describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this presentation is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture.

  5. FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data.

    PubMed

    Muir, Dylan R; Kampa, Björn M

    2014-01-01

    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.

  6. FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data

    PubMed Central

    Muir, Dylan R.; Kampa, Björn M.

    2015-01-01

    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories1. PMID:25653614

  7. An Educational Model for Hands-On Hydrology Education

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.

    2014-12-01

    This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.

  8. Wyrm: A Brain-Computer Interface Toolbox in Python.

    PubMed

    Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin

    2015-10-01

    In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.

  9. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  10. Aerospace Toolbox--a flight vehicle design, analysis, simulation, and software development environment II: an in-depth overview

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.

    2002-07-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  11. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    PubMed

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  12. Integrated system dynamics toolbox for water resources planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associatedmore » processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward achieving the technology development goals of this center.« less

  13. Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing ...

  14. A Toolbox for Corrective Action: Resource Conservation and Recovery Act Facilities Investigation Remedy Selection Track

    EPA Pesticide Factsheets

    The purpose of this toolbox is to help EPA Regional staff and their partners to take advantage of the efficiency and quality gains from the Resource Conservation and Recovery Act (RCRA) Facilities Investigation Remedy Selection Track (FIRST) approach.

  15. Traffic analysis toolbox volume IX : work zone modeling and simulation, a guide for analysts

    DOT National Transportation Integrated Search

    2009-03-01

    This document is the second volume in the FHWA Traffic Analysis Toolbox: Work Zone Analysis series. Whereas the first volume provides guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in work zone plan...

  16. An analysis toolbox to explore mesenchymal migration heterogeneity reveals adaptive switching between distinct modes

    PubMed Central

    Shafqat-Abbasi, Hamdah; Kowalewski, Jacob M; Kiss, Alexa; Gong, Xiaowei; Hernandez-Varas, Pablo; Berge, Ulrich; Jafari-Mamaghani, Mehrdad; Lock, John G; Strömblad, Staffan

    2016-01-01

    Mesenchymal (lamellipodial) migration is heterogeneous, although whether this reflects progressive variability or discrete, 'switchable' migration modalities, remains unclear. We present an analytical toolbox, based on quantitative single-cell imaging data, to interrogate this heterogeneity. Integrating supervised behavioral classification with multivariate analyses of cell motion, membrane dynamics, cell-matrix adhesion status and F-actin organization, this toolbox here enables the detection and characterization of two quantitatively distinct mesenchymal migration modes, termed 'Continuous' and 'Discontinuous'. Quantitative mode comparisons reveal differences in cell motion, spatiotemporal coordination of membrane protrusion/retraction, and how cells within each mode reorganize with changed cell speed. These modes thus represent distinctive migratory strategies. Additional analyses illuminate the macromolecular- and cellular-scale effects of molecular targeting (fibronectin, talin, ROCK), including 'adaptive switching' between Continuous (favored at high adhesion/full contraction) and Discontinuous (low adhesion/inhibited contraction) modes. Overall, this analytical toolbox now facilitates the exploration of both spontaneous and adaptive heterogeneity in mesenchymal migration. DOI: http://dx.doi.org/10.7554/eLife.11384.001 PMID:26821527

  17. A Toolbox to Improve Algorithms for Insulin-Dosing Decision Support

    PubMed Central

    Donsa, K.; Plank, J.; Schaupp, L.; Mader, J. K.; Truskaller, T.; Tschapeller, B.; Höll, B.; Spat, S.; Pieber, T. R.

    2014-01-01

    Summary Background Standardized insulin order sets for subcutaneous basal-bolus insulin therapy are recommended by clinical guidelines for the inpatient management of diabetes. The algorithm based GlucoTab system electronically assists health care personnel by supporting clinical workflow and providing insulin-dose suggestions. Objective To develop a toolbox for improving clinical decision-support algorithms. Methods The toolbox has three main components. 1) Data preparation: Data from several heterogeneous sources is extracted, cleaned and stored in a uniform data format. 2) Simulation: The effects of algorithm modifications are estimated by simulating treatment workflows based on real data from clinical trials. 3) Analysis: Algorithm performance is measured, analyzed and simulated by using data from three clinical trials with a total of 166 patients. Results Use of the toolbox led to algorithm improvements as well as the detection of potential individualized subgroup-specific algorithms. Conclusion These results are a first step towards individualized algorithm modifications for specific patient subgroups. PMID:25024768

  18. A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering

    PubMed Central

    Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng

    2013-01-01

    Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791

  19. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    USGS Publications Warehouse

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  20. A task-related and resting state realistic fMRI simulator for fMRI data validation

    NASA Astrophysics Data System (ADS)

    Hill, Jason E.; Liu, Xiangyu; Nutter, Brian; Mitra, Sunanda

    2017-02-01

    After more than 25 years of published functional magnetic resonance imaging (fMRI) studies, careful scrutiny reveals that most of the reported results lack fully decisive validation. The complex nature of fMRI data generation and acquisition results in unavoidable uncertainties in the true estimation and interpretation of both task-related activation maps and resting state functional connectivity networks, despite the use of various statistical data analysis methodologies. The goal of developing the proposed STANCE (Spontaneous and Task-related Activation of Neuronally Correlated Events) simulator is to generate realistic task-related and/or resting-state 4D blood oxygenation level dependent (BOLD) signals, given the experimental paradigm and scan protocol, by using digital phantoms of twenty normal brains available from BrainWeb (http://brainweb.bic.mni.mcgill.ca/brainweb/). The proposed simulator will include estimated system and modelled physiological noise as well as motion to serve as a reference to measured brain activities. In its current form, STANCE is a MATLAB toolbox with command line functions serving as an open-source add-on to SPM8 (http://www.fil.ion.ucl.ac.uk/spm/software/spm8/). The STANCE simulator has been designed in a modular framework so that the hemodynamic response (HR) and various noise models can be iteratively improved to include evolving knowledge about such models.

  1. ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes

    PubMed Central

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273

  2. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    PubMed

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  3. SBEToolbox: A Matlab Toolbox for Biological Network Analysis

    PubMed Central

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418

  4. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    PubMed

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  5. Sodium 3D COncentration MApping (COMA 3D) using 23Na and proton MRI

    NASA Astrophysics Data System (ADS)

    Truong, Milton L.; Harrington, Michael G.; Schepkin, Victor D.; Chekmenev, Eduard Y.

    2014-10-01

    Functional changes of sodium 3D MRI signals were converted into millimolar concentration changes using an open-source fully automated MATLAB toolbox. These concentration changes are visualized via 3D sodium concentration maps, and they are overlaid over conventional 3D proton images to provide high-resolution co-registration for easy correlation of functional changes to anatomical regions. Nearly 5000/h concentration maps were generated on a personal computer (ca. 2012) using 21.1 T 3D sodium MRI brain images of live rats with spatial resolution of 0.8 × 0.8 × 0.8 mm3 and imaging matrices of 60 × 60 × 60. The produced concentration maps allowed for non-invasive quantitative measurement of in vivo sodium concentration in the normal rat brain as a functional response to migraine-like conditions. The presented work can also be applied to sodium-associated changes in migraine, cancer, and other metabolic abnormalities that can be sensed by molecular imaging. The MATLAB toolbox allows for automated image analysis of the 3D images acquired on the Bruker platform and can be extended to other imaging platforms. The resulting images are presented in a form of series of 2D slices in all three dimensions in native MATLAB and PDF formats. The following is provided: (a) MATLAB source code for image processing, (b) the detailed processing procedures, (c) description of the code and all sub-routines, (d) example data sets of initial and processed data. The toolbox can be downloaded at: http://www.vuiis.vanderbilt.edu/ truongm/COMA3D/.

  6. nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab

    PubMed Central

    Cajigas, I.; Malik, W.Q.; Brown, E.N.

    2012-01-01

    Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419

  7. UAS-NAS Live Virtual Constructive Distributed Environment (LVC): LVC Gateway, Gateway Toolbox, Gateway Data Logger (GDL), SaaProc Software Design Description

    NASA Technical Reports Server (NTRS)

    Jovic, Srboljub

    2015-01-01

    This document provides the software design description for the two core software components, the LVC Gateway, the LVC Gateway Toolbox, and two participants, the LVC Gateway Data Logger and the SAA Processor (SaaProc).

  8. Expanding the seat belt program strategies toolbox: a starter kit for trying new program ideas : traffic tech.

    DOT National Transportation Integrated Search

    2016-10-01

    The National Highway Traffic Safety Administration has just : released a new resource for developing seat belt programs in : the traffic safety communityExpanding the Seat Belt Program : Toolbox: A Starter Kit for Trying New Program Ideas. : Resea...

  9. Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox - slides

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing S...

  10. A Software Toolbox for Systematic Evaluation of Seismometer-Digitizer System Responses

    DTIC Science & Technology

    2010-09-01

    characteristics (e.g., borehole vs. surface installation) than the actual seismic noise characteristics. These results suggest that our best results of NOISETRAN...Award No. DE-FG02-09ER85548/Phase_I ABSTRACT Measurement of the absolute amplitudes of a seismic signal requires accurate knowledge of...power spectral density (PSD) estimator for background noise spectra at a seismic station. SACPSD differs from the current PSD used by NEIC and IRIS

  11. Motor assessment using the NIH Toolbox

    PubMed Central

    Magasi, Susan; McCreath, Heather E.; Bohannon, Richard W.; Wang, Ying-Chih; Bubela, Deborah J.; Rymer, William Z.; Beaumont, Jennifer; Rine, Rose Marie; Lai, Jin-Shei; Gershon, Richard C.

    2013-01-01

    Motor function involves complex physiologic processes and requires the integration of multiple systems, including neuromuscular, musculoskeletal, and cardiopulmonary, and neural motor and sensory-perceptual systems. Motor-functional status is indicative of current physical health status, burden of disease, and long-term health outcomes, and is integrally related to daily functioning and quality of life. Given its importance to overall neurologic health and function, motor function was identified as a key domain for inclusion in the NIH Toolbox for Assessment of Neurological and Behavioral Function (NIH Toolbox). We engaged in a 3-stage developmental process to: 1) identify key subdomains and candidate measures for inclusion in the NIH Toolbox, 2) pretest candidate measures for feasibility across the age span of people aged 3 to 85 years, and 3) validate candidate measures against criterion measures in a sample of healthy individuals aged 3 to 85 years (n = 340). Based on extensive literature review and input from content experts, the 5 subdomains of dexterity, strength, balance, locomotion, and endurance were recommended for inclusion in the NIH Toolbox motor battery. Based on our validation testing, valid and reliable measures that are simultaneously low-cost and portable have been recommended to assess each subdomain, including the 9-hole peg board for dexterity, grip dynamometry for upper-extremity strength, standing balance test, 4-m walk test for gait speed, and a 2-minute walk test for endurance. PMID:23479547

  12. An ethics toolbox for neurotechnology.

    PubMed

    Farah, Martha J

    2015-04-08

    Advances in neurotechnology will raise new ethical dilemmas, to which scientists and the rest of society must respond. Here I present a "toolbox" of concepts to help us analyze these issues and communicate with each other about them across differences of ethical intuition. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. The triticeae toolbox: combining phenotype and genotype data to advance small-grains breeding

    USDA-ARS?s Scientific Manuscript database

    The Triticeae Toolbox (http://triticeaetoolbox.org; T3) is the database schema enabling plant breeders and researchers to combine, visualize, and interrogate the wealth of phenotype and genotype data generated by the Triticeae Coordinated Agricultural Project (TCAP). T3 enables users to define speci...

  14. Wastewater Collection System Toolbox | Eliminating Sanitary ...

    EPA Pesticide Factsheets

    2017-04-10

    Communities across the United States are working to find cost-effective, long-term approaches to managing their aging wastewater infrastructure and preventing the problems that lead to sanitary sewer overflows. The Toolbox is an effort by EPA New England to provide examples of programs and educational efforts from New England and beyond.

  15. 40 CFR 141.717 - Pre-filtration treatment toolbox components.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... surface water or GWUDI source. (c) Bank filtration. Systems receive Cryptosporidium treatment credit for... paragraph. Systems using bank filtration when they begin source water monitoring under § 141.701(a) must... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Pre-filtration treatment toolbox...

  16. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  17. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    ERIC Educational Resources Information Center

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  18. Mechanostimulation protocols for cardiac tissue engineering.

    PubMed

    Govoni, Marco; Muscari, Claudio; Guarnieri, Carlo; Giordano, Emanuele

    2013-01-01

    Owing to the inability of self-replacement by a damaged myocardium, alternative strategies to heart transplantation have been explored within the last decades and cardiac tissue engineering/regenerative medicine is among the present challenges in biomedical research. Hopefully, several studies witness the constant extension of the toolbox available to engineer a fully functional, contractile, and robust cardiac tissue using different combinations of cells, template bioscaffolds, and biophysical stimuli obtained by the use of specific bioreactors. Mechanical forces influence the growth and shape of every tissue in our body generating changes in intracellular biochemistry and gene expression. That is why bioreactors play a central role in the task of regenerating a complex tissue such as the myocardium. In the last fifteen years a large number of dynamic culture devices have been developed and many results have been collected. The aim of this brief review is to resume in a single streamlined paper the state of the art in this field.

  19. Mechanostimulation Protocols for Cardiac Tissue Engineering

    PubMed Central

    Govoni, Marco; Muscari, Claudio; Guarnieri, Carlo; Giordano, Emanuele

    2013-01-01

    Owing to the inability of self-replacement by a damaged myocardium, alternative strategies to heart transplantation have been explored within the last decades and cardiac tissue engineering/regenerative medicine is among the present challenges in biomedical research. Hopefully, several studies witness the constant extension of the toolbox available to engineer a fully functional, contractile, and robust cardiac tissue using different combinations of cells, template bioscaffolds, and biophysical stimuli obtained by the use of specific bioreactors. Mechanical forces influence the growth and shape of every tissue in our body generating changes in intracellular biochemistry and gene expression. That is why bioreactors play a central role in the task of regenerating a complex tissue such as the myocardium. In the last fifteen years a large number of dynamic culture devices have been developed and many results have been collected. The aim of this brief review is to resume in a single streamlined paper the state of the art in this field. PMID:23936858

  20. A toolbox of immunoprecipitation-grade monoclonal antibodies to human transcription factors.

    PubMed

    Venkataraman, Anand; Yang, Kun; Irizarry, Jose; Mackiewicz, Mark; Mita, Paolo; Kuang, Zheng; Xue, Lin; Ghosh, Devlina; Liu, Shuang; Ramos, Pedro; Hu, Shaohui; Bayron Kain, Diane; Keegan, Sarah; Saul, Richard; Colantonio, Simona; Zhang, Hongyan; Behn, Florencia Pauli; Song, Guang; Albino, Edisa; Asencio, Lillyann; Ramos, Leonardo; Lugo, Luvir; Morell, Gloriner; Rivera, Javier; Ruiz, Kimberly; Almodovar, Ruth; Nazario, Luis; Murphy, Keven; Vargas, Ivan; Rivera-Pacheco, Zully Ann; Rosa, Christian; Vargas, Moises; McDade, Jessica; Clark, Brian S; Yoo, Sooyeon; Khambadkone, Seva G; de Melo, Jimmy; Stevanovic, Milanka; Jiang, Lizhi; Li, Yana; Yap, Wendy Y; Jones, Brittany; Tandon, Atul; Campbell, Elliot; Montelione, Gaetano T; Anderson, Stephen; Myers, Richard M; Boeke, Jef D; Fenyö, David; Whiteley, Gordon; Bader, Joel S; Pino, Ignacio; Eichinger, Daniel J; Zhu, Heng; Blackshaw, Seth

    2018-03-19

    A key component of efforts to address the reproducibility crisis in biomedical research is the development of rigorously validated and renewable protein-affinity reagents. As part of the US National Institutes of Health (NIH) Protein Capture Reagents Program (PCRP), we have generated a collection of 1,406 highly validated immunoprecipitation- and/or immunoblotting-grade mouse monoclonal antibodies (mAbs) to 737 human transcription factors, using an integrated production and validation pipeline. We used HuProt human protein microarrays as a primary validation tool to identify mAbs with high specificity for their cognate targets. We further validated PCRP mAbs by means of multiple experimental applications, including immunoprecipitation, immunoblotting, chromatin immunoprecipitation followed by sequencing (ChIP-seq), and immunohistochemistry. We also conducted a meta-analysis that identified critical variables that contribute to the generation of high-quality mAbs. All validation data, protocols, and links to PCRP mAb suppliers are available at http://proteincapture.org.

  1. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.

    PubMed

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2017-11-05

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.

  2. Impact-oriented steering--the concept of NGO-IDEAs 'impact toolbox'.

    PubMed

    2008-03-01

    The NGO-IDEAs 'Impact Toolbox' has been developed with a group of NGOs all of which are active in the area of saving and credit in South India. This compilation of methods to apply in impact-oriented steering was devised by the executive staff of the Indian partner NGOs, also known as the Resource Persons, in 2006 and tested from late 2006 to early 2007. At first glance, the approach may appear to be highly specialised and difficult to transfer. However, in fact it follows principles that can be adapted for several NGOs in other countries and in other sectors. The following article presents the concept of the NGO-IDEAs 'Impact Toolbox'.

  3. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    PubMed

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  4. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing

    PubMed Central

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment. PMID:25101013

  5. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  6. No slowing down of Jakobshavn Isbræ in 2014: Results from feature-tracking five Greenland outlet glaciers using Landsat-8 data and the ImGRAFT toolbox

    NASA Astrophysics Data System (ADS)

    Messerli, Alexandra; Karlsson, Nanna B.; Grinsted, Aslak

    2015-04-01

    Data from the Landsat-8, panchromatic band, spanning the period (August) 2013 - (September) 2014 have been feature-tracked to construct ice velocities and flux estimates for five major Greenland outlet glaciers: Jakobshavn Isbræ, Nioghalvfjerdsbræ, Kangerdlugssuaq, Helheim and Petermann glaciers. The outlet glaciers are responsible for draining more than 20% of the Greenland Ice Sheet, and thus have a significant impact on its mass balance. The feature-tracking is performed with the newly developed ImGRAFT toolbox, a Matlab-based, freely available software (http://imgraft.glaciology.net). Overall, the resulting velocity fields and fluxes agree with the findings of existing studies. Notably, we find that Jakobshavn Isbræ has reached an unprecedented speed of over 50m/day, and exhibit large, seasonal fluctuations. In contrast, on the east coast of Greenland, Helheim and Kangerdlugssuaq Glaciers have returned to pre-speed up velocities, following a peak in ice flux about a decade ago. Petermann and Nigohalvfjerdsbræ show little variability in speeds with typical flow speeds of less than 5m/day.

  7. Functional Genomics Assistant (FUGA): a toolbox for the analysis of complex biological networks

    PubMed Central

    2011-01-01

    Background Cellular constituents such as proteins, DNA, and RNA form a complex web of interactions that regulate biochemical homeostasis and determine the dynamic cellular response to external stimuli. It follows that detailed understanding of these patterns is critical for the assessment of fundamental processes in cell biology and pathology. Representation and analysis of cellular constituents through network principles is a promising and popular analytical avenue towards a deeper understanding of molecular mechanisms in a system-wide context. Findings We present Functional Genomics Assistant (FUGA) - an extensible and portable MATLAB toolbox for the inference of biological relationships, graph topology analysis, random network simulation, network clustering, and functional enrichment statistics. In contrast to conventional differential expression analysis of individual genes, FUGA offers a framework for the study of system-wide properties of biological networks and highlights putative molecular targets using concepts of systems biology. Conclusion FUGA offers a simple and customizable framework for network analysis in a variety of systems biology applications. It is freely available for individual or academic use at http://code.google.com/p/fuga. PMID:22035155

  8. Policy Analysis for Sustainable Development: The Toolbox for the Environmental Social Scientist

    ERIC Educational Resources Information Center

    Runhaar, Hens; Dieperink, Carel; Driessen, Peter

    2006-01-01

    Purpose: The paper seeks to propose the basic competencies of environmental social scientists regarding policy analysis for sustainable development. The ultimate goal is to contribute to an improvement of educational programmes in higher education by suggesting a toolbox that should be integrated in the curriculum. Design/methodology/approach:…

  9. Rural ITS toolbox and deployment plan for Regions 2, 6, 7 and 9 : ITS toolbox for rural and small urban areas

    DOT National Transportation Integrated Search

    1998-12-01

    As a part of the Small Urban and Rural ITS Study it conducted in 4 of its more rural regions, the New York State Department of Transportation has developed a compendium of systems, devices and strategies that can enhance safety, provide information, ...

  10. The Psychometric Toolbox: An Excel Package for Use in Measurement and Psychometrics Courses

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Masip-Cabrera, Antoni; Navarro-González, David; Lorenzo-Seva, Urbano

    2017-01-01

    The Psychometric Toolbox (PT) is a user-friendly, non-commercial package mainly intended to be used for instructional purposes in introductory courses of educational and psychological measurement, psychometrics and statistics. The PT package is organized in six separate modules or sub-programs: Data preprocessor (descriptive analyses and data…

  11. Toolbox or Adjustable Spanner? A Critical Comparison of Two Metaphors for Adaptive Decision Making

    ERIC Educational Resources Information Center

    Söllner, Anke; Bröder, Arndt

    2016-01-01

    For multiattribute decision tasks, different metaphors exist that describe the process of decision making and its adaptation to diverse problems and situations. Multiple strategy models (MSMs) assume that decision makers choose adaptively from a set of different strategies (toolbox metaphor), whereas evidence accumulation models (EAMs) hold that a…

  12. FALCON: a toolbox for the fast contextualization of logical networks

    PubMed Central

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-01-01

    Abstract Motivation Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. Results We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. Availability and implementation FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. Contact thomas.sauter@uni.lu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28673016

  13. FALCON: a toolbox for the fast contextualization of logical networks.

    PubMed

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-11-01

    Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. thomas.sauter@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  14. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    NASA Astrophysics Data System (ADS)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinuesa, Ricardo; Fick, Lambert; Negi, Prabal

    In the present document we describe a toolbox for the spectral-element code Nek5000, aimed at computing turbulence statistics. The toolbox is presented for a small test case, namely a square duct with L x = 2h, L y = 2h and L z = 4h, where x, y and z are the horizontal, vertical and streamwise directions, respectively. The number of elements in the xy-plane is 16 X 16 = 256, and the number of elements in z is 4, leading to a total of 1,204 spectral elements. A polynomial order of N = 5 is chosen, and the meshmore » is generated using the Nek5000 tool genbox. The toolbox presented here allows to compute mean-velocity components, the Reynolds-stress tensor as well as turbulent kinetic energy (TKE) and Reynolds-stress budgets. Note that the present toolbox allows to compute turbulence statistics in turbulent flows with one homogeneous direction (where the statistics are based on time-averaging as well as averaging in the homogeneous direction), as well as in fully three-dimensional flows (with no periodic directions, where only time-averaging is considered).« less

  16. Cross-species 3D virtual reality toolbox for visual and cognitive experiments.

    PubMed

    Doucet, Guillaume; Gulli, Roberto A; Martinez-Trujillo, Julio C

    2016-06-15

    Although simplified visual stimuli, such as dots or gratings presented on homogeneous backgrounds, provide strict control over the stimulus parameters during visual experiments, they fail to approximate visual stimulation in natural conditions. Adoption of virtual reality (VR) in neuroscience research has been proposed to circumvent this problem, by combining strict control of experimental variables and behavioral monitoring within complex and realistic environments. We have created a VR toolbox that maximizes experimental flexibility while minimizing implementation costs. A free VR engine (Unreal 3) has been customized to interface with any control software via text commands, allowing seamless introduction into pre-existing laboratory data acquisition frameworks. Furthermore, control functions are provided for the two most common programming languages used in visual neuroscience: Matlab and Python. The toolbox offers milliseconds time resolution necessary for electrophysiological recordings and is flexible enough to support cross-species usage across a wide range of paradigms. Unlike previously proposed VR solutions whose implementation is complex and time-consuming, our toolbox requires minimal customization or technical expertise to interface with pre-existing data acquisition frameworks as it relies on already familiar programming environments. Moreover, as it is compatible with a variety of display and input devices, identical VR testing paradigms can be used across species, from rodents to humans. This toolbox facilitates the addition of VR capabilities to any laboratory without perturbing pre-existing data acquisition frameworks, or requiring any major hardware changes. Copyright © 2016 Z. All rights reserved.

  17. Optimizing detection and analysis of slow waves in sleep EEG.

    PubMed

    Mensen, Armand; Riedner, Brady; Tononi, Giulio

    2016-12-01

    Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction.

    PubMed

    Abulnaga, S Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M; Onyike, Chiadi U; Ying, Sarah H; Prince, Jerry L

    2016-02-27

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  19. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction

    NASA Astrophysics Data System (ADS)

    Abulnaga, S. Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M.; Onyike, Chiadi U.; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  20. Online model evaluation of large-eddy simulations covering Germany with a horizontal resolution of 156 m

    NASA Astrophysics Data System (ADS)

    Hansen, Akio; Ament, Felix; Lammert, Andrea

    2017-04-01

    Large-eddy simulations have been performed since several decades, but due to computational limits most studies were restricted to small domains or idealised initial-/boundary conditions. Within the High definition clouds and precipitation for advancing climate prediction (HD(CP)2) project realistic weather forecasting like LES simulations were performed with the newly developed ICON LES model for several days. The domain covers central Europe with a horizontal resolution down to 156 m. The setup consists of more than 3 billion grid cells, by what one 3D dump requires roughly 500 GB. A newly developed online evaluation toolbox was created to check instantaneously for realistic model simulations. The toolbox automatically combines model results with observations and generates several quicklooks for various variables. So far temperature-/humidity profiles, cloud cover, integrated water vapour, precipitation and many more are included. All kind of observations like aircraft observations, soundings or precipitation radar networks are used. For each dataset, a specific module is created, which allows for an easy handling and enhancement of the toolbox. Most of the observations are automatically downloaded from the Standardized Atmospheric Measurement Database (SAMD). The evaluation tool should support scientists at monitoring computational costly model simulations as well as to give a first overview about model's performance. The structure of the toolbox as well as the SAMD database are presented. Furthermore, the toolbox was applied on an ICON LES sensitivity study, where example results are shown.

  1. U.S. Geological Survey groundwater toolbox, a graphical and mapping interface for analysis of hydrologic data (version 1.0): user guide for estimation of base flow, runoff, and groundwater recharge from streamflow data

    USGS Publications Warehouse

    Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark

    2015-01-01

    This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.

  2. Evidence for a Common Toolbox Based on Necrotrophy in a Fungal Lineage Spanning Necrotrophs, Biotrophs, Endophytes, Host Generalists and Specialists

    PubMed Central

    Andrew, Marion; Barua, Reeta; Short, Steven M.; Kohn, Linda M.

    2012-01-01

    The Sclerotiniaceae (Ascomycotina, Leotiomycetes) is a relatively recently evolved lineage of necrotrophic host generalists, and necrotrophic or biotrophic host specialists, some latent or symptomless. We hypothesized that they inherited a basic toolbox of genes for plant symbiosis from their common ancestor. Maintenance and evolutionary diversification of symbiosis could require selection on toolbox genes or on timing and magnitude of gene expression. The genes studied were chosen because their products have been previously investigated as pathogenicity factors in the Sclerotiniaceae. They encode proteins associated with cell wall degradation: acid protease 1 (acp1), aspartyl protease (asps), and polygalacturonases (pg1, pg3, pg5, pg6), and the oxalic acid (OA) pathway: a zinc finger transcription factor (pac1), and oxaloacetate acetylhydrolase (oah), catalyst in OA production, essential for full symptom production in Sclerotinia sclerotiorum. Site-specific likelihood analyses provided evidence for purifying selection in all 8 pathogenicity-related genes. Consistent with an evolutionary arms race model, positive selection was detected in 5 of 8 genes. Only generalists produced large, proliferating disease lesions on excised Arabidopsis thaliana leaves and oxalic acid by 72 hours in vitro. In planta expression of oah was 10–300 times greater among the necrotrophic host generalists than necrotrophic and biotrophic host specialists; pac1 was not differentially expressed. Ability to amplify 6/8 pathogenicity related genes and produce oxalic acid in all genera are consistent with the common toolbox hypothesis for this gene sample. That our data did not distinguish biotrophs from necrotrophs is consistent with 1) a common toolbox based on necrotrophy and 2) the most conservative interpretation of the 3-locus housekeeping gene phylogeny – a baseline of necrotrophy from which forms of biotrophy emerged at least twice. Early oah overexpression likely expands the host range of necrotrophic generalists in the Sclerotiniaceae, while specialists and biotrophs deploy oah, or other as-yet-unknown toolbox genes, differently. PMID:22253834

  3. The 'Toolbox' of strategies for managing Haemonchus contortus in goats: What's in and what's out.

    PubMed

    Kearney, P E; Murray, P J; Hoy, J M; Hohenhaus, M; Kotze, A

    2016-04-15

    A dynamic and innovative approach to managing the blood-consuming nematode Haemonchus contortus in goats is critical to crack dependence on veterinary anthelmintics. H. contortus management strategies have been the subject of intense research for decades, and must be selected to create a tailored, individualized program for goat farms. Through the selection and combination of strategies from the Toolbox, an effective management program for H. contortus can be designed according to the unique conditions of each particular farm. This Toolbox investigates strategies including vaccines, bioactive forages, pasture/grazing management, behavioural management, natural immunity, FAMACHA, Refugia and strategic drenching, mineral/vitamin supplementation, copper Oxide Wire Particles (COWPs), breeding and selection/selecting resistant and resilient individuals, biological control and anthelmintic drugs. Barbervax(®), the ground-breaking Haemonchus vaccine developed and currently commercially available on a pilot scale for sheep, is prime for trialling in goats and would be an invaluable inclusion to this Toolbox. The specialised behaviours of goats, specifically their preferences to browse a variety of plants and accompanying physiological adaptations to the consumption of secondary compounds contained in browse, have long been unappreciated and thus overlooked as a valuable, sustainable strategy for Haemonchus management. These strategies are discussed in this review as to their value for inclusion into the 'Toolbox' currently, and the future implications of ongoing research for goat producers. Combining and manipulating strategies such as browsing behaviour, pasture management, bioactive forages and identifying and treating individual animals for haemonchosis, in addition to continuous evaluation of strategy effectiveness, is conducted using a model farm scenario. Selecting strategies from the Toolbox, with regard to their current availability, feasibility, economical cost and potential ease of implementation depending on the systems of production and their complementary nature, is the future of managing H. contortus in farmed goats internationally and maintaining the remaining efficacy of veterinary anthelmintics. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  5. An adaptable, low cost test-bed for unmanned vehicle systems research

    NASA Astrophysics Data System (ADS)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  6. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  7. An experimental toolbox for the generation of cold and ultracold polar molecules

    NASA Astrophysics Data System (ADS)

    Zeppenfeld, Martin; Gantner, Thomas; Glöckner, Rosa; Ibrügger, Martin; Koller, Manuel; Prehn, Alexander; Wu, Xing; Chervenkov, Sotir; Rempe, Gerhard

    2017-01-01

    Cold and ultracold molecules enable fascinating applications in quantum science. We present our toolbox of techniques to generate the required molecule ensembles, including buffergas cooling, centrifuge deceleration and optoelectrical Sisyphus cooling. We obtain excellent control over both the motional and internal molecular degrees of freedom, allowing us to aim at various applications.

  8. Tensor Toolbox for MATLAB v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kola, Tamara; Bader, Brett W.; Acar Ataman, Evrim NMN

    Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors using MATLAB's object-oriented features. It also provides algorithms for tensor decomposition and factorization, algorithms for computing tensor eigenvalues, and methods for visualization of results.

  9. MOFA Software for the COBRA Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesemer, Marc; Navid, Ali

    MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.

  10. The panacea toolbox of a PhD biomedical student.

    PubMed

    Skaik, Younis

    2014-01-01

    Doing a PhD (doctor of philosophy) for the sake of contribution to knowledge should give the student an immense enthusiasm through the PhD period. It is the time in one's life that one spends to "hit the nail on the head" in a specific area and topic of interest. A PhD consists mostly of hard work and tenacity; however, luck and genius might also play a little role. You can pass all PhD phases without having both luck and genius. The PhD student should have pre-PhD and PhD toolboxes, which are "sine quibus non" for getting successfully a PhD degree. In this manuscript, the toolboxes of the PhD student are discussed.

  11. A Tol2 Gateway-Compatible Toolbox for the Study of the Nervous System and Neurodegenerative Disease.

    PubMed

    Don, Emily K; Formella, Isabel; Badrock, Andrew P; Hall, Thomas E; Morsch, Marco; Hortle, Elinor; Hogan, Alison; Chow, Sharron; Gwee, Serene S L; Stoddart, Jack J; Nicholson, Garth; Chung, Roger; Cole, Nicholas J

    2017-02-01

    Currently there is a lack in fundamental understanding of disease progression of most neurodegenerative diseases, and, therefore, treatments and preventative measures are limited. Consequently, there is a great need for adaptable, yet robust model systems to both investigate elementary disease mechanisms and discover effective therapeutics. We have generated a Tol2 Gateway-compatible toolbox to study neurodegenerative disorders in zebrafish, which includes promoters for astrocytes, microglia and motor neurons, multiple fluorophores, and compatibility for the introduction of genes of interest or disease-linked genes. This toolbox will advance the rapid and flexible generation of zebrafish models to discover the biology of the nervous system and the disease processes that lead to neurodegeneration.

  12. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  13. FAST: FAST Analysis of Sequences Toolbox

    PubMed Central

    Lawrence, Travis J.; Kauffman, Kyle T.; Amrine, Katherine C. H.; Carper, Dana L.; Lee, Raymond S.; Becich, Peter J.; Canales, Claudia J.; Ardell, David H.

    2015-01-01

    FAST (FAST Analysis of Sequences Toolbox) provides simple, powerful open source command-line tools to filter, transform, annotate and analyze biological sequence data. Modeled after the GNU (GNU's Not Unix) Textutils such as grep, cut, and tr, FAST tools such as fasgrep, fascut, and fastr make it easy to rapidly prototype expressive bioinformatic workflows in a compact and generic command vocabulary. Compact combinatorial encoding of data workflows with FAST commands can simplify the documentation and reproducibility of bioinformatic protocols, supporting better transparency in biological data science. Interface self-consistency and conformity with conventions of GNU, Matlab, Perl, BioPerl, R, and GenBank help make FAST easy and rewarding to learn. FAST automates numerical, taxonomic, and text-based sorting, selection and transformation of sequence records and alignment sites based on content, index ranges, descriptive tags, annotated features, and in-line calculated analytics, including composition and codon usage. Automated content- and feature-based extraction of sites and support for molecular population genetic statistics make FAST useful for molecular evolutionary analysis. FAST is portable, easy to install and secure thanks to the relative maturity of its Perl and BioPerl foundations, with stable releases posted to CPAN. Development as well as a publicly accessible Cookbook and Wiki are available on the FAST GitHub repository at https://github.com/tlawrence3/FAST. The default data exchange format in FAST is Multi-FastA (specifically, a restriction of BioPerl FastA format). Sanger and Illumina 1.8+ FastQ formatted files are also supported. FAST makes it easier for non-programmer biologists to interactively investigate and control biological data at the speed of thought. PMID:26042145

  14. PREVIEW Behavior Modification Intervention Toolbox (PREMIT): A Study Protocol for a Psychological Element of a Multicenter Project.

    PubMed

    Kahlert, Daniela; Unyi-Reicherz, Annelie; Stratton, Gareth; Meinert Larsen, Thomas; Fogelholm, Mikael; Raben, Anne; Schlicht, Wolfgang

    2016-01-01

    Losing excess body weight and preventing weight regain by changing lifestyle is a challenging but promising task to prevent the incidence of type-2 diabetes. To be successful, it is necessary to use evidence-based and theory-driven interventions, which also contribute to the science of behavior modification by providing a deeper understanding of successful intervention components. To develop a physical activity and dietary behavior modification intervention toolbox (PREMIT) that fulfills current requirements of being theory-driven and evidence-based, comprehensively described and feasible to evaluate. PREMIT is part of an intervention trial, which aims to prevent the onset of type-2 diabetes in pre-diabetics in eight clinical centers across the world by guiding them in changing their physical activity and dietary behavior through a group counseling approach. The program development took five progressive steps, in line with the Public Health Action Cycle: (1) Summing-up the intervention goal(s), target group and the setting, (2) uncovering the generative psychological mechanisms, (3) identifying behavior change techniques and tools, (4) preparing for evaluation and (5) implementing the intervention and assuring quality. PREMIT is based on a trans-theoretical approach referring to valid behavior modification theories, models and approaches. A major "product" of PREMIT is a matrix, constructed for use by onsite-instructors. The matrix includes objectives, tasks and activities ordered by periods. PREMIT is constructed to help instructors guide participants' behavior change. To ensure high fidelity and adherence of program-implementation across the eight intervention centers standardized operational procedures were defined and "train-the-trainer" workshops were held. In summary PREMIT is a theory-driven, evidence-based program carefully developed to change physical activity and dietary behaviors in pre-diabetic people.

  15. PREVIEW Behavior Modification Intervention Toolbox (PREMIT): A Study Protocol for a Psychological Element of a Multicenter Project

    PubMed Central

    Kahlert, Daniela; Unyi-Reicherz, Annelie; Stratton, Gareth; Meinert Larsen, Thomas; Fogelholm, Mikael; Raben, Anne; Schlicht, Wolfgang

    2016-01-01

    Background: Losing excess body weight and preventing weight regain by changing lifestyle is a challenging but promising task to prevent the incidence of type-2 diabetes. To be successful, it is necessary to use evidence-based and theory-driven interventions, which also contribute to the science of behavior modification by providing a deeper understanding of successful intervention components. Objective: To develop a physical activity and dietary behavior modification intervention toolbox (PREMIT) that fulfills current requirements of being theory-driven and evidence-based, comprehensively described and feasible to evaluate. PREMIT is part of an intervention trial, which aims to prevent the onset of type-2 diabetes in pre-diabetics in eight clinical centers across the world by guiding them in changing their physical activity and dietary behavior through a group counseling approach. Methods: The program development took five progressive steps, in line with the Public Health Action Cycle: (1) Summing-up the intervention goal(s), target group and the setting, (2) uncovering the generative psychological mechanisms, (3) identifying behavior change techniques and tools, (4) preparing for evaluation and (5) implementing the intervention and assuring quality. Results: PREMIT is based on a trans-theoretical approach referring to valid behavior modification theories, models and approaches. A major “product” of PREMIT is a matrix, constructed for use by onsite-instructors. The matrix includes objectives, tasks and activities ordered by periods. PREMIT is constructed to help instructors guide participants' behavior change. To ensure high fidelity and adherence of program-implementation across the eight intervention centers standardized operational procedures were defined and “train-the-trainer” workshops were held. In summary PREMIT is a theory-driven, evidence-based program carefully developed to change physical activity and dietary behaviors in pre-diabetic people. PMID:27559319

  16. Voxel-wise grey matter asymmetry analysis in left- and right-handers.

    PubMed

    Ocklenburg, Sebastian; Friedrich, Patrick; Güntürkün, Onur; Genç, Erhan

    2016-10-28

    Handedness is thought to originate in the brain, but identifying its structural correlates in the cortex has yielded surprisingly incoherent results. One idea proclaimed by several authors is that structural grey matter asymmetries might underlie handedness. While some authors have found significant associations with handedness in different brain areas (e.g. in the central sulcus and precentral sulcus), others have failed to identify such associations. One method used by many researchers to determine structural grey matter asymmetries is voxel based morphometry (VBM). However, it has recently been suggested that the standard VBM protocol might not be ideal to assess structural grey matter asymmetries, as it establishes accurate voxel-wise correspondence across individuals but not across both hemispheres. This could potentially lead to biased and incoherent results. Recently, a new toolbox specifically geared at assessing structural asymmetries and involving accurate voxel-wise correspondence across hemispheres has been published [F. Kurth, C. Gaser, E. Luders. A 12-step user guide for analyzing voxel-wise gray matter asymmetries in statistical parametric mapping (SPM), Nat Protoc 10 (2015), 293-304]. Here, we used this new toolbox to re-assess grey matter asymmetry differences in left- vs. right-handers and linked them to quantitative measures of hand preference and hand skill. While we identified several significant left-right asymmetries in the overall sample, no difference between left- and right-handers reached significance after correction for multiple comparisons. These findings indicate that the structural brain correlates of handedness are unlikely to be rooted in macroscopic grey matter area differences that can be assessed with VBM. Future studies should focus on other potential structural correlates of handedness, e.g. structural white matter asymmetries. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Navy Enhanced Sierra Mechanics (NESM): Toolbox for predicting Navy shock and damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moyer, Thomas; Stergiou, Jonathan; Reese, Garth

    Here, the US Navy is developing a new suite of computational mechanics tools (Navy Enhanced Sierra Mechanics) for the prediction of ship response, damage, and shock environments transmitted to vital systems during threat weapon encounters. NESM includes fully coupled Euler-Lagrange solvers tailored to ship shock/damage predictions. NESM is optimized to support high-performance computing architectures, providing the physics-based ship response/threat weapon damage predictions needed to support the design and assessment of highly survivable ships. NESM is being employed to support current Navy ship design and acquisition programs while being further developed for future Navy fleet needs.

  18. Navy Enhanced Sierra Mechanics (NESM): Toolbox for predicting Navy shock and damage

    DOE PAGES

    Moyer, Thomas; Stergiou, Jonathan; Reese, Garth; ...

    2016-05-25

    Here, the US Navy is developing a new suite of computational mechanics tools (Navy Enhanced Sierra Mechanics) for the prediction of ship response, damage, and shock environments transmitted to vital systems during threat weapon encounters. NESM includes fully coupled Euler-Lagrange solvers tailored to ship shock/damage predictions. NESM is optimized to support high-performance computing architectures, providing the physics-based ship response/threat weapon damage predictions needed to support the design and assessment of highly survivable ships. NESM is being employed to support current Navy ship design and acquisition programs while being further developed for future Navy fleet needs.

  19. Development of a Toolbox Using Chemical, Physical and Biological Technologies for Decontamination of Sediments to Support Strategic Army Response to Natural Disasters

    DTIC Science & Technology

    2006-11-01

    disinfection) was tested using soil microcosms and respirometry to determine diesel range and total organic compound degradation. These tests were...grease) such as benzo(a)pyrene were detected above chronic (long term-measured in years) screening levels. Levels of diesel and oil range organics... bioremediation , and toxicity of liquid and solid samples. The Comput-OX 4R is a 4 reactor unit with no stirring modules or temperature controlled water bath

  20. Emotion Regulation Training for Treating Warfighters with Combat-Related PTSD Using Real-Time fMRI and EEG-Assisted Neurofeedback

    DTIC Science & Technology

    2017-12-01

    response integration . J Abnorm Psychol 92, 276-306. Misaki, M., Phillips, R., Zotev, V., Wong, C.K., Wurfel, B.E., Krueger, F., Feldner, M., Bodurka, J...illustrated schematically in Fig. A1A. The visits were typically scheduled one week apart. Each visit involved a psychological evaluation by a...from multiple tests. Partial correlation analyses were conducted using MATLAB Statistics toolbox. A3. Results A3.1 Psychological measures 11

  1. Extending Inferential Group Analysis in Type 2 Diabetic Patients with Multivariate GLM Implemented in SPM8.

    PubMed

    Ferreira, Fábio S; Pereira, João M S; Duarte, João V; Castelo-Branco, Miguel

    2017-01-01

    Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately - using standard univariate VBM - and simultaneously, with multivariate analyses. Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities.

  2. COMETS2: An advanced MATLAB toolbox for the numerical analysis of electric fields generated by transcranial direct current stimulation.

    PubMed

    Lee, Chany; Jung, Young-Jin; Lee, Sang Jun; Im, Chang-Hwan

    2017-02-01

    Since there is no way to measure electric current generated by transcranial direct current stimulation (tDCS) inside the human head through in vivo experiments, numerical analysis based on the finite element method has been widely used to estimate the electric field inside the head. In 2013, we released a MATLAB toolbox named COMETS, which has been used by a number of groups and has helped researchers to gain insight into the electric field distribution during stimulation. The aim of this study was to develop an advanced MATLAB toolbox, named COMETS2, for the numerical analysis of the electric field generated by tDCS. COMETS2 can generate any sizes of rectangular pad electrodes on any positions on the scalp surface. To reduce the large computational burden when repeatedly testing multiple electrode locations and sizes, a new technique to decompose the global stiffness matrix was proposed. As examples of potential applications, we observed the effects of sizes and displacements of electrodes on the results of electric field analysis. The proposed mesh decomposition method significantly enhanced the overall computational efficiency. We implemented an automatic electrode modeler for the first time, and proposed a new technique to enhance the computational efficiency. In this paper, an efficient toolbox for tDCS analysis is introduced (freely available at http://www.cometstool.com). It is expected that COMETS2 will be a useful toolbox for researchers who want to benefit from the numerical analysis of electric fields generated by tDCS. Copyright © 2016. Published by Elsevier B.V.

  3. Extending Inferential Group Analysis in Type 2 Diabetic Patients with Multivariate GLM Implemented in SPM8

    PubMed Central

    Ferreira, Fábio S.; Pereira, João M.S.; Duarte, João V.; Castelo-Branco, Miguel

    2017-01-01

    Background: Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Objective: Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). Method: We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately – using standard univariate VBM - and simultaneously, with multivariate analyses. Results: Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. Conclusion: While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities. PMID:28761571

  4. T-MATS Toolbox for the Modeling and Analysis of Thermodynamic Systems

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a MATLABSimulink (The MathWorks Inc.) plug-in for creating and simulating thermodynamic systems and controls. The package contains generic parameterized components that can be combined with a variable input iterative solver and optimization algorithm to create complex system models, such as gas turbines.

  5. Various Solution Methods, Accompanied by Dynamic Investigation, for the Same Problem as a Means for Enriching the Mathematical Toolbox

    ERIC Educational Resources Information Center

    Oxman, Victor; Stupel, Moshe

    2018-01-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  6. Various solution methods, accompanied by dynamic investigation, for the same problem as a means for enriching the mathematical toolbox

    NASA Astrophysics Data System (ADS)

    Oxman, Victor; Stupel, Moshe

    2018-04-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  7. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, P.; Benveniste, J.

    2011-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),

  8. Turbo-Satori: a neurofeedback and brain-computer interface toolbox for real-time functional near-infrared spectroscopy.

    PubMed

    Lührs, Michael; Goebel, Rainer

    2017-10-01

    Turbo-Satori is a neurofeedback and brain-computer interface (BCI) toolbox for real-time functional near-infrared spectroscopy (fNIRS). It incorporates multiple pipelines from real-time preprocessing and analysis to neurofeedback and BCI applications. The toolbox is designed with a focus in usability, enabling a fast setup and execution of real-time experiments. Turbo-Satori uses an incremental recursive least-squares procedure for real-time general linear model calculation and support vector machine classifiers for advanced BCI applications. It communicates directly with common NIRx fNIRS hardware and was tested extensively ensuring that the calculations can be performed in real time without a significant change in calculation times for all sampling intervals during ongoing experiments of up to 6 h of recording. Enabling immediate access to advanced processing features also allows the use of this toolbox for students and nonexperts in the field of fNIRS data acquisition and processing. Flexible network interfaces allow third party stimulus applications to access the processed data and calculated statistics in real time so that this information can be easily incorporated in neurofeedback or BCI presentations.

  9. HYDRORECESSION: A toolbox for streamflow recession analysis

    NASA Astrophysics Data System (ADS)

    Arciniega, S.

    2015-12-01

    Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.

  10. Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis

    PubMed Central

    Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.

    2006-01-01

    In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709

  11. A novel toolbox for E. coli lysis monitoring.

    PubMed

    Rajamanickam, Vignesh; Wurm, David; Slouka, Christoph; Herwig, Christoph; Spadiut, Oliver

    2017-01-01

    The bacterium Escherichia coli is a well-studied recombinant host organism with a plethora of applications in biotechnology. Highly valuable biopharmaceuticals, such as antibody fragments and growth factors, are currently being produced in E. coli. However, the high metabolic burden during recombinant protein production can lead to cell death, consequent lysis, and undesired product loss. Thus, fast and precise analyzers to monitor E. coli bioprocesses and to retrieve key process information, such as the optimal time point of harvest, are needed. However, such reliable monitoring tools are still scarce to date. In this study, we cultivated an E. coli strain producing a recombinant single-chain antibody fragment in the cytoplasm. In bioreactor cultivations, we purposely triggered cell lysis by pH ramps. We developed a novel toolbox using UV chromatograms as fingerprints and chemometric techniques to monitor these lysis events and used flow cytometry (FCM) as reference method to quantify viability offline. Summarizing, we were able to show that a novel toolbox comprising HPLC chromatogram fingerprinting and data science tools allowed the identification of E. coli lysis in a fast and reliable manner. We are convinced that this toolbox will not only facilitate E. coli bioprocess monitoring but will also allow enhanced process control in the future.

  12. ICT: isotope correction toolbox.

    PubMed

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. From black box to toolbox: Outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions.

    PubMed

    Danaher, Brian G; Brendryen, Håvar; Seeley, John R; Tyler, Milagra S; Woolley, Tim

    2015-03-01

    mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1) mobile device functionality, which represents the technological toolbox available to intervention developers; and (2) the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be better able to achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture.

  14. Toolbox No. 2: Expanding the Role of Foster Parents in Achieving Permanency. Toolboxes for Permanency.

    ERIC Educational Resources Information Center

    Dougherty, Susan

    Noting that over the last decade, the role of a foster parent has evolved from temporary caregiver to essential part of a professional team in determining the best long-term plan for children in their care, this guide focuses on practical ways in which best child welfare practice can be incorporated into the recruitment, training, and support of…

  15. ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Ambrozio, A.; Restano, M.

    2016-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's future release (4.0.0) is planned for September 2016. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.0 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE's variance-covariance matrices. BRAT and GUT toolboxes can be freely downloaded, along with ancillary material, at https://earth.esa.int/brat and https://earth.esa.int/gut.

  16. Simple tool for the rapid, automated quantification of glacier advance/retreat observations using multiple methods

    NASA Astrophysics Data System (ADS)

    Lea, J.

    2017-12-01

    The quantification of glacier change is a key variable within glacier monitoring, with the method used potentially being crucial to ensuring that data can be appropriately compared with environmental data. The topic and timescales of study (e.g. land/marine terminating environments; sub-annual/decadal/centennial/millennial timescales) often mean that different methods are more suitable for different problems. However, depending on the GIS/coding expertise of the user, some methods can potentially be time consuming to undertake, making large-scale studies problematic. In addition, examples exist where different users have nominally applied the same methods in different studies, though with minor methodological inconsistencies in their approach. In turn, this will have implications for data homogeneity where regional/global datasets may be constructed. Here, I present a simple toolbox scripted in a Matlab® environment that requires only glacier margin and glacier centreline data to quantify glacier length, glacier change between observations, rate of change, in addition to other metrics. The toolbox includes the option to apply the established centreline or curvilinear box methods, or a new method: the variable box method - designed for tidewater margins where box width is defined as the total width of the individual terminus observation. The toolbox is extremely flexible, and has the option to be applied as either Matlab® functions within user scripts, or via a graphical user interface (GUI) for those unfamiliar with a coding environment. In both instances, there is potential to apply the methods quickly to large datasets (100s-1000s of glaciers, with potentially similar numbers of observations each), thus ensuring large scale methodological consistency (and therefore data homogeneity) and allowing regional/global scale analyses to be achievable for those with limited GIS/coding experience. The toolbox has been evaluated against idealised scenarios demonstrating its accuracy, while feedback from undergraduate students who have trialled the toolbox is that it is intuitive and simple to use. When released, the toolbox will be free and open source allowing users to potentially modify, improve and expand upon the current version.

  17. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  18. Assessing psychological well-being: self-report instruments for the NIH Toolbox.

    PubMed

    Salsman, John M; Lai, Jin-Shei; Hendrie, Hugh C; Butt, Zeeshan; Zill, Nicholas; Pilkonis, Paul A; Peterson, Christopher; Stoney, Catherine M; Brouwers, Pim; Cella, David

    2014-02-01

    Psychological well-being (PWB) has a significant relationship with physical and mental health. As a part of the NIH Toolbox for the Assessment of Neurological and Behavioral Function, we developed self-report item banks and short forms to assess PWB. Expert feedback and literature review informed the selection of PWB concepts and the development of item pools for positive affect, life satisfaction, and meaning and purpose. Items were tested with a community-dwelling US Internet panel sample of adults aged 18 and above (N = 552). Classical and item response theory (IRT) approaches were used to evaluate unidimensionality, fit of items to the overall measure, and calibrations of those items, including differential item function (DIF). IRT-calibrated item banks were produced for positive affect (34 items), life satisfaction (16 items), and meaning and purpose (18 items). Their psychometric properties were supported based on the results of factor analysis, fit statistics, and DIF evaluation. All banks measured the concepts precisely (reliability ≥0.90) for more than 98% of participants. These adult scales and item banks for PWB provide the flexibility, efficiency, and precision necessary to promote future epidemiological, observational, and intervention research on the relationship of PWB with physical and mental health.

  19. Clinical Holistic Medicine: The Case Story of Anna. II. Patient Diary as a Tool in Treatment

    PubMed Central

    Ventegodt, Sören; Clausen, Birgitte; Merrick, Joav

    2006-01-01

    In spite of extreme childhood sexual and violent abuse, a 22-year-old young woman, Anna, healed during holistic existential therapy. New and highly confrontational therapeutic tools were developed and used to help this patient (like acceptance through touch and acupressure through the vagina). Her vulva and introitus were scarred from repeated brutal rape, as was the interior of her mouth. During therapy, these scars were gently contacted and the negative emotional contents released. The healing was in accordance with the advanced holistic medical toolbox that uses (1) love, (2) trust, (3) holding, and (4) helping the patient to process and integrate old traumas.The case story clearly revealed the philosophical adjustments that Anna made during treatment in response to the severe childhood abuse. These adjustments are demonstrated by her diary, where sentences contain both the feelings and thoughts of the painful present (the gestalt) at the time of the abuse, thus containing the essence of the traumas, making the repression of the painful emotions possible through the change in the patients philosophical perspective. Anna's case gives a unique insight into the process of traumatization (pathogenesis) and the process of healing (salutogenesis). At the end of the healing, Anna reconnected her existence to the outer world in a deep existential, suicidal crisis and faced her choice of life or death. She decided to live and, in this process, assumed existential responsibility, which made her able to step out of her mental disease. The advanced holistic toolbox seems to help patients heal even from the worst childhood abuse. In spite of the depth of the existential crisis, holistic existential therapy seems to support existential responsibility well and thus safe for the patients. PMID:17370000

  20. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy

    PubMed Central

    2011-01-01

    Background Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. Results In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. Conclusions TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox. PMID:22098775

  1. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    PubMed

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  2. Evaluation of an educational "toolbox" for improving nursing staff competence and psychosocial work environment in elderly care: results of a prospective, non-randomized controlled intervention.

    PubMed

    Arnetz, J E; Hasson, H

    2007-07-01

    Lack of professional development opportunities among nursing staff is a major concern in elderly care and has been associated with work dissatisfaction and staff turnover. There is a lack of prospective, controlled studies evaluating the effects of educational interventions on nursing competence and work satisfaction. The aim of this study was to evaluate the possible effects of an educational "toolbox" intervention on nursing staff ratings of their competence, psychosocial work environment and overall work satisfaction. The study was a prospective, non-randomized, controlled intervention. Nursing staff in two municipal elderly care organizations in western Sweden. In an initial questionnaire survey, nursing staff in the intervention municipality described several areas in which they felt a need for competence development. Measurement instruments and educational materials for improving staff knowledge and work practices were then collated by researchers and managers in a "toolbox." Nursing staff ratings of their competence and work were measured pre and post-intervention by questionnaire. Staff ratings in the intervention municipality were compared to staff ratings in the reference municipality, where no toolbox was introduced. Nursing staff ratings of their competence and psychosocial work environment, including overall work satisfaction, improved significantly over time in the intervention municipality, compared to the reference group. Both competence and work environment ratings were largely unchanged among reference municipality staff. Multivariate analysis revealed a significant interaction effect between municipalities over time for nursing staff ratings of participation, leadership, performance feedback and skills' development. Staff ratings for these four scales improved significantly in the intervention municipality as compared to the reference municipality. Compared to a reference municipality, nursing staff ratings of their competence and the psychosocial work environment improved in the municipality where the toolbox was introduced.

  3. ReTrOS: a MATLAB toolbox for reconstructing transcriptional activity from gene and protein expression data.

    PubMed

    Minas, Giorgos; Momiji, Hiroshi; Jenkins, Dafyd J; Costa, Maria J; Rand, David A; Finkenstädt, Bärbel

    2017-06-26

    Given the development of high-throughput experimental techniques, an increasing number of whole genome transcription profiling time series data sets, with good temporal resolution, are becoming available to researchers. The ReTrOS toolbox (Reconstructing Transcription Open Software) provides MATLAB-based implementations of two related methods, namely ReTrOS-Smooth and ReTrOS-Switch, for reconstructing the temporal transcriptional activity profile of a gene from given mRNA expression time series or protein reporter time series. The methods are based on fitting a differential equation model incorporating the processes of transcription, translation and degradation. The toolbox provides a framework for model fitting along with statistical analyses of the model with a graphical interface and model visualisation. We highlight several applications of the toolbox, including the reconstruction of the temporal cascade of transcriptional activity inferred from mRNA expression data and protein reporter data in the core circadian clock in Arabidopsis thaliana, and how such reconstructed transcription profiles can be used to study the effects of different cell lines and conditions. The ReTrOS toolbox allows users to analyse gene and/or protein expression time series where, with appropriate formulation of prior information about a minimum of kinetic parameters, in particular rates of degradation, users are able to infer timings of changes in transcriptional activity. Data from any organism and obtained from a range of technologies can be used as input due to the flexible and generic nature of the model and implementation. The output from this software provides a useful analysis of time series data and can be incorporated into further modelling approaches or in hypothesis generation.

  4. TRENTOOL: a Matlab open source toolbox to analyse information flow in time series data with transfer entropy.

    PubMed

    Lindner, Michael; Vicente, Raul; Priesemann, Viola; Wibral, Michael

    2011-11-18

    Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox.

  5. A new impetus for guideline development and implementation: construction and evaluation of a toolbox.

    PubMed

    Hilbink, Mirrian A H W; Ouwens, Marielle M T J; Burgers, Jako S; Kool, Rudolf B

    2014-03-19

    In the last decade, guideline organizations faced a number of problems, including a lack of standardization in guideline development methods and suboptimal guideline implementation. To contribute to the solution of these problems, we produced a toolbox for guideline development, implementation, revision, and evaluation. All relevant guideline organizations in the Netherlands were approached to prioritize the topics. We sent out a questionnaire and discussed the results at an invitational conference. Based on consensus, twelve topics were selected for the development of new tools. Subsequently, working groups were composed for the development of the tools. After development of the tools, their draft versions were pilot tested in 40 guideline projects. Based on the results of the pilot tests, the tools were refined and their final versions were presented. The vast majority of organizations involved in pilot testing of the tools reported satisfaction with using the tools. Guideline experts involved in pilot testing of the tools proposed a variety of suggestions for the implementation of the tools. The tools are available in Dutch and in English at a web-based platform on guideline development and implementation (http://www.ha-ring.nl). A collaborative approach was used for the development and evaluation of a toolbox for development, implementation, revision, and evaluation of guidelines. This approach yielded a potentially powerful toolbox for improving the quality and implementation of Dutch clinical guidelines. Collaboration between guideline organizations within this project led to stronger linkages, which is useful for enhancing coordination of guideline development and implementation and preventing duplication of efforts. Use of the toolbox could improve quality standards in the Netherlands, and might facilitate the development of high-quality guidelines in other countries as well.

  6. Predictive Mining of Time Series Data

    NASA Astrophysics Data System (ADS)

    Java, A.; Perlman, E. S.

    2002-05-01

    All-sky monitors are a relatively new development in astronomy, and their data represent a largely untapped resource. Proper utilization of this resource could lead to important discoveries not only in the physics of variable objects, but in how one observes such objects. We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build predictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets (representing several classes of objects) from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. A possible application would be in scheduling target of opportunity observations where the astronomer wants to observe an object when a certain event or series of events occurs. By combining such a toolbox with an automatic, Java query tool which regularly gathers data on objects of interest, the astronomer or telescope operator could use the real-time datastream to efficiently predict the occurrence of (for example) a flare or other event. By combining the toolbox with dynamic time warping data-mining tools, one could predict events which may happen on variable time scales.

  7. Switch of Sensitivity Dynamics Revealed with DyGloSA Toolbox for Dynamical Global Sensitivity Analysis as an Early Warning for System's Critical Transition

    PubMed Central

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574

  8. Switch of sensitivity dynamics revealed with DyGloSA toolbox for dynamical global sensitivity analysis as an early warning for system's critical transition.

    PubMed

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.

  9. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic System T-MATS

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.

  10. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.

  11. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    PubMed

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  12. EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.

    PubMed

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.

  13. III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.

    PubMed

    Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L

    2013-08-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.

  14. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    PubMed

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  15. Ideas Exchange: How Do You Use NASPE's Teacher Toolbox to Enhance Professional Activities with Students, Sport or Physical Education Lessons, Faculty Wellness Classes or Community Programs?

    ERIC Educational Resources Information Center

    Simpkins, Mary Ann; McNeill, Shane; Dieckman, Dale; Sissom, Mark; LoBianco, Judy; Lund, Jackie; Barney, David C.; Manson, Mara; Silva, Betsy

    2009-01-01

    NASPE's Teacher Toolbox is an instructional resource site which provides educators with a wide variety of teaching tools that focus on physical activity. This service is provided by NASPE to support instructional activities as well as promote quality programs. New monthly issues support NASPE's mission to enhance knowledge, improve professional…

  16. Motion Simulation in the Environment for Auditory Research

    DTIC Science & Technology

    2011-08-01

    Toolbox, Centre for Digital Music , Queen Mary University of London, 2009. http://www.isophonics.net/content/spatial-audio- matlab-toolbox (accessed July 27...this work. 46 Student Bio I studied Music Technology at Northwestern University, graduating as valedictorian of the School of... Music in 2008. In 2009, I was awarded the Gates Cambridge Scholarship to fund a postgraduate degree at the University of Cambridge. I read for a

  17. Development of a Dependency Theory Toolbox for Database Design.

    DTIC Science & Technology

    1987-12-01

    published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and

  18. PLDAPS: A Hardware Architecture and Software Toolbox for Neurophysiology Requiring Complex Visual Stimuli and Online Behavioral Control.

    PubMed

    Eastman, Kyler M; Huk, Alexander C

    2012-01-01

    Neurophysiological studies in awake, behaving primates (both human and non-human) have focused with increasing scrutiny on the temporal relationship between neural signals and behaviors. Consequently, laboratories are often faced with the problem of developing experimental equipment that can support data recording with high temporal precision and also be flexible enough to accommodate a wide variety of experimental paradigms. To this end, we have developed a MATLAB toolbox that integrates several modern pieces of equipment, but still grants experimenters the flexibility of a high-level programming language. Our toolbox takes advantage of three popular and powerful technologies: the Plexon apparatus for neurophysiological recordings (Plexon, Inc., Dallas, TX, USA), a Datapixx peripheral (Vpixx Technologies, Saint-Bruno, QC, Canada) for control of analog, digital, and video input-output signals, and the Psychtoolbox MATLAB toolbox for stimulus generation (Brainard, 1997; Pelli, 1997; Kleiner et al., 2007). The PLDAPS ("Platypus") system is designed to support the study of the visual systems of awake, behaving primates during multi-electrode neurophysiological recordings, but can be easily applied to other related domains. Despite its wide range of capabilities and support for cutting-edge video displays and neural recording systems, the PLDAPS system is simple enough for someone with basic MATLAB programming skills to design their own experiments.

  19. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  20. ERPLAB: an open-source toolbox for the analysis of event-related potentials

    PubMed Central

    Lopez-Calderon, Javier; Luck, Steven J.

    2014-01-01

    ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741

  1. Citizen Science Air Monitoring in the Ironbound Community ...

    EPA Pesticide Factsheets

    The Environmental Protection Agency’s (EPA) mission is to protect human health and the environment. To move toward achieving this goal, EPA is facilitating identification of potential environmental concerns, particularly in vulnerable communities. This includes actively supporting citizen science projects and providing communities with the information and assistance they need to conduct their own air pollution monitoring efforts. The Air Sensor Toolbox for Citizen Scientists1 was developed as a resource to meet stakeholder needs. Examples of materials developed for the Toolbox and ultimately pilot tested in the Ironbound Community in Newark, New Jersey are reported here. The Air Sensor Toolbox for Citizen Scientists is designed as an online resource that provides information and guidance on new, low-cost compact technologies used for measuring air quality. The Toolbox features resources developed by EPA researchers that can be used by citizens to effectively collect, analyze, interpret, and communicate air quality data. The resources include information about sampling methods, how to calibrate and validate monitors, options for measuring air quality, data interpretation guidelines, and low-cost sensor performance information. This Regional Applied Research Effort (RARE) project provided an opportunity for the Office of Research and Development (ORD) to work collaboratively with EPA Region 2 to provide the Ironbound Community with a “Toolbox” specific for c

  2. Optics Program Simplifies Analysis and Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Engineers at Goddard Space Flight Center partnered with software experts at Mide Technology Corporation, of Medford, Massachusetts, through a Small Business Innovation Research (SBIR) contract to design the Disturbance-Optics-Controls-Structures (DOCS) Toolbox, a software suite for performing integrated modeling for multidisciplinary analysis and design. The DOCS Toolbox integrates various discipline models into a coupled process math model that can then predict system performance as a function of subsystem design parameters. The system can be optimized for performance; design parameters can be traded; parameter uncertainties can be propagated through the math model to develop error bounds on system predictions; and the model can be updated, based on component, subsystem, or system level data. The Toolbox also allows the definition of process parameters as explicit functions of the coupled model and includes a number of functions that analyze the coupled system model and provide for redesign. The product is being sold commercially by Nightsky Systems Inc., of Raleigh, North Carolina, a spinoff company that was formed by Mide specifically to market the DOCS Toolbox. Commercial applications include use by any contractors developing large space-based optical systems, including Lockheed Martin Corporation, The Boeing Company, and Northrup Grumman Corporation, as well as companies providing technical audit services, like General Dynamics Corporation

  3. ERPLAB: an open-source toolbox for the analysis of event-related potentials.

    PubMed

    Lopez-Calderon, Javier; Luck, Steven J

    2014-01-01

    ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB's EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB's tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user's guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  4. Using OPeNDAP's Data-Services Framework to Lift Mash-Ups above Blind Dates

    NASA Astrophysics Data System (ADS)

    Gallagher, J. H. R.; Fulker, D. W.

    2015-12-01

    OPeNDAP's data-as-service framework (Hyrax) matches diverse sources with many end-user tools and contexts. Keys to its flexibility include: A data model embracing tabular data alongside n-dim arrays and other structures useful in geoinformatics. A REST-like protocol that supports—via suffix notation—a growing set of output forms (netCDF, XML, etc.) plus a query syntax for subsetting. Subsetting applies (via constraints on column values) to tabular data or (via constraints on indices or coordinates) to array-style data . A handler-style architecture that admits a growing set of input types. Community members may contribute handlers, making Hyrax effective as middleware, where N sources are mapped to M outputs with order N+M effort (not NxM). Hyrax offers virtual aggregations of source data, enabling granularity aimed at users, not data-collectors. OPeNDAP-access libraries exist in multiple languages, including Python, Java, and C++. Recent enhancements are increasing this framework's interoperability (i.e., its mash-up) potential. Extensions implemented as servlets—running adjacent to Hyrax—are enriching the forms of aggregation and enabling new protocols: User-specified aggregations, namely, applying a query to (huge) lists of source granules, and receiving one (large) table or zipped netCDF file. OGC (Open Geospatial Consortium) protocols, WMS and WCS. A Webification (W10n) protocol that returns JavaScript Object Notation (JSON). Extensions to OPeNDAP's query language are reducing transfer volumes and enabling new forms of inspection. Advances underway include: Functions that, for triangular-mesh sources, return sub-meshes spec'd via geospatial bounding boxes. Functions that, for data from multiple, satellite-borne sensors (with differing orbits), select observations based on coincidence. Calculations of means, histograms, etc. that greatly reduce output volumes.. Paths for communities to contribute new server functions (in Python, e.g.) that data providers may incorporate into Hyrax via installation parameters. One could say Hyrax itself is a mash-up, but we suggest it as an instrument for a mash-up artist's toolbox. This instrument can support mash-ups built on netCDF files, OGC protocols, JavaScript Web pages, and/or programs written in Python, Java, C or C++.

  5. Introduction of A New Toolbox for Processing Digital Images From Multiple Camera Networks: FMIPROT

    NASA Astrophysics Data System (ADS)

    Melih Tanis, Cemal; Nadir Arslan, Ali

    2017-04-01

    Webcam networks intended for scientific monitoring of ecosystems is providing digital images and other environmental data for various studies. Also, other types of camera networks can also be used for scientific purposes, e.g. usage of traffic webcams for phenological studies, camera networks for ski tracks and avalanche monitoring over mountains for hydrological studies. To efficiently harness the potential of these camera networks, easy to use software which can obtain and handle images from different networks having different protocols and standards is necessary. For the analyses of the images from webcam networks, numerous software packages are freely available. These software packages have different strong features not only for analyzing but also post processing digital images. But specifically for the ease of use, applicability and scalability, a different set of features could be added. Thus, a more customized approach would be of high value, not only for analyzing images of comprehensive camera networks, but also considering the possibility to create operational data extraction and processing with an easy to use toolbox. At this paper, we introduce a new toolbox, entitled; Finnish Meteorological Institute Image PROcessing Tool (FMIPROT) which a customized approach is followed. FMIPROT has currently following features: • straightforward installation, • no software dependencies that require as extra installations, • communication with multiple camera networks, • automatic downloading and handling images, • user friendly and simple user interface, • data filtering, • visualizing results on customizable plots, • plugins; allows users to add their own algorithms. Current image analyses in FMIPROT include "Color Fraction Extraction" and "Vegetation Indices". The analysis of color fraction extraction is calculating the fractions of the colors in a region of interest, for red, green and blue colors along with brightness and luminance parameters. The analysis of vegetation indices is a collection of indices used in vegetation phenology and includes "Green Fraction" (green chromatic coordinate), "Green-Red Vegetation Index" and "Green Excess Index". "Snow cover fraction" analysis which detects snow covered pixels in the images and georeference them on a geospatial plane to calculate the snow cover fraction is being implemented at the moment. FMIPROT is being developed during the EU Life+ MONIMET project. Altogether we mounted 28 cameras at 14 different sites in Finland as MONIMET camera network. In this paper, we will present details of FMIPROT and analysis results from MONIMET camera network. We will also discuss on future planned developments of FMIPROT.

  6. Frequency Domain Identification Toolbox

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Juang, Jer-Nan; Chen, Chung-Wen

    1996-01-01

    This report documents software written in MATLAB programming language for performing identification of systems from frequency response functions. MATLAB is a commercial software environment which allows easy manipulation of data matrices and provides other intrinsic matrix functions capabilities. Algorithms programmed in this collection of subroutines have been documented elsewhere but all references are provided in this document. A main feature of this software is the use of matrix fraction descriptions and system realization theory to identify state space models directly from test data. All subroutines have templates for the user to use as guidelines.

  7. Does confirmed pathogen transfer between sanctuary workers and great apes mean that reintroduction should not occur? Commentary on "Drug-resistant human Staphylococcus aureus findings in sanctuary apes and its threat to wild ape populations".

    PubMed

    Unwin, Steve; Robinson, Ian; Schmidt, Vanessa; Colin, Chris; Ford, Lisa; Humle, Tatyana

    2012-12-01

    This commentary discusses the findings and conclusions of the paper "Drug resistant human Staphylococcus aureus findings in sanctuary apes and its threat to wild ape populations." This paper confirms the zoonotic transfer of Staphylococcus aureus in a sanctuary setting. The assertion that this in itself is enough to reconsider the conservation potential of ape reintroduction provides an opportunity to discuss risk analysis of pathogen transmission, following IUCN guidelines, using S. aureus as an example. It is concluded that ape reintroduction projects must have disease risk mitigation strategies that include effective biosecurity protocols and pathogen surveillance. These strategies will assist with creating a well planned and executed reintroduction. This provides one way to enforce habitat protection, to minimise human encroachment and the risks from the illegal wildlife trade. Thus reintroduction must remain a useful tool in the conservation toolbox. © 2012 Wiley Periodicals, Inc.

  8. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Maciel, Paulo

    2017-01-01

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078

  9. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  10. The Visible Signature Modelling and Evaluation ToolBox

    DTIC Science & Technology

    2008-12-01

    Technology Organisation DSTO–TR–2212 ABSTRACT A new software suite, the Visible Signature ToolBox ( VST ), has been developed to model and evaluate the...visible signatures of maritime platforms. The VST is a collection of commercial, off-the-shelf software and DSTO developed pro- grams and procedures. The...suite. The VST can be utilised to model and assess visible signatures of maritime platforms. A number of examples are presented to demonstrate the

  11. The Benefits of Comparing Grapefruits and Tangerines: A Toolbox for European Cross-Cultural Comparisons in Engineering Education--Using This Toolbox to Study Gendered Images of Engineering among Students

    ERIC Educational Resources Information Center

    Godfroy-Genin, Anne-Sophie; Pinault, Cloe

    2006-01-01

    The main objective of the WomEng European research project was to assess when, how and why women decide to or not to study engineering. This question was addressed through an international cross-comparison by an interdisciplinary research team in seven European countries. This article presents, in the first part, the methodological toolbox…

  12. System-Events Toolbox--Activating Urban Places for Social Cohesion through Designing a System of Events That Relies on Local Resources

    ERIC Educational Resources Information Center

    Fassi, Davide; Motter, Roberta

    2014-01-01

    This paper is a reflection on the use of public spaces in towns and the development of a system-events toolbox to activate them towards social cohesion. It is the result of a 1 year action research developed together with POLIMI DESIS Lab of the Department of Design to develop design solutions to open up the public spaces of the campus to the…

  13. Evaluating 3D-printed biomaterials as scaffolds for vascularized bone tissue engineering.

    PubMed

    Wang, Martha O; Vorwald, Charlotte E; Dreher, Maureen L; Mott, Eric J; Cheng, Ming-Huei; Cinar, Ali; Mehdizadeh, Hamidreza; Somo, Sami; Dean, David; Brey, Eric M; Fisher, John P

    2015-01-07

    There is an unmet need for a consistent set of tools for the evaluation of 3D-printed constructs. A toolbox developed to design, characterize, and evaluate 3D-printed poly(propylene fumarate) scaffolds is proposed for vascularized engineered tissues. This toolbox combines modular design and non-destructive fabricated design evaluation, evaluates biocompatibility and mechanical properties, and models angiogenesis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Social Network Mapping: A New Tool For The Leadership Toolbox

    DTIC Science & Technology

    2002-04-01

    SOCIAL NETWORK MAPPING: A NEW TOOL FOR THE LEADERSHIP TOOLBOX By Elisabeth J. Strines, Colonel, USAF 8037 Washington Road Alexandria...valid OMB control number. 1. REPORT DATE 00 APR 2002 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Social Network Mapping: A...describes the concept of social network mapping and demonstrates how it can be used by squadron commanders and leaders at all levels to provide subtle

  15. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  16. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE PAGES

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.; ...

    2014-09-12

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  17. GUIDANCE DOCUMENT ON IMPLEMENTATION OF THE ...

    EPA Pesticide Factsheets

    The Agreement in Principle for the Stage 2 M-DBP Federal Advisory Committee contains a list of treatment processes and management practices for water systems to use in meeting additional Cryptosporidium treatment requirements under the LT2ESWTR. This list, termed the microbial toolbox, includes watershed control programs, alternative intake locations, pretreatment processes, additional filtration barriers, inactivation technologies, and enhanced plant performance. The intent of the microbial toolbox is to provide water systems with broad flexibility in selecting cost-effective LT2ESWTR compliance strategies. Moreover, the toolbox allows systems that currently provide additional pathogen barriers or that can demonstrate enhanced performance to receive additional Cryptosporidium treatment credit. Provide guidance to utilities with surface water supplies and to state drinking water programs on the use of different treatment technologies to reduce the level of Cryptosporidium in drinking water. Technologies included in the guidance manual may be used to achieve compliance with the requirements of the LT2ESWTR.

  18. ΔΔPT: a comprehensive toolbox for the analysis of protein motion

    PubMed Central

    2013-01-01

    Background Normal Mode Analysis is one of the most successful techniques for studying motions in proteins and macromolecules. It can provide information on the mechanism of protein functions, used to aid crystallography and NMR data reconstruction, and calculate protein free energies. Results ΔΔPT is a toolbox allowing calculation of elastic network models and principle component analysis. It allows the analysis of pdb files or trajectories taken from; Gromacs, Amber, and DL_POLY. As well as calculation of the normal modes it also allows comparison of the modes with experimental protein motion, variation of modes with mutation or ligand binding, and calculation of molecular dynamic entropies. Conclusions This toolbox makes the respective tools available to a wide community of potential NMA users, and allows them unrivalled ability to analyse normal modes using a variety of techniques and current software. PMID:23758746

  19. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  20. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates

    PubMed Central

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: • The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms. • The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform. • The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time. PMID:26150988

  1. The laboratory test utilization management toolbox

    PubMed Central

    Baird, Geoffrey

    2014-01-01

    Efficiently managing laboratory test utilization requires both ensuring adequate utilization of needed tests in some patients and discouraging superfluous tests in other patients. After the difficult clinical decision is made to define the patients that do and do not need a test, a wealth of interventions are available to the clinician and laboratorian to help guide appropriate utilization. These interventions are collectively referred to here as the utilization management toolbox. Experience has shown that some tools in the toolbox are weak and other are strong, and that tools are most effective when many are used simultaneously. While the outcomes of utilization management studies are not always as concrete as may be desired, what data is available in the literature indicate that strong utilization management interventions are safe and effective measures to improve patient health and reduce waste in an era of increasing financial pressure. PMID:24969916

  2. Development of an online well-being intervention for young people: an evaluation protocol.

    PubMed

    Antezana, Gaston; Bidargaddi, Niranjan; Blake, Victoria; Schrader, Geoffrey; Kaambwa, Billingsley; Quinn, Stephen; Orlowski, Simone; Winsall, Megan; Battersby, Malcolm

    2015-04-30

    Research has shown that improving well-being using positive mental health interventions can be useful for predicting and preventing mental illness. Implementing online interventions may be an effective way to reach young people, given their familiarity with technology. This study will assess the effectiveness of a website called the "Online Wellbeing Centre (OWC)," designed for the support and improvement of mental health and well-being in young Australians aged between 16 and 25 years. As the active component of the study, the OWC will introduce a self-guided app recommendation service called "The Toolbox: The best apps for your brain and body" developed by ReachOut.com. The Toolbox is a responsive website that serves as a personalized, ongoing recommendation service for technology-based tools and apps to improve well-being. It allows users to personalize their experience according to their individual needs. This study will be a two-arm, randomized controlled trial following a wait-list control design. The primary outcome will be changes in psychological well-being measured by the Mental Health Continuum Short Form. The secondary outcomes will be drawn from a subsample of participants and will include depression scores measured by the Center for Epidemiologic Studies Depression Scale, and quality of life measured by the Assessment of Quality of Life-four dimensions (AQOL-4D) index. Cost-effectiveness analysis will be conducted based on a primary outcome of cost per unique visit to the OWC. Utility-based outcomes will also be incorporated into the analysis allowing a secondary outcome to be cost per quality-adjusted life year gained (based on the AQOL-4D values). Resource use associated with both the intervention and control groups will be collected using a customized questionnaire. Online- and community-based recruitment strategies will be implemented, and the effectiveness of each approach will be analyzed. Participants will be recruited from the general Australian population and randomized online. The trial will last for 4 weeks. Small but clinically significant increases in well-being symptoms are expected to be detected in the intervention group compared with the control group. If this intervention proves to be effective, it will have an impact on the future design and implementation of online-based well-being interventions as a valid and cost-effective way to support mental health clinical treatment. Findings regarding recruitment effectiveness will also contribute to developing better ways to engage this population in research. This study is registered in the Australian New Zealand Clinical Trials Registry (ANZCTR): ACTRN12614000710628.

  3. 32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...

  4. 32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 1 2012-07-01 2012-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...

  5. 32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 1 2014-07-01 2014-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...

  6. 32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 1 2011-07-01 2011-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...

  7. 32 CFR Appendix A to Part 179 - Tables of the Munitions Response Site Prioritization Protocol

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 1 2013-07-01 2013-07-01 false Tables of the Munitions Response Site... OF DEFENSE CLOSURES AND REALIGNMENT MUNITIONS RESPONSE SITE PRIORITIZATION PROTOCOL (MRSPP) Pt. 179, App. A Appendix A to Part 179—Tables of the Munitions Response Site Prioritization Protocol The tables...

  8. Wave data processing toolbox manual

    USGS Publications Warehouse

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.

  9. ESA's Multi-mission Sentinel-1 Toolbox

    NASA Astrophysics Data System (ADS)

    Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus

    2017-04-01

    The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for exploitation of polarimetric data including speckle filters, decompositions, and classifiers. The Toolbox will also include tools for large data stacks, supervised and unsupervised classification, improved vector handling and change detection. Architectural improvements such as smart memory configuration, task queuing, and optimizations for complex data will provide better support and performance for very large products and stacks.In addition, a Cloud Exploitation Platform Extension (CEP) has been developed to add the capability to smoothly utilize a cloud computing platform where EO data repositories and high performance processing capabilities are available. The extension to the SENTINEL Application Platform would facilitate entry into cloud processing services for supporting bulk processing on high performance clusters. Since December 2016, the COMET-LiCS InSAR portal (http://comet.nerc.ac.uk/COMET-LiCS-portal/) has been live, delivering interferograms and coherence estimates over the entire Alpine-Himalayan belt. The portal already contains tens of thousands of products, which can be browsed in a user-friendly portal, and downloaded for free by the general public. For our processing, we use the facilities at the Climate and Environmental Monitoring from Space (CEMS). Here we have large storage and processing facilities to our disposal, and a complete duplicate of the Sentinel-1 archive is maintained. This greatly simplifies the infrastructure we had to develop for automated processing of large areas. Here we will give an overview of the current status of the processing system, as well as discuss future plans. We will cover the infrastructure we developed to automatically produce interferograms and its challenges, and the processing strategy for time series analysis. We will outline the objectives of the system in the near and distant future, and a roadmap for its continued development. Finally, we will highlight some of the scientific results and projects linked to the system.

  10. GISMO: A MATLAB toolbox for seismic research, monitoring, & education

    NASA Astrophysics Data System (ADS)

    Thompson, G.; Reyes, C. G.; Kempler, L. A.

    2017-12-01

    GISMO is an open-source MATLAB toolbox which provides an object-oriented framework to build workflows and applications that read, process, visualize and write seismic waveform, catalog and instrument response data. GISMO can retrieve data from a variety of sources (e.g. FDSN web services, Earthworm/Winston servers) and data formats (SAC, Seisan, etc.). It can handle waveform data that crosses file boundaries. All this alleviates one of the most time consuming part for scientists developing their own codes. GISMO simplifies seismic data analysis by providing a common interface for your data, regardless of its source. Several common plots are built-in to GISMO, such as record section plots, spectrograms, depth-time sections, event count per unit time, energy release per unit time, etc. Other visualizations include map views and cross-sections of hypocentral data. Several common processing methods are also included, such as an extensive set of tools for correlation analysis. Support is being added to interface GISMO with ObsPy. GISMO encourages community development of an integrated set of codes and accompanying documentation, eliminating the need for seismologists to "reinvent the wheel". By sharing code the consistency and repeatability of results can be enhanced. GISMO is hosted on GitHub with documentation both within the source code and in the project wiki. GISMO has been used at the University of South Florida and University of Alaska Fairbanks in graduate-level courses including Seismic Data Analysis, Time Series Analysis and Computational Seismology. GISMO has also been tailored to interface with the common seismic monitoring software and data formats used by volcano observatories in the US and elsewhere. As an example, toolbox training was delivered to researchers at INETER (Nicaragua). Applications built on GISMO include IceWeb (e.g. web-based spectrograms), which has been used by Alaska Volcano Observatory since 1998 and became the prototype for the USGS Pensive system.

  11. Modelling multi-pulse population dynamics from ultrafast spectroscopy.

    PubMed

    van Wilderen, Luuk J G W; Lincoln, Craig N; van Thor, Jasper J

    2011-03-21

    Current advanced laser, optics and electronics technology allows sensitive recording of molecular dynamics, from single resonance to multi-colour and multi-pulse experiments. Extracting the occurring (bio-) physical relevant pathways via global analysis of experimental data requires a systematic investigation of connectivity schemes. Here we present a Matlab-based toolbox for this purpose. The toolbox has a graphical user interface which facilitates the application of different reaction models to the data to generate the coupled differential equations. Any time-dependent dataset can be analysed to extract time-independent correlations of the observables by using gradient or direct search methods. Specific capabilities (i.e. chirp and instrument response function) for the analysis of ultrafast pump-probe spectroscopic data are included. The inclusion of an extra pulse that interacts with a transient phase can help to disentangle complex interdependent pathways. The modelling of pathways is therefore extended by new theory (which is included in the toolbox) that describes the finite bleach (orientation) effect of single and multiple intense polarised femtosecond pulses on an ensemble of randomly oriented particles in the presence of population decay. For instance, the generally assumed flat-top multimode beam profile is adapted to a more realistic Gaussian shape, exposing the need for several corrections for accurate anisotropy measurements. In addition, the (selective) excitation (photoselection) and anisotropy of populations that interact with single or multiple intense polarised laser pulses is demonstrated as function of power density and beam profile. Using example values of real world experiments it is calculated to what extent this effectively orients the ensemble of particles. Finally, the implementation includes the interaction with multiple pulses in addition to depth averaging in optically dense samples. In summary, we show that mathematical modelling is essential to model and resolve the details of physical behaviour of populations in ultrafast spectroscopy such as pump-probe, pump-dump-probe and pump-repump-probe experiments.

  12. Modelling Multi-Pulse Population Dynamics from Ultrafast Spectroscopy

    PubMed Central

    van Wilderen, Luuk J. G. W.; Lincoln, Craig N.; van Thor, Jasper J.

    2011-01-01

    Current advanced laser, optics and electronics technology allows sensitive recording of molecular dynamics, from single resonance to multi-colour and multi-pulse experiments. Extracting the occurring (bio-) physical relevant pathways via global analysis of experimental data requires a systematic investigation of connectivity schemes. Here we present a Matlab-based toolbox for this purpose. The toolbox has a graphical user interface which facilitates the application of different reaction models to the data to generate the coupled differential equations. Any time-dependent dataset can be analysed to extract time-independent correlations of the observables by using gradient or direct search methods. Specific capabilities (i.e. chirp and instrument response function) for the analysis of ultrafast pump-probe spectroscopic data are included. The inclusion of an extra pulse that interacts with a transient phase can help to disentangle complex interdependent pathways. The modelling of pathways is therefore extended by new theory (which is included in the toolbox) that describes the finite bleach (orientation) effect of single and multiple intense polarised femtosecond pulses on an ensemble of randomly oriented particles in the presence of population decay. For instance, the generally assumed flat-top multimode beam profile is adapted to a more realistic Gaussian shape, exposing the need for several corrections for accurate anisotropy measurements. In addition, the (selective) excitation (photoselection) and anisotropy of populations that interact with single or multiple intense polarised laser pulses is demonstrated as function of power density and beam profile. Using example values of real world experiments it is calculated to what extent this effectively orients the ensemble of particles. Finally, the implementation includes the interaction with multiple pulses in addition to depth averaging in optically dense samples. In summary, we show that mathematical modelling is essential to model and resolve the details of physical behaviour of populations in ultrafast spectroscopy such as pump-probe, pump-dump-probe and pump-repump-probe experiments. PMID:21445294

  13. SHynergie: Development of a virtual project laboratory for monitoring hydraulic stimulations

    NASA Astrophysics Data System (ADS)

    Renner, Jörg; Friederich, Wolfgang; Meschke, Günther; Müller, Thomas; Steeb, Holger

    2016-04-01

    Hydraulic stimulations are the primary means of developing subsurface reservoirs regarding the extent of fluid transport in them. The associated creation or conditioning of a system of hydraulic conduits involves a range of hydraulic and mechanical processes but also chemical reactions, such as dissolution and precipitation, may affect the stimulation result on time scales as short as hours. In the light of the extent and complexity of these processes, the steering potential for the operator of a stimulation critically depends on the ability to integrate the maximum amount of site-specific information with profound process understanding and a large spectrum of experience. We report on the development of a virtual project laboratory for monitoring hydraulic stimulations within the project SHynergie (http://www.ruhr-uni-bochum.de/shynergie/). The concept of the laboratory envisioned product that constitutes a preparing and accompanying rather than post-processing instrument ultimately accessible to persons responsible for a project over a web-repository. The virtual laboratory consists of a data base, a toolbox, and a model-building environment. Entries in the data base are of two categories. On the one hand, selected mineral and rock properties are provided from the literature. On the other hand, project-specific entries of any format can be made that are assigned attributes regarding their use in a stimulation problem at hand. The toolbox is interactive and allows the user to perform calculations of effective properties and simulations of different types (e.g., wave propagation in a reservoir, hydraulic test). The model component is also hybrid. The laboratory provides a library of models reflecting a range of scenarios but also allows the user to develop a site-specific model constituting the basis for simulations. The laboratory offers the option to use its components following the typical workflow of a stimulation project. The toolbox incorporates simulation instruments developed in the course of the SHynergie project that account for the experimental and modeling results of the various sub-projects.

  14. ObsPy - A Python Toolbox for Seismology - and Applications

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Barsch, R.; MacCarthy, J.; Lecocq, T.; Koymans, M. R.; Carothers, L.; Eulenfeld, T.; Reyes, C. G.; Falco, N.; Sales de Andrade, E.

    2017-12-01

    Recent years witnessed the evolution of Python's ecosystem into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It is a Python toolbox offering: Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, SC3ML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. Newest features include: Full interoperability of SEED and StationXML/Inventory objects Access to the Nominal Response Library (NRL) for easy and quick creation of station metadata from scratch Support for the IRIS Federated Catalog Service Improved performance of the EarthWorm client Several improvements to MiniSEED read/write module Improved plotting capabilities for PPSD (spectrograms, PSD of discrete frequencies over time, ..) Support for.. Reading ArcLink Inventory XML Reading Reftek data format Writing SeisComp3 ML (SC3ML) Writing StationTXT format This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases and show-case some projects that are based on ObsPy, e.g.: seismo-live.org Seedlink-plotter MSNoise, and others..

  15. Visualizing flow fields using acoustic Doppler current profilers and the Velocity Mapping Toolbox

    USGS Publications Warehouse

    Jackson, P. Ryan

    2013-01-01

    The purpose of this fact sheet is to provide examples of how the U.S. Geological Survey is using acoustic Doppler current profilers for much more than routine discharge measurements. These instruments are capable of mapping complex three-dimensional flow fields within rivers, lakes, and estuaries. Using the Velocity Mapping Toolbox to process the ADCP data allows detailed visualization of the data, providing valuable information for a range of studies and applications.

  16. The iRoCS Toolbox--3D analysis of the plant root apical meristem at cellular resolution.

    PubMed

    Schmidt, Thorsten; Pasternak, Taras; Liu, Kun; Blein, Thomas; Aubry-Hivet, Dorothée; Dovzhenko, Alexander; Duerr, Jasmin; Teale, William; Ditengou, Franck A; Burkhardt, Hans; Ronneberger, Olaf; Palme, Klaus

    2014-03-01

    To achieve a detailed understanding of processes in biological systems, cellular features must be quantified in the three-dimensional (3D) context of cells and organs. We described use of the intrinsic root coordinate system (iRoCS) as a reference model for the root apical meristem of plants. iRoCS enables direct and quantitative comparison between the root tips of plant populations at single-cell resolution. The iRoCS Toolbox automatically fits standardized coordinates to raw 3D image data. It detects nuclei or segments cells, automatically fits the coordinate system, and groups the nuclei/cells into the root's tissue layers. The division status of each nucleus may also be determined. The only manual step required is to mark the quiescent centre. All intermediate outputs may be refined if necessary. The ability to learn the visual appearance of nuclei by example allows the iRoCS Toolbox to be easily adapted to various phenotypes. The iRoCS Toolbox is provided as an open-source software package, licensed under the GNU General Public License, to make it accessible to a broad community. To demonstrate the power of the technique, we measured subtle changes in cell division patterns caused by modified auxin flux within the Arabidopsis thaliana root apical meristem. © 2014 The Authors The Plant Journal © 2014 John Wiley & Sons Ltd.

  17. NIA outreach to minority and health disparity populations can a toolbox for recruitment and retention be far behind?

    PubMed

    Harden, J Taylor; Silverberg, Nina

    2010-01-01

    The ability to locate the right research tool at the right time for recruitment and retention of minority and health disparity populations is a challenge. This article provides an introduction to a number of recruitment and retention tools in a National Institute on Aging Health Disparities Toolbox and to this special edition on challenges and opportunities in recruitment and retention of minority populations in Alzheimer disease and dementia research. The Health Disparities Toolbox and Health Disparities Resource Persons Network are described along with other more established resource tools including the Alzheimer Disease Center Education Cores, Alzheimer Disease Education and Referral Center, and Resource Centers for Minority Aging Research. Nine featured articles are introduced. The articles address a range of concerns including what we know and do not know, conceptual and theoretical perspectives framing issues of diversity and inclusion, success as a result of sustained investment of time and community partnerships, the significant issue of mistrust, willingness to participate in research as a dynamic personal attribute, Helpline Service and the amount of resources required for success, assistance in working with Limited English Proficiency elders, and sage advice from social marketing and investigations of health literacy as a barrier to recruitment and retention. Finally, an appeal is made for scientists to share tools for the National Institute on Aging Health Disparity Toolbox and to join the Health Disparities Resource Persons Network.

  18. HDDM: Hierarchical Bayesian estimation of the Drift-Diffusion Model in Python.

    PubMed

    Wiecki, Thomas V; Sofer, Imri; Frank, Michael J

    2013-01-01

    The diffusion model is a commonly used tool to infer latent psychological processes underlying decision-making, and to link them to neural mechanisms based on response times. Although efficient open source software has been made available to quantitatively fit the model to data, current estimation methods require an abundance of response time measurements to recover meaningful parameters, and only provide point estimates of each parameter. In contrast, hierarchical Bayesian parameter estimation methods are useful for enhancing statistical power, allowing for simultaneous estimation of individual subject parameters and the group distribution that they are drawn from, while also providing measures of uncertainty in these parameters in the posterior distribution. Here, we present a novel Python-based toolbox called HDDM (hierarchical drift diffusion model), which allows fast and flexible estimation of the the drift-diffusion model and the related linear ballistic accumulator model. HDDM requires fewer data per subject/condition than non-hierarchical methods, allows for full Bayesian data analysis, and can handle outliers in the data. Finally, HDDM supports the estimation of how trial-by-trial measurements (e.g., fMRI) influence decision-making parameters. This paper will first describe the theoretical background of the drift diffusion model and Bayesian inference. We then illustrate usage of the toolbox on a real-world data set from our lab. Finally, parameter recovery studies show that HDDM beats alternative fitting methods like the χ(2)-quantile method as well as maximum likelihood estimation. The software and documentation can be downloaded at: http://ski.clps.brown.edu/hddm_docs/

  19. Test accommodations for individuals with neurological conditions completing the NIH Toolbox-Cognition Battery: An evaluation of frequency and appropriateness.

    PubMed

    Magasi, Susan; Harniss, Mark; Tulsky, David S; Cohen, Matthew L; Heaton, Robert K; Heinemann, Allen W

    2017-11-01

    First, to evaluate the frequency with which individuals with neurological conditions require test administration accommodations for the NIH Toolbox-Cognition Battery (NIHTB-CB). Second, to evaluate the appropriateness of accommodations provided by administrators, including adherence to NIHTB-CB Reasonable Accommodations Guidelines. Adults with traumatic brain injury, spinal cord injury, or stroke (n = 604) completed the NIHTB-CB and other assessments as part of a multisite study. We provide a descriptive, secondary analysis of test administrator notes to determine use and appropriateness of accommodations. Of the 604 participants, 450 (75%) completed the NIHTB-CB using standard administration procedures, but 137 (22.6%) encountered accessibility challenges that required accommodations. Participants with motor function impairments were most likely to receive at least 1 of 3 kinds of accommodations: (a) use of nonstandard methods of entering responses using standard input devices, (b) use of alternate input devices, or (c) help from the test administrator to enter a response. Fatigue and/or impulsivity led to nonstandard administration by 48 (7.9%) individuals. Post hoc audit of test administrator notes revealed that despite careful instructions and supervision, 49 (56.3%) of the accommodated administrations breached standardization and scores could not be interpreted using test norms. Although the NIHTB-CB was developed for individuals without neurological impairment, most individuals with neurological conditions completed the standardized administration without accommodations. When accommodations were needed, administrators did not adhere to the official Reasonable Accommodations Guidelines in more than half of the cases. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Spectral imaging toolbox: segmentation, hyperstack reconstruction, and batch processing of spectral images for the determination of cell and model membrane lipid order.

    PubMed

    Aron, Miles; Browning, Richard; Carugo, Dario; Sezgin, Erdinc; Bernardino de la Serna, Jorge; Eggeling, Christian; Stride, Eleanor

    2017-05-12

    Spectral imaging with polarity-sensitive fluorescent probes enables the quantification of cell and model membrane physical properties, including local hydration, fluidity, and lateral lipid packing, usually characterized by the generalized polarization (GP) parameter. With the development of commercial microscopes equipped with spectral detectors, spectral imaging has become a convenient and powerful technique for measuring GP and other membrane properties. The existing tools for spectral image processing, however, are insufficient for processing the large data sets afforded by this technological advancement, and are unsuitable for processing images acquired with rapidly internalized fluorescent probes. Here we present a MATLAB spectral imaging toolbox with the aim of overcoming these limitations. In addition to common operations, such as the calculation of distributions of GP values, generation of pseudo-colored GP maps, and spectral analysis, a key highlight of this tool is reliable membrane segmentation for probes that are rapidly internalized. Furthermore, handling for hyperstacks, 3D reconstruction and batch processing facilitates analysis of data sets generated by time series, z-stack, and area scan microscope operations. Finally, the object size distribution is determined, which can provide insight into the mechanisms underlying changes in membrane properties and is desirable for e.g. studies involving model membranes and surfactant coated particles. Analysis is demonstrated for cell membranes, cell-derived vesicles, model membranes, and microbubbles with environmentally-sensitive probes Laurdan, carboxyl-modified Laurdan (C-Laurdan), Di-4-ANEPPDHQ, and Di-4-AN(F)EPPTEA (FE), for quantification of the local lateral density of lipids or lipid packing. The Spectral Imaging Toolbox is a powerful tool for the segmentation and processing of large spectral imaging datasets with a reliable method for membrane segmentation and no ability in programming required. The Spectral Imaging Toolbox can be downloaded from https://uk.mathworks.com/matlabcentral/fileexchange/62617-spectral-imaging-toolbox .

  1. A self-teaching image processing and voice-recognition-based, intelligent and interactive system to educate visually impaired children

    NASA Astrophysics Data System (ADS)

    Iqbal, Asim; Farooq, Umar; Mahmood, Hassan; Asad, Muhammad Usman; Khan, Akrama; Atiq, Hafiz Muhammad

    2010-02-01

    A self teaching image processing and voice recognition based system is developed to educate visually impaired children, chiefly in their primary education. System comprises of a computer, a vision camera, an ear speaker and a microphone. Camera, attached with the computer system is mounted on the ceiling opposite (on the required angle) to the desk on which the book is placed. Sample images and voices in the form of instructions and commands of English, Urdu alphabets, Numeric Digits, Operators and Shapes are already stored in the database. A blind child first reads the embossed character (object) with the help of fingers than he speaks the answer, name of the character, shape etc into the microphone. With the voice command of a blind child received by the microphone, image is taken by the camera which is processed by MATLAB® program developed with the help of Image Acquisition and Image processing toolbox and generates a response or required set of instructions to child via ear speaker, resulting in self education of a visually impaired child. Speech recognition program is also developed in MATLAB® with the help of Data Acquisition and Signal Processing toolbox which records and process the command of the blind child.

  2. Physical, chemical, and metabolic state sensors expand the synthetic biology toolbox for Synechocystis sp. PCC 6803.

    PubMed

    Immethun, Cheryl M; DeLorenzo, Drew M; Focht, Caroline M; Gupta, Dinesh; Johnson, Charles B; Moon, Tae Seok

    2017-07-01

    Many under-developed organisms possess important traits that can boost the effectiveness and sustainability of microbial biotechnology. Photoautotrophic cyanobacteria can utilize the energy captured from light to fix carbon dioxide for their metabolic needs while living in environments not suited for growing crops. Various value-added compounds have been produced by cyanobacteria in the laboratory; yet, the products' titers and yields are often not industrially relevant and lag behind what have been accomplished in heterotrophic microbes. Genetic tools for biological process control are needed to take advantage of cyanobacteria's beneficial qualities, as tool development also lags behind what has been created in common heterotrophic hosts. To address this problem, we developed a suite of sensors that regulate transcription in the model cyanobacterium Synechocystis sp. PCC 6803 in response to metabolically relevant signals, including light and the cell's nitrogen status, and a family of sensors that respond to the inexpensive chemical, l-arabinose. Increasing the number of available tools enables more complex and precise control of gene expression. Expanding the synthetic biology toolbox for this cyanobacterium also improves our ability to utilize this important under-developed organism in biotechnology. Biotechnol. Bioeng. 2017;114: 1561-1569. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) Users' Workshop Presentations

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S. (Compiler)

    2018-01-01

    NASA Glenn Research Center hosted a Users' Workshop on the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) on August 21, 2017. The objective of this workshop was to update the user community on the latest features of T-MATS, and to provide a forum to present work performed using T-MATS. Presentations highlighted creative applications and the development of new features and libraries, and emphasized the flexibility and simulation power of T-MATS.

  4. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  5. CEMENTITIOUS BARRIERS PARTNERSHIP FY13 MID-YEAR REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, H.; Flach, G.; Langton, C.

    2013-05-01

    In FY2013, the Cementitious Barriers Partnership (CBP) is continuing in its effort to develop and enhance software tools demonstrating tangible progress toward fulfilling the objective of developing a set of tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In FY2012, the CBP released the initial inhouse “Beta-version” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. The current primary software components are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. THAMESmore » is a planned future CBP Toolbox component (FY13/14) focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. This past November, the CBP Software Toolbox Version 1.0 was released that supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). The CBP issued numerous reports and other documentation that accompanied the “Version 1.0” release including a CBP Software Toolbox User Guide and Installation Guide. These documents, as well as, the presentations from the CBP Software Toolbox Demonstration and User Workshop, which are briefly described below, can be accessed from the CBP webpage at http://cementbarriers.org/. The website was recently modified to describe the CBP Software Toolbox and includes an interest form for application to use the software. The CBP FY13 program is continuing research to improve and enhance the simulation tools as well as develop new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools thru laboratory experiments and analysis of field specimens are ongoing to quantify and reduce the uncertainty associated with performance assessments are ongoing. This mid-year report also includes both a summary on the FY13 software accomplishments in addition to the release of Version 1.0 of the CBP Software Toolbox and the various experimental programs that are providing data for calibration and validation of the CBP developed software. The focus this year for experimental studies was to measure transport in cementitious material by utilization of a leaching method and reduction capacity of saltstone field samples. Results are being used to calibrate and validate the updated carbonation model.« less

  6. An Early Years Toolbox for Assessing Early Executive Function, Language, Self-Regulation, and Social Development: Validity, Reliability, and Preliminary Norms

    PubMed Central

    Howard, Steven J.; Melhuish, Edward

    2016-01-01

    Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years Toolbox (EYT) offers substantial advantages for early assessment of language, EF, self-regulation, and social development. In the current study, results of our large-scale administration of this toolbox to 1,764 preschool and early primary school students indicated very good reliability, convergent validity with existing measures, and developmental sensitivity. Results were also suggestive of better capture of children’s emerging abilities relative to comparison measures. Preliminary norms are presented, showing a clear developmental trajectory across half-year age groups. The accessibility of the EYT, as well as its advantages over existing measures, offers considerably enhanced opportunities for objective measurement of young children’s abilities to enable research and educational applications. PMID:28503022

  7. Distributed Aerodynamic Sensing and Processing Toolbox

    NASA Technical Reports Server (NTRS)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  8. BOLDSync: a MATLAB-based toolbox for synchronized stimulus presentation in functional MRI.

    PubMed

    Joshi, Jitesh; Saharan, Sumiti; Mandal, Pravat K

    2014-02-15

    Precise and synchronized presentation of paradigm stimuli in functional magnetic resonance imaging (fMRI) is central to obtaining accurate information about brain regions involved in a specific task. In this manuscript, we present a new MATLAB-based toolbox, BOLDSync, for synchronized stimulus presentation in fMRI. BOLDSync provides a user friendly platform for design and presentation of visual, audio, as well as multimodal audio-visual (AV) stimuli in functional imaging experiments. We present simulation experiments that demonstrate the millisecond synchronization accuracy of BOLDSync, and also illustrate the functionalities of BOLDSync through application to an AV fMRI study. BOLDSync gains an advantage over other available proprietary and open-source toolboxes by offering a user friendly and accessible interface that affords both precision in stimulus presentation and versatility across various types of stimulus designs and system setups. BOLDSync is a reliable, efficient, and versatile solution for synchronized stimulus presentation in fMRI study. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. A spike sorting toolbox for up to thousands of electrodes validated with ground truth recordings in vitro and in vivo

    PubMed Central

    Lefebvre, Baptiste; Deny, Stéphane; Gardella, Christophe; Stimberg, Marcel; Jetter, Florian; Zeck, Guenther; Picaud, Serge; Duebel, Jens

    2018-01-01

    In recent years, multielectrode arrays and large silicon probes have been developed to record simultaneously between hundreds and thousands of electrodes packed with a high density. However, they require novel methods to extract the spiking activity of large ensembles of neurons. Here, we developed a new toolbox to sort spikes from these large-scale extracellular data. To validate our method, we performed simultaneous extracellular and loose patch recordings in rodents to obtain ‘ground truth’ data, where the solution to this sorting problem is known for one cell. The performance of our algorithm was always close to the best expected performance, over a broad range of signal-to-noise ratios, in vitro and in vivo. The algorithm is entirely parallelized and has been successfully tested on recordings with up to 4225 electrodes. Our toolbox thus offers a generic solution to sort accurately spikes for up to thousands of electrodes. PMID:29557782

  10. Developing a Fluorescent Toolbox To Shed Light on the Mysteries of RNA.

    PubMed

    Alexander, Seth C; Devaraj, Neal K

    2017-10-03

    Technologies that detect and image RNA have illuminated the complex roles played by RNA, redefining the traditional and superficial role first outlined by the central dogma of biology. Because there is such a wide diversity of RNA structure arising from an assortment of functions within biology, a toolbox of approaches have emerged for investigation of this important class of biomolecules. These methods are necessary to detect and elucidate the localization and dynamics of specific RNAs and in doing so unlock our understanding of how RNA dysregulation leads to disease. Current methods for detecting and imaging RNA include in situ hybridization techniques, fluorescent aptamers, RNA binding proteins fused to fluorescent reporters, and covalent labeling strategies. Because of the inherent diversity of these methods, each approach comes with a set of strengths and limitations that leave room for future improvement. This perspective seeks to highlight the most recent advances and remaining challenges for the wide-ranging toolbox of technologies that illuminate RNA's contribution to cellular complexity.

  11. GenSSI 2.0: multi-experiment structural identifiability analysis of SBML models.

    PubMed

    Ligon, Thomas S; Fröhlich, Fabian; Chis, Oana T; Banga, Julio R; Balsa-Canto, Eva; Hasenauer, Jan

    2018-04-15

    Mathematical modeling using ordinary differential equations is used in systems biology to improve the understanding of dynamic biological processes. The parameters of ordinary differential equation models are usually estimated from experimental data. To analyze a priori the uniqueness of the solution of the estimation problem, structural identifiability analysis methods have been developed. We introduce GenSSI 2.0, an advancement of the software toolbox GenSSI (Generating Series for testing Structural Identifiability). GenSSI 2.0 is the first toolbox for structural identifiability analysis to implement Systems Biology Markup Language import, state/parameter transformations and multi-experiment structural identifiability analysis. In addition, GenSSI 2.0 supports a range of MATLAB versions and is computationally more efficient than its previous version, enabling the analysis of more complex models. GenSSI 2.0 is an open-source MATLAB toolbox and available at https://github.com/genssi-developer/GenSSI. thomas.ligon@physik.uni-muenchen.de or jan.hasenauer@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  12. Interplay between plasma hormone profiles, sex and body condition in immature hawksbill turtles (Eretmochelys imbricata) subjected to a capture stress protocol.

    PubMed

    Jessop, Tim S; Sumner, Joanna M; Limpus, Colin J; Whittier, Joan M

    2004-01-01

    We investigated plasma hormone profiles of corticosterone and testosterone in immature hawksbill turtles (Eretmochelys imbricata) in response to a capture stress protocol. Further, we examined whether sex and body condition were covariates associated with variation in the adrenocortical response of immature turtles. Hawksbill turtles responded to the capture stress protocol by significantly increasing plasma levels of corticosterone over a 5 h period. There was no significant sex difference in the corticosterone stress response of immature turtles. Plasma testosterone profiles, while significantly different between the sexes, did not exhibit a significant change during the 5 h capture stress protocol. An index of body condition was not significantly associated with a turtle's capacity to produce plasma corticosterone both prior to and during exposure to the capture stress protocol. In summary, while immature hawksbill turtles exhibited an adrenocortical response to a capture stress protocol, neither their sex nor body condition was responsible for variation in endocrine responses. This lack of interaction between the adrenocortical response and these internal factors suggests that the inactive reproductive- and the current energetic- status of these immature turtles are important factors that could influence plasma hormone profiles during stress.

  13. Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox

    NASA Astrophysics Data System (ADS)

    Karimpour, Arash; Chen, Qin

    2017-09-01

    There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana's estuaries.

  14. ESA Atmospheric Toolbox

    NASA Astrophysics Data System (ADS)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/

  15. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  16. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  17. COBRA ATD multispectral camera response model

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.

  18. Evaluation of the University of Florida lomustine, vincristine, procarbazine, and prednisone chemotherapy protocol for the treatment of relapsed lymphoma in dogs: 33 cases (2003-2009).

    PubMed

    Fahey, Christine E; Milner, Rowan J; Barabas, Karri; Lurie, David; Kow, Kelvin; Parfitt, Shannon; Lyles, Sarah; Clemente, Monica

    2011-07-15

    To evaluate the toxicity and efficacy of a modification of a previously evaluated combination of lomustine, vincristine, procarbazine, and prednisone (LOPP) as a rescue protocol for refractory lymphoma in dogs. Retrospective case series. Animals-33 dogs with a cytologic or histologic diagnosis of lymphoma that developed resistance to their induction chemotherapy protocol. Lomustine was administered on day 0 of the protocol. Vincristine was administered on day 0 and again 1 time on day 14. Procarbazine and prednisone were administered on days 0 through 13 of the protocol. This cycle was repeated every 28 days. Median time from initiation to discontinuation of the University of Florida LOPP protocol was 84 days (range, 10 to 308 days). Overall median survival time was 290 days (range, 51 to 762 days). Overall response rate with this protocol was 61% (20/33), with 36% (12) having a complete response and 24% (8) having a partial response. Toxicosis rates were lower than for the previously published LOPP protocol. The University of Florida LOPP protocol may be an acceptable alternative to the mechlorethamine, vincristine, procarbazine, and prednisone protocol as a rescue protocol for dogs with lymphoma.

  19. Why Heuristics Work.

    PubMed

    Gigerenzer, Gerd

    2008-01-01

    The adaptive toolbox is a Darwinian-inspired theory that conceives of the mind as a modular system that is composed of heuristics, their building blocks, and evolved capacities. The study of the adaptive toolbox is descriptive and analyzes the selection and structure of heuristics in social and physical environments. The study of ecological rationality is prescriptive and identifies the structure of environments in which specific heuristics either succeed or fail. Results have been used for designing heuristics and environments to improve professional decision making in the real world. © 2008 Association for Psychological Science.

  20. GIXSGUI : a MATLAB toolbox for grazing-incidence X-ray scattering data visualization and reduction, and indexing of buried three-dimensional periodic nanostructured films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Zhang

    GIXSGUIis a MATLAB toolbox that offers both a graphical user interface and script-based access to visualize and process grazing-incidence X-ray scattering data from nanostructures on surfaces and in thin films. It provides routine surface scattering data reduction methods such as geometric correction, one-dimensional intensity linecut, two-dimensional intensity reshapingetc. Three-dimensional indexing is also implemented to determine the space group and lattice parameters of buried organized nanoscopic structures in supported thin films.

  1. Intra-individual psychological and physiological responses to acute laboratory stressors of different intensity.

    PubMed

    Skoluda, Nadine; Strahler, Jana; Schlotz, Wolff; Niederberger, Larissa; Marques, Sofia; Fischer, Susanne; Thoma, Myriam V; Spoerri, Corinne; Ehlert, Ulrike; Nater, Urs M

    2015-01-01

    The phenomenon of stress is understood as a multidimensional concept which can be captured by psychological and physiological measures. There are various laboratory stress protocols which enable stress to be investigated under controlled conditions. However, little is known about whether these protocols differ with regard to the induced psycho-physiological stress response pattern. In a within-subjects design, 20 healthy young men underwent four of the most common stress protocols (Stroop test [Stroop], cold pressor test [CPT], Trier Social Stress Test [TSST], and bicycle ergometer test [Ergometer]) and a no-stress control condition (rest) in a randomized order. For the multidimensional assessment of the stress response, perceived stress, endocrine and autonomic biomarkers (salivary cortisol, salivary alpha-amylase, and heart rate) were obtained during the experiments. All stress protocols evoked increases in perceived stress levels, with the highest levels in the TSST, followed by Ergometer, Stroop, and CPT. The highest HPA axis response was found in the TSST, followed by Ergometer, CPT, and Stroop, whilst the highest autonomic response was found in the Ergometer, followed by TSST, Stroop, and CPT. These findings suggest that different stress protocols differentially stimulate various aspects of the stress response. Physically demanding stress protocols such as the Ergometer test appear to be particularly suitable for evoking autonomic stress responses, whereas uncontrollable and social-evaluative threatening stressors (such as the TSST) are most likely to elicit HPA axis stress responses. The results of this study may help researchers in deciding which stress protocol to use, depending on the individual research question. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. A New Method for a Virtue-Based Responsible Conduct of Research Curriculum: Pilot Test Results.

    PubMed

    Berling, Eric; McLeskey, Chet; O'Rourke, Michael; Pennock, Robert T

    2018-02-03

    Drawing on Pennock's theory of scientific virtues, we are developing an alternative curriculum for training scientists in the responsible conduct of research (RCR) that emphasizes internal values rather than externally imposed rules. This approach focuses on the virtuous characteristics of scientists that lead to responsible and exemplary behavior. We have been pilot-testing one element of such a virtue-based approach to RCR training by conducting dialogue sessions, modeled upon the approach developed by Toolbox Dialogue Initiative, that focus on a specific virtue, e.g., curiosity and objectivity. During these structured discussions, small groups of scientists explore the roles they think the focus virtue plays and should play in the practice of science. Preliminary results have shown that participants strongly prefer this virtue-based model over traditional methods of RCR training. While we cannot yet definitively say that participation in these RCR sessions contributes to responsible conduct, these pilot results are encouraging and warrant continued development of this virtue-based approach to RCR training.

  3. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, III, F. G.

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less

  4. ESP Toolbox: A Computational Framework for Precise, Scale-Independent Analysis of Bulk Elastic and Seismic Properties

    NASA Astrophysics Data System (ADS)

    Johnson, S. E.; Vel, S. S.; Cook, A. C.; Song, W. J.; Gerbi, C. C.; Okaya, D. A.

    2014-12-01

    Owing to the abundance of highly anisotropic minerals in the crust, the Voigt and Reuss bounds on the seismic velocities can be separated by more than 1 km/s. These bounds are determined by modal mineralogy and crystallographic preferred orientations (CPO) of the constituent minerals, but where the true velocities lie between these bounds is determined by other fabric parameters such as the shapes, shape-preferred orientations, and spatial arrangements of grains. Thus, the calculation of accurate bulk stiffness relies on explicitly treating the grain-scale heterogeneity, and the same principle applies at larger scales, for example calculating accurate bulk stiffness for a crustal volume with varying proportions and distributions of folds or shear zones. We have developed stand-alone GUI software - ESP Toolbox - for the calculation of 3D bulk elastic and seismic properties of heterogeneous and polycrystalline materials using image or EBSD data. The GUI includes a number of different homogenization techniques, including Voigt, Reuss, Hill, geometric mean, self-consistent and asymptotic expansion homogenization (AEH) methods. The AEH method, which uses a finite element mesh, is most accurate since it explicitly accounts for elastic interactions of constituent minerals/phases. The user need only specify the microstructure and material properties of the minerals/phases. We use the Toolbox to explore changes in bulk elasticity and related seismic anisotropy caused by specific variables, including: (a) the quartz alpha-beta phase change in rocks with varying proportions of quartz, (b) changes in modal mineralogy and CPO fabric that occur during progressive deformation and metamorphism, and (c) shear zones of varying thickness, abundance and geometry in continental crust. The Toolbox allows rapid sensitivity analysis around these and other variables, and the resulting bulk stiffness matrices can be used to populate volumes for synthetic wave propagation experiments that allow direct visualization of how variables of interest might affect propagation at a variety of scales. Sensitivity analyses also illustrate the value of the more precise AEH method. The ESP Toolbox can be downloaded here: http://umaine.edu/mecheng/faculty-and-staff/senthil-vel/software/

  5. Basic Radar Altimetry Toolbox: Tools to Use Radar Altimetry for Geodesy

    NASA Astrophysics Data System (ADS)

    Rosmorduc, V.; Benveniste, J. J.; Bronner, E.; Niejmeier, S.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications and uses. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for geodesy, especially combined witht ESA GOCE mission data is still somehow hard. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed (as well as, on ESA side, the GOCE User Toolbox, both being linked). The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are ongoing, some are in discussion. Examples and Data use cases on geodesy will be presented. BRAT is developed under contract with ESA and CNES.

  6. A Recessive Pollination Control System for Wheat Based on Intein-Mediated Protein Splicing.

    PubMed

    Gils, Mario

    2017-01-01

    A transgene-expression system for wheat that relies on the complementation of inactive precursor protein fragments through a split-intein system is described. The N- and C-terminal fragments of a barnase gene from Bacillus amyloliquifaciens were fused to intein sequences from Synechocystis sp. and transformed into wheat plants. Upon translation, both barnase fragments are assembled by an autocatalytic intein-mediated trans-splicing reaction, thus forming a cytotoxic enzyme. This chapter focuses on the use of introns and flexible polypeptide linkers to foster the expression of a split-barnase expression system in plants. The methods and protocols that were employed with the objective to test the effects of such genetic elements on transgene expression and to find the optimal design of expression vectors for use in wheat are provided. Split-inteins can be used to form an agriculturally important trait (male sterility) in wheat plants. The use of this principle for the production of hybrid wheat seed is described. The suggested toolbox will hopefully be a valuable contribution to future optimization strategies in this commercially important crop.

  7. Genetic tools for the investigation of Roseobacter clade bacteria

    PubMed Central

    2009-01-01

    Background The Roseobacter clade represents one of the most abundant, metabolically versatile and ecologically important bacterial groups found in marine habitats. A detailed molecular investigation of the regulatory and metabolic networks of these organisms is currently limited for many strains by missing suitable genetic tools. Results Conjugation and electroporation methods for the efficient and stable genetic transformation of selected Roseobacter clade bacteria including Dinoroseobacter shibae, Oceanibulbus indolifex, Phaeobacter gallaeciensis, Phaeobacter inhibens, Roseobacter denitrificans and Roseobacter litoralis were tested. For this purpose an antibiotic resistance screening was performed and suitable genetic markers were selected. Based on these transformation protocols stably maintained plasmids were identified. A plasmid encoded oxygen-independent fluorescent system was established using the flavin mononucleotide-based fluorescent protein FbFP. Finally, a chromosomal gene knockout strategy was successfully employed for the inactivation of the anaerobic metabolism regulatory gene dnr from D. shibae DFL12T. Conclusion A genetic toolbox for members of the Roseobacter clade was established. This provides a solid methodical basis for the detailed elucidation of gene regulatory and metabolic networks underlying the ecological success of this group of marine bacteria. PMID:20021642

  8. PETPVC: a toolbox for performing partial volume correction techniques in positron emission tomography

    NASA Astrophysics Data System (ADS)

    Thomas, Benjamin A.; Cuplov, Vesna; Bousse, Alexandre; Mendes, Adriana; Thielemans, Kris; Hutton, Brian F.; Erlandsson, Kjell

    2016-11-01

    Positron emission tomography (PET) images are degraded by a phenomenon known as the partial volume effect (PVE). Approaches have been developed to reduce PVEs, typically through the utilisation of structural information provided by other imaging modalities such as MRI or CT. These methods, known as partial volume correction (PVC) techniques, reduce PVEs by compensating for the effects of the scanner resolution, thereby improving the quantitative accuracy. The PETPVC toolbox described in this paper comprises a suite of methods, both classic and more recent approaches, for the purposes of applying PVC to PET data. Eight core PVC techniques are available. These core methods can be combined to create a total of 22 different PVC techniques. Simulated brain PET data are used to demonstrate the utility of toolbox in idealised conditions, the effects of applying PVC with mismatched point-spread function (PSF) estimates and the potential of novel hybrid PVC methods to improve the quantification of lesions. All anatomy-based PVC techniques achieve complete recovery of the PET signal in cortical grey matter (GM) when performed in idealised conditions. Applying deconvolution-based approaches results in incomplete recovery due to premature termination of the iterative process. PVC techniques are sensitive to PSF mismatch, causing a bias of up to 16.7% in GM recovery when over-estimating the PSF by 3 mm. The recovery of both GM and a simulated lesion was improved by combining two PVC techniques together. The PETPVC toolbox has been written in C++, supports Windows, Mac and Linux operating systems, is open-source and publicly available.

  9. Characterization of Disease-Related Covariance Topographies with SSMPCA Toolbox: Effects of Spatial Normalization and PET Scanners

    PubMed Central

    Peng, Shichun; Ma, Yilong; Spetsieris, Phoebe G; Mattis, Paul; Feigin, Andrew; Dhawan, Vijay; Eidelberg, David

    2013-01-01

    In order to generate imaging biomarkers from disease-specific brain networks, we have implemented a general toolbox to rapidly perform scaled subprofile modeling (SSM) based on principal component analysis (PCA) on brain images of patients and normals. This SSMPCA toolbox can define spatial covariance patterns whose expression in individual subjects can discriminate patients from controls or predict behavioral measures. The technique may depend on differences in spatial normalization algorithms and brain imaging systems. We have evaluated the reproducibility of characteristic metabolic patterns generated by SSMPCA in patients with Parkinson's disease (PD). We used [18F]fluorodeoxyglucose PET scans from PD patients and normal controls. Motor-related (PDRP) and cognition-related (PDCP) metabolic patterns were derived from images spatially normalized using four versions of SPM software (spm99, spm2, spm5 and spm8). Differences between these patterns and subject scores were compared across multiple independent groups of patients and control subjects. These patterns and subject scores were highly reproducible with different normalization programs in terms of disease discrimination and cognitive correlation. Subject scores were also comparable in PD patients imaged across multiple PET scanners. Our findings confirm a very high degree of consistency among brain networks and their clinical correlates in PD using images normalized in four different SPM platforms. SSMPCA toolbox can be used reliably for generating disease-specific imaging biomarkers despite the continued evolution of image preprocessing software in the neuroimaging community. Network expressions can be quantified in individual patients independent of different physical characteristics of PET cameras. PMID:23671030

  10. Characterization of disease-related covariance topographies with SSMPCA toolbox: effects of spatial normalization and PET scanners.

    PubMed

    Peng, Shichun; Ma, Yilong; Spetsieris, Phoebe G; Mattis, Paul; Feigin, Andrew; Dhawan, Vijay; Eidelberg, David

    2014-05-01

    To generate imaging biomarkers from disease-specific brain networks, we have implemented a general toolbox to rapidly perform scaled subprofile modeling (SSM) based on principal component analysis (PCA) on brain images of patients and normals. This SSMPCA toolbox can define spatial covariance patterns whose expression in individual subjects can discriminate patients from controls or predict behavioral measures. The technique may depend on differences in spatial normalization algorithms and brain imaging systems. We have evaluated the reproducibility of characteristic metabolic patterns generated by SSMPCA in patients with Parkinson's disease (PD). We used [(18) F]fluorodeoxyglucose PET scans from patients with PD and normal controls. Motor-related (PDRP) and cognition-related (PDCP) metabolic patterns were derived from images spatially normalized using four versions of SPM software (spm99, spm2, spm5, and spm8). Differences between these patterns and subject scores were compared across multiple independent groups of patients and control subjects. These patterns and subject scores were highly reproducible with different normalization programs in terms of disease discrimination and cognitive correlation. Subject scores were also comparable in patients with PD imaged across multiple PET scanners. Our findings confirm a very high degree of consistency among brain networks and their clinical correlates in PD using images normalized in four different SPM platforms. SSMPCA toolbox can be used reliably for generating disease-specific imaging biomarkers despite the continued evolution of image preprocessing software in the neuroimaging community. Network expressions can be quantified in individual patients independent of different physical characteristics of PET cameras. Copyright © 2013 Wiley Periodicals, Inc.

  11. MagPy: A Python toolbox for controlling Magstim transcranial magnetic stimulators.

    PubMed

    McNair, Nicolas A

    2017-01-30

    To date, transcranial magnetic stimulation (TMS) studies manipulating stimulation parameters have largely used blocked paradigms. However, altering these parameters on a trial-by-trial basis in Magstim stimulators is complicated by the need to send regular (1Hz) commands to the stimulator. Additionally, effecting such control interferes with the ability to send TMS pulses or simultaneously present stimuli with high-temporal precision. This manuscript presents the MagPy toolbox, a Python software package that provides full control over Magstim stimulators via the serial port. It is able to maintain this control with no impact on concurrent processing, such as stimulus delivery. In addition, a specially-designed "QuickFire" serial cable is specified that allows MagPy to trigger TMS pulses with very low-latency. In a series of experimental simulations, MagPy was able to maintain uninterrupted remote control over the connected Magstim stimulator across all testing sessions. In addition, having MagPy enabled had no effect on stimulus timing - all stimuli were presented for precisely the duration specified. Finally, using the QuickFire cable, MagPy was able to elicit TMS pulses with sub-millisecond latencies. The MagPy toolbox allows for experiments that require manipulating stimulation parameters from trial to trial. Furthermore, it can achieve this in contexts that require tight control over timing, such as those seeking to combine TMS with fMRI or EEG. Together, the MagPy toolbox and QuickFire serial cable provide an effective means for controlling Magstim stimulators during experiments while ensuring high-precision timing. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. FACET - a "Flexible Artifact Correction and Evaluation Toolbox" for concurrently recorded EEG/fMRI data.

    PubMed

    Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S

    2013-11-09

    In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.

  13. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis.

    PubMed

    Fayyaz S, S Kiavash; Liu, Xiaoyue Cathy; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George's transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis.

  14. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis

    PubMed Central

    Fayyaz S., S. Kiavash; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George’s transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis. PMID:28981544

  15. MNPBEM - A Matlab toolbox for the simulation of plasmonic nanoparticles

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich; Trügler, Andreas

    2012-02-01

    MNPBEM is a Matlab toolbox for the simulation of metallic nanoparticles (MNP), using a boundary element method (BEM) approach. The main purpose of the toolbox is to solve Maxwell's equations for a dielectric environment where bodies with homogeneous and isotropic dielectric functions are separated by abrupt interfaces. Although the approach is in principle suited for arbitrary body sizes and photon energies, it is tested (and probably works best) for metallic nanoparticles with sizes ranging from a few to a few hundreds of nanometers, and for frequencies in the optical and near-infrared regime. The toolbox has been implemented with Matlab classes. These classes can be easily combined, which has the advantage that one can adapt the simulation programs flexibly for various applications. Program summaryProgram title: MNPBEM Catalogue identifier: AEKJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v2 No. of lines in distributed program, including test data, etc.: 15 700 No. of bytes in distributed program, including test data, etc.: 891 417 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b) Computer: Any which supports Matlab 7.11.0 (R2010b) Operating system: Any which supports Matlab 7.11.0 (R2010b) RAM: ⩾1 GByte Classification: 18 Nature of problem: Solve Maxwell's equations for dielectric particles with homogeneous dielectric functions separated by abrupt interfaces. Solution method: Boundary element method using electromagnetic potentials. Running time: Depending on surface discretization between seconds and hours.

  16. Geo-PUMMA: Urban and Periurban Landscape Representation Toolbox for Hydrological Distributed Modeling

    NASA Astrophysics Data System (ADS)

    Sanzana, Pedro; Gironas, Jorge; Braud, Isabelle; Branger, Flora; Rodriguez, Fabrice; Vargas, Ximena; Hitschfeld, Nancy; Francisco Munoz, Jose

    2016-04-01

    In addition to land use changes, the process of urbanization can modify the direction of the surface and sub-surface flows, generating complex environments and increasing the types of connectivity between pervious and impervious areas. Thus, hydrological pathways in urban and periurban areas are significantly affected by artificial elements like channels, pipes, streets and other elements of storm water systems. This work presents Geo-PUMMA, a new GIS toolbox to generate vectorial meshes for distributed hydrological modeling and extract the drainage network in urban and periurban terrain. Geo-PUMMA gathers spatial information maps (e.g. cadastral, soil types, geology and digital elevation models) to produce Hydrological Response Units (HRU) and Urban Hydrological Elements (UHE). Geo-PUMMA includes tools to improve the initial mesh derived from GIS layers intersection in order to respect geometrical constraints, which ensures numerical stability while preserving the shape of the initial HRUs and minimizing the small elements to lower computing times. The geometrical constraints taken into account include: elements convexity, limitation of the number of sliver elements (e.g. roads) and of very small or very large elements. This toolbox allows the representation of basins at small scales (0.1-10km2), as it takes into account the hydrological connectivity of the main elements explicitly, and improves the representation of water pathways compared with classical raster approaches. Geo-PUMMA also allows the extraction of basin morphologic properties such as the width function, the area function and the imperviousness function. We applied this new toolbox to two periurban catchments: the Mercier catchment located near Lyon, France, and the Estero El Guindo catchment located in the Andean piedmont in the Maipo River, Chile. We use the capability of Geo-PUMMA to generate three different meshes. The first one is the initial mesh derived from the direct intersection of GIS layers. The second one is based on fine triangulation of HRUs and is considered the best one we can obtain (reference mesh). The third one is the recommended mesh, preserving the shape of the initial HRUs and limiting the number of elements. The representation of the drainage network and its morphological properties is compared between the three meshes. This comparison shows that the drainage network representation is particularly improved at small to medium spatial scales when using the recommended meshes (i.e. 120-150 m for the El Guindo catchment and 80-150 m for the Mercier catchment). The results also show that the recommended mesh correctly represents the main features of the drainage network as compared to the reference mesh. KEYWORDS: GRASS-GIS, Computer-assisted mesh generation, periurban catchments

  17. A quantitative framework for flower phenotyping in cultivated carnation (Dianthus caryophyllus L.).

    PubMed

    Chacón, Borja; Ballester, Roberto; Birlanga, Virginia; Rolland-Lagan, Anne-Gaëlle; Pérez-Pérez, José Manuel

    2013-01-01

    Most important breeding goals in ornamental crops are plant appearance and flower characteristics where selection is visually performed on direct offspring of crossings. We developed an image analysis toolbox for the acquisition of flower and petal images from cultivated carnation (Dianthus caryophyllus L.) that was validated by a detailed analysis of flower and petal size and shape in 78 commercial cultivars of D. caryophyllus, including 55 standard, 22 spray and 1 pot carnation cultivars. Correlation analyses allowed us to reduce the number of parameters accounting for the observed variation in flower and petal morphology. Convexity was used as a descriptor for the level of serration in flowers and petals. We used a landmark-based approach that allowed us to identify eight main principal components (PCs) accounting for most of the variance observed in petal shape. The effect and the strength of these PCs in standard and spray carnation cultivars are consistent with shared underlying mechanisms involved in the morphological diversification of petals in both subpopulations. Our results also indicate that neighbor-joining trees built with morphological data might infer certain phylogenetic relationships among carnation cultivars. Based on estimated broad-sense heritability values for some flower and petal features, different genetic determinants shall modulate the responses of flower and petal morphology to environmental cues in this species. We believe our image analysis toolbox could allow capturing flower variation in other species of high ornamental value.

  18. AnisoVis: a MATLAB™ toolbox for the visualisation of elastic anisotropy

    NASA Astrophysics Data System (ADS)

    Healy, D.; Timms, N.; Pearce, M. A.

    2016-12-01

    The elastic properties of rocks and minerals vary with direction, and this has significant consequences for their physical response to acoustic waves and natural or imposed stresses. This anisotropy of elasticity is well described mathematically by 4th rank tensors of stiffness or compliance. These tensors are not easy to visualise in a single diagram or graphic, and visualising Poisson's ratio and shear modulus presents a further challenge in that their anisotropy depends on two principal directions. Students and researchers can easily underestimate the importance of elastic anisotropy. This presentation describes an open source toolbox of MATLAB scripts that aims to visualise elastic anisotropy in rocks and minerals. The code produces linked 2-D and 3-D representations of the standard elastic constants, such as Young's modulus, Poisson's ratio and shear modulus, all from a simple GUI. The 3-D plots can be manipulated by the user (rotated, panned, zoomed), to encourage investigation and a deeper understanding of directional variations in the fundamental properties. Examples are presented of common rock forming minerals, including those with negative Poisson's ratio (auxetic behaviour). We hope that an open source code base will encourage further enhancements from the rock physics and wider geoscience communities. Eventually, we hope to generate 3-D prints of these complex and beautiful natural surfaces to provide a tactile link to the underlying physics of elastic anisotropy.

  19. The RAVEN Toolbox and Its Use for Generating a Genome-scale Metabolic Model for Penicillium chrysogenum

    PubMed Central

    Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens

    2013-01-01

    We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215

  20. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    PubMed

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  1. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    PubMed Central

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  2. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    PubMed

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  3. Robust Correlation Analyses: False Positive and Power Validation Using a New Open Source Matlab Toolbox

    PubMed Central

    Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.

    2012-01-01

    Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907

  4. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    PubMed

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  5. A Toolbox for Ab Initio 3-D Reconstructions in Single-particle Electron Microscopy

    PubMed Central

    Voss, Neil R; Lyumkis, Dmitry; Cheng, Anchi; Lau, Pick-Wei; Mulder, Anke; Lander, Gabriel C; Brignole, Edward J; Fellmann, Denis; Irving, Christopher; Jacovetty, Erica L; Leung, Albert; Pulokas, James; Quispe, Joel D; Winkler, Hanspeter; Yoshioka, Craig; Carragher, Bridget; Potter, Clinton S

    2010-01-01

    Structure determination of a novel macromolecular complex via single-particle electron microscopy depends upon overcoming the challenge of establishing a reliable 3-D reconstruction using only 2-D images. There are a variety of strategies that deal with this issue, but not all of them are readily accessible and straightforward to use. We have developed a “toolbox” of ab initio reconstruction techniques that provide several options for calculating 3-D volumes in an easily managed and tightly controlled work-flow that adheres to standard conventions and formats. This toolbox is designed to streamline the reconstruction process by removing the necessity for bookkeeping, while facilitating transparent data transfer between different software packages. It currently includes procedures for calculating ab initio reconstructions via random or orthogonal tilt geometry, tomograms, and common lines, all of which have been tested using the 50S ribosomal subunit. Our goal is that the accessibility of multiple independent reconstruction algorithms via this toolbox will improve the ease with which models can be generated, and provide a means of evaluating the confidence and reliability of the final reconstructed map. PMID:20018246

  6. PredPsych: A toolbox for predictive machine learning-based approach in experimental psychology research.

    PubMed

    Koul, Atesh; Becchio, Cristina; Cavallo, Andrea

    2017-12-12

    Recent years have seen an increased interest in machine learning-based predictive methods for analyzing quantitative behavioral data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible implementation. The aim of current work was to build an open-source R toolbox - "PredPsych" - that could make these methods readily available to all psychologists. PredPsych is a user-friendly, R toolbox based on machine-learning predictive algorithms. In this paper, we present the framework of PredPsych via the analysis of a recently published multiple-subject motion capture dataset. In addition, we discuss examples of possible research questions that can be addressed with the machine-learning algorithms implemented in PredPsych and cannot be easily addressed with univariate statistical analysis. We anticipate that PredPsych will be of use to researchers with limited programming experience not only in the field of psychology, but also in that of clinical neuroscience, enabling computational assessment of putative bio-behavioral markers for both prognosis and diagnosis.

  7. DPARSF: A MATLAB Toolbox for "Pipeline" Data Analysis of Resting-State fMRI.

    PubMed

    Chao-Gan, Yan; Yu-Feng, Zang

    2010-01-01

    Resting-state functional magnetic resonance imaging (fMRI) has attracted more and more attention because of its effectiveness, simplicity and non-invasiveness in exploration of the intrinsic functional architecture of the human brain. However, user-friendly toolbox for "pipeline" data analysis of resting-state fMRI is still lacking. Based on some functions in Statistical Parametric Mapping (SPM) and Resting-State fMRI Data Analysis Toolkit (REST), we have developed a MATLAB toolbox called Data Processing Assistant for Resting-State fMRI (DPARSF) for "pipeline" data analysis of resting-state fMRI. After the user arranges the Digital Imaging and Communications in Medicine (DICOM) files and click a few buttons to set parameters, DPARSF will then give all the preprocessed (slice timing, realign, normalize, smooth) data and results for functional connectivity, regional homogeneity, amplitude of low-frequency fluctuation (ALFF), and fractional ALFF. DPARSF can also create a report for excluding subjects with excessive head motion and generate a set of pictures for easily checking the effect of normalization. In addition, users can also use DPARSF to extract time courses from regions of interest.

  8. The influence of age and exercise modality on growth hormone bioactivity in women.

    PubMed

    Gordon, Scott E; Kraemer, William J; Looney, David P; Flanagan, Shawn D; Comstock, Brett A; Hymer, Wesley C

    2014-01-01

    Prior research has indicated that the loss of skeletal muscle mass and bone mineral density observed with aging is related to the prominent age-related decline in the concentration of serum growth hormone (GH). However, there is limited data on the effects of aging on GH responses to acute bouts of heavy resistance exercise (HRE) and aerobic exercise (AE). The present investigation examined the effects of a HRE protocol and an AE protocol on immunoreactive GH (IGH) and bioactive GH (BGH) in active young and old women. Older women had a diminished serum IGH response to both the HRE and AE protocols compared to the younger women, however a similar response was not observed in serum BGH. Additionally, the HRE protocol elicited a greater BGH response than the AE protocol exclusively in the younger group. Regardless of exercise mode, aging induces an increase in growth hormone polymerization that specifically results in a loss of serum growth hormone immunoreactivity without a concurrent loss of serum growth hormone bioactivity. The greater BGH response to the HRE protocol found in the younger group can be attributed to an unknown serum factor of molecular weight between 30 and 55kD that either potentiated growth hormone bioactivity in response to HRE or inhibited growth hormone bioactivity in response to AE. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Development of an Online Well-Being Intervention for Young People: An Evaluation Protocol

    PubMed Central

    Bidargaddi, Niranjan; Blake, Victoria; Schrader, Geoffrey; Kaambwa, Billingsley; Quinn, Stephen; Orlowski, Simone; Winsall, Megan; Battersby, Malcolm

    2015-01-01

    Background Research has shown that improving well-being using positive mental health interventions can be useful for predicting and preventing mental illness. Implementing online interventions may be an effective way to reach young people, given their familiarity with technology. Objective This study will assess the effectiveness of a website called the “Online Wellbeing Centre (OWC),” designed for the support and improvement of mental health and well-being in young Australians aged between 16 and 25 years. As the active component of the study, the OWC will introduce a self-guided app recommendation service called “The Toolbox: The best apps for your brain and body” developed by ReachOut.com. The Toolbox is a responsive website that serves as a personalized, ongoing recommendation service for technology-based tools and apps to improve well-being. It allows users to personalize their experience according to their individual needs. Methods This study will be a two-arm, randomized controlled trial following a wait-list control design. The primary outcome will be changes in psychological well-being measured by the Mental Health Continuum Short Form. The secondary outcomes will be drawn from a subsample of participants and will include depression scores measured by the Center for Epidemiologic Studies Depression Scale, and quality of life measured by the Assessment of Quality of Life-four dimensions (AQOL-4D) index. Cost-effectiveness analysis will be conducted based on a primary outcome of cost per unique visit to the OWC. Utility-based outcomes will also be incorporated into the analysis allowing a secondary outcome to be cost per quality-adjusted life year gained (based on the AQOL-4D values). Resource use associated with both the intervention and control groups will be collected using a customized questionnaire. Online- and community-based recruitment strategies will be implemented, and the effectiveness of each approach will be analyzed. Participants will be recruited from the general Australian population and randomized online. The trial will last for 4 weeks. Results Small but clinically significant increases in well-being symptoms are expected to be detected in the intervention group compared with the control group. Conclusions If this intervention proves to be effective, it will have an impact on the future design and implementation of online-based well-being interventions as a valid and cost-effective way to support mental health clinical treatment. Findings regarding recruitment effectiveness will also contribute to developing better ways to engage this population in research. ClinicalTrial This study is registered in the Australian New Zealand Clinical Trials Registry (ANZCTR): ACTRN12614000710628. PMID:25929201

  10. RSA-Based Password-Authenticated Key Exchange, Revisited

    NASA Astrophysics Data System (ADS)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    The RSA-based Password-Authenticated Key Exchange (PAKE) protocols have been proposed to realize both mutual authentication and generation of secure session keys where a client is sharing his/her password only with a server and the latter should generate its RSA public/private key pair (e, n), (d, n) every time due to the lack of PKI (Public-Key Infrastructures). One of the ways to avoid a special kind of off-line (so called e-residue) attacks in the RSA-based PAKE protocols is to deploy a challenge/response method by which a client verifies the relative primality of e and φ(n) interactively with a server. However, this kind of RSA-based PAKE protocols did not give any proof of the underlying challenge/response method and therefore could not specify the exact complexity of their protocols since there exists another security parameter, needed in the challenge/response method. In this paper, we first present an RSA-based PAKE (RSA-PAKE) protocol that can deploy two different challenge/response methods (denoted by Challenge/Response Method1 and Challenge/Response Method2). The main contributions of this work include: (1) Based on the number theory, we prove that the Challenge/Response Method1 and the Challenge/Response Method2 are secure against e-residue attacks for any odd prime e (2) With the security parameter for the on-line attacks, we show that the RSA-PAKE protocol is provably secure in the random oracle model where all of the off-line attacks are not more efficient than on-line dictionary attacks; and (3) By considering the Hamming weight of e and its complexity in the. RSA-PAKE protocol, we search for primes to be recommended for a practical use. We also compare the RSA-PAKE protocol with the previous ones mainly in terms of computation and communication complexities.

  11. LASSIM-A network inference toolbox for genome-wide mechanistic modeling.

    PubMed

    Magnusson, Rasmus; Mariotti, Guido Pio; Köpsén, Mattias; Lövfors, William; Gawel, Danuta R; Jörnsten, Rebecka; Linde, Jörg; Nordling, Torbjörn E M; Nyman, Elin; Schulze, Sylvie; Nestor, Colm E; Zhang, Huan; Cedersund, Gunnar; Benson, Mikael; Tjärnberg, Andreas; Gustafsson, Mika

    2017-06-01

    Recent technological advancements have made time-resolved, quantitative, multi-omics data available for many model systems, which could be integrated for systems pharmacokinetic use. Here, we present large-scale simulation modeling (LASSIM), which is a novel mathematical tool for performing large-scale inference using mechanistically defined ordinary differential equations (ODE) for gene regulatory networks (GRNs). LASSIM integrates structural knowledge about regulatory interactions and non-linear equations with multiple steady state and dynamic response expression datasets. The rationale behind LASSIM is that biological GRNs can be simplified using a limited subset of core genes that are assumed to regulate all other gene transcription events in the network. The LASSIM method is implemented as a general-purpose toolbox using the PyGMO Python package to make the most of multicore computers and high performance clusters, and is available at https://gitlab.com/Gustafsson-lab/lassim. As a method, LASSIM works in two steps, where it first infers a non-linear ODE system of the pre-specified core gene expression. Second, LASSIM in parallel optimizes the parameters that model the regulation of peripheral genes by core system genes. We showed the usefulness of this method by applying LASSIM to infer a large-scale non-linear model of naïve Th2 cell differentiation, made possible by integrating Th2 specific bindings, time-series together with six public and six novel siRNA-mediated knock-down experiments. ChIP-seq showed significant overlap for all tested transcription factors. Next, we performed novel time-series measurements of total T-cells during differentiation towards Th2 and verified that our LASSIM model could monitor those data significantly better than comparable models that used the same Th2 bindings. In summary, the LASSIM toolbox opens the door to a new type of model-based data analysis that combines the strengths of reliable mechanistic models with truly systems-level data. We demonstrate the power of this approach by inferring a mechanistically motivated, genome-wide model of the Th2 transcription regulatory system, which plays an important role in several immune related diseases.

  12. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Casto, Gordon V.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a toolbox format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  13. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; Hetherington, Samuel E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a "toolbox" format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  14. Getting a grip on glycans: A current overview of the metabolic oligosaccharide engineering toolbox.

    PubMed

    Sminia, Tjerk J; Zuilhof, Han; Wennekes, Tom

    2016-11-29

    This review discusses the advances in metabolic oligosaccharide engineering (MOE) from 2010 to 2016 with a focus on the structure, preparation, and reactivity of its chemical probes. A brief historical overview of MOE is followed by a comprehensive overview of the chemical probes currently available in the MOE molecular toolbox and the bioconjugation techniques they enable. The final part of the review focusses on the synthesis of a selection of probes and finishes with an outlook on recent and potential upcoming advances in the field of MOE. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Improving Cognitive Skills of the Industrial Robot

    NASA Astrophysics Data System (ADS)

    Bezák, Pavol

    2015-08-01

    At present, there are plenty of industrial robots that are programmed to do the same repetitive task all the time. Industrial robots doing such kind of job are not able to understand whether the action is correct, effective or good. Object detection, manipulation and grasping is challenging due to the hand and object modeling uncertainties, unknown contact type and object stiffness properties. In this paper, the proposal of an intelligent humanoid hand object detection and grasping model is presented assuming that the object properties are known. The control is simulated in the Matlab Simulink/ SimMechanics, Neural Network Toolbox and Computer Vision System Toolbox.

  16. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  17. GPELab, a Matlab toolbox to solve Gross-Pitaevskii equations II: Dynamics and stochastic simulations

    NASA Astrophysics Data System (ADS)

    Antoine, Xavier; Duboscq, Romain

    2015-08-01

    GPELab is a free Matlab toolbox for modeling and numerically solving large classes of systems of Gross-Pitaevskii equations that arise in the physics of Bose-Einstein condensates. The aim of this second paper, which follows (Antoine and Duboscq, 2014), is to first present the various pseudospectral schemes available in GPELab for computing the deterministic and stochastic nonlinear dynamics of Gross-Pitaevskii equations (Antoine, et al., 2013). Next, the corresponding GPELab functions are explained in detail. Finally, some numerical examples are provided to show how the code works for the complex dynamics of BEC problems.

  18. Kinematic simulation and analysis of robot based on MATLAB

    NASA Astrophysics Data System (ADS)

    Liao, Shuhua; Li, Jiong

    2018-03-01

    The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.

  19. DoOR 2.0 - Comprehensive Mapping of Drosophila melanogaster Odorant Responses

    NASA Astrophysics Data System (ADS)

    Münch, Daniel; Galizia, C. Giovanni

    2016-02-01

    Odors elicit complex patterns of activated olfactory sensory neurons. Knowing the complete olfactome, i.e. the responses in all sensory neurons for all relevant odorants, is desirable to understand olfactory coding. The DoOR project combines all available Drosophila odorant response data into a single consensus response matrix. Since its first release many studies were published: receptors were deorphanized and several response profiles were expanded. In this study, we add unpublished data to the odor-response profiles for four odorant receptors (Or10a, Or42b, Or47b, Or56a). We deorphanize Or69a, showing a broad response spectrum with the best ligands including 3-hydroxyhexanoate, alpha-terpineol, 3-octanol and linalool. We include all of these datasets into DoOR, provide a comprehensive update of both code and data, and new tools for data analyses and visualizations. The DoOR project has a web interface for quick queries (http://neuro.uni.kn/DoOR), and a downloadable, open source toolbox written in R, including all processed and original datasets. DoOR now gives reliable odorant-responses for nearly all Drosophila olfactory responding units, listing 693 odorants, for a total of 7381 data points.

  20. Using Arduino microcontroller boards to measure response latencies.

    PubMed

    Schubert, Thomas W; D'Ausilio, Alessandro; Canto, Rosario

    2013-12-01

    Latencies of buttonpresses are a staple of cognitive science paradigms. Often keyboards are employed to collect buttonpresses, but their imprecision and variability decreases test power and increases the risk of false positives. Response boxes and data acquisition cards are precise, but expensive and inflexible, alternatives. We propose using open-source Arduino microcontroller boards as an inexpensive and flexible alternative. These boards connect to standard experimental software using a USB connection and a virtual serial port, or by emulating a keyboard. In our solution, an Arduino measures response latencies after being signaled the start of a trial, and communicates the latency and response back to the PC over a USB connection. We demonstrated the reliability, robustness, and precision of this communication in six studies. Test measures confirmed that the error added to the measurement had an SD of less than 1 ms. Alternatively, emulation of a keyboard results in similarly precise measurement. The Arduino performs as well as a serial response box, and better than a keyboard. In addition, our setup allows for the flexible integration of other sensors, and even actuators, to extend the cognitive science toolbox.

  1. A toolbox of genes, proteins, metabolites and promoters for improving drought tolerance in soybean includes the metabolite coumestrol and stomatal development genes.

    PubMed

    Tripathi, Prateek; Rabara, Roel C; Reese, R Neil; Miller, Marissa A; Rohila, Jai S; Subramanian, Senthil; Shen, Qingxi J; Morandi, Dominique; Bücking, Heike; Shulaev, Vladimir; Rushton, Paul J

    2016-02-09

    The purpose of this project was to identify metabolites, proteins, genes, and promoters associated with water stress responses in soybean. A number of these may serve as new targets for the biotechnological improvement of drought responses in soybean (Glycine max). We identified metabolites, proteins, and genes that are strongly up or down regulated during rapid water stress following removal from a hydroponics system. 163 metabolites showed significant changes during water stress in roots and 93 in leaves. The largest change was a root-specific 160-fold increase in the coumestan coumestrol making it a potential biomarker for drought and a promising target for improving drought responses. Previous reports suggest that coumestrol stimulates mycorrhizal colonization and under certain conditions mycorrhizal plants have improved drought tolerance. This suggests that coumestrol may be part of a call for help to the rhizobiome during stress. About 3,000 genes were strongly up-regulated by drought and we identified regulators such as ERF, MYB, NAC, bHLH, and WRKY transcription factors, receptor-like kinases, and calcium signaling components as potential targets for soybean improvement as well as the jasmonate and abscisic acid biosynthetic genes JMT, LOX1, and ABA1. Drought stressed soybean leaves show reduced mRNA levels of stomatal development genes including FAMA-like, MUTE-like and SPEECHLESS-like bHLH transcription factors and leaves formed after drought stress had a reduction in stomatal density of 22.34 % and stomatal index of 17.56 %. This suggests that reducing stomatal density may improve drought tolerance. MEME analyses suggest that ABRE (CACGT/CG), CRT/DRE (CCGAC) and a novel GTGCnTGC/G element play roles in transcriptional activation and these could form components of synthetic promoters to drive expression of transgenes. Using transformed hairy roots, we validated the increase in promoter activity of GmWRKY17 and GmWRKY67 during dehydration and after 20 μM ABA treatment. Our toolbox provides new targets and strategies for improving soybean drought tolerance and includes the coumestan coumestrol, transcription factors that regulate stomatal density, water stress-responsive WRKY gene promoters and a novel DNA element that appears to be enriched in water stress responsive promoters.

  2. FRACOR-software toolbox for deterministic mapping of fracture corridors in oil fields on AutoCAD platform

    NASA Astrophysics Data System (ADS)

    Ozkaya, Sait I.

    2018-03-01

    Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.

  3. MRI Atlas-Based Measurement of Spinal Cord Injury Predicts Outcome in Acute Flaccid Myelitis.

    PubMed

    McCoy, D B; Talbott, J F; Wilson, Michael; Mamlouk, M D; Cohen-Adad, J; Wilson, Mark; Narvid, J

    2017-02-01

    Recent advances in spinal cord imaging analysis have led to the development of a robust anatomic template and atlas incorporated into an open-source platform referred to as the Spinal Cord Toolbox. Using the Spinal Cord Toolbox, we sought to correlate measures of GM, WM, and cross-sectional area pathology on T2 MR imaging with motor disability in patients with acute flaccid myelitis. Spinal cord imaging for 9 patients with acute flaccid myelitis was analyzed by using the Spinal Cord Toolbox. A semiautomated pipeline using the Spinal Cord Toolbox measured lesion involvement in GM, WM, and total spinal cord cross-sectional area. Proportions of GM, WM, and cross-sectional area affected by T2 hyperintensity were calculated across 3 ROIs: 1) center axial section of lesion; 2) full lesion segment; and 3) full cord atlas volume. Spearman rank order correlation was calculated to compare MR metrics with clinical measures of disability. Proportion of GM metrics at the center axial section significantly correlated with measures of motor impairment upon admission ( r [9] = -0.78; P = .014) and at 3-month follow-up ( r [9] = -0.66; P = .05). Further, proportion of GM extracted across the full lesion segment significantly correlated with initial motor impairment ( r [9] = -0.74, P = .024). No significant correlation was found for proportion of WM or proportion of cross-sectional area with clinical disability. Atlas-based measures of proportion of GM T2 signal abnormality measured on a single axial MR imaging section and across the full lesion segment correlate with motor impairment and outcome in patients with acute flaccid myelitis. This is the first atlas-based study to correlate clinical outcomes with segmented measures of T2 signal abnormality in the spinal cord. © 2017 by American Journal of Neuroradiology.

  4. SacLab: A toolbox for saccade analysis to increase usability of eye tracking systems in clinical ophthalmology practice.

    PubMed

    Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela

    2017-01-01

    Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. The conservation physiology toolbox: status and opportunities

    PubMed Central

    Love, Oliver P; Hultine, Kevin R

    2018-01-01

    Abstract For over a century, physiological tools and techniques have been allowing researchers to characterize how organisms respond to changes in their natural environment and how they interact with human activities or infrastructure. Over time, many of these techniques have become part of the conservation physiology toolbox, which is used to monitor, predict, conserve, and restore plant and animal populations under threat. Here, we provide a summary of the tools that currently comprise the conservation physiology toolbox. By assessing patterns in articles that have been published in ‘Conservation Physiology’ over the past 5 years that focus on introducing, refining and validating tools, we provide an overview of where researchers are placing emphasis in terms of taxa and physiological sub-disciplines. Although there is certainly diversity across the toolbox, metrics of stress physiology (particularly glucocorticoids) and studies focusing on mammals have garnered the greatest attention, with both comprising the majority of publications (>45%). We also summarize the types of validations that are actively being completed, including those related to logistics (sample collection, storage and processing), interpretation of variation in physiological traits and relevance for conservation science. Finally, we provide recommendations for future tool refinement, with suggestions for: (i) improving our understanding of the applicability of glucocorticoid physiology; (ii) linking multiple physiological and non-physiological tools; (iii) establishing a framework for plant conservation physiology; (iv) assessing links between environmental disturbance, physiology and fitness; (v) appreciating opportunities for validations in under-represented taxa; and (vi) emphasizing tool validation as a core component of research programmes. Overall, we are confident that conservation physiology will continue to increase its applicability to more taxa, develop more non-invasive techniques, delineate where limitations exist, and identify the contexts necessary for interpretation in captivity and the wild. PMID:29942517

  6. MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.

    PubMed

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.

  7. morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python

    PubMed Central

    Hull, Michael J.; Willshaw, David J.

    2014-01-01

    The broad structure of a modeling study can often be explained over a cup of coffee, but converting this high-level conceptual idea into graphs of the final simulation results may require many weeks of sitting at a computer. Although models themselves can be complex, often many mental resources are wasted working around complexities of the software ecosystem such as fighting to manage files, interfacing between tools and data formats, finding mistakes in code or working out the units of variables. morphforge is a high-level, Python toolbox for building and managing simulations of small populations of multicompartmental biophysical model neurons. An entire in silico experiment, including the definition of neuronal morphologies, channel descriptions, stimuli, visualization and analysis of results can be written within a single short Python script using high-level objects. Multiple independent simulations can be created and run from a single script, allowing parameter spaces to be investigated. Consideration has been given to the reuse of both algorithmic and parameterizable components to allow both specific and stochastic parameter variations. Some other features of the toolbox include: the automatic generation of human-readable documentation (e.g., PDF files) about a simulation; the transparent handling of different biophysical units; a novel mechanism for plotting simulation results based on a system of tags; and an architecture that supports both the use of established formats for defining channels and synapses (e.g., MODL files), and the possibility to support other libraries and standards easily. We hope that this toolbox will allow scientists to quickly build simulations of multicompartmental model neurons for research and serve as a platform for further tool development. PMID:24478690

  8. Local soil quality assessment of north-central Namibia: integrating farmers' and technical knowledge

    NASA Astrophysics Data System (ADS)

    Prudat, Brice; Bloemertz, Lena; Kuhn, Nikolaus J.

    2018-02-01

    Soil degradation is a major threat for farmers of semi-arid north-central Namibia. Soil conservation practices can be promoted by the development of soil quality (SQ) evaluation toolboxes that provide ways to evaluate soil degradation. However, such toolboxes must be adapted to local conditions to reach farmers. Based on qualitative (interviews and soil descriptions) and quantitative (laboratory analyses) data, we developed a set of SQ indicators relevant for our study area that integrates farmers' field experiences (FFEs) and technical knowledge. We suggest using participatory mapping to delineate soil units (Oshikwanyama soil units, KwSUs) based on FFEs, which highlight mostly soil properties that integrate long-term productivity and soil hydrological characteristics (i.e. internal SQ). The actual SQ evaluation of a location depends on the KwSU described and is thereafter assessed by field soil texture (i.e. chemical fertility potential) and by soil colour shade (i.e. SOC status). This three-level information aims to reveal SQ improvement potential by comparing, for any location, (a) estimated clay content against median clay content (specific to KwSU) and (b) soil organic status against calculated optimal values (depends on clay content). The combination of farmers' and technical assessment cumulates advantages of both systems of knowledge, namely the integrated long-term knowledge of the farmers and a short- and medium-term SQ status assessment. The toolbox is a suggestion for evaluating SQ and aims to help farmers, rural development planners and researchers from all fields of studies understanding SQ issues in north-central Namibia. This suggested SQ toolbox is adapted to a restricted area of north-central Namibia, but similar tools could be developed in most areas where small-scale agriculture prevails.

  9. Basic Radar Altimetry Toolbox: tools to teach altimetry for ocean

    NASA Astrophysics Data System (ADS)

    Rosmorduc, Vinca; Benveniste, Jerome; Bronner, Emilie; Niemeijer, Sander; Lucas, Bruno Manuel; Dinardo, Salvatore

    2013-04-01

    The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, CryoSat. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (January 2013), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are in discussion for the future, including addition of the future Sentinel-3. The Basic Radar Altimetry Toolbox is able: - to read most distributed radar altimetry data, including the one from future missions like Saral, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways, including as an educational tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. Example from education uses will be presented, and feedback from those who used it as such will be most welcome. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  10. Stereotaxic 18F-FDG PET and MRI templates with three-dimensional digital atlas for statistical parametric mapping analysis of tree shrew brain.

    PubMed

    Huang, Qi; Nie, Binbin; Ma, Chen; Wang, Jing; Zhang, Tianhao; Duan, Shaofeng; Wu, Shang; Liang, Shengxiang; Li, Panlong; Liu, Hua; Sun, Hua; Zhou, Jiangning; Xu, Lin; Shan, Baoci

    2018-01-01

    Tree shrews are proposed as an alternative animal model to nonhuman primates due to their close affinity to primates. Neuroimaging techniques are widely used to study brain functions and structures of humans and animals. However, tree shrews are rarely applied in neuroimaging field partly due to the lack of available species specific analysis methods. In this study, 10 PET/CT and 10 MRI images of tree shrew brain were used to construct PET and MRI templates; based on histological atlas we reconstructed a three-dimensional digital atlas with 628 structures delineated; then the digital atlas and templates were aligned into a stereotaxic space. Finally, we integrated the digital atlas and templates into a toolbox for tree shrew brain spatial normalization, statistical analysis and results localization. We validated the feasibility of the toolbox by simulated data with lesions in laterodorsal thalamic nucleus (LD). The lesion volumes of simulated PET and MRI images were (12.97±3.91)mm 3 and (7.04±0.84)mm 3 . Statistical results at p<0.005 showed the lesion volumes of PET and MRI were 13.18mm 3 and 8.06mm 3 in LD. To our knowledge, we report the first PET template and digital atlas of tree shrew brain. Compared to the existing MRI templates, our MRI template was aligned into stereotaxic space. And the toolbox is the first software dedicated for tree shrew brain analysis. The templates and digital atlas of tree shrew brain, as well as the toolbox, facilitate the use of tree shrews in neuroimaging field. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Modular-based multiscale modeling on viscoelasticity of polymer nanocomposites

    NASA Astrophysics Data System (ADS)

    Li, Ying; Liu, Zeliang; Jia, Zheng; Liu, Wing Kam; Aldousari, Saad M.; Hedia, Hassan S.; Asiri, Saeed A.

    2017-02-01

    Polymer nanocomposites have been envisioned as advanced materials for improving the mechanical performance of neat polymers used in aerospace, petrochemical, environment and energy industries. With the filler size approaching the nanoscale, composite materials tend to demonstrate remarkable thermomechanical properties, even with addition of a small amount of fillers. These observations confront the classical composite theories and are usually attributed to the high surface-area-to-volume-ratio of the fillers, which can introduce strong nanoscale interfacial effect and relevant long-range perturbation on polymer chain dynamics. Despite decades of research aimed at understanding interfacial effect and improving the mechanical performance of composite materials, it is not currently possible to accurately predict the mechanical properties of polymer nanocomposites directly from their molecular constituents. To overcome this challenge, different theoretical, experimental and computational schemes will be used to uncover the key physical mechanisms at the relevant spatial and temporal scales for predicting and tuning constitutive behaviors in silico, thereby establishing a bottom-up virtual design principle to achieve unprecedented mechanical performance of nanocomposites. A modular-based multiscale modeling approach for viscoelasticity of polymer nanocomposites has been proposed and discussed in this study, including four modules: (A) neat polymer toolbox; (B) interphase toolbox; (C) microstructural toolbox and (D) homogenization toolbox. Integrating these modules together, macroscopic viscoelasticity of polymer nanocomposites could be directly predicted from their molecular constituents. This will maximize the computational ability to design novel polymer composites with advanced performance. More importantly, elucidating the viscoelasticity of polymer nanocomposites through fundamental studies is a critical step to generate an integrated computational material engineering principle for discovering and manufacturing new composites with transformative impact on aerospace, automobile, petrochemical industries.

  12. Cortical Thickness Estimations of FreeSurfer and the CAT12 Toolbox in Patients with Alzheimer's Disease and Healthy Controls.

    PubMed

    Seiger, Rene; Ganger, Sebastian; Kranz, Georg S; Hahn, Andreas; Lanzenberger, Rupert

    2018-05-15

    Automated cortical thickness (CT) measurements are often used to assess gray matter changes in the healthy and diseased human brain. The FreeSurfer software is frequently applied for this type of analysis. The computational anatomy toolbox (CAT12) for SPM, which offers a fast and easy-to-use alternative approach, was recently made available. In this study, we compared region of interest (ROI)-wise CT estimations of the surface-based FreeSurfer 6 (FS6) software and the volume-based CAT12 toolbox for SPM using 44 elderly healthy female control subjects (HC). In addition, these 44 HCs from the cross-sectional analysis and 34 age- and sex-matched patients with Alzheimer's disease (AD) were used to assess the potential of detecting group differences for each method. Finally, a test-retest analysis was conducted using 19 HC subjects. All data were taken from the OASIS database and MRI scans were recorded at 1.5 Tesla. A strong correlation was observed between both methods in terms of ROI mean CT estimates (R 2 = .83). However, CAT12 delivered significantly higher CT estimations in 32 of the 34 ROIs, indicating a systematic difference between both approaches. Furthermore, both methods were able to reliably detect atrophic brain areas in AD subjects, with the highest decreases in temporal areas. Finally, FS6 as well as CAT12 showed excellent test-retest variability scores. Although CT estimations were systematically higher for CAT12, this study provides evidence that this new toolbox delivers accurate and robust CT estimates and can be considered a fast and reliable alternative to FreeSurfer. © 2018 The Authors. Journal of Neuroimaging published by Wiley Periodicals, Inc. on behalf of American Society of Neuroimaging.

  13. Orbit Determination Toolbox

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Berry, Kevin; Gregpru. Late; Speckman, Keith; Hur-Diaz, Sun; Surka, Derek; Gaylor, Dave

    2010-01-01

    The Orbit Determination Toolbox is an orbit determination (OD) analysis tool based on MATLAB and Java that provides a flexible way to do early mission analysis. The toolbox is primarily intended for advanced mission analysis such as might be performed in concept exploration, proposal, early design phase, or rapid design center environments. The emphasis is on flexibility, but it has enough fidelity to produce credible results. Insight into all flight dynamics source code is provided. MATLAB is the primary user interface and is used for piecing together measurement and dynamic models. The Java Astrodynamics Toolbox is used as an engine for things that might be slow or inefficient in MATLAB, such as high-fidelity trajectory propagation, lunar and planetary ephemeris look-ups, precession, nutation, polar motion calculations, ephemeris file parsing, and the like. The primary analysis functions are sequential filter/smoother and batch least-squares commands that incorporate Monte-Carlo data simulation, linear covariance analysis, measurement processing, and plotting capabilities at the generic level. These functions have a user interface that is based on that of the MATLAB ODE suite. To perform a specific analysis, users write MATLAB functions that implement truth and design system models. The user provides his or her models as inputs to the filter commands. The software provides a capability to publish and subscribe to a software bus that is compliant with the NASA Goddard Mission Services Evolution Center (GMSEC) standards, to exchange data with other flight dynamics tools to simplify the flight dynamics design cycle. Using the publish and subscribe approach allows for analysts in a rapid design center environment to seamlessly incorporate changes in spacecraft and mission design into navigation analysis and vice versa.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, A; Veeraraghavan, H; Oh, J

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less

  15. Apomorphine conditioning and sensitization: the paired/unpaired treatment order as a new major determinant of drug conditioned and sensitization effects.

    PubMed

    de Matos, Liana Wermelinger; Carey, Robert J; Carrera, Marinete Pinheiro

    2010-09-01

    Repeated treatments with psychostimulant drugs generate behavioral sensitization. In the present study we employed a paired/unpaired protocol to assess the effects of repeated apomorphine (2.0 mg/kg) treatments upon locomotion behavior. In the first experiment we assessed the effects of conditioning upon apomorphine sensitization. Neither the extinction of the conditioned response nor a counter-conditioning procedure in which we paired an inhibitory treatment (apomorphine 0.05 mg/kg) with the previously established conditioned stimulus modified the sensitization response. In the second experiment, we administered the paired/unpaired protocol in two phases. In the second phase, we reversed the paired/unpaired protocol. Following the first phase, the paired group alone exhibited conditioned locomotion in the vehicle test and a sensitization response. In the second phase, the initial unpaired group which received 5 paired apomorphine trials during the reversal phase did not develop a conditioned response but developed a potentiated sensitization response. This disassociation of the conditioned response from the sensitization response is attributed to an apomorphine anti-habituation effect that can generate a false positive Pavlovian conditioned response effect. The potentiated sensitization response induced by the treatment reversal protocol points to an important role for the sequential experience of the paired/unpaired protocol in behavioral sensitization. 2010 Elsevier Inc. All rights reserved.

  16. Physiological, Perceptual, and Affective Responses to Six High-Intensity Interval Training Protocols.

    PubMed

    Follador, Lucio; Alves, Ragami C; Ferreira, Sandro Dos S; Buzzachera, Cosme F; Andrade, Vinicius F Dos S; Garcia, Erick D S de A; Osiecki, Raul; Barbosa, Sara C; de Oliveira, Letícia M; da Silva, Sergio G

    2018-04-01

    This study examined the extent to which different high-intensity interval training (HIIT) and sprint interval training (SIT) protocols could influence psychophysiological responses in moderately active young men. Fourteen participants completed, in a randomized order, three cycling protocols (SIT: 4 × 30-second all-out sprints; Tabata: 7 × 20 seconds at 170% ⋮O 2max ; and HIIT: 10 × 60 seconds at 90% HR max ) and three running HIIT protocols (4 × 4 minutes at 90%-95% HR max , 5 × at v⋮O 2max , and 4 × 1,000 meters at a rating of perceived exertion (RPE) of 8, from the OMNI-Walk/Run scale). Oxygen uptake (⋮O 2 ), heart rate, and RPE were recorded during each interval. Affective responses were assessed before and after each trial. The Tabata protocol elicited the highest ⋮O 2 and RPE responses, and the least pleasant session-affect among the cycling trials. The v⋮O 2max elicited the highest ⋮O 2 and RPE responses and the lowest mean session-affect among the running trials. Findings highlight the limited application of SIT and some HIIT protocols to individuals with low fitness levels.

  17. Physiological responses to simulated firefighter exercise protocols in varying environments.

    PubMed

    Horn, Gavin P; Kesler, Richard M; Motl, Robert W; Hsiao-Wecksler, Elizabeth T; Klaren, Rachel E; Ensari, Ipek; Petrucci, Matthew N; Fernhall, Bo; Rosengren, Karl S

    2015-01-01

    For decades, research to quantify the effects of firefighting activities and personal protective equipment on physiology and biomechanics has been conducted in a variety of testing environments. It is unknown if these different environments provide similar information and comparable responses. A novel Firefighting Activities Station, which simulates four common fireground tasks, is presented for use with an environmental chamber in a controlled laboratory setting. Nineteen firefighters completed three different exercise protocols following common research practices. Simulated firefighting activities conducted in an environmental chamber or live-fire structures elicited similar physiological responses (max heart rate: 190.1 vs 188.0 bpm, core temperature response: 0.047°C/min vs 0.043°C/min) and accelerometry counts. However, the response to a treadmill protocol commonly used in laboratory settings resulted in significantly lower heart rate (178.4 vs 188.0 bpm), core temperature response (0.037°C/min vs 0.043°C/min) and physical activity counts compared with firefighting activities in the burn building. Practitioner Summary: We introduce a new approach for simulating realistic firefighting activities in a controlled laboratory environment for ergonomics assessment of fire service equipment and personnel. Physiological responses to this proposed protocol more closely replicate those from live-fire activities than a traditional treadmill protocol and are simple to replicate and standardise.

  18. Outcomes of Optimized over Standard Protocol of Rabbit Antithymocyte Globulin for Severe Aplastic Anemia: A Single-Center Experience

    PubMed Central

    Ge, Meili; Shao, Yingqi; Huang, Jinbo; Huang, Zhendong; Zhang, Jing; Nie, Neng; Zheng, Yizhou

    2013-01-01

    Background Previous reports showed that outcome of rabbit antithymocyte globulin (rATG) was not satisfactory as the first-line therapy for severe aplastic anemia (SAA). We explored a modifying schedule of administration of rATG. Design and Methods Outcomes of a cohort of 175 SAA patients, including 51 patients administered with standard protocol (3.55 mg/kg/d for 5 days) and 124 cases with optimized protocol (1.97 mg/kg/d for 9 days) of rATG plus cyclosporine (CSA), were analyzed retrospectively. Results Of all 175 patients, response rates at 3 and 6 months were 36.6% and 56.0%, respectively. 51 cases received standard protocol had poor responses at 3 (25.5%) and 6 months (41.2%). However, 124 patients received optimized protocol had better responses at 3 (41.1%, P = 0.14) and 6 (62.1%, P = 0.01). Higher incidences of infection (57.1% versus 37.9%, P = 0.02) and early mortality (17.9% versus 0.8%, P<0.001) occurred in patients received standard protocol compared with optimized protocol. The 5-year overall survival in favor of the optimized over standard rATG protocol (76.0% versus. 50.3%, P<0.001) was observed. By multivariate analysis, optimized protocol (RR = 2.21, P = 0.04), response at 3 months (RR = 10.31, P = 0.03) and shorter interval (<23 days) between diagnosis and initial dose of rATG (RR = 5.35, P = 0.002) were independent favorable predictors of overall survival. Conclusions Optimized instead of standard rATG protocol in combination with CSA remained efficacious as a first-line immunosuppressive regimen for SAA. PMID:23554855

  19. Pupillary Stroop effects

    PubMed Central

    Ørbo, Marte; Holmlund, Terje; Miozzo, Michele

    2010-01-01

    We recorded the pupil diameters of participants performing the words’ color-naming Stroop task (i.e., naming the color of a word that names a color). Non-color words were used as baseline to firmly establish the effects of semantic relatedness induced by color word distractors. We replicated the classic Stroop effects of color congruency and color incongruency with pupillary diameter recordings: relative to non-color words, pupil diameters increased for color distractors that differed from color responses, while they reduced for color distractors that were identical to color responses. Analyses of the time courses of pupil responses revealed further differences between color-congruent and color-incongruent distractors, with the latter inducing a steep increase of pupil size and the former a relatively lower increase. Consistent with previous findings that have demonstrated that pupil size increases as task demands rise, the present results indicate that pupillometry is a robust measure of Stroop interference, and it represents a valuable addition to the cognitive scientist’s toolbox. PMID:20865297

  20. Genetic landscapes GIS Toolbox: tools to map patterns of genetic divergence and diversity.

    USGS Publications Warehouse

    Vandergast, Amy G.; Perry, William M.; Lugo, Roberto V.; Hathaway, Stacie A.

    2011-01-01

    The Landscape Genetics GIS Toolbox contains tools that run in the Geographic Information System software, ArcGIS, to map genetic landscapes and to summarize multiple genetic landscapes as average and variance surfaces. These tools can be used to visualize the distribution of genetic diversity across geographic space and to study associations between patterns of genetic diversity and geographic features or other geo-referenced environmental data sets. Together, these tools create genetic landscape surfaces directly from tables containing genetic distance or diversity data and sample location coordinates, greatly reducing the complexity of building and analyzing these raster surfaces in a Geographic Information System.

  1. Genetic and Genomic Toolbox of Zea mays

    PubMed Central

    Nannas, Natalie J.; Dawe, R. Kelly

    2015-01-01

    Maize has a long history of genetic and genomic tool development and is considered one of the most accessible higher plant systems. With a fully sequenced genome, a suite of cytogenetic tools, methods for both forward and reverse genetics, and characterized phenotype markers, maize is amenable to studying questions beyond plant biology. Major discoveries in the areas of transposons, imprinting, and chromosome biology came from work in maize. Moving forward in the post-genomic era, this classic model system will continue to be at the forefront of basic biological study. In this review, we outline the basics of working with maize and describe its rich genetic toolbox. PMID:25740912

  2. Recent advances of molecular toolbox construction expand Pichia pastoris in synthetic biology applications.

    PubMed

    Kang, Zhen; Huang, Hao; Zhang, Yunfeng; Du, Guocheng; Chen, Jian

    2017-01-01

    Pichia pastoris: (reclassified as Komagataella phaffii), a methylotrophic yeast strain has been widely used for heterologous protein production because of its unique advantages, such as readily achievable high-density fermentation, tractable genetic modifications and typical eukaryotic post-translational modifications. More recently, P. pastoris as a metabolic pathway engineering platform has also gained much attention. In this mini-review, we addressed recent advances of molecular toolboxes, including synthetic promoters, signal peptides, and genome engineering tools that established for P. pastoris. Furthermore, the applications of P. pastoris towards synthetic biology were also discussed and prospected especially in the context of genome-scale metabolic pathway analysis.

  3. Rad Toolbox User's Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eckerman, Keith F.; Sjoreen, Andrea L.

    2013-05-01

    The Radiological Toolbox software developed by Oak Ridge National Laboratory (ORNL) for U. S. Nuclear Regulatory Commission (NRC) is designed to provide electronic access to the vast and varied data that underlies the field of radiation protection. These data represent physical, chemical, anatomical, physiological, and mathematical parameters detailed in various handbooks which a health physicist might consult while in his office. The initial motivation for the software was to serve the needs of the health physicist away from his office and without access to his handbooks; e.g., NRC inspectors. The earlier releases of the software were widely used and acceptedmore » around the world by not only practicing health physicist but also those within educational programs. This release updates the software to accommodate changes in Windows operating systems and, in some aspects, radiation protection. This release has been tested on Windows 7 and 8 and on 32- and 64-bit machines. The nuclear decay data has been updated and thermal neutron capture cross sections and cancer risk coefficients have been included. This document and the software’s user’s guide provide further details and documentation of the information captured within the Radiological Toolbox.« less

  4. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  5. spads 1.0: a toolbox to perform spatial analyses on DNA sequence data sets.

    PubMed

    Dellicour, Simon; Mardulyn, Patrick

    2014-05-01

    SPADS 1.0 (for 'Spatial and Population Analysis of DNA Sequences') is a population genetic toolbox for characterizing genetic variability within and among populations from DNA sequences. In view of the drastic increase in genetic information available through sequencing methods, spads was specifically designed to deal with multilocus data sets of DNA sequences. It computes several summary statistics from populations or groups of populations, performs input file conversions for other population genetic programs and implements locus-by-locus and multilocus versions of two clustering algorithms to study the genetic structure of populations. The toolbox also includes two MATLAB and r functions, GDISPAL and GDIVPAL, to display differentiation and diversity patterns across landscapes. These functions aim to generate interpolating surfaces based on multilocus distance and diversity indices. In the case of multiple loci, such surfaces can represent a useful alternative to multiple pie charts maps traditionally used in phylogeography to represent the spatial distribution of genetic diversity. These coloured surfaces can also be used to compare different data sets or different diversity and/or distance measures estimated on the same data set. © 2013 John Wiley & Sons Ltd.

  6. MEG/EEG Source Reconstruction, Statistical Evaluation, and Visualization with NUTMEG

    PubMed Central

    Dalal, Sarang S.; Zumer, Johanna M.; Guggisberg, Adrian G.; Trumpis, Michael; Wong, Daniel D. E.; Sekihara, Kensuke; Nagarajan, Srikantan S.

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions. PMID:21437174

  7. MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.

    PubMed

    Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.

  8. Hydratools, a MATLAB® based data processing package for Sontek Hydra data

    USGS Publications Warehouse

    Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.

    2005-01-01

    The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.

  9. Ovarian responses of dairy buffalo cows to timed artificial insemination protocol, using new or used progesterone devices, during the breeding season (autumn-winter).

    PubMed

    Monteiro, Bruno Moura; de Souza, Diego Cavalcante; Vasconcellos, Guilherme Souza Floriano Machado; Corrêa, Thalita Bueno; Vecchio, Domenico; de Sá Filho, Manoel Francisco; de Carvalho, Nelcio Antonio Tonizza; Baruselli, Pietro Sampaio

    2016-01-01

    This study evaluated the effect of new or used P4 devices on the ovarian responses of dairy buffalo that were administered an estradiol (E2) plus progesterone (P4)-based timed artificial insemination (TAI) protocol during the breeding season. On the first day of the TAI protocol, 142 cows were randomly assigned to receive one of the following: a new device (New; 1.0 g of P4; n = 48); a device that had previously been used for 9 days (Used1x, n = 47); or a device that had previously been used for 18 days (Used2x, n = 47). Ultrasound was used to evaluate the following: the presence of a corpus luteum (CL); the diameter of the dominant follicle (ØDF) during protocol; ovulatory response; and pregnancies per AI (P/AI). Despite similar responses among the treatments, there was a significant positive association of the ØDF during TAI protocol with ovulatory responses and number of pregnancies. In conclusion, satisfactory ovarian responses and a satisfactory pregnancy rate were achieved when grazing dairy buffalo were subjected to the TAI protocol in breeding season, independent of whether a new or used P4 device was used. Furthermore, the presence of the larger follicle was associated with a higher ovulation rate and higher P/AI following TAI. © 2015 Japanese Society of Animal Science.

  10. A Quantitative Framework for Flower Phenotyping in Cultivated Carnation (Dianthus caryophyllus L.)

    PubMed Central

    Chacón, Borja; Ballester, Roberto; Birlanga, Virginia; Rolland-Lagan, Anne-Gaëlle; Pérez-Pérez, José Manuel

    2013-01-01

    Most important breeding goals in ornamental crops are plant appearance and flower characteristics where selection is visually performed on direct offspring of crossings. We developed an image analysis toolbox for the acquisition of flower and petal images from cultivated carnation (Dianthus caryophyllus L.) that was validated by a detailed analysis of flower and petal size and shape in 78 commercial cultivars of D. caryophyllus, including 55 standard, 22 spray and 1 pot carnation cultivars. Correlation analyses allowed us to reduce the number of parameters accounting for the observed variation in flower and petal morphology. Convexity was used as a descriptor for the level of serration in flowers and petals. We used a landmark-based approach that allowed us to identify eight main principal components (PCs) accounting for most of the variance observed in petal shape. The effect and the strength of these PCs in standard and spray carnation cultivars are consistent with shared underlying mechanisms involved in the morphological diversification of petals in both subpopulations. Our results also indicate that neighbor-joining trees built with morphological data might infer certain phylogenetic relationships among carnation cultivars. Based on estimated broad-sense heritability values for some flower and petal features, different genetic determinants shall modulate the responses of flower and petal morphology to environmental cues in this species. We believe our image analysis toolbox could allow capturing flower variation in other species of high ornamental value. PMID:24349209

  11. BCILAB: a platform for brain-computer interface development

    NASA Astrophysics Data System (ADS)

    Kothe, Christian Andreas; Makeig, Scott

    2013-10-01

    Objective. The past two decades have seen dramatic progress in our ability to model brain signals recorded by electroencephalography, functional near-infrared spectroscopy, etc., and to derive real-time estimates of user cognitive state, response, or intent for a variety of purposes: to restore communication by the severely disabled, to effect brain-actuated control and, more recently, to augment human-computer interaction. Continuing these advances, largely achieved through increases in computational power and methods, requires software tools to streamline the creation, testing, evaluation and deployment of new data analysis methods. Approach. Here we present BCILAB, an open-source MATLAB-based toolbox built to address the need for the development and testing of brain-computer interface (BCI) methods by providing an organized collection of over 100 pre-implemented methods and method variants, an easily extensible framework for the rapid prototyping of new methods, and a highly automated framework for systematic testing and evaluation of new implementations. Main results. To validate and illustrate the use of the framework, we present two sample analyses of publicly available data sets from recent BCI competitions and from a rapid serial visual presentation task. We demonstrate the straightforward use of BCILAB to obtain results compatible with the current BCI literature. Significance. The aim of the BCILAB toolbox is to provide the BCI community a powerful toolkit for methods research and evaluation, thereby helping to accelerate the pace of innovation in the field, while complementing the existing spectrum of tools for real-time BCI experimentation, deployment and use.

  12. 4D Printed Actuators with Soft-Robotic Functions.

    PubMed

    López-Valdeolivas, María; Liu, Danqing; Broer, Dick Jan; Sánchez-Somolinos, Carlos

    2018-03-01

    Soft matter elements undergoing programed, reversible shape change can contribute to fundamental advance in areas such as optics, medicine, microfluidics, and robotics. Crosslinked liquid crystalline polymers have demonstrated huge potential to implement soft responsive elements; however, the complexity and size of the actuators are limited by the current dominant thin-film geometry processing toolbox. Using 3D printing, stimuli-responsive liquid crystalline elastomeric structures are created here. The printing process prescribes a reversible shape-morphing behavior, offering a new paradigm for active polymer system preparation. The additive character of this technology also leads to unprecedented geometries, complex functions, and sizes beyond those of typical thin-films. The fundamental concepts and devices presented therefore overcome the current limitations of actuation energy available from thin-films, thereby narrowing the gap between materials and practical applications. © 2017 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Aurally-adequate time-frequency analysis for scattered sound in auditoria

    NASA Astrophysics Data System (ADS)

    Norris, Molly K.; Xiang, Ning; Kleiner, Mendel

    2005-04-01

    The goal of this work was to apply an aurally-adequate time-frequency analysis technique to the analysis of sound scattering effects in auditoria. Time-frequency representations were developed as a motivated effort that takes into account binaural hearing, with a specific implementation of interaural cross-correlation process. A model of the human auditory system was implemented in the MATLAB platform based on two previous models [A. Härmä and K. Palomäki, HUTear, Espoo, Finland; and M. A. Akeroyd, A. Binaural Cross-correlogram Toolbox for MATLAB (2001), University of Sussex, Brighton]. These stages include proper frequency selectivity, the conversion of the mechanical motion of the basilar membrane to neural impulses, and binaural hearing effects. The model was then used in the analysis of room impulse responses with varying scattering characteristics. This paper discusses the analysis results using simulated and measured room impulse responses. [Work supported by the Frank H. and Eva B. Buck Foundation.

  14. Engineering a Functional Small RNA Negative Autoregulation Network with Model-Guided Design.

    PubMed

    Hu, Chelsea Y; Takahashi, Melissa K; Zhang, Yan; Lucks, Julius B

    2018-05-22

    RNA regulators are powerful components of the synthetic biology toolbox. Here, we expand the repertoire of synthetic gene networks built from these regulators by constructing a transcriptional negative autoregulation (NAR) network out of small RNAs (sRNAs). NAR network motifs are core motifs of natural genetic networks, and are known for reducing network response time and steady state signal. Here we use cell-free transcription-translation (TX-TL) reactions and a computational model to design and prototype sRNA NAR constructs. Using parameter sensitivity analysis, we design a simple set of experiments that allow us to accurately predict NAR function in TX-TL. We transfer successful network designs into Escherichia coli and show that our sRNA transcriptional network reduces both network response time and steady-state gene expression. This work broadens our ability to construct increasingly sophisticated RNA genetic networks with predictable function.

  15. COBRA ATD minefield detection model initial performance analysis

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.

  16. Fabrication and Demonstration of Mercury Disc-Well Probes for Stripping-Based Cyclic Voltammetry Scanning Electrochemical Microscopy.

    PubMed

    Barton, Zachary J; Rodríguez-López, Joaquín

    2017-03-07

    Scanning electrochemical microscopy (SECM) is a rising technique for the study of energy storage materials. Hg-based probes allow the extension of SECM investigations to ionic processes, but the risk of irreversible Hg amalgam saturation limits their operation to rapid timescales and dilute analyte solutions. Here, we report a novel fabrication protocol for Hg disc-well ultramicroelectrodes (UMEs), which retain access to stripping information but are less susceptible to amalgam saturation than traditional Hg sphere-caps or thin-films. The amalgamation and stripping behaviors of Hg disc-well UMEs are compared to those of traditional Hg sphere-cap UMEs and corroborated with data from finite element simulations. The improved protection against amalgam saturation allows Hg disc-wells to operate safely in highly concentrated environments at long timescales. The utility of the probes for bulk measurements extends also to SECM studies, where the disc geometry facilitates small tip-substrate gaps and improves both spatial and temporal resolution. Because they can carry out slow, high-resolution anodic stripping voltammetry approaches and imaging in concentrated solutions, Hg disc-well electrodes fill a new analytical niche for studies of ionic reactivity and are a valuable addition to the electrochemical toolbox.

  17. High resolution critical habitat mapping and classification of tidal freshwater wetlands in the ACE Basin

    NASA Astrophysics Data System (ADS)

    Strickland, Melissa Anne

    In collaboration with the South Carolina Department of Natural Resources ACE Basin National Estuarine Research Reserve (ACE Basin NERR), the tidal freshwater ecosystems along the South Edisto River in the ACE Basin are being accurately mapped and classified using a LIDAR-Remote Sensing Fusion technique that integrates LAS LIDAR data into texture images and then merges the elevation textures and multispectral imagery for very high resolution mapping. This project discusses the development and refinement of an ArcGIS Toolbox capable of automating protocols and procedures for marsh delineation and microhabitat identification. The result is a high resolution habitat and land use map used for the identification of threatened habitat. Tidal freshwater wetlands are also a critical habitat for colonial wading birds and an accurate assessment of community diversity and acreage of this habitat type in the ACE Basin will support SCDNR's conservation and protection efforts. The maps developed by this study will be used to better monitor the freshwater/saltwater interface and establish a baseline for an ACE NERR monitoring program to track the rates and extent of alterations due to projected environmental stressors. Preliminary ground-truthing in the field will provide information about the accuracy of the mapping tool.

  18. Expanding the UniFrac Toolbox

    PubMed Central

    2016-01-01

    The UniFrac distance metric is often used to separate groups in microbiome analysis, but requires a constant sequencing depth to work properly. Here we demonstrate that unweighted UniFrac is highly sensitive to rarefaction instance and to sequencing depth in uniform data sets with no clear structure or separation between groups. We show that this arises because of subcompositional effects. We introduce information UniFrac and ratio UniFrac, two new weightings that are not as sensitive to rarefaction and allow greater separation of outliers than classic unweighted and weighted UniFrac. With this expansion of the UniFrac toolbox, we hope to empower researchers to extract more varied information from their data. PMID:27632205

  19. A problem solving and decision making toolbox for approaching clinical problems and decisions.

    PubMed

    Margolis, C; Jotkowitz, A; Sitter, H

    2004-08-01

    In this paper, we begin by presenting three real patients and then review all the practical conceptual tools that have been suggested for systematically analyzing clinical problems. Each of these conceptual tools (e.g. Evidence-Based Medicine, Clinical Practice Guidelines, Decision Analysis) deals mainly with a different type or aspect of clinical problems. We suggest that all of these conceptual tools can be thought of as belonging in the clinician's toolbox for solving clinical problems and making clinical decisions. A heuristic for guiding the clinician in using the tools is proposed. The heuristic is then used to analyze management of the three patients presented at the outset. Copyright 2004 Birkhäuser Verlag, Basel

  20. Phylogenetic Analyses: A Toolbox Expanding towards Bayesian Methods

    PubMed Central

    Aris-Brosou, Stéphane; Xia, Xuhua

    2008-01-01

    The reconstruction of phylogenies is becoming an increasingly simple activity. This is mainly due to two reasons: the democratization of computing power and the increased availability of sophisticated yet user-friendly software. This review describes some of the latest additions to the phylogenetic toolbox, along with some of their theoretical and practical limitations. It is shown that Bayesian methods are under heavy development, as they offer the possibility to solve a number of long-standing issues and to integrate several steps of the phylogenetic analyses into a single framework. Specific topics include not only phylogenetic reconstruction, but also the comparison of phylogenies, the detection of adaptive evolution, and the estimation of divergence times between species. PMID:18483574

  1. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    PubMed

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  2. Refining the chemical toolbox to be fit for educational and practical purpose for drug discovery in the 21st Century.

    PubMed

    Lolli, Marco; Narramore, Sarah; Fishwick, Colin W G; Pors, Klaus

    2015-08-01

    We live in a time where exploration and generation of new knowledge is occurring on a colossal scale. Medicinal chemists have traditionally taken key roles in drug discovery; however, the many unmet medical demands in the healthcare sector emphasise the need to evolve the medicinal chemistry discipline. To rise to the challenges in the 21st Century there is a necessity to refine the chemical toolbox for educational and practical reasons. This review proposes modern strategies that are beneficial to teaching in academia but are also important reminders of strategies that can potentially lead to better drugs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A protocol for analysing thermal stress in insects using infrared thermography.

    PubMed

    Gallego, Belén; Verdú, José R; Carrascal, Luis M; Lobo, Jorge M

    2016-02-01

    The study of insect responses to thermal stress has involved a variety of protocols and methodologies that hamper the ability to compare results between studies. For that reason, the development of a protocol to standardize thermal assays is necessary. In this sense, infrared thermography solves some of the problems allowing us to take continuous temperature measurements without handling the individuals, an important fact in cold-blooded organisms like insects. Here, we present a working protocol based on infrared thermography to estimate both cold and heat thermal stress in insects. We analyse both the change in the body temperature of individuals and their behavioural response. In addition, we used partial least squares regression for the statistical analysis of our data, a technique that solves the problem of having a large number of variables and few individuals, allowing us to work with rare or endemic species. To test our protocol, we chose two species of congeneric, narrowly distributed dung beetles that are endemic to the southeastern part of the Iberian Peninsula. With our protocol we have obtained five variables in the response to cold and twelve in the response to heat. With this methodology we discriminate between the two flightless species of Jekelius through their thermal response. In response to cold, Jekelius hernandezi showed a higher rate of cooling and reached higher temperatures of stupor and haemolymph freezing than Jekelius punctatolineatus. Both species displayed similar thermoregulation ranges before reaching lethal body temperature with heat stress. Overall, we have demonstrated that infrared thermography is a suitable method to assess insect thermal responses with a high degree of sensitivity, allowing for the discrimination between closely related species. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Addressing Participant Validity in a Small Internet Health Survey (The Restore Study): Protocol and Recommendations for Survey Response Validation

    PubMed Central

    Dewitt, James; Capistrant, Benjamin; Kohli, Nidhi; Mitteldorf, Darryl; Merengwa, Enyinnaya; West, William

    2018-01-01

    Background While deduplication and cross-validation protocols have been recommended for large Web-based studies, protocols for survey response validation of smaller studies have not been published. Objective This paper reports the challenges of survey validation inherent in a small Web-based health survey research. Methods The subject population was North American, gay and bisexual, prostate cancer survivors, who represent an under-researched, hidden, difficult-to-recruit, minority-within-a-minority population. In 2015-2016, advertising on a large Web-based cancer survivor support network, using email and social media, yielded 478 completed surveys. Results Our manual deduplication and cross-validation protocol identified 289 survey submissions (289/478, 60.4%) as likely spam, most stemming from advertising on social media. The basic components of this deduplication and validation protocol are detailed. An unexpected challenge encountered was invalid survey responses evolving across the study period. This necessitated the static detection protocol be augmented with a dynamic one. Conclusions Five recommendations for validation of Web-based samples, especially with smaller difficult-to-recruit populations, are detailed. PMID:29691203

  5. Improving Flood Risk Management for California's Central Valley: How the State Developed a Toolbox for Large, System-wide Studies

    NASA Astrophysics Data System (ADS)

    Pingel, N.; Liang, Y.; Bindra, A.

    2016-12-01

    More than 1 million Californians live and work in the floodplains of the Sacramento-San Joaquin Valley where flood risks are among the highest in the nation. In response to this threat to people, property and the environment, the Department of Water Resources (DWR) has been called to action to improve flood risk management. This has transpired through significant advances in development of flood information and tools, analysis, and planning. Senate Bill 5 directed DWR to prepare the Central Valley Flood Protection Plan (CVFPP) and update it every 5 years. A key component of this aggressive planning approach is answering the question: What is the current flood risk, and how would proposed improvements change flood risk throughout the system? Answering this question is a substantial challenge due to the size and complexity of the watershed and flood control system. The watershed is roughly 42,000 sq mi, and flows are controlled by numerous reservoirs, bypasses, and levees. To overcome this challenge, the State invested in development of a comprehensive analysis "tool box" through various DWR programs. Development of the tool box included: collection of hydro-meteorological, topographic, geotechnical, and economic data; development of rainfall-runoff, reservoir operation, hydraulic routing, and flood risk analysis models; and development of specialized applications and computing schemes to accelerate the analysis. With this toolbox, DWR is analyzing flood hazard, flood control system performance, exposure and vulnerability of people and property to flooding, consequence of flooding for specific events, and finally flood risk for a range of CVFPP alternatives. Based on the results, DWR will put forward a State Recommended Plan in the 2017 CVFPP. Further, the value of the analysis tool box extends beyond the CVFPP. It will serve as a foundation for other flood studies for years to come and has already been successfully applied for inundation mapping to support emergency response, reservoir operation analysis, and others.

  6. A Citizen Science and Government Collaboration: Developing ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) is actively involved in supporting citizen science projects and providing communities with information and assistance for conducting their own air pollution monitoring. As part of a Regional Applied Research Effort (RARE) project, EPA's Office of Research and Development (ORD) worked collaboratively with EPA Region 2 and the Ironbound Community Corporation (ICC) in Newark, New Jersey, to develop and test the “Air Sensor Toolbox for Citizen Scientists.” In this collaboration, citizen scientists measured local gaseous and particulate air pollution levels by using a customized low-cost sensor pod designed and fabricated by EPA. This citizen science air quality measurement project provided an excellent opportunity for EPA to evaluate and improve the Toolbox resources available to communities. The Air Sensor Toolbox, developed in coordination with the ICC, can serve as a template for communities across the country to use in developing their own air pollution monitoring programs in areas where air pollution is a concern. This pilot project provided an opportunity for a highly motivated citizen science organization and the EPA to work together directly to address environmental concerns within the community. Useful lessons were learned about how to improve coordination between the government and communities and the types of tools and technologies needed for conducting an effective citizen science project that can be app

  7. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink(Registered TradeMark) (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  8. Cantera Integration with the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Lavelle, Thomas M.; Chapman, Jeffryes W.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    NASA Glenn Research Center (GRC) has recently developed a software package for modeling generic thermodynamic systems called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a library of building blocks that can be assembled to represent any thermodynamic system in the Simulink (The MathWorks, Inc.) environment. These elements, along with a Newton Raphson solver (also provided as part of the T-MATS package), enable users to create models of a wide variety of systems. The current version of T-MATS (v1.0.1) uses tabular data for providing information about a specific mixture of air, water (humidity), and hydrocarbon fuel in calculations of thermodynamic properties. The capabilities of T-MATS can be expanded by integrating it with the Cantera thermodynamic package. Cantera is an object-oriented analysis package that calculates thermodynamic solutions for any mixture defined by the user. Integration of Cantera with T-MATS extends the range of systems that may be modeled using the toolbox. In addition, the library of elements released with Cantera were developed using MATLAB native M-files, allowing for quicker prototyping of elements. This paper discusses how the new Cantera-based elements are created and provides examples for using T-MATS integrated with Cantera.

  9. NeAT: a toolbox for the analysis of biological networks, clusters, classes and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Sand, Olivier; Janky, Rekin's; Vanderstocken, Gilles; Deville, Yves; van Helden, Jacques

    2008-07-01

    The network analysis tools (NeAT) (http://rsat.ulb.ac.be/neat/) provide a user-friendly web access to a collection of modular tools for the analysis of networks (graphs) and clusters (e.g. microarray clusters, functional classes, etc.). A first set of tools supports basic operations on graphs (comparison between two graphs, neighborhood of a set of input nodes, path finding and graph randomization). Another set of programs makes the connection between networks and clusters (graph-based clustering, cliques discovery and mapping of clusters onto a network). The toolbox also includes programs for detecting significant intersections between clusters/classes (e.g. clusters of co-expression versus functional classes of genes). NeAT are designed to cope with large datasets and provide a flexible toolbox for analyzing biological networks stored in various databases (protein interactions, regulation and metabolism) or obtained from high-throughput experiments (two-hybrid, mass-spectrometry and microarrays). The web interface interconnects the programs in predefined analysis flows, enabling to address a series of questions about networks of interest. Each tool can also be used separately by entering custom data for a specific analysis. NeAT can also be used as web services (SOAP/WSDL interface), in order to design programmatic workflows and integrate them with other available resources.

  10. A part toolbox to tune genetic expression in Bacillus subtilis

    PubMed Central

    Guiziou, Sarah; Sauveplane, Vincent; Chang, Hung-Ju; Clerté, Caroline; Declerck, Nathalie; Jules, Matthieu; Bonnet, Jerome

    2016-01-01

    Libraries of well-characterised components regulating gene expression levels are essential to many synthetic biology applications. While widely available for the Gram-negative model bacterium Escherichia coli, such libraries are lacking for the Gram-positive model Bacillus subtilis, a key organism for basic research and biotechnological applications. Here, we engineered a genetic toolbox comprising libraries of promoters, Ribosome Binding Sites (RBS), and protein degradation tags to precisely tune gene expression in B. subtilis. We first designed a modular Expression Operating Unit (EOU) facilitating parts assembly and modifications and providing a standard genetic context for gene circuits implementation. We then selected native, constitutive promoters of B. subtilis and efficient RBS sequences from which we engineered three promoters and three RBS sequence libraries exhibiting ∼14 000-fold dynamic range in gene expression levels. We also designed a collection of SsrA proteolysis tags of variable strength. Finally, by using fluorescence fluctuation methods coupled with two-photon microscopy, we quantified the absolute concentration of GFP in a subset of strains from the library. Our complete promoters and RBS sequences library comprising over 135 constructs enables tuning of GFP concentration over five orders of magnitude, from 0.05 to 700 μM. This toolbox of regulatory components will support many research and engineering applications in B. subtilis. PMID:27402159

  11. Origins and Evolution of Stomatal Development1[OPEN

    PubMed Central

    2017-01-01

    The fossil record suggests stomata-like pores were present on the surfaces of land plants over 400 million years ago. Whether stomata arose once or whether they arose independently across newly evolving land plant lineages has long been a matter of debate. In Arabidopsis, a genetic toolbox has been identified that tightly controls stomatal development and patterning. This includes the basic helix-loop-helix (bHLH) transcription factors SPEECHLESS (SPCH), MUTE, FAMA, and ICE/SCREAMs (SCRMs), which promote stomatal formation. These factors are regulated via a signaling cascade, which includes mobile EPIDERMAL PATTERNING FACTOR (EPF) peptides to enforce stomatal spacing. Mosses and hornworts, the most ancient extant lineages to possess stomata, possess orthologs of these Arabidopsis (Arabidopsis thaliana) stomatal toolbox genes, and manipulation in the model bryophyte Physcomitrella patens has shown that the bHLH and EPF components are also required for moss stomatal development and patterning. This supports an ancient and tightly conserved genetic origin of stomata. Here, we review recent discoveries and, by interrogating newly available plant genomes, we advance the story of stomatal development and patterning across land plant evolution. Furthermore, we identify potential orthologs of the key toolbox genes in a hornwort, further supporting a single ancient genetic origin of stomata in the ancestor to all stomatous land plants. PMID:28356502

  12. Registration of in vivo MR to histology of rodent brains using blockface imaging

    NASA Astrophysics Data System (ADS)

    Uberti, Mariano; Liu, Yutong; Dou, Huanyu; Mosley, R. Lee; Gendelman, Howard E.; Boska, Michael

    2009-02-01

    Registration of MRI to histopathological sections can enhance bioimaging validation for use in pathobiologic, diagnostic, and therapeutic evaluations. However, commonly used registration methods fall short of this goal due to tissue shrinkage and tearing after brain extraction and preparation. In attempts to overcome these limitations we developed a software toolbox using 3D blockface imaging as the common space of reference. This toolbox includes a semi-automatic brain extraction technique using constraint level sets (CLS), 3D reconstruction methods for the blockface and MR volume, and a 2D warping technique using thin-plate splines with landmark optimization. Using this toolbox, the rodent brain volume is first extracted from the whole head MRI using CLS. The blockface volume is reconstructed followed by 3D brain MRI registration to the blockface volume to correct the global deformations due to brain extraction and fixation. Finally, registered MRI and histological slices are warped to corresponding blockface images to correct slice specific deformations. The CLS brain extraction technique was validated by comparing manual results showing 94% overlap. The image warping technique was validated by calculating target registration error (TRE). Results showed a registration accuracy of a TRE < 1 pixel. Lastly, the registration method and the software tools developed were used to validate cell migration in murine human immunodeficiency virus type one encephalitis.

  13. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  14. A finite-element toolbox for the stationary Gross-Pitaevskii equation with rotation

    NASA Astrophysics Data System (ADS)

    Vergez, Guillaume; Danaila, Ionut; Auliac, Sylvain; Hecht, Frédéric

    2016-12-01

    We present a new numerical system using classical finite elements with mesh adaptivity for computing stationary solutions of the Gross-Pitaevskii equation. The programs are written as a toolbox for FreeFem++ (www.freefem.org), a free finite-element software available for all existing operating systems. This offers the advantage to hide all technical issues related to the implementation of the finite element method, allowing to easily code various numerical algorithms. Two robust and optimized numerical methods were implemented to minimize the Gross-Pitaevskii energy: a steepest descent method based on Sobolev gradients and a minimization algorithm based on the state-of-the-art optimization library Ipopt. For both methods, mesh adaptivity strategies are used to reduce the computational time and increase the local spatial accuracy when vortices are present. Different run cases are made available for 2D and 3D configurations of Bose-Einstein condensates in rotation. An optional graphical user interface is also provided, allowing to easily run predefined cases or with user-defined parameter files. We also provide several post-processing tools (like the identification of quantized vortices) that could help in extracting physical features from the simulations. The toolbox is extremely versatile and can be easily adapted to deal with different physical models.

  15. Integration of Lead Discovery Tactics and the Evolution of the Lead Discovery Toolbox.

    PubMed

    Leveridge, Melanie; Chung, Chun-Wa; Gross, Jeffrey W; Phelps, Christopher B; Green, Darren

    2018-06-01

    There has been much debate around the success rates of various screening strategies to identify starting points for drug discovery. Although high-throughput target-based and phenotypic screening has been the focus of this debate, techniques such as fragment screening, virtual screening, and DNA-encoded library screening are also increasingly reported as a source of new chemical equity. Here, we provide examples in which integration of more than one screening approach has improved the campaign outcome and discuss how strengths and weaknesses of various methods can be used to build a complementary toolbox of approaches, giving researchers the greatest probability of successfully identifying leads. Among others, we highlight case studies for receptor-interacting serine/threonine-protein kinase 1 and the bromo- and extra-terminal domain family of bromodomains. In each example, the unique insight or chemistries individual approaches provided are described, emphasizing the synergy of information obtained from the various tactics employed and the particular question each tactic was employed to answer. We conclude with a short prospective discussing how screening strategies are evolving, what this screening toolbox might look like in the future, how to maximize success through integration of multiple tactics, and scenarios that drive selection of one combination of tactics over another.

  16. National Institutes of Health Toolbox Emotion Battery for English- and Spanish-speaking adults: normative data and factor-based summary scores.

    PubMed

    Babakhanyan, Ida; McKenna, Benjamin S; Casaletto, Kaitlin B; Nowinski, Cindy J; Heaton, Robert K

    2018-01-01

    The National Institutes of Health Toolbox Emotion Battery (NIHTB-EB) is a "common currency", computerized assessment developed to measure the full spectrum of emotional health. Though comprehensive, the NIHTB-EB's 17 scales may be unwieldy for users aiming to capture more global indices of emotional functioning. NIHTB-EB was administered to 1,036 English-speaking and 408 Spanish-speaking adults as a part of the NIH Toolbox norming project. We examined the factor structure of the NIHTB-EB in English- and Spanish-speaking adults and developed factor analysis-based summary scores. Census-weighted norms were presented for English speakers, and sample-weighted norms were presented for Spanish speakers. Exploratory factor analysis for both English- and Spanish-speaking cohorts resulted in the same 3-factor solution: 1) negative affect, 2) social satisfaction, and 3) psychological well-being. Confirmatory factor analysis supported similar factor structures for English- and Spanish-speaking cohorts. Model fit indices fell within the acceptable/good range, and our final solution was optimal compared to other solutions. Summary scores based upon the normative samples appear to be psychometrically supported and should be applied to clinical samples to further validate the factor structures and investigate rates of problematic emotions in medical and psychiatric populations.

  17. The rate of high ovarian response in women identified at risk by a high serum AMH level is influenced by the type of gonadotropin.

    PubMed

    Arce, Joan-Carles; Klein, Bjarke M; La Marca, Antonio

    2014-06-01

    The aim was to compare ovarian response and clinical outcome of potential high-responders after stimulation with highly purified menotropin (HP-hMG) or recombinant follicle-stimulating hormone (rFSH) for in vitro fertilisation/intracytoplasmic sperm injection. Retrospective analysis was performed on data collected in two randomized controlled trials, one conducted following a long GnRH agonist protocol and the other with an antagonist protocol. Potential high-responders (n = 155 and n = 188 in the agonist and antagonist protocol, respectively) were defined as having an initial anti-Müllerian hormone (AMH) value >75th percentile (5.2 ng/ml). In both protocols, HP-hMG stimulation in women in the high AMH category was associated with a significantly lower occurrence of high response (≥15 oocytes retrieved) than rFSH stimulation; 33% versus 51% (p = 0.025) and 31% versus 49% (p = 0.015) in the long agonist and antagonist protocol, respectively. In the potential high-responder women, trends for improved live birth rate were observed with HP-hMG compared with rFSH (long agonist protocol: 33% versus 20%, p = 0.074; antagonist protocol: 34% versus 23%, p = 0.075; overall population: 34% versus 22%, p = 0.012). In conclusion, the type of gonadotropin used for ovarian stimulation influences high-response rates and potentially clinical outcome in women identified as potential high-responders.

  18. Acute Hematological and Inflammatory Responses to High-intensity Exercise Tests: Impact of Duration and Mode of Exercise.

    PubMed

    Minuzzi, Luciele G; Carvalho, Humberto M; Brunelli, Diego T; Rosado, Fatima; Cavaglieri, Cláudia R; Gonçalves, Carlos E; Gaspar, Joana M; Rama, Luís M; Teixeira, Ana M

    2017-07-01

    The purpose of this study was to investigate the hematological and inflammatory responses to 4 maximal high-intensity protocols, considering energy expenditure in each test. 9 healthy volunteers performed 4 high-intensity exercise tests of short [Wingate (WANT); Repeated-sprints (RSA)] and long durations [Continuous VO 2 test (VCONT); intermittent VO 2 test (VINT)] in a cycle-ergometer, until exhaustion. Hematological parameters and IL-6, IL-10 and creatine kinase (CK) levels were determined before (PRE), POST, 30 min, 1, 2, 12 and 24 h after the end of the protocols. Additionally, energy expenditure was determined. Leucocytes, erythrocytes and lymphocytes increased at POST and returned to PRE values at 30 min for all protocols. Lymphocytes had a second decreased at 2 h and granulocytes increased at 2 h when compared to PRE. Both variables returned to PRE values between 12-24 h into recovery. The magnitude of response for IL-6 was greater in VINT and for IL-10 in VCONT. There was no association of energy expenditure within each exercise protocol with the pattern of IL-6, IL-10 and CK responses to the exercise protocols. The present finding support that similar responses after continuous or intermittent acute protocols are observed when exercises are performed to volitional failure, regardless of the duration and mode of exercise. © Georg Thieme Verlag KG Stuttgart · New York.

  19. New Horizons on Molecular Pharmacology Applied to Drug Discovery: When Resonance Overcomes Radioligand Binding.

    PubMed

    Pernomian, Larissa; Gomes, Mayara Santos; Moreira, Josimar Dornelas; da Silva, Carlos Henrique Tomich de Paula; Rosa, Joaquin Maria Campos; Cardoso, Cristina Ribeiro de Barros

    2017-01-01

    One of the cornerstones of rational drug development is the measurement of molecular parameters derived from ligand-receptor interaction, which guides therapeutic windows definition. Over the last decades, radioligand binding has provided valuable contributions in this field as key method for such purposes. However, its limitations spurred the development of more exquisite techniques for determining such parameters. For instance, safety risks related to radioactivity waste, expensive and controlled disposal of radioisotopes, radiotracer separation-dependence for affinity analysis, and one-site mathematical models-based fitting of data make radioligand binding a suboptimal approach in providing measures of actual affinity conformations from ligands and G proteincoupled receptors (GPCR). Current advances on high-throughput screening (HTS) assays have markedly extended the options of sparing sensitive ways for monitoring ligand affinity. The advent of the novel bioluminescent donor NanoLuc luciferase (Nluc), engineered from Oplophorus gracilirostris luciferase, allowed fitting bioluminescence resonance energy transfer (BRET) for monitoring ligand binding. Such novel approach named Nluc-based BRET (NanoBRET) binding assay consists of a real-time homogeneous proximity assay that overcomes radioligand binding limitations but ensures the quality in affinity measurements. Here, we cover the main advantages of NanoBRET protocol and the undesirable drawbacks of radioligand binding as molecular methods that span pharmacological toolbox applied to Drug Discovery. Also, we provide a novel perspective for the application of NanoBRET technology in affinity assays for multiple-state binding mechanisms involving oligomerization and/or functional biased selectivity. This new angle was proposed based on specific biophysical criteria required for the real-time homogeneity assigned to the proximity NanoBRET protocol. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Critical Response Protocol

    ERIC Educational Resources Information Center

    Ellingson, Charlene; Roehrig, Gillian; Bakkum, Kris; Dubinsky, Janet M.

    2016-01-01

    This article introduces the Critical Response Protocol (CRP), an arts-based technique that engages students in equitable critical discourse and aligns with the "Next Generation Science Standards" vision for providing students opportunities for language learning while advancing science learning (NGSS Lead States 2013). CRP helps teachers…

  1. Blood flow dynamics in heart failure

    NASA Technical Reports Server (NTRS)

    Shoemaker, J. K.; Naylor, H. L.; Hogeman, C. S.; Sinoway, L. I.

    1999-01-01

    BACKGROUND: Exercise intolerance in heart failure (HF) may be due to inadequate vasodilation, augmented vasoconstriction, and/or altered muscle metabolic responses that lead to fatigue. METHODS AND RESULTS: Vascular and metabolic responses to rhythmic forearm exercise were tested in 9 HF patients and 9 control subjects (CTL) during 2 protocols designed to examine the effect of HF on the time course of oxygen delivery versus uptake (protocol 1) and on vasoconstriction during exercise with 50 mm Hg pressure about the forearm to evoke a metaboreflex (protocol 2). In protocol 1, venous lactate and H+ were greater at 4 minutes of exercise in HF versus CTL (P<0.05) despite similar blood flow and oxygen uptake responses. In protocol 2, mean arterial pressure increased similarly in each group during ischemic exercise. In CTL, forearm blood flow and vascular conductance were similar at the end of ischemic and ambient exercise. In HF, forearm blood flow and vascular conductance were reduced during ischemic exercise compared with the ambient trial. CONCLUSIONS: Intrinsic differences in skeletal muscle metabolism, not vasodilatory dynamics, must account for the augmented glycolytic metabolic responses to moderate-intensity exercise in class II and III HF. The inability to increase forearm vascular conductance during ischemic handgrip exercise, despite a normal pressor response, suggests that enhanced vasoconstriction of strenuously exercising skeletal muscle contributes to exertional fatigue in HF.

  2. Basic Radar Altimetry Toolbox and Radar Altimetry Tutorial: Tools for all Altimetry Users

    NASA Astrophysics Data System (ADS)

    Rosmorduc, Vinca; Benveniste, J.; Breebaart, L.; Bronner, E.; Dinardo, S.; Earith, D.; Lucas, B. M.; Maheu, C.; Niejmeier, S.; Picot, N.

    2013-09-01

    The Basic Radar Altimetry Toolbox is an "all- altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, Saral.It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. Nearly 2000 people downloaded it (January 2012), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 to 4. Others are under development, some are in discussion for the future.The Basic Radar Altimetry Toolbox is able:- to read most distributed radar altimetry data, including the one from future missions like Saral, Jason-3- to perform some processing, data editing and statistic, - and to visualize the results.It can be used at several levels/several ways, including as an educational tool, with the graphical user interface.As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data.BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (as of end of September 2012), with many "newcomers" to altimetry among them, and teachers/students. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are envisioned, some are in discussion.

  3. Towards a Comprehensive Catalog of Volcanic Seismicity

    NASA Astrophysics Data System (ADS)

    Thompson, G.

    2014-12-01

    Catalogs of earthquakes located using differential travel-time techniques are a core product of volcano observatories, and while vital, they represent an incomplete perspective of volcanic seismicity. Many (often most) earthquakes are too small to locate accurately, and are omitted from available catalogs. Low frequency events, tremor and signals related to rockfalls, pyroclastic flows and lahars are not systematically catalogued, and yet from a hazard management perspective are exceedingly important. Because STA/LTA detection schemes break down in the presence of high amplitude tremor, swarms or dome collapses, catalogs may suggest low seismicity when seismicity peaks. We propose to develop a workflow and underlying software toolbox that can be applied to near-real-time and offline waveform data to produce comprehensive catalogs of volcanic seismicity. Existing tools to detect and locate phaseless signals will be adapted to fit within this framework. For this proof of concept the toolbox will be developed in MATLAB, extending the existing GISMO toolbox (an object-oriented MATLAB toolbox for seismic data analysis). Existing database schemas such as the CSS 3.0 will need to be extended to describe this wider range of volcano-seismic signals. WOVOdat may already incorporate many of the additional tables needed. Thus our framework may act as an interface between volcano observatories (or campaign-style research projects) and WOVOdat. We aim to take the further step of reducing volcano-seismic catalogs to sets of continuous metrics that are useful for recognizing data trends, and for feeding alarm systems and forecasting techniques. Previous experience has shown that frequency index, peak frequency, mean frequency, mean event rate, median event rate, and cumulative magnitude (or energy) are potentially useful metrics to generate for all catalogs at a 1-minute sample rate (directly comparable with RSAM and similar metrics derived from continuous data). Our framework includes tools to plot these metrics in a consistent manner. We work with data from unrest at Redoubt volcano and Soufriere Hills volcano to develop our framework.

  4. FracPaQ: A MATLAB™ toolbox for the quantification of fracture patterns

    NASA Astrophysics Data System (ADS)

    Healy, David; Rizzo, Roberto E.; Cornwell, David G.; Farrell, Natalie J. C.; Watkins, Hannah; Timms, Nick E.; Gomez-Rivas, Enrique; Smith, Michael

    2017-02-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, and spatial distributions often exhibit some kind of order. In detail, relationships may exist among the different fracture attributes, e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture attributes and patterns. This paper describes FracPaQ, a new open source, cross-platform toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on previously published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales, rock types and tectonic settings. The implemented methods presented are inherently scale independent, and a key task where applicable is analysing and integrating quantitative fracture pattern data from micro-to macro-scales. The toolbox was developed in MATLAB™ and the source code is publicly available on GitHub™ and the Mathworks™ FileExchange. The code runs on any computer with MATLAB installed, including PCs with Microsoft Windows, Apple Macs with Mac OS X, and machines running different flavours of Linux. The application, source code and sample input files are available in open repositories in the hope that other developers and researchers will optimise and extend the functionality for the benefit of the wider community.

  5. A toolbox model of evolution of metabolic pathways on networks of arbitrary topology.

    PubMed

    Pang, Tin Yau; Maslov, Sergei

    2011-05-01

    In prokaryotic genomes the number of transcriptional regulators is known to be proportional to the square of the total number of protein-coding genes. A toolbox model of evolution was recently proposed to explain this empirical scaling for metabolic enzymes and their regulators. According to its rules, the metabolic network of an organism evolves by horizontal transfer of pathways from other species. These pathways are part of a larger "universal" network formed by the union of all species-specific networks. It remained to be understood, however, how the topological properties of this universal network influence the scaling law of functional content of genomes in the toolbox model. Here we answer this question by first analyzing the scaling properties of the toolbox model on arbitrary tree-like universal networks. We prove that critical branching topology, in which the average number of upstream neighbors of a node is equal to one, is both necessary and sufficient for quadratic scaling. We further generalize the rules of the model to incorporate reactions with multiple substrates/products as well as branched and cyclic metabolic pathways. To achieve its metabolic tasks, the new model employs evolutionary optimized pathways with minimal number of reactions. Numerical simulations of this realistic model on the universal network of all reactions in the KEGG database produced approximately quadratic scaling between the number of regulated pathways and the size of the metabolic network. To quantify the geometrical structure of individual pathways, we investigated the relationship between their number of reactions, byproducts, intermediate, and feedback metabolites. Our results validate and explain the ubiquitous appearance of the quadratic scaling for a broad spectrum of topologies of underlying universal metabolic networks. They also demonstrate why, in spite of "small-world" topology, real-life metabolic networks are characterized by a broad distribution of pathway lengths and sizes of metabolic regulons in regulatory networks.

  6. Role of Gist and PHOG Features in Computer-Aided Diagnosis of Tuberculosis without Segmentation

    PubMed Central

    Chauhan, Arun; Chauhan, Devesh; Rout, Chittaranjan

    2014-01-01

    Purpose Effective diagnosis of tuberculosis (TB) relies on accurate interpretation of radiological patterns found in a chest radiograph (CXR). Lack of skilled radiologists and other resources, especially in developing countries, hinders its efficient diagnosis. Computer-aided diagnosis (CAD) methods provide second opinion to the radiologists for their findings and thereby assist in better diagnosis of cancer and other diseases including TB. However, existing CAD methods for TB are based on the extraction of textural features from manually or semi-automatically segmented CXRs. These methods are prone to errors and cannot be implemented in X-ray machines for automated classification. Methods Gabor, Gist, histogram of oriented gradients (HOG), and pyramid histogram of oriented gradients (PHOG) features extracted from the whole image can be implemented into existing X-ray machines to discriminate between TB and non-TB CXRs in an automated manner. Localized features were extracted for the above methods using various parameters, such as frequency range, blocks and region of interest. The performance of these features was evaluated against textural features. Two digital CXR image datasets (8-bit DA and 14-bit DB) were used for evaluating the performance of these features. Results Gist (accuracy 94.2% for DA, 86.0% for DB) and PHOG (accuracy 92.3% for DA, 92.0% for DB) features provided better results for both the datasets. These features were implemented to develop a MATLAB toolbox, TB-Xpredict, which is freely available for academic use at http://sourceforge.net/projects/tbxpredict/. This toolbox provides both automated training and prediction modules and does not require expertise in image processing for operation. Conclusion Since the features used in TB-Xpredict do not require segmentation, the toolbox can easily be implemented in X-ray machines. This toolbox can effectively be used for the mass screening of TB in high-burden areas with improved efficiency. PMID:25390291

  7. Retrospective analysis for treatment of naïve canine multicentric lymphoma with a 15-week, maintenance-free CHOP protocol.

    PubMed

    Curran, K; Thamm, D H

    2016-08-01

    Standard of care treatment of dogs with multicentric lymphoma includes combination chemotherapy with cyclophosphamide, doxorubicin, vincristine and prednisone (CHOP); however, owners may be hesitant to commit the resources necessary to complete a lengthy, multi-drug protocol. One hundred thirty-four client-owned dogs with multicentric lymphoma were treated with a 15-week CHOP chemotherapy protocol. The overall response rate was 98% with 104 dogs experiencing a complete response (CR). The median progression-free survival (PFS) time for all dogs was 176 days, and the median disease-specific overall survival time was 311 days. Prognostic factors identified on multivariate analysis as significant for PFS included substage, immunophenotype, hospitalization for adverse events, need for dose reduction, presence of neutrophilia at diagnosis, presence of anemia and experiencing a CR as best response to therapy. In conclusion, this protocol may be a viable alternative to CHOP protocols using a larger number of treatments. © 2015 John Wiley & Sons Ltd.

  8. Addressing Participant Validity in a Small Internet Health Survey (The Restore Study): Protocol and Recommendations for Survey Response Validation.

    PubMed

    Dewitt, James; Capistrant, Benjamin; Kohli, Nidhi; Rosser, B R Simon; Mitteldorf, Darryl; Merengwa, Enyinnaya; West, William

    2018-04-24

    While deduplication and cross-validation protocols have been recommended for large Web-based studies, protocols for survey response validation of smaller studies have not been published. This paper reports the challenges of survey validation inherent in a small Web-based health survey research. The subject population was North American, gay and bisexual, prostate cancer survivors, who represent an under-researched, hidden, difficult-to-recruit, minority-within-a-minority population. In 2015-2016, advertising on a large Web-based cancer survivor support network, using email and social media, yielded 478 completed surveys. Our manual deduplication and cross-validation protocol identified 289 survey submissions (289/478, 60.4%) as likely spam, most stemming from advertising on social media. The basic components of this deduplication and validation protocol are detailed. An unexpected challenge encountered was invalid survey responses evolving across the study period. This necessitated the static detection protocol be augmented with a dynamic one. Five recommendations for validation of Web-based samples, especially with smaller difficult-to-recruit populations, are detailed. ©James Dewitt, Benjamin Capistrant, Nidhi Kohli, B R Simon Rosser, Darryl Mitteldorf, Enyinnaya Merengwa, William West. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 24.04.2018.

  9. HEART RATE AND INDIRECT BLOOD PRESSURE RESPONSES TO FOUR DIFFERENT FIELD ANESTHETIC PROTOCOLS IN WILD-BORN CAPTIVE CHIMPANZEES (PAN TROGLODYTES).

    PubMed

    Atencia, Rebeca; Stöhr, Eric J; Drane, Aimee L; Stembridge, Mike; Howatson, Glyn; Del Rio, Pablo Rodriguez Lopez; Feltrer, Yedra; Tafon, Babila; Redrobe, Sharon; Peck, Bruce; Eng, Jaclyn; Unwin, Steve; Sanchez, Carlos R; Shave, Rob E

    2017-09-01

    Limited data are available on hemodynamic responses to anesthetic protocols in wild-born chimpanzees (Pan troglodytes). Accordingly, this study characterized the heart rate (HR) and blood pressure responses to four anesthetic protocols in 176 clinically healthy, wild-born chimpanzees undergoing routine health assessments. Animals were anesthetized with medetomidine-ketamine (MK) (n = 101), tiletamine-zolazepam (TZ) (n = 30), tiletamine-zolazepam-medetomidine (TZM) (n = 24), or medetomidine-ketamine (maintained with isoflurane) (MKI) (n = 21). During each procedure, HR, systolic blood pressure (SBP), and diastolic blood pressure (DBP) were regularly recorded. Data were grouped according to anesthetic protocol, and mean HR, SBP, and DBP were calculated. Differences between mean HR, SBP, and DBP for each anesthetic protocol were assessed using the Kruskall-Wallis test and a Dunn multiple comparisons post hoc analysis. To assess the hemodynamic time course response to each anesthetic protocol, group mean data (±95% confidence interval [CI]) were plotted against time postanesthetic induction. Mean HR (beats/min [CI]) was significantly higher in TZ (86 [80-92]) compared to MKI (69 [61-78]) and MK (62 [60-64]) and in TZM (73 [68-78]) compared to MK. The average SBP and DBP values (mm Hg [CI]) were significantly higher in MK (130 [126-134] and 94 [91-97]) compared to TZ (104 [96-112] and 58 [53-93]) and MKI (113 [103-123] and 78 [69-87]) and in TZM (128 [120-135] and 88 [83-93]) compared to TZ. Time course data were markedly different between protocols, with MKI showing the greatest decline over time. Both the anesthetic protocol adopted and the timing of measurement after injection influence hemodynamic recordings in wild-born chimpanzees and need to be considered when monitoring or assessing cardiovascular health.

  10. Comparison of cyclophosphamide-thalidomide-dexamethasone to bortezomib-cyclophosphamide-dexamethasone as induction therapy for multiple myeloma patients in Brazil.

    PubMed

    Vigolo, Suelen; Zuckermann, Joice; Bittencourt, Rosane Isabel; Silla, Lúcia; Pilger, Diogo André

    2017-09-01

    Chemotherapy followed by autologous hematopoietic stem cell transplantation (HSCT) remains the standard treatment for multiple myeloma (MM). Thalidomide or bortezomib may be combined with cyclophosphamide and dexamethasone, in what are known as the CTD and VCD protocols, respectively. The objective of this study was to evaluate the clinical characteristics and response rates obtained with CTD and VCD, observing whether the inclusion of bortezomib to treat MM patients in Brazil increases therapeutic efficiency. Forty-three MM patients treated with induction protocols CTD and VCD between January 2010 and March 2015 were included. The parameters analyzed were staging, frequency of comorbidities prior to treatment, response rates obtained at each induction cycle, progression-free survival, and overall survival of patients. Very good partial response and complete response obtained with the VCD protocol were superior, compared with the CTD treatment. The presence of comorbidities was similar in the two groups, except kidney failure, which prevailed in the VCD group. Also, 78.3% and 48.3% of patients treated with the VCD and CTD protocols underwent autologous HSCT, respectively. In patients given the VCD protocol, 45.5% had complete response before autologous HSCT. Among those given CTD, this number was only 7.1% (p=0.023). Disease progression after autologous HSCT did not differ between the two groups. VCD afforded better responses than the CTD protocol, and improved patient condition before autologous HSCT. However, more studies are necessary including more patients and addressing various clinical conditions, besides the analysis of cost-effectiveness of these treatments. Copyright © 2017 King Faisal Specialist Hospital & Research Centre. Published by Elsevier B.V. All rights reserved.

  11. The contemptuous separation: Facial expressions of emotion and breakups in young adulthood

    PubMed Central

    Heshmati, Saeideh; Sbarra, David A.; Mason, Ashley E.

    2017-01-01

    The importance of studying specific and expressed emotions after a stressful life event is well known, yet few studies have moved beyond assessing self-reported emotional responses to a romantic breakup. This study examined associations between computer-recognized facial expressions and self-reported breakup-related distress among recently separated college-aged young adults (N = 135; 37 men) on four visits across 9 weeks. Participants’ facial expressions were coded using the Computer Expression Recognition Toolbox while participants spoke about their breakups. Of the seven expressed emotions studied, only Contempt showed a unique association with breakup-related distress over time. At baseline, greater Contempt was associated with less breakup-related distress; however, over time, greater Contempt was associated with greater breakup-related distress. PMID:29249896

  12. The contemptuous separation: Facial expressions of emotion and breakups in young adulthood.

    PubMed

    Heshmati, Saeideh; Sbarra, David A; Mason, Ashley E

    2017-06-01

    The importance of studying specific and expressed emotions after a stressful life event is well known, yet few studies have moved beyond assessing self-reported emotional responses to a romantic breakup. This study examined associations between computer-recognized facial expressions and self-reported breakup-related distress among recently separated college-aged young adults ( N = 135; 37 men) on four visits across 9 weeks. Participants' facial expressions were coded using the Computer Expression Recognition Toolbox while participants spoke about their breakups. Of the seven expressed emotions studied, only Contempt showed a unique association with breakup-related distress over time. At baseline, greater Contempt was associated with less breakup-related distress; however, over time, greater Contempt was associated with greater breakup-related distress.

  13. Image-based Analysis to Study Plant Infection with Human Pathogens

    PubMed Central

    Schikora, Marek; Schikora, Adam

    2014-01-01

    Our growing awareness that contaminated plants, fresh fruits and vegetables are responsible for a significant proportion of food poisoning with pathogenic microorganisms indorses the demand to understand the interactions between plants and human pathogens. Today we understand that those pathogens do not merely survive on or within plants, they actively infect plant organisms by suppressing their immune system. Studies on the infection process and disease development used mainly physiological, genetic, and molecular approaches, and image-based analysis provides yet another method for this toolbox. Employed as an observational tool, it bears the potential for objective and high throughput approaches, and together with other methods it will be very likely a part of data fusion approaches in the near future. PMID:25505501

  14. Dasymetric Toolbox

    EPA Pesticide Factsheets

    Dasymetric mapping is a geospatial technique that uses additional information, such as landcover types, to more accurately distribute data that has been assigned to arbitrary boundaries, such as census blocks.

  15. MMM: A toolbox for integrative structure modeling.

    PubMed

    Jeschke, Gunnar

    2018-01-01

    Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.

  16. Advanced Thermoplastic Polymers and Additive Manufacturing Applied to ISS Columbus Toolbox: Lessons Learnt and Results

    NASA Astrophysics Data System (ADS)

    Ferrino, Marinella; Secondo, Ottaviano; Sabbagh, Amir; Della Sala, Emilio

    2014-06-01

    In the frame of the International Space Station (ISS) Exploitation Program a new toolbox has been realized by TAS-I to accommodate the tools currently in use on the ISS Columbus Module utilizing full-scale prototypes obtained with 3D rapid prototyping. The manufacturing of the flight hardware by means of advanced thermoplastic polymer UL TEM 9085 and additive manufacturing Fused Deposition Modelling (FDM) technology represent innovative elements. In this paper, the results achieved and the lessons learned are analyzed to promote future technology know-how. The acquired experience confirmed that the additive manufacturing process allows to save time/cost and to realize new shapes/features to introduce innovation in products and future design processes for space applications.

  17. An Anonymous Surveying Protocol via Greenberger-Horne-Zeilinger States

    NASA Astrophysics Data System (ADS)

    Naseri, Mosayeb; Gong, Li-Hua; Houshmand, Monireh; Matin, Laleh Farhang

    2016-10-01

    A new experimentally feasible anonymous survey protocol with authentication using Greenberger-Horne-Zeilinger (GHZ) entangled states is proposed. In this protocol, a chief executive officer (CEO) of a firm or company is trying to find out the effect of a possible action. In order to prepare a fair voting, the CEO would like to make an anonymous survey and is also interested in the total action for the whole company and he doesn't want to have a partial estimate for each department. In our proposal, there are two voters, Alice and Bob, voting on a question with a response of either "yes" or "no" and a tallyman, whose responsibility is to determine whether they have cast the same vote or not. In the proposed protocol the total response of the voters is calculated without revealing the actual votes of the voters.

  18. Neuroimaging paradigms for tonotopic mapping (II): the influence of acquisition protocol.

    PubMed

    Langers, Dave R M; Sanchez-Panchuelo, Rosa M; Francis, Susan T; Krumbholz, Katrin; Hall, Deborah A

    2014-10-15

    Numerous studies on the tonotopic organisation of auditory cortex in humans have employed a wide range of neuroimaging protocols to assess cortical frequency tuning. In the present functional magnetic resonance imaging (fMRI) study, we made a systematic comparison between acquisition protocols with variable levels of interference from acoustic scanner noise. Using sweep stimuli to evoke travelling waves of activation, we measured sound-evoked response signals using sparse, clustered, and continuous imaging protocols that were characterised by inter-scan intervals of 8.8, 2.2, or 0.0 s, respectively. With regard to sensitivity to sound-evoked activation, the sparse and clustered protocols performed similarly, and both detected more activation than the continuous method. Qualitatively, tonotopic maps in activated areas proved highly similar, in the sense that the overall pattern of tonotopic gradients was reproducible across all three protocols. However, quantitatively, we observed substantial reductions in response amplitudes to moderately low stimulus frequencies that coincided with regions of strong energy in the scanner noise spectrum for the clustered and continuous protocols compared to the sparse protocol. At the same time, extreme frequencies became over-represented for these two protocols, and high best frequencies became relatively more abundant. Our results indicate that although all three scanning protocols are suitable to determine the layout of tonotopic fields, an exact quantitative assessment of the representation of various sound frequencies is substantially confounded by the presence of scanner noise. In addition, we noticed anomalous signal dynamics in response to our travelling wave paradigm that suggest that the assessment of frequency-dependent tuning is non-trivially influenced by time-dependent (hemo)dynamics when using sweep stimuli. Copyright © 2014. Published by Elsevier Inc.

  19. Sustainable knowledge development across cultural boundaries: Experiences from the EU-project SILMAS (Toolbox for conflict solving instruments in Alpine Lake Management)

    NASA Astrophysics Data System (ADS)

    Fegerl, Michael; Wieden, Wilfried

    2013-04-01

    Increasingly people have to communicate knowledge across cultural and language boundaries. Even though recent technologies offer powerful communication facilities people often feel confronted with barriers which clearly reduce their chances of making their interaction a success. Concrete evidence concerning such problems derives from a number of projects, where generated knowledge often results in dead-end products. In the Alpine Space-project SILMAS (Sustainable Instruments for Lake Management in Alpine Space), in which both authors were involved, a special approach (syneris® ) was taken to avoid this problem and to manage project knowledge in sustainable form. Under this approach knowledge input and output are handled interactively: Relevant knowledge can be developed continuously and users can always access the latest state of expertise. Resort to the respective tools and procedures can also assist in closing knowledge gaps and in developing innovative responses to familiar or novel problems. This contribution intends to describe possible ways and means which have been found to increase the chances of success of knowledge communication across cultural boundaries. The process of trans-cultural discussions of experts to find a standardized solution is highlighted as well as the problem of dissemination of expert knowledge to variant stakeholders. Finally lessons learned are made accessible, where a main task lies in the creation of a tool box for conflict solving instruments, as a demonstrable result of the project and for the time thereafter. The interactive web-based toolbox enables lake managers to access best practice instruments in standardized, explicit and cross-linguistic form.

  20. Microbe-ID: an open source toolbox for microbial genotyping and species identification.

    PubMed

    Tabima, Javier F; Everhart, Sydney E; Larsen, Meredith M; Weisberg, Alexandra J; Kamvar, Zhian N; Tancos, Matthew A; Smart, Christine D; Chang, Jeff H; Grünwald, Niklaus J

    2016-01-01

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID.

  1. The effects of traditional, superset, and tri-set resistance training structures on perceived intensity and physiological responses.

    PubMed

    Weakley, Jonathon J S; Till, Kevin; Read, Dale B; Roe, Gregory A B; Darrall-Jones, Joshua; Phibbs, Padraic J; Jones, Ben

    2017-09-01

    Investigate the acute and short-term (i.e., 24 h) effects of traditional (TRAD), superset (SS), and tri-set (TRI) resistance training protocols on perceptions of intensity and physiological responses. Fourteen male participants completed a familiarisation session and three resistance training protocols (i.e., TRAD, SS, and TRI) in a randomised-crossover design. Rating of perceived exertion, lactate concentration ([Lac]), creatine kinase concentration ([CK]), countermovement jump (CMJ), testosterone, and cortisol concentrations was measured pre, immediately, and 24-h post the resistance training sessions with magnitude-based inferences assessing changes/differences within/between protocols. TRI reported possible to almost certainly greater efficiency and rate of perceived exertion, although session perceived load was very likely lower. SS and TRI had very likely to almost certainly greater lactate responses during the protocols, with changes in [CK] being very likely and likely increased at 24 h, respectively. At 24-h post-training, CMJ variables in the TRAD protocol had returned to baseline; however, SS and TRI were still possibly to likely reduced. Possible increases in testosterone immediately post SS and TRI protocols were reported, with SS showing possible increases at 24-h post-training. TRAD and SS showed almost certain and likely decreases in cortisol immediately post, respectively, with TRAD reporting likely decreases at 24-h post-training. SS and TRI can enhance training efficiency and reduce training time. However, acute and short-term physiological responses differ between protocols. Athletes can utilise SS and TRI resistance training, but may require additional recovery post-training to minimise effects of fatigue.

  2. Development of protocols for confined extension/creep testing of geosynthetics for highway applications

    DOT National Transportation Integrated Search

    1998-03-01

    This report presents the development and verification of a testing protocol and protocol equipment for confined extension testing and confined creep testing for geosynthetic reinforcement materials. The developed data indicate that confined response ...

  3. Internal Stakeholder Engagement for Renewable Projects

    EPA Pesticide Factsheets

    The Toolbox for Renewable Energy Project Development's Internal Stakeholder Engagement for Renewable Projects page discusses the importance of garnering stakeholder buy-in for renewable energy project success.

  4. Validating a two-high-threshold measurement model for confidence rating data in recognition.

    PubMed

    Bröder, Arndt; Kellen, David; Schütz, Julia; Rohrmeier, Constanze

    2013-01-01

    Signal Detection models as well as the Two-High-Threshold model (2HTM) have been used successfully as measurement models in recognition tasks to disentangle memory performance and response biases. A popular method in recognition memory is to elicit confidence judgements about the presumed old/new status of an item, allowing for the easy construction of ROCs. Since the 2HTM assumes fewer latent memory states than response options are available in confidence ratings, the 2HTM has to be extended by a mapping function which models individual rating scale usage. Unpublished data from 2 experiments in Bröder and Schütz (2009) validate the core memory parameters of the model, and 3 new experiments show that the response mapping parameters are selectively affected by manipulations intended to affect rating scale use, and this is independent of overall old/new bias. Comparisons with SDT show that both models behave similarly, a case that highlights the notion that both modelling approaches can be valuable (and complementary) elements in a researcher's toolbox.

  5. Generation of insulin-producing cells from human bone marrow-derived mesenchymal stem cells: comparison of three differentiation protocols.

    PubMed

    Gabr, Mahmoud M; Zakaria, Mahmoud M; Refaie, Ayman F; Khater, Sherry M; Ashamallah, Sylvia A; Ismail, Amani M; El-Badri, Nagwa; Ghoneim, Mohamed A

    2014-01-01

    Many protocols were utilized for directed differentiation of mesenchymal stem cells (MSCs) to form insulin-producing cells (IPCs). We compared the relative efficiency of three differentiation protocols. Human bone marrow-derived MSCs (HBM-MSCs) were obtained from three insulin-dependent type 2 diabetic patients. Differentiation into IPCs was carried out by three protocols: conophylline-based (one-step protocol), trichostatin-A-based (two-step protocol), and β -mercaptoethanol-based (three-step protocol). At the end of differentiation, cells were evaluated by immunolabeling for insulin production, expression of pancreatic endocrine genes, and release of insulin and c-peptide in response to increasing glucose concentrations. By immunolabeling, the proportion of generated IPCs was modest ( ≃ 3%) in all the three protocols. All relevant pancreatic endocrine genes, insulin, glucagon, and somatostatin, were expressed. There was a stepwise increase in insulin and c-peptide release in response to glucose challenge, but the released amounts were low when compared with those of pancreatic islets. The yield of functional IPCs following directed differentiation of HBM-MSCs was modest and was comparable among the three tested protocols. Protocols for directed differentiation of MSCs need further optimization in order to be clinically meaningful. To this end, addition of an extracellular matrix and/or a suitable template should be attempted.

  6. [Usefulness of a protocol for carotid sinus massage in supine and erect postures in patients with syncope without other cardiovascular or neurological diseases].

    PubMed

    Bocchiardo, M; Alciati, M; Buscemi, A; Cravetto, A; Richiardi, E; Gaita, F

    1995-05-01

    Carotid sinus massage is a first level test when investigating the cause of syncope. It is normally performed in the supine and erect positions. However, there is no standard complete protocol. So we have devised a new protocol to evaluate the utility of carotid sinus massage in different postures and the influence of patients age on the response. Two groups of subjects were selected: a group of 167 patients (mean age 50 ys +/- 18, 105 males, 62 females) with a history of syncope without cardiovascular and neurological disease and 20 asymptomatic control subjects (mean age 52 ys +/- 13, 11 males, 9 females). Carotid sinus massage was performed supine, just after passive tilt, after 5 minutes of tilt and just after passive return to supine. If a pause > 3" was detected, the protocol was repeated after atropine i.v. injection. Borderline vasodepressor: blood pressure reduction > 30 but < 50 mm Hg without symptoms; vasodepressor: blood pressure reduction > 50 mm Hg or > 30 mm Hg with symptoms like dizziness, vertigo or syncope; cardioinhibitory: pause > 3"; mixed: cardioinhibitory with blood pressure reduction > 30 mm Hg after atropine. Carotid sinus massage gave all informations in the supine position in 14 (12%) patients, after passive tilt in 67 (57%), after 5 minutes of tilt in 30 (26%), and after return to supine in 6 (5%). The responses were: 13 (8%) borderline vasodepressor, 32 (19%) vasodepressor, 2 (1%) cardioinhibitory, 70 (42%) mixed, 50 (30%) negative. Positive responses were more frequent in patients over 45 years (90% versus 43%). In the control group only 3 (15%) positive responses were elicited (2 borderline vasodepressor, and 1 vasodepressor, all in subjects over 45). This protocol for carotid sinus massage evidenced positive responses in 70% of patients with syncope without cardiovascular and neurological disease; cardioinhibitory responses are rare (2%); positive responses are more frequent in patients over 45 years; the protocol specificity was 85%.

  7. Soil Fumigants

    EPA Pesticide Factsheets

    This Toolbox provides training, outreach, and resource materials for applicators and handlers, communities, state/local agencies, and others interested in understanding and implementing the current requirements for safe use of these pesticides.

  8. CalRecycle Home Page

    Science.gov Websites

    ; Regulations Data Central Facility Information Toolbox (FacIT) Resources Grant, Payment, & Loan Programs environment--particularly in disadvantaged communities. See our Greenhouse Gas Reduction Grant and Loan

  9. What happens in the lab does not stay in the lab [corrected]: Applying midstream modulation to enhance critical reflection in the laboratory.

    PubMed

    Schuurbiers, Daan

    2011-12-01

    In response to widespread policy prescriptions for responsible innovation, social scientists and engineering ethicists, among others, have sought to engage natural scientists and engineers at the 'midstream': building interdisciplinary collaborations to integrate social and ethical considerations with research and development processes. Two 'laboratory engagement studies' have explored how applying the framework of midstream modulation could enhance the reflections of natural scientists on the socio-ethical context of their work. The results of these interdisciplinary collaborations confirm the utility of midstream modulation in encouraging both first- and second-order reflective learning. The potential for second-order reflective learning, in which underlying value systems become the object of reflection, is particularly significant with respect to addressing social responsibility in research practices. Midstream modulation served to render the socio-ethical context of research visible in the laboratory and helped enable research participants to more critically reflect on this broader context. While lab-based collaborations would benefit from being carried out in concert with activities at institutional and policy levels, midstream modulation could prove a valuable asset in the toolbox of interdisciplinary methods aimed at responsible innovation.

  10. SCoT: a Python toolbox for EEG source connectivity.

    PubMed

    Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R

    2014-01-01

    Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT-a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT.

  11. MTpy: A Python toolbox for magnetotellurics

    NASA Astrophysics Data System (ADS)

    Krieger, Lars; Peacock, Jared R.

    2014-11-01

    We present the software package MTpy that allows handling, processing, and imaging of magnetotelluric (MT) data sets. Written in Python, the code is open source, containing sub-packages and modules for various tasks within the standard MT data processing and handling scheme. Besides the independent definition of classes and functions, MTpy provides wrappers and convenience scripts to call standard external data processing and modelling software. In its current state, modules and functions of MTpy work on raw and pre-processed MT data. However, opposite to providing a static compilation of software, we prefer to introduce MTpy as a flexible software toolbox, whose contents can be combined and utilised according to the respective needs of the user. Just as the overall functionality of a mechanical toolbox can be extended by adding new tools, MTpy is a flexible framework, which will be dynamically extended in the future. Furthermore, it can help to unify and extend existing codes and algorithms within the (academic) MT community. In this paper, we introduce the structure and concept of MTpy. Additionally, we show some examples from an everyday work-flow of MT data processing: the generation of standard EDI data files from raw electric (E-) and magnetic flux density (B-) field time series as input, the conversion into MiniSEED data format, as well as the generation of a graphical data representation in the form of a Phase Tensor pseudosection.

  12. CERENA: ChEmical REaction Network Analyzer--A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics.

    PubMed

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.

  13. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis

    PubMed Central

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475

  14. SCoT: a Python toolbox for EEG source connectivity

    PubMed Central

    Billinger, Martin; Brunner, Clemens; Müller-Putz, Gernot R.

    2014-01-01

    Analysis of brain connectivity has become an important research tool in neuroscience. Connectivity can be estimated between cortical sources reconstructed from the electroencephalogram (EEG). Such analysis often relies on trial averaging to obtain reliable results. However, some applications such as brain-computer interfaces (BCIs) require single-trial estimation methods. In this paper, we present SCoT—a source connectivity toolbox for Python. This toolbox implements routines for blind source decomposition and connectivity estimation with the MVARICA approach. Additionally, a novel extension called CSPVARICA is available for labeled data. SCoT estimates connectivity from various spectral measures relying on vector autoregressive (VAR) models. Optionally, these VAR models can be regularized to facilitate ill posed applications such as single-trial fitting. We demonstrate basic usage of SCoT on motor imagery (MI) data. Furthermore, we show simulation results of utilizing SCoT for feature extraction in a BCI application. These results indicate that CSPVARICA and correct regularization can significantly improve MI classification. While SCoT was mainly designed for application in BCIs, it contains useful tools for other areas of neuroscience. SCoT is a software package that (1) brings combined source decomposition and connectivtiy estimation to the open Python platform, and (2) offers tools for single-trial connectivity estimation. The source code is released under the MIT license and is available online at github.com/SCoT-dev/SCoT. PMID:24653694

  15. SinCHet: a MATLAB toolbox for single cell heterogeneity analysis in cancer.

    PubMed

    Li, Jiannong; Smalley, Inna; Schell, Michael J; Smalley, Keiran S M; Chen, Y Ann

    2017-09-15

    Single-cell technologies allow characterization of transcriptomes and epigenomes for individual cells under different conditions and provide unprecedented resolution for researchers to investigate cellular heterogeneity in cancer. The SinCHet ( gle ell erogeneity) toolbox is developed in MATLAB and has a graphical user interface (GUI) for visualization and user interaction. It analyzes both continuous (e.g. mRNA expression) and binary omics data (e.g. discretized methylation data). The toolbox does not only quantify cellular heterogeneity using S hannon P rofile (SP) at different clonal resolutions but also detects heterogeneity differences using a D statistic between two populations. It is defined as the area under the P rofile of S hannon D ifference (PSD). This flexible tool provides a default clonal resolution using the change point of PSD detected by multivariate adaptive regression splines model; it also allows user-defined clonal resolutions for further investigation. This tool provides insights into emerging or disappearing clones between conditions, and enables the prioritization of biomarkers for follow-up experiments based on heterogeneity or marker differences between and/or within cell populations. The SinCHet software is freely available for non-profit academic use. The source code, example datasets, and the compiled package are available at http://labpages2.moffitt.org/chen/software/ . ann.chen@moffitt.org. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  16. Vectorized data acquisition and fast triple-correlation integrals for Fluorescence Triple Correlation Spectroscopy

    PubMed Central

    Ridgeway, William K; Millar, David P; Williamson, James R

    2013-01-01

    Fluorescence Correlation Spectroscopy (FCS) is widely used to quantitate reaction rates and concentrations of molecules in vitro and in vivo. We recently reported Fluorescence Triple Correlation Spectroscopy (F3CS), which correlates three signals together instead of two. F3CS can analyze the stoichiometries of complex mixtures and detect irreversible processes by identifying time-reversal asymmetries. Here we report the computational developments that were required for the realization of F3CS and present the results as the Triple Correlation Toolbox suite of programs. Triple Correlation Toolbox is a complete data analysis pipeline capable of acquiring, correlating and fitting large data sets. Each segment of the pipeline handles error estimates for accurate error-weighted global fitting. Data acquisition was accelerated with a combination of off-the-shelf counter-timer chips and vectorized operations on 128-bit registers. This allows desktop computers with inexpensive data acquisition cards to acquire hours of multiple-channel data with sub-microsecond time resolution. Off-line correlation integrals were implemented as a two delay time multiple-tau scheme that scales efficiently with multiple processors and provides an unprecedented view of linked dynamics. Global fitting routines are provided to fit FCS and F3CS data to models containing up to ten species. Triple Correlation Toolbox is a complete package that enables F3CS to be performed on existing microscopes. PMID:23525193

  17. Report on the Current Inventory of the Toolbox for Plant Cell Wall Analysis: Proteinaceous and Small Molecular Probes

    PubMed Central

    Rydahl, Maja G.; Hansen, Aleksander R.; Kračun, Stjepan K.; Mravec, Jozef

    2018-01-01

    Plant cell walls are highly complex structures composed of diverse classes of polysaccharides, proteoglycans, and polyphenolics, which have numerous roles throughout the life of a plant. Significant research efforts aim to understand the biology of this cellular organelle and to facilitate cell-wall-based industrial applications. To accomplish this, researchers need to be provided with a variety of sensitive and specific detection methods for separate cell wall components, and their various molecular characteristics in vitro as well as in situ. Cell wall component-directed molecular detection probes (in short: cell wall probes, CWPs) are an essential asset to the plant glycobiology toolbox. To date, a relatively large set of CWPs has been produced—mainly consisting of monoclonal antibodies, carbohydrate-binding modules, synthetic antibodies produced by phage display, and small molecular probes. In this review, we summarize the state-of-the-art knowledge about these CWPs; their classification and their advantages and disadvantages in different applications. In particular, we elaborate on the recent advances in non-conventional approaches to the generation of novel CWPs, and identify the remaining gaps in terms of target recognition. This report also highlights the addition of new “compartments” to the probing toolbox, which is filled with novel chemical biology tools, such as metabolic labeling reagents and oligosaccharide conjugates. In the end, we also forecast future developments in this dynamic field. PMID:29774041

  18. CERENA: ChEmical REaction Network Analyzer—A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics

    PubMed Central

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J.; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/. PMID:26807911

  19. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  20. Potential Projective Material on the Rorschach: Comparing Comprehensive System Protocols to Their Modeled R-Optimized Administration Counterparts.

    PubMed

    Pianowski, Giselle; Meyer, Gregory J; Villemor-Amaral, Anna Elisa de

    2016-01-01

    Exner ( 1989 ) and Weiner ( 2003 ) identified 3 types of Rorschach codes that are most likely to contain personally relevant projective material: Distortions, Movement, and Embellishments. We examine how often these types of codes occur in normative data and whether their frequency changes for the 1st, 2nd, 3rd, 4th, or last response to a card. We also examine the impact on these variables of the Rorschach Performance Assessment System's (R-PAS) statistical modeling procedures that convert the distribution of responses (R) from Comprehensive System (CS) administered protocols to match the distribution of R found in protocols obtained using R-optimized administration guidelines. In 2 normative reference databases, the results indicated that about 40% of responses (M = 39.25) have 1 type of code, 15% have 2 types, and 1.5% have all 3 types, with frequencies not changing by response number. In addition, there were no mean differences in the original CS and R-optimized modeled records (M Cohen's d = -0.04 in both databases). When considered alongside findings showing minimal differences between the protocols of people randomly assigned to CS or R-optimized administration, the data suggest R-optimized administration should not alter the extent to which potential projective material is present in a Rorschach protocol.

Top