Sample records for robust identification toolbox

  1. Structured Uncertainty Bound Determination From Data for Control and Performance Validation

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    2003-01-01

    This report attempts to document the broad scope of issues that must be satisfactorily resolved before one can expect to methodically obtain, with a reasonable confidence, a near-optimal robust closed loop performance in physical applications. These include elements of signal processing, noise identification, system identification, model validation, and uncertainty modeling. Based on a recently developed methodology involving a parameterization of all model validating uncertainty sets for a given linear fractional transformation (LFT) structure and noise allowance, a new software, Uncertainty Bound Identification (UBID) toolbox, which conveniently executes model validation tests and determine uncertainty bounds from data, has been designed and is currently available. This toolbox also serves to benchmark the current state-of-the-art in uncertainty bound determination and in turn facilitate benchmarking of robust control technology. To help clarify the methodology and use of the new software, two tutorial examples are provided. The first involves the uncertainty characterization of a flexible structure dynamics, and the second example involves a closed loop performance validation of a ducted fan based on an uncertainty bound from data. These examples, along with other simulation and experimental results, also help describe the many factors and assumptions that determine the degree of success in applying robust control theory to practical problems.

  2. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  3. Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John

    2014-02-21

    The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less

  4. Integrating hidden Markov model and PRAAT: a toolbox for robust automatic speech transcription

    NASA Astrophysics Data System (ADS)

    Kabir, A.; Barker, J.; Giurgiu, M.

    2010-09-01

    An automatic time-aligned phone transcription toolbox of English speech corpora has been developed. Especially the toolbox would be very useful to generate robust automatic transcription and able to produce phone level transcription using speaker independent models as well as speaker dependent models without manual intervention. The system is based on standard Hidden Markov Models (HMM) approach and it was successfully experimented over a large audiovisual speech corpus namely GRID corpus. One of the most powerful features of the toolbox is the increased flexibility in speech processing where the speech community would be able to import the automatic transcription generated by HMM Toolkit (HTK) into a popular transcription software, PRAAT, and vice-versa. The toolbox has been evaluated through statistical analysis on GRID data which shows that automatic transcription deviates by an average of 20 ms with respect to manual transcription.

  5. Exploration of available feature detection and identification systems and their performance on radiographs

    NASA Astrophysics Data System (ADS)

    Wantuch, Andrew C.; Vita, Joshua A.; Jimenez, Edward S.; Bray, Iliana E.

    2016-10-01

    Despite object detection, recognition, and identification being very active areas of computer vision research, many of the available tools to aid in these processes are designed with only photographs in mind. Although some algorithms used specifically for feature detection and identification may not take explicit advantage of the colors available in the image, they still under-perform on radiographs, which are grayscale images. We are especially interested in the robustness of these algorithms, specifically their performance on a preexisting database of X-ray radiographs in compressed JPEG form, with multiple ways of describing pixel information. We will review various aspects of the performance of available feature detection and identification systems, including MATLABs Computer Vision toolbox, VLFeat, and OpenCV on our non-ideal database. In the process, we will explore possible reasons for the algorithms' lessened ability to detect and identify features from the X-ray radiographs.

  6. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    DTIC Science & Technology

    2013-04-24

    DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and

  7. FDA Escherichia coli Identification (FDA-ECID) Microarray: a Pangenome Molecular Toolbox for Serotyping, Virulence Profiling, Molecular Epidemiology, and Phylogeny

    PubMed Central

    Patel, Isha R.; Gangiredla, Jayanthi; Lacher, David W.; Mammel, Mark K.; Jackson, Scott A.; Lampel, Keith A.

    2016-01-01

    ABSTRACT Most Escherichia coli strains are nonpathogenic. However, for clinical diagnosis and food safety analysis, current identification methods for pathogenic E. coli either are time-consuming and/or provide limited information. Here, we utilized a custom DNA microarray with informative genetic features extracted from 368 sequence sets for rapid and high-throughput pathogen identification. The FDA Escherichia coli Identification (FDA-ECID) platform contains three sets of molecularly informative features that together stratify strain identification and relatedness. First, 53 known flagellin alleles, 103 alleles of wzx and wzy, and 5 alleles of wzm provide molecular serotyping utility. Second, 41,932 probe sets representing the pan-genome of E. coli provide strain-level gene content information. Third, approximately 125,000 single nucleotide polymorphisms (SNPs) of available whole-genome sequences (WGS) were distilled to 9,984 SNPs capable of recapitulating the E. coli phylogeny. We analyzed 103 diverse E. coli strains with available WGS data, including those associated with past foodborne illnesses, to determine robustness and accuracy. The array was able to accurately identify the molecular O and H serotypes, potentially correcting serological failures and providing better resolution for H-nontypeable/nonmotile phenotypes. In addition, molecular risk assessment was possible with key virulence marker identifications. Epidemiologically, each strain had a unique comparative genomic fingerprint that was extended to an additional 507 food and clinical isolates. Finally, a 99.7% phylogenetic concordance was established between microarray analysis and WGS using SNP-level data for advanced genome typing. Our study demonstrates FDA-ECID as a powerful tool for epidemiology and molecular risk assessment with the capacity to profile the global landscape and diversity of E. coli. IMPORTANCE This study describes a robust, state-of-the-art platform developed from available whole-genome sequences of E. coli and Shigella spp. by distilling useful signatures for epidemiology and molecular risk assessment into one assay. The FDA-ECID microarray contains features that enable comprehensive molecular serotyping and virulence profiling along with genome-scale genotyping and SNP analysis. Hence, it is a molecular toolbox that stratifies strain identification and pathogenic potential in the contexts of epidemiology and phylogeny. We applied this tool to strains from food, environmental, and clinical sources, resulting in significantly greater phylogenetic and strain-specific resolution than previously reported for available typing methods. PMID:27037122

  8. A finite-element toolbox for the stationary Gross-Pitaevskii equation with rotation

    NASA Astrophysics Data System (ADS)

    Vergez, Guillaume; Danaila, Ionut; Auliac, Sylvain; Hecht, Frédéric

    2016-12-01

    We present a new numerical system using classical finite elements with mesh adaptivity for computing stationary solutions of the Gross-Pitaevskii equation. The programs are written as a toolbox for FreeFem++ (www.freefem.org), a free finite-element software available for all existing operating systems. This offers the advantage to hide all technical issues related to the implementation of the finite element method, allowing to easily code various numerical algorithms. Two robust and optimized numerical methods were implemented to minimize the Gross-Pitaevskii energy: a steepest descent method based on Sobolev gradients and a minimization algorithm based on the state-of-the-art optimization library Ipopt. For both methods, mesh adaptivity strategies are used to reduce the computational time and increase the local spatial accuracy when vortices are present. Different run cases are made available for 2D and 3D configurations of Bose-Einstein condensates in rotation. An optional graphical user interface is also provided, allowing to easily run predefined cases or with user-defined parameter files. We also provide several post-processing tools (like the identification of quantized vortices) that could help in extracting physical features from the simulations. The toolbox is extremely versatile and can be easily adapted to deal with different physical models.

  9. Emotion assessment using the NIH Toolbox

    PubMed Central

    Butt, Zeeshan; Pilkonis, Paul A.; Cyranowski, Jill M.; Zill, Nicholas; Hendrie, Hugh C.; Kupst, Mary Jo; Kelly, Morgen A. R.; Bode, Rita K.; Choi, Seung W.; Lai, Jin-Shei; Griffith, James W.; Stoney, Catherine M.; Brouwers, Pim; Knox, Sarah S.; Cella, David

    2013-01-01

    One of the goals of the NIH Toolbox for Assessment of Neurological and Behavioral Function was to identify or develop brief measures of emotion for use in prospective epidemiologic and clinical research. Emotional health has significant links to physical health and exerts a powerful effect on perceptions of life quality. Based on an extensive literature review and expert input, the Emotion team identified 4 central subdomains: Negative Affect, Psychological Well-Being, Stress and Self-Efficacy, and Social Relationships. A subsequent psychometric review identified several existing self-report and proxy measures of these subdomains with measurement characteristics that met the NIH Toolbox criteria. In cases where adequate measures did not exist, robust item banks were developed to assess concepts of interest. A population-weighted sample was recruited by an online survey panel to provide initial item calibration and measure validation data. Participants aged 8 to 85 years completed self-report measures whereas parents/guardians responded for children aged 3 to 12 years. Data were analyzed using a combination of classic test theory and item response theory methods, yielding efficient measures of emotional health concepts. An overview of the development of the NIH Toolbox Emotion battery is presented along with preliminary results. Norming activities led to further refinement of the battery, thus enhancing the robustness of emotional health measurement for researchers using the NIH Toolbox. PMID:23479549

  10. Robust Correlation Analyses: False Positive and Power Validation Using a New Open Source Matlab Toolbox

    PubMed Central

    Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.

    2012-01-01

    Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907

  11. Robust correlation analyses: false positive and power validation using a new open source matlab toolbox.

    PubMed

    Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A

    2012-01-01

    Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.

  12. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    PubMed

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  13. DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals

    PubMed Central

    Lawhern, Vernon; Hairston, W. David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169

  14. System/observer/controller identification toolbox

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Horta, Lucas G.; Phan, Minh

    1992-01-01

    System Identification is the process of constructing a mathematical model from input and output data for a system under testing, and characterizing the system uncertainties and measurement noises. The mathematical model structure can take various forms depending upon the intended use. The SYSTEM/OBSERVER/CONTROLLER IDENTIFICATION TOOLBOX (SOCIT) is a collection of functions, written in MATLAB language and expressed in M-files, that implements a variety of modern system identification techniques. For an open loop system, the central features of the SOCIT are functions for identification of a system model and its corresponding forward and backward observers directly from input and output data. The system and observers are represented by a discrete model. The identified model and observers may be used for controller design of linear systems as well as identification of modal parameters such as dampings, frequencies, and mode shapes. For a closed-loop system, an observer and its corresponding controller gain directly from input and output data.

  15. A Tol2 Gateway-Compatible Toolbox for the Study of the Nervous System and Neurodegenerative Disease.

    PubMed

    Don, Emily K; Formella, Isabel; Badrock, Andrew P; Hall, Thomas E; Morsch, Marco; Hortle, Elinor; Hogan, Alison; Chow, Sharron; Gwee, Serene S L; Stoddart, Jack J; Nicholson, Garth; Chung, Roger; Cole, Nicholas J

    2017-02-01

    Currently there is a lack in fundamental understanding of disease progression of most neurodegenerative diseases, and, therefore, treatments and preventative measures are limited. Consequently, there is a great need for adaptable, yet robust model systems to both investigate elementary disease mechanisms and discover effective therapeutics. We have generated a Tol2 Gateway-compatible toolbox to study neurodegenerative disorders in zebrafish, which includes promoters for astrocytes, microglia and motor neurons, multiple fluorophores, and compatibility for the introduction of genes of interest or disease-linked genes. This toolbox will advance the rapid and flexible generation of zebrafish models to discover the biology of the nervous system and the disease processes that lead to neurodegeneration.

  16. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli.

    PubMed

    Crosse, Michael J; Di Liberto, Giovanni M; Bednar, Adam; Lalor, Edmund C

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter-often referred to as a temporal response function-that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application.

  17. The Multivariate Temporal Response Function (mTRF) Toolbox: A MATLAB Toolbox for Relating Neural Signals to Continuous Stimuli

    PubMed Central

    Crosse, Michael J.; Di Liberto, Giovanni M.; Bednar, Adam; Lalor, Edmund C.

    2016-01-01

    Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter—often referred to as a temporal response function—that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application. PMID:27965557

  18. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    PubMed

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  19. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    NASA Technical Reports Server (NTRS)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  20. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    PubMed

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  1. An overview of recent advances in system identification

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan

    1994-01-01

    This paper presents an overview of the recent advances in system identification for modal testing and control of large flexible structures. Several techniques are discussed including the Observer/Kalman Filter Identification, the Observer/Controller Identification, and the State-Space System Identification in the Frequency Domain. The System/Observer/Controller Toolbox developed at NASA Langley Research Center is used to show the applications of these techniques to real aerospace structures such as the Hubble spacecraft telescope and the active flexible aircraft wing.

  2. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  3. 0-6775 : NTCIP-based traffic signal evaluation and optimization toolbox.

    DOT National Transportation Integrated Search

    2014-08-01

    Routine maintenance of traffic signals requires : identification and resolution of hardware faults : and operational inefficiencies. Like most : agencies in charge of operating and maintaining : traffic signals scattered over large geographic : regio...

  4. Sensor Selection from Independence Graphs using Submodularity

    DTIC Science & Technology

    2017-02-01

    Krause , B. McMahan, Guestrin C., and Gupta A., “Robust sub- modular observation selection,” Journal of Machine Learning Research (JMLR), vol. 9, pp. 2761...235–257. Springer Berlin Heidelberg, 1983. [10] A. Krause , “SFO: A toolbox for submodular function optimization,” J. Mach. Learn. Res., vol. 11, pp

  5. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    PubMed Central

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-01-01

    Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code. PMID:19607698

  6. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.

    PubMed

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-07-16

    Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code.

  7. ERPLAB: an open-source toolbox for the analysis of event-related potentials

    PubMed Central

    Lopez-Calderon, Javier; Luck, Steven J.

    2014-01-01

    ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741

  8. ERPLAB: an open-source toolbox for the analysis of event-related potentials.

    PubMed

    Lopez-Calderon, Javier; Luck, Steven J

    2014-01-01

    ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB's EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB's tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user's guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  9. LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.

    PubMed

    Pernet, Cyril R; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A

    2011-01-01

    Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses.

  10. LIMO EEG: A Toolbox for Hierarchical LInear MOdeling of ElectroEncephaloGraphic Data

    PubMed Central

    Pernet, Cyril R.; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A.

    2011-01-01

    Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses. PMID:21403915

  11. A novel toolbox for E. coli lysis monitoring.

    PubMed

    Rajamanickam, Vignesh; Wurm, David; Slouka, Christoph; Herwig, Christoph; Spadiut, Oliver

    2017-01-01

    The bacterium Escherichia coli is a well-studied recombinant host organism with a plethora of applications in biotechnology. Highly valuable biopharmaceuticals, such as antibody fragments and growth factors, are currently being produced in E. coli. However, the high metabolic burden during recombinant protein production can lead to cell death, consequent lysis, and undesired product loss. Thus, fast and precise analyzers to monitor E. coli bioprocesses and to retrieve key process information, such as the optimal time point of harvest, are needed. However, such reliable monitoring tools are still scarce to date. In this study, we cultivated an E. coli strain producing a recombinant single-chain antibody fragment in the cytoplasm. In bioreactor cultivations, we purposely triggered cell lysis by pH ramps. We developed a novel toolbox using UV chromatograms as fingerprints and chemometric techniques to monitor these lysis events and used flow cytometry (FCM) as reference method to quantify viability offline. Summarizing, we were able to show that a novel toolbox comprising HPLC chromatogram fingerprinting and data science tools allowed the identification of E. coli lysis in a fast and reliable manner. We are convinced that this toolbox will not only facilitate E. coli bioprocess monitoring but will also allow enhanced process control in the future.

  12. Microbe-ID: An open source toolbox for microbial genotyping and species identification

    USDA-ARS?s Scientific Manuscript database

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user...

  13. Microbe-ID: an open source toolbox for microbial genotyping and species identification.

    PubMed

    Tabima, Javier F; Everhart, Sydney E; Larsen, Meredith M; Weisberg, Alexandra J; Kamvar, Zhian N; Tancos, Matthew A; Smart, Christine D; Chang, Jeff H; Grünwald, Niklaus J

    2016-01-01

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID.

  14. Citizen Science Air Monitoring in the Ironbound Community ...

    EPA Pesticide Factsheets

    The Environmental Protection Agency’s (EPA) mission is to protect human health and the environment. To move toward achieving this goal, EPA is facilitating identification of potential environmental concerns, particularly in vulnerable communities. This includes actively supporting citizen science projects and providing communities with the information and assistance they need to conduct their own air pollution monitoring efforts. The Air Sensor Toolbox for Citizen Scientists1 was developed as a resource to meet stakeholder needs. Examples of materials developed for the Toolbox and ultimately pilot tested in the Ironbound Community in Newark, New Jersey are reported here. The Air Sensor Toolbox for Citizen Scientists is designed as an online resource that provides information and guidance on new, low-cost compact technologies used for measuring air quality. The Toolbox features resources developed by EPA researchers that can be used by citizens to effectively collect, analyze, interpret, and communicate air quality data. The resources include information about sampling methods, how to calibrate and validate monitors, options for measuring air quality, data interpretation guidelines, and low-cost sensor performance information. This Regional Applied Research Effort (RARE) project provided an opportunity for the Office of Research and Development (ORD) to work collaboratively with EPA Region 2 to provide the Ironbound Community with a “Toolbox” specific for c

  15. Robust signal peptides for protein secretion in Yarrowia lipolytica: identification and characterization of novel secretory tags.

    PubMed

    Celińska, Ewelina; Borkowska, Monika; Białas, Wojciech; Korpys, Paulina; Nicaud, Jean-Marc

    2018-06-01

    Upon expression of a given protein in an expression host, its secretion into the culture medium or cell-surface display is frequently advantageous in both research and industrial contexts. Hence, engineering strategies targeting folding, trafficking, and secretion of the proteins gain considerable interest. Yarrowia lipolytica has emerged as an efficient protein expression platform, repeatedly proved to be a competitive secretor of proteins. Although the key role of signal peptides (SPs) in secretory overexpression of proteins and their direct effect on the final protein titers are widely known, the number of reports on manipulation with SPs in Y. lipolytica is rather scattered. In this study, we assessed the potential of ten different SPs for secretion of two heterologous proteins in Y. lipolytica. Genomic and transcriptomic data mining allowed us to select five novel, previously undescribed SPs for recombinant protein secretion in Y. lipolytica. Their secretory potential was assessed in comparison with known, widely exploited SPs. We took advantage of Golden Gate approach, for construction of expression cassettes, and micro-volume enzymatic assays, for functional screening of large libraries of recombinant strains. Based on the adopted strategy, we identified novel secretory tags, characterized their secretory capacity, indicated the most potent SPs, and suggested a consensus sequence of a potentially robust synthetic SP to expand the molecular toolbox for engineering Y. lipolytica.

  16. Dynamical modeling and multi-experiment fitting with PottersWheel

    PubMed Central

    Maiwald, Thomas; Timmer, Jens

    2008-01-01

    Motivation: Modelers in Systems Biology need a flexible framework that allows them to easily create new dynamic models, investigate their properties and fit several experimental datasets simultaneously. Multi-experiment-fitting is a powerful approach to estimate parameter values, to check the validity of a given model, and to discriminate competing model hypotheses. It requires high-performance integration of ordinary differential equations and robust optimization. Results: We here present the comprehensive modeling framework Potters-Wheel (PW) including novel functionalities to satisfy these requirements with strong emphasis on the inverse problem, i.e. data-based modeling of partially observed and noisy systems like signal transduction pathways and metabolic networks. PW is designed as a MATLAB toolbox and includes numerous user interfaces. Deterministic and stochastic optimization routines are combined by fitting in logarithmic parameter space allowing for robust parameter calibration. Model investigation includes statistical tests for model-data-compliance, model discrimination, identifiability analysis and calculation of Hessian- and Monte-Carlo-based parameter confidence limits. A rich application programming interface is available for customization within own MATLAB code. Within an extensive performance analysis, we identified and significantly improved an integrator–optimizer pair which decreases the fitting duration for a realistic benchmark model by a factor over 3000 compared to MATLAB with optimization toolbox. Availability: PottersWheel is freely available for academic usage at http://www.PottersWheel.de/. The website contains a detailed documentation and introductory videos. The program has been intensively used since 2005 on Windows, Linux and Macintosh computers and does not require special MATLAB toolboxes. Contact: maiwald@fdm.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18614583

  17. Robust blood-glucose control using Mathematica.

    PubMed

    Kovács, Levente; Paláncz, Béla; Benyó, Balázs; Török, László; Benyó, Zoltán

    2006-01-01

    A robust control design on frequency domain using Mathematica is presented for regularization of glucose level in type I diabetes persons under intensive care. The method originally proposed under Mathematica by Helton and Merino, --now with an improved disturbance rejection constraint inequality--is employed, using a three-state minimal patient model. The robustness of the resulted high-order linear controller is demonstrated by nonlinear closed loop simulation in state-space, in case of standard meal disturbances and is compared with H infinity design implemented with the mu-toolbox of Matlab. The controller designed with model parameters represented the most favorable plant dynamics from the point of view of control purposes, can operate properly even in case of parameter values of the worst-case scenario.

  18. iELVis: An open source MATLAB toolbox for localizing and visualizing human intracranial electrode data.

    PubMed

    Groppe, David M; Bickel, Stephan; Dykstra, Andrew R; Wang, Xiuyuan; Mégevand, Pierre; Mercier, Manuel R; Lado, Fred A; Mehta, Ashesh D; Honey, Christopher J

    2017-04-01

    Intracranial electrical recordings (iEEG) and brain stimulation (iEBS) are invaluable human neuroscience methodologies. However, the value of such data is often unrealized as many laboratories lack tools for localizing electrodes relative to anatomy. To remedy this, we have developed a MATLAB toolbox for intracranial electrode localization and visualization, iELVis. NEW METHOD: iELVis uses existing tools (BioImage Suite, FSL, and FreeSurfer) for preimplant magnetic resonance imaging (MRI) segmentation, neuroimaging coregistration, and manual identification of electrodes in postimplant neuroimaging. Subsequently, iELVis implements methods for correcting electrode locations for postimplant brain shift with millimeter-scale accuracy and provides interactive visualization on 3D surfaces or in 2D slices with optional functional neuroimaging overlays. iELVis also localizes electrodes relative to FreeSurfer-based atlases and can combine data across subjects via the FreeSurfer average brain. It takes 30-60min of user time and 12-24h of computer time to localize and visualize electrodes from one brain. We demonstrate iELVis's functionality by showing that three methods for mapping primary hand somatosensory cortex (iEEG, iEBS, and functional MRI) provide highly concordant results. COMPARISON WITH EXISTING METHODS: iELVis is the first public software for electrode localization that corrects for brain shift, maps electrodes to an average brain, and supports neuroimaging overlays. Moreover, its interactive visualizations are powerful and its tutorial material is extensive. iELVis promises to speed the progress and enhance the robustness of intracranial electrode research. The software and extensive tutorial materials are freely available as part of the EpiSurg software project: https://github.com/episurg/episurg. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Microbe-ID: an open source toolbox for microbial genotyping and species identification

    PubMed Central

    Tabima, Javier F.; Everhart, Sydney E.; Larsen, Meredith M.; Weisberg, Alexandra J.; Kamvar, Zhian N.; Tancos, Matthew A.; Smart, Christine D.; Chang, Jeff H.

    2016-01-01

    Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID. PMID:27602267

  20. Forensic Odontology: Automatic Identification of Persons Comparing Antemortem and Postmortem Panoramic Radiographs Using Computer Vision.

    PubMed

    Heinrich, Andreas; Güttler, Felix; Wendt, Sebastian; Schenkl, Sebastian; Hubig, Michael; Wagner, Rebecca; Mall, Gita; Teichgräber, Ulf

    2018-06-18

     In forensic odontology the comparison between antemortem and postmortem panoramic radiographs (PRs) is a reliable method for person identification. The purpose of this study was to improve and automate identification of unknown people by comparison between antemortem and postmortem PR using computer vision.  The study includes 43 467 PRs from 24 545 patients (46 % females/54 % males). All PRs were filtered and evaluated with Matlab R2014b including the toolboxes image processing and computer vision system. The matching process used the SURF feature to find the corresponding points between two PRs (unknown person and database entry) out of the whole database.  From 40 randomly selected persons, 34 persons (85 %) could be reliably identified by corresponding PR matching points between an already existing scan in the database and the most recent PR. The systematic matching yielded a maximum of 259 points for a successful identification between two different PRs of the same person and a maximum of 12 corresponding matching points for other non-identical persons in the database. Hence 12 matching points are the threshold for reliable assignment.  Operating with an automatic PR system and computer vision could be a successful and reliable tool for identification purposes. The applied method distinguishes itself by virtue of its fast and reliable identification of persons by PR. This Identification method is suitable even if dental characteristics were removed or added in the past. The system seems to be robust for large amounts of data.   · Computer vision allows an automated antemortem and postmortem comparison of panoramic radiographs (PRs) for person identification.. · The present method is able to find identical matching partners among huge datasets (big data) in a short computing time.. · The identification method is suitable even if dental characteristics were removed or added.. · Heinrich A, Güttler F, Wendt S et al. Forensic Odontology: Automatic Identification of Persons Comparing Antemortem and Postmortem Panoramic Radiographs Using Computer Vision. Fortschr Röntgenstr 2018; DOI: 10.1055/a-0632-4744. © Georg Thieme Verlag KG Stuttgart · New York.

  1. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability

    PubMed Central

    ChariDingari, Narahara; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P.; Kumar, G. Manoj

    2012-01-01

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real world applications, e.g. quality assurance and process monitoring. Specifically, variability in sample, system and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a non-linear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), due to its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data – highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples as well as in related areas of forensic and biological sample analysis. PMID:22292496

  2. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability.

    PubMed

    Dingari, Narahara Chari; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P; Kumar Gundawar, Manoj

    2012-03-20

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real-world applications, e.g., quality assurance and process monitoring. Specifically, variability in sample, system, and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a nonlinear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that the application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), because of its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data-highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples, as well as in related areas of forensic and biological sample analysis.

  3. Robust stochastic stability of discrete-time fuzzy Markovian jump neural networks.

    PubMed

    Arunkumar, A; Sakthivel, R; Mathiyalagan, K; Park, Ju H

    2014-07-01

    This paper focuses the issue of robust stochastic stability for a class of uncertain fuzzy Markovian jumping discrete-time neural networks (FMJDNNs) with various activation functions and mixed time delay. By employing the Lyapunov technique and linear matrix inequality (LMI) approach, a new set of delay-dependent sufficient conditions are established for the robust stochastic stability of uncertain FMJDNNs. More precisely, the parameter uncertainties are assumed to be time varying, unknown and norm bounded. The obtained stability conditions are established in terms of LMIs, which can be easily checked by using the efficient MATLAB-LMI toolbox. Finally, numerical examples with simulation result are provided to illustrate the effectiveness and less conservativeness of the obtained results. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Qualitative methods in health services and management research: pockets of excellence and progress, but still a long way to go.

    PubMed

    Devers, Kelly J

    2011-02-01

    The 10-year systematic review of published health services and management research by Weiner et al. (2011) chronicles the contributions of qualitative methods, highlights areas of substantial progress, and identifies areas in need of more progress. This article (Devers, 2011) discusses possible reasons for lack of progress in some areas--related to the under-supply of well-trained qualitative researchers and more tangible demand for their research--and mechanisms for future improvement. To ensure a robust health services research toolbox, the field must take additional steps to provide stronger education and training in qualitative methods and more funding and publication opportunities. Given the rapidly changing health care system post the passage of national health reform and the chalresearch issues associated with it, the health services research and management field will not meet its future challenges with quantitative methods alone or with a half-empty toolbox.

  5. ParTIES: a toolbox for Paramecium interspersed DNA elimination studies.

    PubMed

    Denby Wilkes, Cyril; Arnaiz, Olivier; Sperling, Linda

    2016-02-15

    Developmental DNA elimination occurs in a wide variety of multicellular organisms, but ciliates are the only single-celled eukaryotes in which this phenomenon has been reported. Despite considerable interest in ciliates as models for DNA elimination, no standard methods for identification and characterization of the eliminated sequences are currently available. We present the Paramecium Toolbox for Interspersed DNA Elimination Studies (ParTIES), designed for Paramecium species, that (i) identifies eliminated sequences, (ii) measures their presence in a sequencing sample and (iii) detects rare elimination polymorphisms. ParTIES is multi-threaded Perl software available at https://github.com/oarnaiz/ParTIES. ParTIES is distributed under the GNU General Public Licence v3. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Frequency Domain Identification Toolbox

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Juang, Jer-Nan; Chen, Chung-Wen

    1996-01-01

    This report documents software written in MATLAB programming language for performing identification of systems from frequency response functions. MATLAB is a commercial software environment which allows easy manipulation of data matrices and provides other intrinsic matrix functions capabilities. Algorithms programmed in this collection of subroutines have been documented elsewhere but all references are provided in this document. A main feature of this software is the use of matrix fraction descriptions and system realization theory to identify state space models directly from test data. All subroutines have templates for the user to use as guidelines.

  7. Robust master-slave synchronization for general uncertain delayed dynamical model based on adaptive control scheme.

    PubMed

    Wang, Tianbo; Zhou, Wuneng; Zhao, Shouwei; Yu, Weiqin

    2014-03-01

    In this paper, the robust exponential synchronization problem for a class of uncertain delayed master-slave dynamical system is investigated by using the adaptive control method. Different from some existing master-slave models, the considered master-slave system includes bounded unmodeled dynamics. In order to compensate the effect of unmodeled dynamics and effectively achieve synchronization, a novel adaptive controller with simple updated laws is proposed. Moreover, the results are given in terms of LMIs, which can be easily solved by LMI Toolbox in Matlab. A numerical example is given to illustrate the effectiveness of the method. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  8. MRI Atlas-Based Measurement of Spinal Cord Injury Predicts Outcome in Acute Flaccid Myelitis.

    PubMed

    McCoy, D B; Talbott, J F; Wilson, Michael; Mamlouk, M D; Cohen-Adad, J; Wilson, Mark; Narvid, J

    2017-02-01

    Recent advances in spinal cord imaging analysis have led to the development of a robust anatomic template and atlas incorporated into an open-source platform referred to as the Spinal Cord Toolbox. Using the Spinal Cord Toolbox, we sought to correlate measures of GM, WM, and cross-sectional area pathology on T2 MR imaging with motor disability in patients with acute flaccid myelitis. Spinal cord imaging for 9 patients with acute flaccid myelitis was analyzed by using the Spinal Cord Toolbox. A semiautomated pipeline using the Spinal Cord Toolbox measured lesion involvement in GM, WM, and total spinal cord cross-sectional area. Proportions of GM, WM, and cross-sectional area affected by T2 hyperintensity were calculated across 3 ROIs: 1) center axial section of lesion; 2) full lesion segment; and 3) full cord atlas volume. Spearman rank order correlation was calculated to compare MR metrics with clinical measures of disability. Proportion of GM metrics at the center axial section significantly correlated with measures of motor impairment upon admission ( r [9] = -0.78; P = .014) and at 3-month follow-up ( r [9] = -0.66; P = .05). Further, proportion of GM extracted across the full lesion segment significantly correlated with initial motor impairment ( r [9] = -0.74, P = .024). No significant correlation was found for proportion of WM or proportion of cross-sectional area with clinical disability. Atlas-based measures of proportion of GM T2 signal abnormality measured on a single axial MR imaging section and across the full lesion segment correlate with motor impairment and outcome in patients with acute flaccid myelitis. This is the first atlas-based study to correlate clinical outcomes with segmented measures of T2 signal abnormality in the spinal cord. © 2017 by American Journal of Neuroradiology.

  9. MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.

    PubMed

    Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica

    2015-01-01

    Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.

  10. Cortical Thickness Estimations of FreeSurfer and the CAT12 Toolbox in Patients with Alzheimer's Disease and Healthy Controls.

    PubMed

    Seiger, Rene; Ganger, Sebastian; Kranz, Georg S; Hahn, Andreas; Lanzenberger, Rupert

    2018-05-15

    Automated cortical thickness (CT) measurements are often used to assess gray matter changes in the healthy and diseased human brain. The FreeSurfer software is frequently applied for this type of analysis. The computational anatomy toolbox (CAT12) for SPM, which offers a fast and easy-to-use alternative approach, was recently made available. In this study, we compared region of interest (ROI)-wise CT estimations of the surface-based FreeSurfer 6 (FS6) software and the volume-based CAT12 toolbox for SPM using 44 elderly healthy female control subjects (HC). In addition, these 44 HCs from the cross-sectional analysis and 34 age- and sex-matched patients with Alzheimer's disease (AD) were used to assess the potential of detecting group differences for each method. Finally, a test-retest analysis was conducted using 19 HC subjects. All data were taken from the OASIS database and MRI scans were recorded at 1.5 Tesla. A strong correlation was observed between both methods in terms of ROI mean CT estimates (R 2 = .83). However, CAT12 delivered significantly higher CT estimations in 32 of the 34 ROIs, indicating a systematic difference between both approaches. Furthermore, both methods were able to reliably detect atrophic brain areas in AD subjects, with the highest decreases in temporal areas. Finally, FS6 as well as CAT12 showed excellent test-retest variability scores. Although CT estimations were systematically higher for CAT12, this study provides evidence that this new toolbox delivers accurate and robust CT estimates and can be considered a fast and reliable alternative to FreeSurfer. © 2018 The Authors. Journal of Neuroimaging published by Wiley Periodicals, Inc. on behalf of American Society of Neuroimaging.

  11. Research on Robust Control Strategies for VSC-HVDC

    NASA Astrophysics Data System (ADS)

    Zhu, Kaicheng; Bao, Hai

    2018-01-01

    In the control system of VSC-HVDC, the phase locked loop provides phase signals to voltage vector control and trigger pulses to generate the required reference phase. The PLL is a typical second-order system. When the system is in unstable state, it will oscillate, make the trigger angle shift, produce harmonic, and make active power and reactive power coupled. Thus, considering the external disturbances introduced by the PLL in VSC-HVDC control system, the parameter perturbations of the controller and the model uncertainties, a H∞ robust controller of mixed sensitivity optimization problem is designed by using the Hinf function provided by the robust control toolbox. Then, compare it with the proportional integral controller through the MATLAB simulation experiment. By contrast, when the H∞ robust controller is added, active and reactive power of the converter station can track the change of reference values more accurately and quickly, and reduce overshoot. When the step change of active and reactive power occurs, mutual influence is reduced and better independent regulation is achieved.

  12. iElectrodes: A Comprehensive Open-Source Toolbox for Depth and Subdural Grid Electrode Localization.

    PubMed

    Blenkmann, Alejandro O; Phillips, Holly N; Princich, Juan P; Rowe, James B; Bekinschtein, Tristan A; Muravchik, Carlos H; Kochen, Silvia

    2017-01-01

    The localization of intracranial electrodes is a fundamental step in the analysis of invasive electroencephalography (EEG) recordings in research and clinical practice. The conclusions reached from the analysis of these recordings rely on the accuracy of electrode localization in relationship to brain anatomy. However, currently available techniques for localizing electrodes from magnetic resonance (MR) and/or computerized tomography (CT) images are time consuming and/or limited to particular electrode types or shapes. Here we present iElectrodes, an open-source toolbox that provides robust and accurate semi-automatic localization of both subdural grids and depth electrodes. Using pre- and post-implantation images, the method takes 2-3 min to localize the coordinates in each electrode array and automatically number the electrodes. The proposed pre-processing pipeline allows one to work in a normalized space and to automatically obtain anatomical labels of the localized electrodes without neuroimaging experts. We validated the method with data from 22 patients implanted with a total of 1,242 electrodes. We show that localization distances were within 0.56 mm of those achieved by experienced manual evaluators. iElectrodes provided additional advantages in terms of robustness (even with severe perioperative cerebral distortions), speed (less than half the operator time compared to expert manual localization), simplicity, utility across multiple electrode types (surface and depth electrodes) and all brain regions.

  13. iElectrodes: A Comprehensive Open-Source Toolbox for Depth and Subdural Grid Electrode Localization

    PubMed Central

    Blenkmann, Alejandro O.; Phillips, Holly N.; Princich, Juan P.; Rowe, James B.; Bekinschtein, Tristan A.; Muravchik, Carlos H.; Kochen, Silvia

    2017-01-01

    The localization of intracranial electrodes is a fundamental step in the analysis of invasive electroencephalography (EEG) recordings in research and clinical practice. The conclusions reached from the analysis of these recordings rely on the accuracy of electrode localization in relationship to brain anatomy. However, currently available techniques for localizing electrodes from magnetic resonance (MR) and/or computerized tomography (CT) images are time consuming and/or limited to particular electrode types or shapes. Here we present iElectrodes, an open-source toolbox that provides robust and accurate semi-automatic localization of both subdural grids and depth electrodes. Using pre- and post-implantation images, the method takes 2–3 min to localize the coordinates in each electrode array and automatically number the electrodes. The proposed pre-processing pipeline allows one to work in a normalized space and to automatically obtain anatomical labels of the localized electrodes without neuroimaging experts. We validated the method with data from 22 patients implanted with a total of 1,242 electrodes. We show that localization distances were within 0.56 mm of those achieved by experienced manual evaluators. iElectrodes provided additional advantages in terms of robustness (even with severe perioperative cerebral distortions), speed (less than half the operator time compared to expert manual localization), simplicity, utility across multiple electrode types (surface and depth electrodes) and all brain regions. PMID:28303098

  14. Robust stability for stochastic bidirectional associative memory neural networks with time delays

    NASA Astrophysics Data System (ADS)

    Shu, H. S.; Lv, Z. W.; Wei, G. L.

    2008-02-01

    In this paper, the asymptotic stability is considered for a class of uncertain stochastic bidirectional associative memory neural networks with time delays and parameter uncertainties. The delays are time-invariant and the uncertainties are norm-bounded that enter into all network parameters. The aim of this paper is to establish easily verifiable conditions under which the delayed neural network is robustly asymptotically stable in the mean square for all admissible parameter uncertainties. By employing a Lyapunov-Krasovskii functional and conducting the stochastic analysis, a linear matrix inequality matrix inequality (LMI) approach is developed to derive the stability criteria. The proposed criteria can be easily checked by the Matlab LMI toolbox. A numerical example is given to demonstrate the usefulness of the proposed criteria.

  15. Verification of ARMA identification for modelling temporal correlation of GPS observations using the toolbox ARMASA

    NASA Astrophysics Data System (ADS)

    Luo, Xiaoguang; Mayer, Michael; Heck, Bernhard

    2010-05-01

    One essential deficiency of the stochastic model used in many GNSS (Global Navigation Satellite Systems) software products consists in neglecting temporal correlation of GNSS observations. Analysing appropriately detrended time series of observation residuals resulting from GPS (Global Positioning System) data processing, the temporal correlation behaviour of GPS observations can be sufficiently described by means of so-called autoregressive moving average (ARMA) processes. Using the toolbox ARMASA which is available free of charge in MATLAB® Central (open exchange platform for the MATLAB® and SIMULINK® user community), a well-fitting time series model can be identified automatically in three steps. Firstly, AR, MA, and ARMA models are computed up to some user-specified maximum order. Subsequently, for each model type, the best-fitting model is selected using the combined (for AR processes) resp. generalised (for MA and ARMA processes) information criterion. The final model identification among the best-fitting AR, MA, and ARMA models is performed based on the minimum prediction error characterising the discrepancies between the given data and the fitted model. The ARMA coefficients are computed using Burg's maximum entropy algorithm (for AR processes), Durbin's first (for MA processes) and second (for ARMA processes) methods, respectively. This paper verifies the performance of the automated ARMA identification using the toolbox ARMASA. For this purpose, a representative data base is generated by means of ARMA simulation with respect to sample size, correlation level, and model complexity. The model error defined as a transform of the prediction error is used as measure for the deviation between the true and the estimated model. The results of the study show that the recognition rates of underlying true processes increase with increasing sample sizes and decrease with rising model complexity. Considering large sample sizes, the true underlying processes can be correctly recognised for nearly 80% of the analysed data sets. Additionally, the model errors of first-order AR resp. MA processes converge clearly more rapidly to the corresponding asymptotical values than those of high-order ARMA processes.

  16. Development of a model using the MATLAB System identification toolbox to estimate (222)Rn equilibrium factor from CR-39 based passive measurements.

    PubMed

    Abo-Elmagd, M; Sadek, A M

    2014-12-01

    Can and Bare method is a widely used passive method for measuring the equilibrium factor F through the determination of the track density ratio between bare (D) and filtered (Do) detectors. The dimensions of the used diffusion chamber are altering the deposition ratios of Po-isotopes on the chamber walls as well as the ratios of the existing alpha emitters in air. Then the measured filtered track density and therefore the resultant equilibrium factor is changed according to the diffusion chamber dimensions. For this reason, high uncertainty was expected in the measured F using different diffusion chambers. In the present work, F is derived as a function of both track density ratio (D/Do) and the dimensions of the used diffusion chambers (its volume to the total internal surface area; V/A). The accuracy of the derived formula was verified using the black-box modeling technique via the MATLAB System identification toolbox. The results show that the uncertainty of the calculated F by using the derived formula of F (D/Do, V/A) is only 5%. The obtained uncertainty ensures the quality of the derived function to calculate F using diffusion chambers with wide range of dimensions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Robust synthetic biology design: stochastic game theory approach.

    PubMed

    Chen, Bor-Sen; Chang, Chia-Hung; Lee, Hsiao-Ching

    2009-07-15

    Synthetic biology is to engineer artificial biological systems to investigate natural biological phenomena and for a variety of applications. However, the development of synthetic gene networks is still difficult and most newly created gene networks are non-functioning due to uncertain initial conditions and disturbances of extra-cellular environments on the host cell. At present, how to design a robust synthetic gene network to work properly under these uncertain factors is the most important topic of synthetic biology. A robust regulation design is proposed for a stochastic synthetic gene network to achieve the prescribed steady states under these uncertain factors from the minimax regulation perspective. This minimax regulation design problem can be transformed to an equivalent stochastic game problem. Since it is not easy to solve the robust regulation design problem of synthetic gene networks by non-linear stochastic game method directly, the Takagi-Sugeno (T-S) fuzzy model is proposed to approximate the non-linear synthetic gene network via the linear matrix inequality (LMI) technique through the Robust Control Toolbox in Matlab. Finally, an in silico example is given to illustrate the design procedure and to confirm the efficiency and efficacy of the proposed robust gene design method. http://www.ee.nthu.edu.tw/bschen/SyntheticBioDesign_supplement.pdf.

  18. PetIGA-MF: A multi-field high-performance toolbox for structure-preserving B-splines spaces

    DOE PAGES

    Sarmiento, Adel; Cortes, Adriano; Garcia, Daniel; ...

    2016-10-07

    We describe the development of a high-performance solution framework for isogeometric discrete differential forms based on B-splines: PetIGA-MF. Built on top of PetIGA, PetIGA-MF is a general multi-field discretization tool. To test the capabilities of our implementation, we solve different viscous flow problems such as Darcy, Stokes, Brinkman, and Navier-Stokes equations. Several convergence benchmarks based on manufactured solutions are presented assuring optimal convergence rates of the approximations, showing the accuracy and robustness of our solver.

  19. Pareto Design of State Feedback Tracking Control of a Biped Robot via Multiobjective PSO in Comparison with Sigma Method and Genetic Algorithms: Modified NSGAII and MATLAB's Toolbox

    PubMed Central

    Mahmoodabadi, M. J.; Taherkhorsandi, M.; Bagheri, A.

    2014-01-01

    An optimal robust state feedback tracking controller is introduced to control a biped robot. In the literature, the parameters of the controller are usually determined by a tedious trial and error process. To eliminate this process and design the parameters of the proposed controller, the multiobjective evolutionary algorithms, that is, the proposed method, modified NSGAII, Sigma method, and MATLAB's Toolbox MOGA, are employed in this study. Among the used evolutionary optimization algorithms to design the controller for biped robots, the proposed method operates better in the aspect of designing the controller since it provides ample opportunities for designers to choose the most appropriate point based upon the design criteria. Three points are chosen from the nondominated solutions of the obtained Pareto front based on two conflicting objective functions, that is, the normalized summation of angle errors and normalized summation of control effort. Obtained results elucidate the efficiency of the proposed controller in order to control a biped robot. PMID:24616619

  20. DAFNE: A Matlab toolbox for Bayesian multi-source remote sensing and ancillary data fusion, with application to flood mapping

    NASA Astrophysics Data System (ADS)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco P.; Pasquariello, Guido

    2018-03-01

    High-resolution, remotely sensed images of the Earth surface have been proven to be of help in producing detailed flood maps, thanks to their synoptic overview of the flooded area and frequent revisits. However, flood scenarios can be complex situations, requiring the integration of different data in order to provide accurate and robust flood information. Several processing approaches have been recently proposed to efficiently combine and integrate heterogeneous information sources. In this paper, we introduce DAFNE, a Matlab®-based, open source toolbox, conceived to produce flood maps from remotely sensed and other ancillary information, through a data fusion approach. DAFNE is based on Bayesian Networks, and is composed of several independent modules, each one performing a different task. Multi-temporal and multi-sensor data can be easily handled, with the possibility of following the evolution of an event through multi-temporal output flood maps. Each DAFNE module can be easily modified or upgraded to meet different user needs. The DAFNE suite is presented together with an example of its application.

  1. A MATLAB toolbox for the efficient estimation of the psychometric function using the updated maximum-likelihood adaptive procedure.

    PubMed

    Shen, Yi; Dai, Wei; Richards, Virginia M

    2015-03-01

    A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given.

  2. Identification of the origin of faecal contamination in estuarine oysters using Bacteroidales and F-specific RNA bacteriophage markers.

    PubMed

    Mieszkin, S; Caprais, M P; Le Mennec, C; Le Goff, M; Edge, T A; Gourmelon, M

    2013-09-01

    The aim of this study was to identify the origin of faecal pollution impacting the Elorn estuary (Brittany, France) by applying microbial source tracking (MST) markers in both oysters and estuarine waters. The MST markers used were as follows: (i) human-, ruminant- and pig-associated Bacteroidales markers by real-time PCR and (ii) human genogroup II and animal genogroup I of F-specific RNA bacteriophages (FRNAPH) by culture/genotyping and by direct real-time reverse-transcriptase PCR. The higher occurrence of the human genogroup II of F-specific RNA bacteriophages using a culture/genotyping method, and human-associated Bacteroidales marker by real-time PCR, allowed the identification of human faecal contamination as the predominant source of contamination in oysters (total of 18 oyster batches tested) and waters (total of 24 water samples tested). The importance of using the intravalvular liquids instead of digestive tissues, when applying host-associated Bacteroidales markers in oysters, was also revealed. This study has shown that the application of a MST toolbox of diverse bacterial and viral methods can provide multiple lines of evidence to identify the predominant source of faecal contamination in shellfish from an estuarine environment. Application of this MST toolbox is a useful approach to understand the origin of faecal contamination in shellfish harvesting areas in an estuarine setting. © 2013 The Society for Applied Microbiology.

  3. Four simple rules that are sufficient to generate the mammalian blastocyst

    PubMed Central

    Nissen, Silas Boye; Perera, Marta; Gonzalez, Javier Martin; Morgani, Sophie M.; Jensen, Mogens H.; Sneppen, Kim; Brickman, Joshua M.

    2017-01-01

    Early mammalian development is both highly regulative and self-organizing. It involves the interplay of cell position, predetermined gene regulatory networks, and environmental interactions to generate the physical arrangement of the blastocyst with precise timing. However, this process occurs in the absence of maternal information and in the presence of transcriptional stochasticity. How does the preimplantation embryo ensure robust, reproducible development in this context? It utilizes a versatile toolbox that includes complex intracellular networks coupled to cell—cell communication, segregation by differential adhesion, and apoptosis. Here, we ask whether a minimal set of developmental rules based on this toolbox is sufficient for successful blastocyst development, and to what extent these rules can explain mutant and experimental phenotypes. We implemented experimentally reported mechanisms for polarity, cell—cell signaling, adhesion, and apoptosis as a set of developmental rules in an agent-based in silico model of physically interacting cells. We find that this model quantitatively reproduces specific mutant phenotypes and provides an explanation for the emergence of heterogeneity without requiring any initial transcriptional variation. It also suggests that a fixed time point for the cells’ competence of fibroblast growth factor (FGF)/extracellular signal—regulated kinase (ERK) sets an embryonic clock that enables certain scaling phenomena, a concept that we evaluate quantitatively by manipulating embryos in vitro. Based on these observations, we conclude that the minimal set of rules enables the embryo to experiment with stochastic gene expression and could provide the robustness necessary for the evolutionary diversification of the preimplantation gene regulatory network. PMID:28700688

  4. A Pathway to Freedom: An Evaluation of Screening Tools for the Identification of Trafficking Victims.

    PubMed

    Bespalova, Nadejda; Morgan, Juliet; Coverdale, John

    2016-02-01

    Because training residents and faculty to identify human trafficking victims is a major public health priority, the authors review existing assessment tools. PubMed and Google were searched using combinations of search terms including human, trafficking, sex, labor, screening, identification, and tool. Nine screening tools that met the inclusion criteria were found. They varied greatly in length, format, target demographic, supporting resources, and other parameters. Only two tools were designed specifically for healthcare providers. Only one tool was formally assessed to be valid and reliable in a pilot project in trafficking victim service organizations, although it has not been validated in the healthcare setting. This toolbox should facilitate the education of resident physicians and faculty in screening for trafficking victims, assist educators in assessing screening skills, and promote future research on the identification of trafficking victims.

  5. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    PubMed

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  6. A MATLAB toolbox for the efficient estimation of the psychometric function using the updated maximum-likelihood adaptive procedure

    PubMed Central

    Richards, V. M.; Dai, W.

    2014-01-01

    A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given. PMID:24671826

  7. Identifying unknown by-products in drinking water using comprehensive two-dimensional gas chromatography-quadrupole mass spectrometry and in silico toxicity assessment.

    PubMed

    Li, Chunmei; Wang, Donghong; Li, Na; Luo, Qian; Xu, Xiong; Wang, Zijian

    2016-11-01

    Improvements in extraction and detection technologies have increased our abilities to identify new disinfection by-products (DBPs) over the last 40 years. However, most previous studies combined DBP identification and measurement efforts with toxicology to address concerns on a few expected DBPs, making it difficult to better define the health risk from the individual DBPs. In this study, a nontargeted screening method involving comprehensive two-dimensional gas chromatography-quadrupole mass spectrometry (GC × GC-qMS) combined with OECD QSAR Toolbox Ver. 3.2 was developed for identifying and prioritizing of volatile and semi-volatile DBPs in drinking water. The method was successfully applied to analyze DBPs formed during chlorination, chloramination or ozonation of the raw water. Over 500 compounds were tentatively identified in each sample, showing the superior performance of this analytical technique. A total of 170 volatile and semi-volatile DBPs representing fourteen chemical classes were then identified, according to the criteria that the DBP was presented in the duplicate treated samples. The genotoxicity and carcinogenicity of the DBPs were evaluated using Toolbox, and 58 DBPs were found to be actual or potential genotoxicants. The accuracy of the compound identification was determined by comparing 47 identified compounds with commercially available standards. About 90% (41 of the 47) of the compounds that were automatically identified using the library were correct. The results show that GC×GC-qMS coupled with a quantitative structure-activity relationship model is a powerful and fast nontargeted screening technique for compounds. The method and results provide us a new idea for identification and prioritization of DBPs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  9. Fluorescence lifetime assays: current advances and applications in drug discovery.

    PubMed

    Pritz, Stephan; Doering, Klaus; Woelcke, Julian; Hassiepen, Ulrich

    2011-06-01

    Fluorescence lifetime assays complement the portfolio of established assay formats available in drug discovery, particularly with the recent advances in microplate readers and the commercial availability of novel fluorescent labels. Fluorescence lifetime assists in lowering complexity of compound screening assays, affording a modular, toolbox-like approach to assay development and yielding robust homogeneous assays. To date, materials and procedures have been reported for biochemical assays on proteases, as well as on protein kinases and phosphatases. This article gives an overview of two assay families, distinguished by the origin of the fluorescence signal modulation. The pharmaceutical industry demands techniques with a robust, integrated compound profiling process and short turnaround times. Fluorescence lifetime assays have already helped the drug discovery field, in this sense, by enhancing productivity during the hit-to-lead and lead optimization phases. Future work will focus on covering other biochemical molecular modifications by investigating the detailed photo-physical mechanisms underlying the fluorescence signal.

  10. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan

    2016-04-28

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less

  11. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    PubMed

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  12. Integration of TomoPy and the ASTRA toolbox for advanced processing and reconstruction of tomographic synchrotron data

    PubMed Central

    Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost

    2016-01-01

    The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167

  13. Arc_Mat: a Matlab-based spatial data analysis toolbox

    NASA Astrophysics Data System (ADS)

    Liu, Xingjian; Lesage, James

    2010-03-01

    This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.

  14. The CatchMod toolbox: easy and guided access to ICT tools for Water Framework Directive implementation.

    PubMed

    van Griensven, A; Vanrolleghem, P A

    2006-01-01

    Web-based toolboxes are handy tools to inform experienced users of existing software in their disciplines. However, for the implementation of the Water Framework Directive, a much more diverse public (water managers, consultancy firms, scientists, etc.) will ask for a very wide diversity of Information and Communication Technology (ICT) tools. It is obvious that the users of a web-based ICT-toolbox providing all this will not be experts in all of the disciplines and that a toolbox for ICT tools for Water Framework Directive implementation should thus go beyond just making interesting web-links. To deal with this issue, expert knowledge is brought to the users through the incorporation of visitor-geared guidance (materials) in the Harmoni-CA toolbox. Small workshops of expert teams were organized to deliver documents explaining why the tools are important, when they are required and what activity they support/perform, as well as a categorization of the multitude of available tools. An integration of this information in the web-based toolbox helps the users to browse through a toolbox containing tools, reports, guidance documents and interesting links. The Harmoni-CA toolbox thus provides not only a virtual toolbox, but incorporates a virtual expert as well.

  15. Identification and robust control of an experimental servo motor.

    PubMed

    Adam, E J; Guestrin, E D

    2002-04-01

    In this work, the design of a robust controller for an experimental laboratory-scale position control system based on a dc motor drive as well as the corresponding identification and robust stability analysis are presented. In order to carry out the robust design procedure, first, a classic closed-loop identification technique is applied and then, the parametrization by internal model control is used. The model uncertainty is evaluated under both parametric and global representation. For the latter case, an interesting discussion about the conservativeness of this description is presented by means of a comparison between the uncertainty disk and the critical perturbation radius approaches. Finally, conclusions about the performance of the experimental system with the robust controller are discussed using comparative graphics of the controlled variable and the Nyquist stability margin as a robustness measurement.

  16. GridPV Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.

  17. Signal processing and neural network toolbox and its application to failure diagnosis and prognosis

    NASA Astrophysics Data System (ADS)

    Tu, Fang; Wen, Fang; Willett, Peter K.; Pattipati, Krishna R.; Jordan, Eric H.

    2001-07-01

    Many systems are comprised of components equipped with self-testing capability; however, if the system is complex involving feedback and the self-testing itself may occasionally be faulty, tracing faults to a single or multiple causes is difficult. Moreover, many sensors are incapable of reliable decision-making on their own. In such cases, a signal processing front-end that can match inference needs will be very helpful. The work is concerned with providing an object-oriented simulation environment for signal processing and neural network-based fault diagnosis and prognosis. In the toolbox, we implemented a wide range of spectral and statistical manipulation methods such as filters, harmonic analyzers, transient detectors, and multi-resolution decomposition to extract features for failure events from data collected by data sensors. Then we evaluated multiple learning paradigms for general classification, diagnosis and prognosis. The network models evaluated include Restricted Coulomb Energy (RCE) Neural Network, Learning Vector Quantization (LVQ), Decision Trees (C4.5), Fuzzy Adaptive Resonance Theory (FuzzyArtmap), Linear Discriminant Rule (LDR), Quadratic Discriminant Rule (QDR), Radial Basis Functions (RBF), Multiple Layer Perceptrons (MLP) and Single Layer Perceptrons (SLP). Validation techniques, such as N-fold cross-validation and bootstrap techniques, are employed for evaluating the robustness of network models. The trained networks are evaluated for their performance using test data on the basis of percent error rates obtained via cross-validation, time efficiency, generalization ability to unseen faults. Finally, the usage of neural networks for the prediction of residual life of turbine blades with thermal barrier coatings is described and the results are shown. The neural network toolbox has also been applied to fault diagnosis in mixed-signal circuits.

  18. An adaptive toolbox approach to the route to expertise in sport.

    PubMed

    de Oliveira, Rita F; Lobinger, Babett H; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes' natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise.

  19. ALEA: a toolbox for allele-specific epigenomics analysis.

    PubMed

    Younesy, Hamid; Möller, Torsten; Heravi-Moussavi, Alireza; Cheng, Jeffrey B; Costello, Joseph F; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2014-04-15

    The assessment of expression and epigenomic status using sequencing based methods provides an unprecedented opportunity to identify and correlate allelic differences with epigenomic status. We present ALEA, a computational toolbox for allele-specific epigenomics analysis, which incorporates allelic variation data within existing resources, allowing for the identification of significant associations between epigenetic modifications and specific allelic variants in human and mouse cells. ALEA provides a customizable pipeline of command line tools for allele-specific analysis of next-generation sequencing data (ChIP-seq, RNA-seq, etc.) that takes the raw sequencing data and produces separate allelic tracks ready to be viewed on genome browsers. The pipeline has been validated using human and hybrid mouse ChIP-seq and RNA-seq data. The package, test data and usage instructions are available online at http://www.bcgsc.ca/platform/bioinfo/software/alea CONTACT: : mkarimi1@interchange.ubc.ca or sjones@bcgsc.ca Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. An adaptive toolbox approach to the route to expertise in sport

    PubMed Central

    de Oliveira, Rita F.; Lobinger, Babett H.; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes’ natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise. PMID:25071673

  1. MOEMS Modeling Using the Geometrical Matrix Toolbox

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2005-01-01

    New technologies such as MicroOptoElectro-Mechanical Systems (MOEMS) require new modeling tools. These tools must simultaneously model the optical, electrical, and mechanical domains and the interactions between these domains. To facilitate rapid prototyping of these new technologies an optical toolbox has been developed for modeling MOEMS devices. The toolbox models are constructed using MATLAB's dynamical simulator, Simulink. Modeling toolboxes will allow users to focus their efforts on system design and analysis as opposed to developing component models. This toolbox was developed to facilitate rapid modeling and design of a MOEMS based laser ultrasonic receiver system.

  2. C%2B%2B tensor toolbox user manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd D.; Kolda, Tamara Gibson

    2012-04-01

    The C++ Tensor Toolbox is a software package for computing tensor decompositions. It is based on the Matlab Tensor Toolbox, and is particularly optimized for sparse data sets. This user manual briefly overviews tensor decomposition mathematics, software capabilities, and installation of the package. Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors in C++. The Toolbox compiles into libraries and is intended for use with custom applications written by users.

  3. An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.

    PubMed

    Silva, Ikaro; Moody, George B

    The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.

  4. Modeling stochasticity and robustness in gene regulatory networks.

    PubMed

    Garg, Abhishek; Mohanram, Kartik; Di Cara, Alessandro; De Micheli, Giovanni; Xenarios, Ioannis

    2009-06-15

    Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.

  5. Speed management toolbox for rural communities.

    DOT National Transportation Integrated Search

    2013-04-01

    The primary objective of this toolbox is to summarize various known traffic-calming treatments and their effectiveness. This toolbox focuses on roadway-based treatments for speed management, particularly for rural communities with transition zones. E...

  6. Living matter—nexus of physics and biology in the 21st century

    PubMed Central

    Gardel, Margaret L.

    2012-01-01

    Cells are made up of complex assemblies of cytoskeletal proteins that facilitate force transmission from the molecular to cellular scale to regulate cell shape and force generation. The “living matter” formed by the cytoskeleton facilitates versatile and robust behaviors of cells, including their migration, adhesion, division, and morphology, that ultimately determine tissue architecture and mechanics. Elucidating the underlying physical principles of such living matter provides great opportunities in both biology and physics. For physicists, the cytoskeleton provides an exceptional toolbox to study materials far from equilibrium. For biologists, these studies will provide new understanding of how molecular-scale processes determine cell morphological changes. PMID:23112229

  7. Operational Research: Evaluating Multimodel Implementations for 24/7 Runtime Environments

    NASA Astrophysics Data System (ADS)

    Burkhart, J. F.; Helset, S.; Abdella, Y. S.; Lappegard, G.

    2016-12-01

    We present a new open source framework for operational hydrologic rainfall-runoff modeling. The Statkraft Hydrologic Forecasting Toolbox (Shyft) is unique from existing frameworks in that two primary goals are to provide: i) modern, professionally developed source code, and ii) a platform that is robust and ready for operational deployment. Developed jointly between Statkraft AS and The University of Oslo, the framework is currently in operation in both private and academic environments. The hydrology presently available in the distribution is simple and proven. Shyft provides a platform for distributed hydrologic modeling in a highly efficient manner. In it's current operational deployment at Statkraft, Shyft is used to provide daily 10-day forecasts for critical reservoirs. In a research setting, we have developed a novel implementation of the SNICAR model to assess the impact of aerosol deposition on snow packs. Several well known rainfall-runoff algorithms are available for use, allowing for intercomparing different approaches based on available data and the geographical environment. The well known HBV model is a default option, and other routines with more localized methods handling snow and evapotranspiration, or simplifications of catchment scale processes are included. For the latter, we have implemented the Kirchner response routine. Being developed in Norway, a variety snow-melt routines, including simplified degree day models or more advanced energy balance models, may be selected. Ensemble forecasts, multi-model implementations, and statistical post-processing routines enable a robust toolbox for investigating optimal model configurations in an operational setting. The Shyft core is written in modern templated C++ and has Python wrappers developed for easy access to module sub-routines. The code is developed such that the modules that make up a "method stack" are easy to modify and customize, allowing one to create new methods and test them rapidly. Due to the simple architecture and ease of access to the module routines, we see Shyft as an optimal choice to evaluate new hydrologic routines in an environment requiring robust, professionally developed software and welcome further community participation.

  8. Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0

    PubMed Central

    Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.

    2012-01-01

    Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097

  9. The ROC Toolbox: A toolbox for analyzing receiver-operating characteristics derived from confidence ratings.

    PubMed

    Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P

    2017-08-01

    Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.

  10. WEC Design Response Toolbox v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey

    2016-03-30

    The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.

  11. A constrained robust least squares approach for contaminant release history identification

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Painter, Scott L.; Wittmeyer, Gordon W.

    2006-04-01

    Contaminant source identification is an important type of inverse problem in groundwater modeling and is subject to both data and model uncertainty. Model uncertainty was rarely considered in the previous studies. In this work, a robust framework for solving contaminant source recovery problems is introduced. The contaminant source identification problem is first cast into one of solving uncertain linear equations, where the response matrix is constructed using a superposition technique. The formulation presented here is general and is applicable to any porous media flow and transport solvers. The robust least squares (RLS) estimator, which originated in the field of robust identification, directly accounts for errors arising from model uncertainty and has been shown to significantly reduce the sensitivity of the optimal solution to perturbations in model and data. In this work, a new variant of RLS, the constrained robust least squares (CRLS), is formulated for solving uncertain linear equations. CRLS allows for additional constraints, such as nonnegativity, to be imposed. The performance of CRLS is demonstrated through one- and two-dimensional test problems. When the system is ill-conditioned and uncertain, it is found that CRLS gave much better performance than its classical counterpart, the nonnegative least squares. The source identification framework developed in this work thus constitutes a reliable tool for recovering source release histories in real applications.

  12. Screening and assessment of chronic pain among children with cerebral palsy: a process evaluation of a pain toolbox.

    PubMed

    Orava, Taryn; Provvidenza, Christine; Townley, Ashleigh; Kingsnorth, Shauna

    2018-06-08

    Though high numbers of children with cerebral palsy experience chronic pain, it remains under-recognized. This paper describes an evaluation of implementation supports and adoption of the Chronic Pain Assessment Toolbox for Children with Disabilities (the Toolbox) to enhance pain screening and assessment practices within a pediatric rehabilitation and complex continuing care hospital. A multicomponent knowledge translation strategy facilitated Toolbox adoption, inclusive of a clinical practice guideline, cerebral palsy practice points and assessment tools. Across the hospital, seven ambulatory care clinics with cerebral palsy caseloads participated in a staggered roll-out (Group 1: exclusive CP caseloads, March-December; Group 2: mixed diagnostic caseloads, August-December). Evaluation measures included client electronic medical record audit, document review and healthcare provider survey and interviews. A significant change in documentation of pain screening and assessment practice from pre-Toolbox (<2%) to post-Toolbox adoption (53%) was found. Uptake in Group 2 clinics lagged behind Group 1. Opportunities to use the Toolbox consistently (based on diagnostic caseload) and frequently (based on client appointments) were noted among contextual factors identified. Overall, the Toolbox was positively received and clinically useful. Findings affirm that the Toolbox, in conjunction with the application of integrated knowledge translation principles and an established knowledge translation framework, has potential to be a useful resource to enrich and standardize chronic pain screening and assessment practices among children with cerebral palsy. Implications for Rehabilitation It is important to engage healthcare providers in the conceptualization, development, implementation and evaluation of a knowledge-to-action best practice product. The Chronic Pain Toolbox for Children with Disabilities provides rehabilitation staff with guidance on pain screening and assessment best practice and offers a range of validated tools that can be incorporated in ambulatory clinic settings to meet varied client needs. Considering unique clinical contexts (i.e., opportunities for use, provider engagement, staffing absences/turnover) is required to optimize and sustain chronic pain screening and assessment practices in rehabilitation outpatient settings.

  13. DeepNeuron: an open deep learning toolbox for neuron tracing.

    PubMed

    Zhou, Zhi; Kuo, Hsien-Chi; Peng, Hanchuan; Long, Fuhui

    2018-06-06

    Reconstructing three-dimensional (3D) morphology of neurons is essential for understanding brain structures and functions. Over the past decades, a number of neuron tracing tools including manual, semiautomatic, and fully automatic approaches have been developed to extract and analyze 3D neuronal structures. Nevertheless, most of them were developed based on coding certain rules to extract and connect structural components of a neuron, showing limited performance on complicated neuron morphology. Recently, deep learning outperforms many other machine learning methods in a wide range of image analysis and computer vision tasks. Here we developed a new Open Source toolbox, DeepNeuron, which uses deep learning networks to learn features and rules from data and trace neuron morphology in light microscopy images. DeepNeuron provides a family of modules to solve basic yet challenging problems in neuron tracing. These problems include but not limited to: (1) detecting neuron signal under different image conditions, (2) connecting neuronal signals into tree(s), (3) pruning and refining tree morphology, (4) quantifying the quality of morphology, and (5) classifying dendrites and axons in real time. We have tested DeepNeuron using light microscopy images including bright-field and confocal images of human and mouse brain, on which DeepNeuron demonstrates robustness and accuracy in neuron tracing.

  14. Delft Dashboard: a quick setup tool for coastal and estuarine models

    NASA Astrophysics Data System (ADS)

    Nederhoff, C., III; Van Dongeren, A.; Van Ormondt, M.; Veeramony, J.

    2016-02-01

    We developed easy-to-use Delft DashBoard (DDB) software for the rapid set-up of coastal and estuarine hydrodynamic and basic morphological numerical models. In the "Model Maker" toolbox, users have the capability to set-up Delft3D models, in a minimal amount of time (in the order of a hour), for any location in the world. DDB draws upon public internet data sources of bathymetry and tidesto construct the model. With additional toolboxes, these models can be forced with parameterized hurricane wind fields, uplift of the sea surface due to tsunamis nested in publically available ocean models and forced with meteo data (wind speed, pressure, temperature) In this presentation we will show the skill of a model which is setup with Delft Dashboard and compare it to well-calibrated benchmark models. These latter models have been set-up using detailed input data and boundary conditions. We have tested the functionality of Delft DashBoard and evaluate the performance and robustness of the DDB model system on a variety of cases, ranging from a coastal to basin models. Furthermore, we have performed a sensitivity study to investigate the most critical physical and numerical processes. The software can benefit operational modellers, as well as scientists and consultants.

  15. PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.

    PubMed

    Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco

    2016-07-11

    Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.

  16. Pointing System Simulation Toolbox with Application to a Balloon Mission Simulator

    NASA Technical Reports Server (NTRS)

    Maringolo Baldraco, Rosana M.; Aretskin-Hariton, Eliot D.; Swank, Aaron J.

    2017-01-01

    The development of attitude estimation and pointing-control algorithms is necessary in order to achieve high-fidelity modeling for a Balloon Mission Simulator (BMS). A pointing system simulation toolbox was developed to enable this. The toolbox consists of a star-tracker (ST) and Inertial Measurement Unit (IMU) signal generator, a UDP (User Datagram Protocol) communication le (bridge), and an indirect-multiplicative extended Kalman filter (imEKF). This document describes the Python toolbox developed and the results of its implementation in the imEKF.

  17. A generic open-source toolbox to help long term irrigation monitoring for integrated water management in semi-arid Mediterranean areas.

    NASA Astrophysics Data System (ADS)

    Le Page, Michel; Gosset, Cindy; Oueslati, Ines; Calvez, Roger; Zribi, Mehrez; Lili Chabaane, Zohra

    2016-04-01

    In semi arid areas, irrigated plains are often the major consumer of water well beyond other water demands. Traditionally fed by surface water, irrigation has massively shifted to a more reliable resource: groundwater. This shift occurred in the late thirty years has also provoked an extension and intensification of irrigation, often translated into impressive groundwater table decreases. Integrated water management needs a systematic and robust way to estimate the water demands by the agricultural sector. We propose a generic toolbox based on the FAO-56 method and the Crop Coefficient/NDVI approach used in Remote Sensing. The toolbox can be separated in three main areas: 1) It facilitates the preparation of different input datasets: download, domain extraction, homogenization of formats, or spatial interpolation. 2) A collection of algorithms based on the analysis of NDVI time series is proposed: Separation of irrigated vs non-irrigated area, a simplified annual land cover classification, Crop Coefficient, Fraction Cover and Efficient Rainfall. 3) Synthesis against points or areas produces the output data at the desired spatial and temporal resolution for Integrated Water Modeling or data analysis and comparison. The toolbox has been used in order to build a WEAP21 model of the Merguellil basin in Tunisia for the period of 2000-2014. Different meteorological forcings were easily used and compared: WFDEI, AGRI4CAST, MED-CORDEX. A local rain gauges database was used to produce a daily rainfall gridded dataset. MODIS MOD13Q1 (16 days, 250m) data was used to produce the NDVI derived datasets (Kc, Fc, RainEff). Punctual evapotranspiration was compared to actual measurements obtained by flux towers on wheat and barley showing good agreements on a daily basis (r2=0.77). Finally, the comparison to monthly statistics of three irrigated commands was performed over 4 years. This late comparison showed a bad agreement which led us to suppose two things: First, the simple approach of (Evapotranspiration minus Efficient Rainfall) to estimate Irrigation at the monthly time step is not pertinent because only a subset of the irrigated commands is actually irrigated. Hence, a higher spatial resolution of remote sensing imagery is needed. Second, in this particular area, farmers have a different rationale about rainfall and irrigation water needs. Those two aspects need to be further investigated. The toolbox has proven to be an interesting tool to integrate different sources of data, efficiently process them and easily produce input data for the WEAP1 model on a long term range. Yet some new challenges have been raised.

  18. T2N as a new tool for robust electrophysiological modeling demonstrated for mature and adult-born dentate granule cells

    PubMed Central

    Mongiat, Lucas Alberto; Schwarzacher, Stephan Wolfgang

    2017-01-01

    Compartmental models are the theoretical tool of choice for understanding single neuron computations. However, many models are incomplete, built ad hoc and require tuning for each novel condition rendering them of limited usability. Here, we present T2N, a powerful interface to control NEURON with Matlab and TREES toolbox, which supports generating models stable over a broad range of reconstructed and synthetic morphologies. We illustrate this for a novel, highly detailed active model of dentate granule cells (GCs) replicating a wide palette of experiments from various labs. By implementing known differences in ion channel composition and morphology, our model reproduces data from mouse or rat, mature or adult-born GCs as well as pharmacological interventions and epileptic conditions. This work sets a new benchmark for detailed compartmental modeling. T2N is suitable for creating robust models useful for large-scale networks that could lead to novel predictions. We discuss possible T2N application in degeneracy studies. PMID:29165247

  19. A Module for Graphical Display of Model Results with the CBP Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.

    2015-04-21

    This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide amore » basis for further development of the CBP Toolbox.« less

  20. A robust interrupted time series model for analyzing complex health care intervention data.

    PubMed

    Cruz, Maricela; Bender, Miriam; Ombao, Hernando

    2017-12-20

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be "interrupted" by a change in a particular method of health care delivery. Interrupted time series (ITS) is a robust quasi-experimental design with the ability to infer the effectiveness of an intervention that accounts for data dependency. Current standardized methods for analyzing ITS data do not model changes in variation and correlation following the intervention. This is a key limitation since it is plausible for data variability and dependency to change because of the intervention. Moreover, present methodology either assumes a prespecified interruption time point with an instantaneous effect or removes data for which the effect of intervention is not fully realized. In this paper, we describe and develop a novel robust interrupted time series (robust-ITS) model that overcomes these omissions and limitations. The robust-ITS model formally performs inference on (1) identifying the change point; (2) differences in preintervention and postintervention correlation; (3) differences in the outcome variance preintervention and postintervention; and (4) differences in the mean preintervention and postintervention. We illustrate the proposed method by analyzing patient satisfaction data from a hospital that implemented and evaluated a new nursing care delivery model as the intervention of interest. The robust-ITS model is implemented in an R Shiny toolbox, which is freely available to the community. Copyright © 2017 John Wiley & Sons, Ltd.

  1. A resilience-oriented approach for quantitatively assessing recurrent spatial-temporal congestion on urban roads.

    PubMed

    Tang, Junqing; Heinimann, Hans Rudolf

    2018-01-01

    Traffic congestion brings not only delay and inconvenience, but other associated national concerns, such as greenhouse gases, air pollutants, road safety issues and risks. Identification, measurement, tracking, and control of urban recurrent congestion are vital for building a livable and smart community. A considerable amount of works has made contributions to tackle the problem. Several methods, such as time-based approaches and level of service, can be effective for characterizing congestion on urban streets. However, studies with systemic perspectives have been minor in congestion quantification. Resilience, on the other hand, is an emerging concept that focuses on comprehensive systemic performance and characterizes the ability of a system to cope with disturbance and to recover its functionality. In this paper, we symbolized recurrent congestion as internal disturbance and proposed a modified metric inspired by the well-applied "R4" resilience-triangle framework. We constructed the metric with generic dimensions from both resilience engineering and transport science to quantify recurrent congestion based on spatial-temporal traffic patterns and made the comparison with other two approaches in freeway and signal-controlled arterial cases. Results showed that the metric can effectively capture congestion patterns in the study area and provides a quantitative benchmark for comparison. Also, it suggested not only a good comparative performance in measuring strength of proposed metric, but also its capability of considering the discharging process in congestion. The sensitivity tests showed that proposed metric possesses robustness against parameter perturbation in Robustness Range (RR), but the number of identified congestion patterns can be influenced by the existence of ϵ. In addition, the Elasticity Threshold (ET) and the spatial dimension of cell-based platform differ the congestion results significantly on both the detected number and intensity. By tackling this conventional problem with emerging concept, our metric provides a systemic alternative approach and enriches the toolbox for congestion assessment. Future work will be conducted on a larger scale with multiplex scenarios in various traffic conditions.

  2. Robust power spectral estimation for EEG data

    PubMed Central

    Melman, Tamar; Victor, Jonathan D.

    2016-01-01

    Background Typical electroencephalogram (EEG) recordings often contain substantial artifact. These artifacts, often large and intermittent, can interfere with quantification of the EEG via its power spectrum. To reduce the impact of artifact, EEG records are typically cleaned by a preprocessing stage that removes individual segments or components of the recording. However, such preprocessing can introduce bias, discard available signal, and be labor-intensive. With this motivation, we present a method that uses robust statistics to reduce dependence on preprocessing by minimizing the effect of large intermittent outliers on the spectral estimates. New method Using the multitaper method[1] as a starting point, we replaced the final step of the standard power spectrum calculation with a quantile-based estimator, and the Jackknife approach to confidence intervals with a Bayesian approach. The method is implemented in provided MATLAB modules, which extend the widely used Chronux toolbox. Results Using both simulated and human data, we show that in the presence of large intermittent outliers, the robust method produces improved estimates of the power spectrum, and that the Bayesian confidence intervals yield close-to-veridical coverage factors. Comparison to existing method The robust method, as compared to the standard method, is less affected by artifact: inclusion of outliers produces fewer changes in the shape of the power spectrum as well as in the coverage factor. Conclusion In the presence of large intermittent outliers, the robust method can reduce dependence on data preprocessing as compared to standard methods of spectral estimation. PMID:27102041

  3. Robust power spectral estimation for EEG data.

    PubMed

    Melman, Tamar; Victor, Jonathan D

    2016-08-01

    Typical electroencephalogram (EEG) recordings often contain substantial artifact. These artifacts, often large and intermittent, can interfere with quantification of the EEG via its power spectrum. To reduce the impact of artifact, EEG records are typically cleaned by a preprocessing stage that removes individual segments or components of the recording. However, such preprocessing can introduce bias, discard available signal, and be labor-intensive. With this motivation, we present a method that uses robust statistics to reduce dependence on preprocessing by minimizing the effect of large intermittent outliers on the spectral estimates. Using the multitaper method (Thomson, 1982) as a starting point, we replaced the final step of the standard power spectrum calculation with a quantile-based estimator, and the Jackknife approach to confidence intervals with a Bayesian approach. The method is implemented in provided MATLAB modules, which extend the widely used Chronux toolbox. Using both simulated and human data, we show that in the presence of large intermittent outliers, the robust method produces improved estimates of the power spectrum, and that the Bayesian confidence intervals yield close-to-veridical coverage factors. The robust method, as compared to the standard method, is less affected by artifact: inclusion of outliers produces fewer changes in the shape of the power spectrum as well as in the coverage factor. In the presence of large intermittent outliers, the robust method can reduce dependence on data preprocessing as compared to standard methods of spectral estimation. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  5. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  6. 40 CFR 141.716 - Source toolbox components.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...

  7. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) User's Guide

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) software package is an open source, MATLABSimulink toolbox (plug in) that can be used by industry professionals and academics for the development of thermodynamic and controls simulations.

  8. CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.

    PubMed

    Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka

    2017-09-15

    CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).

  9. Simulating electron energy loss spectroscopy with the MNPBEM toolbox

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich

    2014-03-01

    Within the MNPBEM toolbox, we show how to simulate electron energy loss spectroscopy (EELS) of plasmonic nanoparticles using a boundary element method approach. The methodology underlying our approach closely follows the concepts developed by García de Abajo and coworkers (Garcia de Abajo, 2010). We introduce two classes eelsret and eelsstat that allow in combination with our recently developed MNPBEM toolbox for a simple, robust, and efficient computation of EEL spectra and maps. The classes are accompanied by a number of demo programs for EELS simulation of metallic nanospheres, nanodisks, and nanotriangles, and for electron trajectories passing by or penetrating through the metallic nanoparticles. We also discuss how to compute electric fields induced by the electron beam and cathodoluminescence. Catalogue identifier: AEKJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 38886 No. of bytes in distributed program, including test data, etc.: 1222650 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b). Computer: Any which supports Matlab 7.11.0 (R2010b). Operating system: Any which supports Matlab 7.11.0 (R2010b). RAM:≥1 GB Classification: 18. Catalogue identifier of previous version: AEKJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 370 External routines: MESH2D available at www.mathworks.com Does the new version supersede the previous version?: Yes Nature of problem: Simulation of electron energy loss spectroscopy (EELS) for plasmonic nanoparticles. Solution method: Boundary element method using electromagnetic potentials. Reasons for new version: The new version of the toolbox includes two additional classes for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles, and corrects a few minor bugs and inconsistencies. Summary of revisions: New classes “eelsstat” and “eelsret” for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles have been added. A few minor errors in the implementation of dipole excitation have been corrected. Running time: Depending on surface discretization between seconds and hours.

  10. A toolbox for microbore liquid chromatography tandem-high-resolution mass spectrometry analysis of albumin-adducts as novel biomarkers of organophosphorus pesticide poisoning.

    PubMed

    von der Wellen, Jens; Winterhalter, Peter; Siegert, Markus; Eyer, Florian; Thiermann, Horst; John, Harald

    2018-08-01

    Exposure to toxic organophosphorus pesticides (OPP) represents a serious problem in the public healthcare sector and might be forced in terroristic attacks. Therefore, reliable verification procedures for OPP-intoxications are required for forensic, toxicological and clinical reasons. We developed and optimized a toolbox of methods to detect adducts of human serum albumin (HSA) with OPP considered as long-term biomarkers. Human serum was incubated with diethyl-oxono and diethyl-thiono pesticides for adduct formation used as reference. Afterwards serum was subjected to proteolysis using three proteases separately thus yielding phosphorylated tyrosine residues (Y*) detected as single amino acid (pronase), as hexadecapeptide LVRY* 411 TKKVPQVSTPTL (pepsin) and as the tripeptide Y* 411 TK (trypsin), respectively. Adducts were analyzed via microbore liquid chromatography coupled to electrospray ionization (μLC-ESI) and tandem-high-resolution mass spectrometry (MS/HR MS). Using paraoxon-ethyl as model OPP for adduct formation, methods were optimized with respect to MS/HR MS-parameters, protease concentrations and incubation time for proteolysis. HSA-adducts were found to be stable in serum in vitro at +37 °C and -30 °C for at least 27 days and resulting biomarkers were stable in the autosampler at 15 °C for at least 24 h. Limits of identification of adducts varied between 0.25 μM and 4.0 μM with respect to the corresponding pesticide concentrations in serum. Applicability of the methods was proven by successful detection of the adducts in samples of OPP-poisoned patients thus demonstrating the methods as a reliable toolbox for forensic and toxicological analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. PreSurgMapp: a MATLAB Toolbox for Presurgical Mapping of Eloquent Functional Areas Based on Task-Related and Resting-State Functional MRI.

    PubMed

    Huang, Huiyuan; Ding, Zhongxiang; Mao, Dewang; Yuan, Jianhua; Zhu, Fangmei; Chen, Shuda; Xu, Yan; Lou, Lin; Feng, Xiaoyan; Qi, Le; Qiu, Wusi; Zhang, Han; Zang, Yu-Feng

    2016-10-01

    The main goal of brain tumor surgery is to maximize tumor resection while minimizing the risk of irreversible postoperative functional sequelae. Eloquent functional areas should be delineated preoperatively, particularly for patients with tumors near eloquent areas. Functional magnetic resonance imaging (fMRI) is a noninvasive technique that demonstrates great promise for presurgical planning. However, specialized data processing toolkits for presurgical planning remain lacking. Based on several functions in open-source software such as Statistical Parametric Mapping (SPM), Resting-State fMRI Data Analysis Toolkit (REST), Data Processing Assistant for Resting-State fMRI (DPARSF) and Multiple Independent Component Analysis (MICA), here, we introduce an open-source MATLAB toolbox named PreSurgMapp. This toolbox can reveal eloquent areas using comprehensive methods and various complementary fMRI modalities. For example, PreSurgMapp supports both model-based (general linear model, GLM, and seed correlation) and data-driven (independent component analysis, ICA) methods and processes both task-based and resting-state fMRI data. PreSurgMapp is designed for highly automatic and individualized functional mapping with a user-friendly graphical user interface (GUI) for time-saving pipeline processing. For example, sensorimotor and language-related components can be automatically identified without human input interference using an effective, accurate component identification algorithm using discriminability index. All the results generated can be further evaluated and compared by neuro-radiologists or neurosurgeons. This software has substantial value for clinical neuro-radiology and neuro-oncology, including application to patients with low- and high-grade brain tumors and those with epilepsy foci in the dominant language hemisphere who are planning to undergo a temporal lobectomy.

  12. Lipid A structural modifications in extreme conditions and identification of unique modifying enzymes to define the Toll-like receptor 4 structure-activity relationship.

    PubMed

    Scott, Alison J; Oyler, Benjamin L; Goodlett, David R; Ernst, Robert K

    2017-11-01

    Strategies utilizing Toll-like receptor 4 (TLR4) agonists for treatment of cancer, infectious diseases, and other targets report promising results. Potent TLR4 antagonists are also gaining attention as therapeutic leads. Though some principles for TLR4 modulation by lipid A have been described, a thorough understanding of the structure-activity relationship (SAR) is lacking. Only through a complete definition of lipid A-TLR4 SAR is it possible to predict TLR4 signaling effects of discrete lipid A structures, rendering them more pharmacologically relevant. A limited 'toolbox' of lipid A-modifying enzymes has been defined and is largely composed of enzymes from mesophile human and zoonotic pathogens. Expansion of this 'toolbox' will result from extending the search into lipid A biosynthesis and modification by bacteria living at the extremes. Here, we review the fundamentals of lipid A structure, advances in lipid A uses in TLR4 modulation, and the search for novel lipid A-modifying systems in extremophile bacteria. This article is part of a Special Issue entitled: Bacterial Lipids edited by Russell E. Bishop. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Robust stability for uncertain stochastic fuzzy BAM neural networks with time-varying delays

    NASA Astrophysics Data System (ADS)

    Syed Ali, M.; Balasubramaniam, P.

    2008-07-01

    In this Letter, by utilizing the Lyapunov functional and combining with the linear matrix inequality (LMI) approach, we analyze the global asymptotic stability of uncertain stochastic fuzzy Bidirectional Associative Memory (BAM) neural networks with time-varying delays which are represented by the Takagi-Sugeno (TS) fuzzy models. A new class of uncertain stochastic fuzzy BAM neural networks with time varying delays has been studied and sufficient conditions have been derived to obtain conservative result in stochastic settings. The developed results are more general than those reported in the earlier literatures. In addition, the numerical examples are provided to illustrate the applicability of the result using LMI toolbox in MATLAB.

  14. RESPONSE PROTOCOL TOOLBOX: OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...

  15. RESPONSE PROTOCOL TOOLBOX: OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water(1). The Toolbox contains guidance that may be adopted voluntarily, a...

  16. RESPONSE PROTOCOL TOOLBOX OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS

    EPA Science Inventory

    The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...

  17. The Brain's Versatile Toolbox.

    ERIC Educational Resources Information Center

    Pinker, Steven

    1997-01-01

    Considers the role of evolution and natural selection in the functioning of the modern human brain. Natural selection equipped humans with a mental toolbox of intuitive theories about the world which were used to master rocks, tools, plants, animals, and one another. The same toolbox is used today to master the intellectual challenges of modern…

  18. Compressible Flow Toolbox

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2006-01-01

    The Compressible Flow Toolbox is primarily a MATLAB-language implementation of a set of algorithms that solve approximately 280 linear and nonlinear classical equations for compressible flow. The toolbox is useful for analysis of one-dimensional steady flow with either constant entropy, friction, heat transfer, or Mach number greater than 1. The toolbox also contains algorithms for comparing and validating the equation-solving algorithms against solutions previously published in open literature. The classical equations solved by the Compressible Flow Toolbox are as follows: The isentropic-flow equations, The Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), The Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section), The normal-shock equations, The oblique-shock equations, and The expansion equations.

  19. A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain

    NASA Astrophysics Data System (ADS)

    Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut

    2017-08-01

    Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.

  20. biomechZoo: An open-source toolbox for the processing, analysis, and visualization of biomechanical movement data.

    PubMed

    Dixon, Philippe C; Loh, Jonathan J; Michaud-Paquette, Yannick; Pearsall, David J

    2017-03-01

    It is common for biomechanics data sets to contain numerous dependent variables recorded over time, for many subjects, groups, and/or conditions. These data often require standard sorting, processing, and analysis operations to be performed in order to answer research questions. Visualization of these data is also crucial. This manuscript presents biomechZoo, an open-source toolbox that provides tools and graphical user interfaces to help users achieve these goals. The aims of this manuscript are to (1) introduce the main features of the toolbox, including a virtual three-dimensional environment to animate motion data (Director), a data plotting suite (Ensembler), and functions for the computation of three-dimensional lower-limb joint angles, moments, and power and (2) compare these computations to those of an existing validated system. To these ends, the steps required to process and analyze a sample data set via the toolbox are outlined. The data set comprises three-dimensional marker, ground reaction force (GRF), joint kinematic, and joint kinetic data of subjects performing straight walking and 90° turning manoeuvres. Joint kinematics and kinetics processed within the toolbox were found to be similar to outputs from a commercial system. The biomechZoo toolbox represents the work of several years and multiple contributors to provide a flexible platform to examine time-series data sets typical in the movement sciences. The toolbox has previously been used to process and analyse walking, running, and ice hockey data sets, and can integrate existing routines, such as the KineMat toolbox, for additional analyses. The toolbox can help researchers and clinicians new to programming or biomechanics to process and analyze their data through a customizable workflow, while advanced users are encouraged to contribute additional functionality to the project. Students may benefit from using biomechZoo as a learning and research tool. It is hoped that the toolbox can play a role in advancing research in the movement sciences. The biomechZoo m-files, sample data, and help repositories are available online (http://www.biomechzoo.com) under the Apache 2.0 License. The toolbox is supported for Matlab (r2014b or newer, The Mathworks Inc., Natick, USA) for Windows (Microsoft Corp., Redmond, USA) and Mac OS (Apple Inc., Cupertino, USA). Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Software Toolbox for Low-Frequency Conductivity and Current Density Imaging Using MRI.

    PubMed

    Sajib, Saurav Z K; Katoch, Nitish; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je

    2017-11-01

    Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes. Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes.

  2. ROLE OF SOURCE WATER PROTECTION IN PLANNING FOR AND RESPONDING TO CONTAMINATION THREATS TO DRINKING WATER SYSTEMS

    EPA Science Inventory

    EPA has developed a "Response Protocol Toolbox" to address the complex, multi-faceted challenges of planning and response to intentional contamination of drinking water (http://www.epa.gov/safewater/security/ertools.html#toolbox). The toolbox is designed to be applied by a numbe...

  3. Using a Toolbox of Tailored Educational Lessons to Improve Fruit, Vegetable, and Physical Activity Behaviors among African American Women in California

    ERIC Educational Resources Information Center

    Backman, Desiree; Scruggs, Valarie; Atiedu, Akpene Ama; Bowie, Shene; Bye, Larry; Dennis, Angela; Hall, Melanie; Ossa, Alexandra; Wertlieb, Stacy; Foerster, Susan B.

    2011-01-01

    Objective: Evaluate the effectiveness of the "Fruit, Vegetable, and Physical Activity Toolbox for Community Educators" ("Toolbox"), an intervention originally designed for Spanish- and English-speaking audiences, in changing knowledge, attitudes, and behavior among low-income African American women. Design: Quasi-experimental…

  4. Travel demand management : a toolbox of strategies to reduce single\\0x2010occupant vehicle trips and increase alternate mode usage in Arizona.

    DOT National Transportation Integrated Search

    2012-02-01

    The report provides a suite of recommended strategies to reduce single-occupant vehicle traffic in the urban : areas of Phoenix and Tucson, Arizona, which are presented as a travel demand management toolbox. The : toolbox includes supporting research...

  5. Robust synchronization control scheme of a population of nonlinear stochastic synthetic genetic oscillators under intrinsic and extrinsic molecular noise via quorum sensing.

    PubMed

    Chen, Bor-Sen; Hsu, Chih-Yuan

    2012-10-26

    Collective rhythms of gene regulatory networks have been a subject of considerable interest for biologists and theoreticians, in particular the synchronization of dynamic cells mediated by intercellular communication. Synchronization of a population of synthetic genetic oscillators is an important design in practical applications, because such a population distributed over different host cells needs to exploit molecular phenomena simultaneously in order to emerge a biological phenomenon. However, this synchronization may be corrupted by intrinsic kinetic parameter fluctuations and extrinsic environmental molecular noise. Therefore, robust synchronization is an important design topic in nonlinear stochastic coupled synthetic genetic oscillators with intrinsic kinetic parameter fluctuations and extrinsic molecular noise. Initially, the condition for robust synchronization of synthetic genetic oscillators was derived based on Hamilton Jacobi inequality (HJI). We found that if the synchronization robustness can confer enough intrinsic robustness to tolerate intrinsic parameter fluctuation and extrinsic robustness to filter the environmental noise, then robust synchronization of coupled synthetic genetic oscillators is guaranteed. If the synchronization robustness of a population of nonlinear stochastic coupled synthetic genetic oscillators distributed over different host cells could not be maintained, then robust synchronization could be enhanced by external control input through quorum sensing molecules. In order to simplify the analysis and design of robust synchronization of nonlinear stochastic synthetic genetic oscillators, the fuzzy interpolation method was employed to interpolate several local linear stochastic coupled systems to approximate the nonlinear stochastic coupled system so that the HJI-based synchronization design problem could be replaced by a simple linear matrix inequality (LMI)-based design problem, which could be solved with the help of LMI toolbox in MATLAB easily. If the synchronization robustness criterion, i.e. the synchronization robustness ≥ intrinsic robustness + extrinsic robustness, then the stochastic coupled synthetic oscillators can be robustly synchronized in spite of intrinsic parameter fluctuation and extrinsic noise. If the synchronization robustness criterion is violated, external control scheme by adding inducer can be designed to improve synchronization robustness of coupled synthetic genetic oscillators. The investigated robust synchronization criteria and proposed external control method are useful for a population of coupled synthetic networks with emergent synchronization behavior, especially for multi-cellular, engineered networks.

  6. Robust synchronization control scheme of a population of nonlinear stochastic synthetic genetic oscillators under intrinsic and extrinsic molecular noise via quorum sensing

    PubMed Central

    2012-01-01

    Background Collective rhythms of gene regulatory networks have been a subject of considerable interest for biologists and theoreticians, in particular the synchronization of dynamic cells mediated by intercellular communication. Synchronization of a population of synthetic genetic oscillators is an important design in practical applications, because such a population distributed over different host cells needs to exploit molecular phenomena simultaneously in order to emerge a biological phenomenon. However, this synchronization may be corrupted by intrinsic kinetic parameter fluctuations and extrinsic environmental molecular noise. Therefore, robust synchronization is an important design topic in nonlinear stochastic coupled synthetic genetic oscillators with intrinsic kinetic parameter fluctuations and extrinsic molecular noise. Results Initially, the condition for robust synchronization of synthetic genetic oscillators was derived based on Hamilton Jacobi inequality (HJI). We found that if the synchronization robustness can confer enough intrinsic robustness to tolerate intrinsic parameter fluctuation and extrinsic robustness to filter the environmental noise, then robust synchronization of coupled synthetic genetic oscillators is guaranteed. If the synchronization robustness of a population of nonlinear stochastic coupled synthetic genetic oscillators distributed over different host cells could not be maintained, then robust synchronization could be enhanced by external control input through quorum sensing molecules. In order to simplify the analysis and design of robust synchronization of nonlinear stochastic synthetic genetic oscillators, the fuzzy interpolation method was employed to interpolate several local linear stochastic coupled systems to approximate the nonlinear stochastic coupled system so that the HJI-based synchronization design problem could be replaced by a simple linear matrix inequality (LMI)-based design problem, which could be solved with the help of LMI toolbox in MATLAB easily. Conclusion If the synchronization robustness criterion, i.e. the synchronization robustness ≥ intrinsic robustness + extrinsic robustness, then the stochastic coupled synthetic oscillators can be robustly synchronized in spite of intrinsic parameter fluctuation and extrinsic noise. If the synchronization robustness criterion is violated, external control scheme by adding inducer can be designed to improve synchronization robustness of coupled synthetic genetic oscillators. The investigated robust synchronization criteria and proposed external control method are useful for a population of coupled synthetic networks with emergent synchronization behavior, especially for multi-cellular, engineered networks. PMID:23101662

  7. Proposal for the design of a zero gravity tool storage device

    NASA Technical Reports Server (NTRS)

    Stuckwisch, Sue; Carrion, Carlos A.; Phillips, Lee; Laughlin, Julia; Francois, Jason

    1994-01-01

    Astronauts frequently use a variety of hand tools during space missions, especially on repair missions. A toolbox is needed to allow storage and retrieval of tools with minimal difficulties. The toolbox must contain tools during launch, landing, and on-orbit operations. The toolbox will be used in the Shuttle Bay and therefore must withstand the hazardous space environment. The three main functions of the toolbox in space are: to protect the tools from the space environment and from damaging one another, to allow for quick, one-handed access to the tools; and to minimize the heat transfer between the astronaut's hand and the tools. This proposal explores the primary design issues associated with the design of the toolbox. Included are the customer and design specifications, global and refined function structures, possible solution principles, concept variants, and finally design recommendations.

  8. CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox

    NASA Astrophysics Data System (ADS)

    Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano

    2018-03-01

    Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.

  9. JWST Wavefront Control Toolbox

    NASA Technical Reports Server (NTRS)

    Shin, Shahram Ron; Aronstein, David L.

    2011-01-01

    A Matlab-based toolbox has been developed for the wavefront control and optimization of segmented optical surfaces to correct for possible misalignments of James Webb Space Telescope (JWST) using influence functions. The toolbox employs both iterative and non-iterative methods to converge to an optimal solution by minimizing the cost function. The toolbox could be used in either of constrained and unconstrained optimizations. The control process involves 1 to 7 degrees-of-freedom perturbations per segment of primary mirror in addition to the 5 degrees of freedom of secondary mirror. The toolbox consists of a series of Matlab/Simulink functions and modules, developed based on a "wrapper" approach, that handles the interface and data flow between existing commercial optical modeling software packages such as Zemax and Code V. The limitations of the algorithm are dictated by the constraints of the moving parts in the mirrors.

  10. A CRISPR-Based Toolbox for Studying T Cell Signal Transduction

    PubMed Central

    Chi, Shen; Weiss, Arthur; Wang, Haopeng

    2016-01-01

    CRISPR/Cas9 system is a powerful technology to perform genome editing in a variety of cell types. To facilitate the application of Cas9 in mapping T cell signaling pathways, we generated a toolbox for large-scale genetic screens in human Jurkat T cells. The toolbox has three different Jurkat cell lines expressing distinct Cas9 variants, including wild-type Cas9, dCas9-KRAB, and sunCas9. We demonstrated that the toolbox allows us to rapidly disrupt endogenous gene expression at the DNA level and to efficiently repress or activate gene expression at the transcriptional level. The toolbox, in combination with multiple currently existing genome-wide sgRNA libraries, will be useful to systematically investigate T cell signal transduction using both loss-of-function and gain-of-function genetic screens. PMID:27057542

  11. The Handover Toolbox: a knowledge exchange and training platform for improving patient care.

    PubMed

    Drachsler, Hendrik; Kicken, Wendy; van der Klink, Marcel; Stoyanov, Slavi; Boshuizen, Henny P A; Barach, Paul

    2012-12-01

    Safe and effective patient handovers remain a global organisational and training challenge. Limited evidence supports available handover training programmes. Customisable training is a promising approach to improve the quality and sustainability of handover training and outcomes. We present a Handover Toolbox designed in the context of the European HANDOVER Project. The Toolbox aims to support physicians, nurses, individuals in health professions training, medical educators and handover experts by providing customised handover training tools for different clinical needs and contexts. The Handover Toolbox uses the Technology Enhanced Learning Design Process (TEL-DP), which encompasses user requirements analysis; writing personas; group concept mapping; analysis of suitable software; plus, minus, interesting rating; and usability testing. TEL-DP is aligned with participatory design approaches and ensures development occurs in close collaboration with, and engagement of, key stakeholders. Application of TEL-DP confirmed that the ideal formats of handover training differs for practicing professionals versus individuals in health profession education programmes. Training experts from different countries differed in their views on the optimal content and delivery of training. Analysis of suitable software identified ready-to-use systems that provide required functionalities and can be further customised to users' needs. Interest rating and usability testing resulted in improved usability, navigation and uptake of the Handover Toolbox. The design of the Handover Toolbox was based on a carefully led stakeholder participatory design using the TEL-DP approach. The Toolbox supports a customisable learning approach that allows trainers to design training that addresses the specific information needs of the various target groups. We offer recommendations regarding the application of the Handover Toolbox to medical educators.

  12. Building Interdisciplinary Research Models Through Interactive Education.

    PubMed

    Hessels, Amanda J; Robinson, Brian; O'Rourke, Michael; Begg, Melissa D; Larson, Elaine L

    2015-12-01

    Critical interdisciplinary research skills include effective communication with diverse disciplines and cultivating collaborative relationships. Acquiring these skills during graduate education may foster future interdisciplinary research quality and productivity. The project aim was to develop and evaluate an interactive Toolbox workshop approach within an interprofessional graduate level course to enhance student learning and skill in interdisciplinary research. We sought to examine the student experience of integrating the Toolbox workshop in modular format over the duration of a 14-week course. The Toolbox Health Sciences Instrument includes six modules that were introduced in a 110-minute dialogue session during the first class and then integrated into the course in a series of six individual workshops in three phases over the course of the semester. Seventeen students participated; the majority were nursing students. Three measures were used to assess project outcomes: pre-post intervention Toolbox survey, competency self-assessment, and a postcourse survey. All measures indicated the objectives were met by a change in survey responses, improved competencies, and favorable experience of the Toolbox modular intervention. Our experience indicates that incorporating this Toolbox modular approach into research curricula can enhance individual level scientific capacity, future interdisciplinary research project success, and ultimately impact on practice and policy. © 2015 Wiley Periodicals, Inc.

  13. The Decoding Toolbox (TDT): a versatile software package for multivariate analyses of functional imaging data

    PubMed Central

    Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan

    2015-01-01

    The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393

  14. Tools in Support of Planning for Weather and Climate Extremes

    NASA Astrophysics Data System (ADS)

    Done, J.; Bruyere, C. L.; Hauser, R.; Holland, G. J.; Tye, M. R.

    2016-12-01

    A major limitation to planning for weather and climate extremes is the lack of maintained and readily available tools that can provide robust and well-communicated predictions and advice on their impacts. The National Center for Atmospheric Research is facilitating a collaborative international program to develop and support such tools within its Capacity Center for Climate and Weather Extremes aimed at improving community resilience planning and reducing weather and climate impacts. A Global Risk, Resilience and Impacts Toolbox is in development and will provide: A portable web-based interface to process work requests from a variety of users and locations; A sophisticated framework that enables specialized community tools to access a comprehensive database (public and private) of geo-located hazard, vulnerability, exposure, and loss data; A community development toolkit that enables and encourages community tool developments geared towards specific user man­agement and planning needs, and A comprehensive community sup­port facilitated by NCAR utilizing tutorials and a help desk. A number of applications are in development, built off the latest climate science, and in collaboration with private industry and local and state governments. Example applications will be described, including a hurricane damage tool in collaboration with the reinsurance sector, and a weather management tool for the construction industry. These examples will serve as starting points to discuss the broader potential of the toolbox.

  15. PDB2Graph: A toolbox for identifying critical amino acids map in proteins based on graph theory.

    PubMed

    Niknam, Niloofar; Khakzad, Hamed; Arab, Seyed Shahriar; Naderi-Manesh, Hossein

    2016-05-01

    The integrative and cooperative nature of protein structure involves the assessment of topological and global features of constituent parts. Network concept takes complete advantage of both of these properties in the analysis concomitantly. High compatibility to structural concepts or physicochemical properties in addition to exploiting a remarkable simplification in the system has made network an ideal tool to explore biological systems. There are numerous examples in which different protein structural and functional characteristics have been clarified by the network approach. Here, we present an interactive and user-friendly Matlab-based toolbox, PDB2Graph, devoted to protein structure network construction, visualization, and analysis. Moreover, PDB2Graph is an appropriate tool for identifying critical nodes involved in protein structural robustness and function based on centrality indices. It maps critical amino acids in protein networks and can greatly aid structural biologists in selecting proper amino acid candidates for manipulating protein structures in a more reasonable and rational manner. To introduce the capability and efficiency of PDB2Graph in detail, the structural modification of Calmodulin through allosteric binding of Ca(2+) is considered. In addition, a mutational analysis for three well-identified model proteins including Phage T4 lysozyme, Barnase and Ribonuclease HI, was performed to inspect the influence of mutating important central residues on protein activity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A robust firearm identification algorithm of forensic ballistics specimens

    NASA Astrophysics Data System (ADS)

    Chuan, Z. L.; Jemain, A. A.; Liong, C.-Y.; Ghani, N. A. M.; Tan, L. K.

    2017-09-01

    There are several inherent difficulties in the existing firearm identification algorithms, include requiring the physical interpretation and time consuming. Therefore, the aim of this study is to propose a robust algorithm for a firearm identification based on extracting a set of informative features from the segmented region of interest (ROI) using the simulated noisy center-firing pin impression images. The proposed algorithm comprises Laplacian sharpening filter, clustering-based threshold selection, unweighted least square estimator, and segment a square ROI from the noisy images. A total of 250 simulated noisy images collected from five different pistols of the same make, model and caliber are used to evaluate the robustness of the proposed algorithm. This study found that the proposed algorithm is able to perform the identical task on the noisy images with noise levels as high as 70%, while maintaining a firearm identification accuracy rate of over 90%.

  17. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  18. Face pose tracking using the four-point algorithm

    NASA Astrophysics Data System (ADS)

    Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen

    2017-06-01

    In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.

  19. National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox

    USGS Publications Warehouse

    Price, Curtis

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells.

  20. Robust Speaker Authentication Based on Combined Speech and Voiceprint Recognition

    NASA Astrophysics Data System (ADS)

    Malcangi, Mario

    2009-08-01

    Personal authentication is becoming increasingly important in many applications that have to protect proprietary data. Passwords and personal identification numbers (PINs) prove not to be robust enough to ensure that unauthorized people do not use them. Biometric authentication technology may offer a secure, convenient, accurate solution but sometimes fails due to its intrinsically fuzzy nature. This research aims to demonstrate that combining two basic speech processing methods, voiceprint identification and speech recognition, can provide a very high degree of robustness, especially if fuzzy decision logic is used.

  1. SOCIB Glider toolbox: from sensor to data repository

    NASA Astrophysics Data System (ADS)

    Pau Beltran, Joan; Heslop, Emma; Ruiz, Simón; Troupin, Charles; Tintoré, Joaquín

    2015-04-01

    Nowadays in oceanography, gliders constitutes a mature, cost-effective technology for the acquisition of measurements independently of the sea state (unlike ships), providing subsurface data during sustained periods, including extreme weather events. The SOCIB glider toolbox is a set of MATLAB/Octave scripts and functions developed in order to manage the data collected by a glider fleet. They cover the main stages of the data management process, both in real-time and delayed-time modes: metadata aggregation, downloading, processing, and automatic generation of data products and figures. The toolbox is distributed under the GNU licence (http://www.gnu.org/copyleft/gpl.html) and is available at http://www.socib.es/users/glider/glider_toolbox.

  2. The heritability of the functional connectome is robust to common nonlinear registration methods

    NASA Astrophysics Data System (ADS)

    Hafzalla, George W.; Prasad, Gautam; Baboyan, Vatche G.; Faskowitz, Joshua; Jahanshad, Neda; McMahon, Katie L.; de Zubicaray, Greig I.; Wright, Margaret J.; Braskie, Meredith N.; Thompson, Paul M.

    2016-03-01

    Nonlinear registration algorithms are routinely used in brain imaging, to align data for inter-subject and group comparisons, and for voxelwise statistical analyses. To understand how the choice of registration method affects maps of functional brain connectivity in a sample of 611 twins, we evaluated three popular nonlinear registration methods: Advanced Normalization Tools (ANTs), Automatic Registration Toolbox (ART), and FMRIB's Nonlinear Image Registration Tool (FNIRT). Using both structural and functional MRI, we used each of the three methods to align the MNI152 brain template, and 80 regions of interest (ROIs), to each subject's T1-weighted (T1w) anatomical image. We then transformed each subject's ROIs onto the associated resting state functional MRI (rs-fMRI) scans and computed a connectivity network or functional connectome for each subject. Given the different degrees of genetic similarity between pairs of monozygotic (MZ) and same-sex dizygotic (DZ) twins, we used structural equation modeling to estimate the additive genetic influences on the elements of the function networks, or their heritability. The functional connectome and derived statistics were relatively robust to nonlinear registration effects.

  3. A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Sadegh, M.; Mallakpour, I.

    2017-12-01

    Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.

  4. TopoToolbox: using sensor topography to calculate psychologically meaningful measures from event-related EEG/MEG.

    PubMed

    Tian, Xing; Poeppel, David; Huber, David E

    2011-01-01

    The open-source toolbox "TopoToolbox" is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004).

  5. User Guide for Compressible Flow Toolbox Version 2.1 for Use With MATLAB(Registered Trademark); Version 7

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2006-01-01

    This report provides a user guide for the Compressible Flow Toolbox, a collection of algorithms that solve almost 300 linear and nonlinear classical compressible flow relations. The algorithms, implemented in the popular MATLAB programming language, are useful for analysis of one-dimensional steady flow with constant entropy, friction, heat transfer, or shock discontinuities. The solutions do not include any gas dissociative effects. The toolbox also contains functions for comparing and validating the equation-solving algorithms against solutions previously published in the open literature. The classical equations solved by the Compressible Flow Toolbox are: isentropic-flow equations, Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section.), normal-shock equations, oblique-shock equations, and Prandtl-Meyer expansion equations. At the time this report was published, the Compressible Flow Toolbox was available without cost from the NASA Software Repository.

  6. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG

    PubMed Central

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies. PMID:29163006

  7. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    PubMed

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  8. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Benveniste, Jérôme; Knudsen, Per

    2016-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  9. A versatile software package for inter-subject correlation based analyses of fMRI.

    PubMed

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  10. A versatile software package for inter-subject correlation based analyses of fMRI

    PubMed Central

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818

  11. Phenology research for natural resource management in the United States.

    PubMed

    Enquist, Carolyn A F; Kellermann, Jherime L; Gerst, Katharine L; Miller-Rushing, Abraham J

    2014-05-01

    Natural resource professionals in the United States recognize that climate-induced changes in phenology can substantially affect resource management. This is reflected in national climate change response plans recently released by major resource agencies. However, managers on-the-ground are often unclear about how to use phenological information to inform their management practices. Until recently, this was at least partially due to the lack of broad-based, standardized phenology data collection across taxa and geographic regions. Such efforts are now underway, albeit in very early stages. Nonetheless, a major hurdle still exists: phenology-linked climate change research has focused more on describing broad ecological changes rather than making direct connections to local to regional management concerns. To help researchers better design relevant research for use in conservation and management decision-making processes, we describe phenology-related research topics that facilitate "actionable" science. Examples include research on evolution and phenotypic plasticity related to vulnerability, the demographic consequences of trophic mismatch, the role of invasive species, and building robust ecological forecast models. Such efforts will increase phenology literacy among on-the-ground resource managers and provide information relevant for short- and long-term decision-making, particularly as related to climate response planning and implementing climate-informed monitoring in the context of adaptive management. In sum, we argue that phenological information is a crucial component of the resource management toolbox that facilitates identification and evaluation of strategies that will reduce the vulnerability of natural systems to climate change. Management-savvy researchers can play an important role in reaching this goal.

  12. Air Sensor Toolbox

    EPA Pesticide Factsheets

    Air Sensor Toolbox provides information to citizen scientists, researchers and developers interested in learning more about new lower-cost compact air sensor technologies and tools for measuring air quality.

  13. Demonstration of a Fractured Rock Geophysical Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in Fractured Rock Aquifers

    DTIC Science & Technology

    2016-01-01

    USER’S GUIDE Demonstration of a Fractured Rock Geophysical Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in...Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in Fractured Rock Aquifers F.D. Day-Lewis, C.D. Johnson, J.H. Williams, C.L...are doomed to failure. DNAPL biodegradation charactrization and monitoring, remediation, fractured rock aquifers. Unclassified Unclassified UU UL 6

  14. Explaining Society: An Expanded Toolbox for Social Scientists

    PubMed Central

    Bell, David C.; Atkinson-Schnell, Jodie L.; DiBacco, Aron E.

    2012-01-01

    We propose for social scientists a theoretical toolbox containing a set of motivations that neurobiologists have recently validated. We show how these motivations can be used to create a theory of society recognizably similar to existing stable societies (sustainable, self-reproducing, and largely peaceful). Using this toolbox, we describe society in terms of three institutions: economy (a source of sustainability), government (peace), and the family (reproducibility). Conducting a thought experiment in three parts, we begin with a simple theory with only two motivations. We then create successive theories that systematically add motivations, showing that each element in the toolbox makes its own contribution to explain the workings of a stable society and that the family has a critical role in this process. PMID:23082093

  15. Complete scanpaths analysis toolbox.

    PubMed

    Augustyniak, Piotr; Mikrut, Zbigniew

    2006-01-01

    This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.

  16. Toolbox for Renewable Energy Project Development

    EPA Pesticide Factsheets

    The Toolbox for Renewable Energy Project Development summarizes key project development issues, addresses how to overcome major hurdles, and provides a curated directory of project development resources.

  17. Acrylamide mitigation strategies: critical appraisal of the FoodDrinkEurope toolbox.

    PubMed

    Palermo, M; Gökmen, V; De Meulenaer, B; Ciesarová, Z; Zhang, Y; Pedreschi, F; Fogliano, V

    2016-06-15

    FoodDrinkEurope Federation recently released the latest version of the Acrylamide Toolbox to support manufacturers in acrylamide reduction activities giving indication about the possible mitigation strategies. The Toolbox is intended for small and medium size enterprises with limited R&D resources, however no comments about the pro and cons of the different measures were provided to advise the potential users. Experts of the field are aware that not all the strategies proposed have equal value in terms of efficacy and cost/benefit ratio. This consideration prompted us to provide a qualitative science-based ranking of the mitigation strategies proposed in the acrylamide Toolbox, focusing on bakery and fried potato products. Five authors from different geographical areas having a publication record on acrylamide mitigation strategies worked independently ranking the efficacy of the acrylamide mitigation strategies taking into account three key parameters: (i) reduction rate; (ii) side effects; and (iii) applicability and economic impact. On the basis of their own experience and considering selected literature of the last ten years, the authors scored for each key parameter the acrylamide mitigation strategies proposed in the Toolbox. As expected, all strategies selected in the Toolbox turned out to be useful, however, not at the same level. The use of enzyme asparaginase and the selection of low sugar varieties were considered the best mitigation strategies in bakery and in potato products, respectively. According to authors' opinion most of the other mitigation strategies, although effective, either have relevant side effects on the sensory profile of the products, or they are not easy to implement in industrial production. The final outcome was a science based commented ranking which can enrich the acrylamide Toolbox supporting individual manufacturer in taking the best actions to reduce the acrylamide content in their specific production context.

  18. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, Per; Benveniste, Jerome

    2017-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  19. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, Per; Benveniste, Jerome; Team Gut

    2016-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  20. An image analysis toolbox for high-throughput C. elegans assays

    PubMed Central

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.

    2012-01-01

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656

  1. Using novel acoustic and visual mapping tools to predict the small-scale spatial distribution of live biogenic reef framework in cold-water coral habitats

    NASA Astrophysics Data System (ADS)

    De Clippele, L. H.; Gafeira, J.; Robert, K.; Hennige, S.; Lavaleye, M. S.; Duineveld, G. C. A.; Huvenne, V. A. I.; Roberts, J. M.

    2017-03-01

    Cold-water corals form substantial biogenic habitats on continental shelves and in deep-sea areas with topographic highs, such as banks and seamounts. In the Atlantic, many reef and mound complexes are engineered by Lophelia pertusa, the dominant framework-forming coral. In this study, a variety of mapping approaches were used at a range of scales to map the distribution of both cold-water coral habitats and individual coral colonies at the Mingulay Reef Complex (west Scotland). The new ArcGIS-based British Geological Survey (BGS) seabed mapping toolbox semi-automatically delineated over 500 Lophelia reef `mini-mounds' from bathymetry data with 2-m resolution. The morphometric and acoustic characteristics of the mini-mounds were also automatically quantified and captured using this toolbox. Coral presence data were derived from high-definition remotely operated vehicle (ROV) records and high-resolution microbathymetry collected by a ROV-mounted multibeam echosounder. With a resolution of 0.35 × 0.35 m, the microbathymetry covers 0.6 km2 in the centre of the study area and allowed identification of individual live coral colonies in acoustic data for the first time. Maximum water depth, maximum rugosity, mean rugosity, bathymetric positioning index and maximum current speed were identified as the environmental variables that contributed most to the prediction of live coral presence. These variables were used to create a predictive map of the likelihood of presence of live cold-water coral colonies in the area of the Mingulay Reef Complex covered by the 2-m resolution data set. Predictive maps of live corals across the reef will be especially valuable for future long-term monitoring surveys, including those needed to understand the impacts of global climate change. This is the first study using the newly developed BGS seabed mapping toolbox and an ROV-based microbathymetric grid to explore the environmental variables that control coral growth on cold-water coral reefs.

  2. FISSA: A neuropil decontamination toolbox for calcium imaging signals.

    PubMed

    Keemink, Sander W; Lowe, Scott C; Pakan, Janelle M P; Dylda, Evelyn; van Rossum, Mark C W; Rochefort, Nathalie L

    2018-02-22

    In vivo calcium imaging has become a method of choice to image neuronal population activity throughout the nervous system. These experiments generate large sequences of images. Their analysis is computationally intensive and typically involves motion correction, image segmentation into regions of interest (ROIs), and extraction of fluorescence traces from each ROI. Out of focus fluorescence from surrounding neuropil and other cells can strongly contaminate the signal assigned to a given ROI. In this study, we introduce the FISSA toolbox (Fast Image Signal Separation Analysis) for neuropil decontamination. Given pre-defined ROIs, the FISSA toolbox automatically extracts the surrounding local neuropil and performs blind-source separation with non-negative matrix factorization. Using both simulated and in vivo data, we show that this toolbox performs similarly or better than existing published methods. FISSA requires only little RAM, and allows for fast processing of large datasets even on a standard laptop. The FISSA toolbox is available in Python, with an option for MATLAB format outputs, and can easily be integrated into existing workflows. It is available from Github and the standard Python repositories.

  3. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap.

    PubMed

    Snigdha, Shikha; Milgram, Norton W; Willis, Sherry L; Albert, Marylin; Weintraub, S; Fortin, Norbert J; Cotman, Carl W

    2013-07-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer's disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer's disease. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. A preclinical cognitive test battery to parallel the National Institute of Health Toolbox in humans: bridging the translational gap

    PubMed Central

    Snigdha, Shikha; Milgram, Norton W.; Willis, Sherry L.; Albert, Marylin; Weintraub, S.; Fortin, Norbert J.; Cotman, Carl W.

    2013-01-01

    A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer’s disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer’s disease. PMID:23434040

  6. An Autonomous Star Identification Algorithm Based on One-Dimensional Vector Pattern for Star Sensors

    PubMed Central

    Luo, Liyan; Xu, Luping; Zhang, Hua

    2015-01-01

    In order to enhance the robustness and accelerate the recognition speed of star identification, an autonomous star identification algorithm for star sensors is proposed based on the one-dimensional vector pattern (one_DVP). In the proposed algorithm, the space geometry information of the observed stars is used to form the one-dimensional vector pattern of the observed star. The one-dimensional vector pattern of the same observed star remains unchanged when the stellar image rotates, so the problem of star identification is simplified as the comparison of the two feature vectors. The one-dimensional vector pattern is adopted to build the feature vector of the star pattern, which makes it possible to identify the observed stars robustly. The characteristics of the feature vector and the proposed search strategy for the matching pattern make it possible to achieve the recognition result as quickly as possible. The simulation results demonstrate that the proposed algorithm can effectively accelerate the star identification. Moreover, the recognition accuracy and robustness by the proposed algorithm are better than those by the pyramid algorithm, the modified grid algorithm, and the LPT algorithm. The theoretical analysis and experimental results show that the proposed algorithm outperforms the other three star identification algorithms. PMID:26198233

  7. An Autonomous Star Identification Algorithm Based on One-Dimensional Vector Pattern for Star Sensors.

    PubMed

    Luo, Liyan; Xu, Luping; Zhang, Hua

    2015-07-07

    In order to enhance the robustness and accelerate the recognition speed of star identification, an autonomous star identification algorithm for star sensors is proposed based on the one-dimensional vector pattern (one_DVP). In the proposed algorithm, the space geometry information of the observed stars is used to form the one-dimensional vector pattern of the observed star. The one-dimensional vector pattern of the same observed star remains unchanged when the stellar image rotates, so the problem of star identification is simplified as the comparison of the two feature vectors. The one-dimensional vector pattern is adopted to build the feature vector of the star pattern, which makes it possible to identify the observed stars robustly. The characteristics of the feature vector and the proposed search strategy for the matching pattern make it possible to achieve the recognition result as quickly as possible. The simulation results demonstrate that the proposed algorithm can effectively accelerate the star identification. Moreover, the recognition accuracy and robustness by the proposed algorithm are better than those by the pyramid algorithm, the modified grid algorithm, and the LPT algorithm. The theoretical analysis and experimental results show that the proposed algorithm outperforms the other three star identification algorithms.

  8. PV_LIB Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-11

    While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less

  9. A GIS tool for two-dimensional glacier-terminus change tracking

    NASA Astrophysics Data System (ADS)

    Urbanski, Jacek Andrzej

    2018-02-01

    This paper presents a Glacier Termini Tracking (GTT) toolbox for the two-dimensional analysis of glacier-terminus position changes. The input consists of a vector layer with several termini lines relating to the same glacier at different times. The output layers allow analyses to be conducted of glacier-terminus retreats, changes in retreats over time and along the ice face, and glacier-terminus fluctuations over time. The application of three tools from the toolbox is demonstrated via the analysis of eight glacier-terminus retreats and fluctuations at the Hornsund fjord in south Svalbard. It is proposed that this toolbox may also be useful in the study of other line features that change over time, like coastlines and rivers. The toolbox has been coded in Python and runs via ArcGIS.

  10. The GMT/MATLAB Toolbox

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  11. Finite-time robust passive control for a class of switched reaction-diffusion stochastic complex dynamical networks with coupling delays and impulsive control

    NASA Astrophysics Data System (ADS)

    Syed Ali, M.; Yogambigai, J.; Kwon, O. M.

    2018-03-01

    Finite-time boundedness and finite-time passivity for a class of switched stochastic complex dynamical networks (CDNs) with coupling delays, parameter uncertainties, reaction-diffusion term and impulsive control are studied. Novel finite-time synchronisation criteria are derived based on passivity theory. This paper proposes a CDN consisting of N linearly and diffusively coupled identical reaction- diffusion neural networks. By constructing of a suitable Lyapunov-Krasovskii's functional and utilisation of Jensen's inequality and Wirtinger's inequality, new finite-time passivity criteria for the networks are established in terms of linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Finally, two interesting numerical examples are given to show the effectiveness of the theoretical results.

  12. Using Compilers to Enhance Cryptographic Product Development

    NASA Astrophysics Data System (ADS)

    Bangerter, E.; Barbosa, M.; Bernstein, D.; Damgård, I.; Page, D.; Pagter, J. I.; Sadeghi, A.-R.; Sovio, S.

    Developing high-quality software is hard in the general case, and it is significantly more challenging in the case of cryptographic software. A high degree of new skill and understanding must be learnt and applied without error to avoid vulnerability and inefficiency. This is often beyond the financial, manpower or intellectual resources avail-able. In this paper we present the motivation for the European funded CACE (Computer Aided Cryptography Engineering) project The main objective of CACE is to provide engineers (with limited or no expertise in cryptography) with a toolbox that allows them to generate robust and efficient implementations of cryptographic primitives. We also present some preliminary results already obtained in the early stages of this project, and discuss the relevance of the project as perceived by stakeholders in the mobile device arena.

  13. Real time wind farm emulation using SimWindFarm toolbox

    NASA Astrophysics Data System (ADS)

    Topor, Marcel

    2016-06-01

    This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.

  14. Efficient estimation and large-scale evaluation of lateral chromatic aberration for digital image forensics

    NASA Astrophysics Data System (ADS)

    Gloe, Thomas; Borowka, Karsten; Winkler, Antje

    2010-01-01

    The analysis of lateral chromatic aberration forms another ingredient for a well equipped toolbox of an image forensic investigator. Previous work proposed its application to forgery detection1 and image source identification.2 This paper takes a closer look on the current state-of-the-art method to analyse lateral chromatic aberration and presents a new approach to estimate lateral chromatic aberration in a runtime-efficient way. Employing a set of 11 different camera models including 43 devices, the characteristic of lateral chromatic aberration is investigated in a large-scale. The reported results point to general difficulties that have to be considered in real world investigations.

  15. Improving student comprehension of the interconnectivity of the hydrologic cycle with a novel 'hydrology toolbox', integrated watershed model, and companion textbook

    NASA Astrophysics Data System (ADS)

    Huning, L. S.; Margulis, S. A.

    2013-12-01

    Concepts in introductory hydrology courses are often taught in the context of process-based modeling that ultimately is integrated into a watershed model. In an effort to reduce the learning curve associated with applying hydrologic concepts to real-world applications, we developed and incorporated a 'hydrology toolbox' that complements a new, companion textbook into introductory undergraduate hydrology courses. The hydrology toolbox contains the basic building blocks (functions coded in MATLAB) for an integrated spatially-distributed watershed model that makes hydrologic topics (e.g. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) more user-friendly and accessible for students. The toolbox functions can be used in a modular format so that students can study individual hydrologic processes and become familiar with the hydrology toolbox. This approach allows such courses to emphasize understanding and application of hydrologic concepts rather than computer coding or programming. While topics in introductory hydrology courses are often introduced and taught independently or semi-independently, they are inherently interconnected. These toolbox functions are therefore linked together at the end of the course to reinforce a holistic understanding of how these hydrologic processes are measured, interconnected, and modeled. They are integrated into a spatially-distributed watershed model or numerical laboratory where students can explore a range of topics such as rainfall-runoff modeling, urbanization, deforestation, watershed response to changes in parameters or forcings, etc. Model output can readily be visualized and analyzed by students to understand watershed response in a real river basin or a simple 'toy' basin. These tools complement the textbook, each of which has been well received by students in multiple hydrology courses with various disciplinary backgrounds. The same governing equations that students have studied in the textbook and used in the toolbox have been encapsulated in the watershed model. Therefore, the combination of the hydrology toolbox, integrated watershed model, and textbook tends to eliminate the potential disconnect between process-based modeling and an 'off-the-shelf' watershed model.

  16. Sentinel-2 data exploitation with ESA's Sentinel-2 Toolbox

    NASA Astrophysics Data System (ADS)

    Gascon, Ferran; Ramoino, Fabrizzio; deanos, Yves-louis

    2017-04-01

    The Sentinel-2 Toolbox is a project kicked off by ESA in early 2014, under the umbrella of the ESA SEOM programme with the aim to provide a tool for visualizing, analysing, and processing the Sentinel-2 datasets. The toolbox is an extension of the SeNtinel Application Platform (SNAP), a project resulting from the effort of the developers of the Sentinel-1, Sentinel-2 and Sentinel-3 toolbox to provide a single common application framework suited for the mixed exploitation of SAR, high resolution optical and medium resolution optical datasets. All three development teams collaborate to drive the evolution of the common SNAP framework in a developer forum. In this triplet, the Sentinel-2 toolbox is dedicated to enhance SNAP support for high resolution optical imagery. It is a multi-mission toolbox, already providing support for Sentinel-2, RapidEye, Deimos, SPOT 1 to SPOT 5 datasets. In terms of processing algorithms, SNAP provides tools specific to the Sentinel-2 mission : • An atmospheric correction module, Sen2Cor, is integrated into the toolbox, and provides scene classification, atmospheric correction, cirrus detection and correction. The output L2A products can be opened seamlessly in the toolbox. • A multitemporal synthesis processor (L3) • A biophysical products processor (L2B) • A water processor • A deforestation detector • OTB tools integration • SNAP Engine for Cloud Exploitation along with a set of more generic tools for high resolution optical data exploitation. Together with the generic functionalities of SNAP this provides an ideal environment for designing multi-missions processing chains and producing value-added products from raw datasets. The use of SNAP is manifold and the desktop tools provides a rich application for interactive visualization, analysis and processing of data. But all tools available from SNAP can be accessed via command-line through the Graph Processing Framework (GPT), the kernel of the SNAP processing engine. This makes it a perfect candidate for driving the processing of data on servers for bulk processing.

  17. Automated identification of stream-channel geomorphic features from high‑resolution digital elevation models in West Tennessee watersheds

    USGS Publications Warehouse

    Cartwright, Jennifer M.; Diehl, Timothy H.

    2017-01-17

    High-resolution digital elevation models (DEMs) derived from light detection and ranging (lidar) enable investigations of stream-channel geomorphology with much greater precision than previously possible. The U.S. Geological Survey has developed the DEM Geomorphology Toolbox, containing seven tools to automate the identification of sites of geomorphic instability that may represent sediment sources and sinks in stream-channel networks. These tools can be used to modify input DEMs on the basis of known locations of stormwater infrastructure, derive flow networks at user-specified resolutions, and identify possible sites of geomorphic instability including steep banks, abrupt changes in channel slope, or areas of rough terrain. Field verification of tool outputs identified several tool limitations but also demonstrated their overall usefulness in highlighting likely sediment sources and sinks within channel networks. In particular, spatial clusters of outputs from multiple tools can be used to prioritize field efforts to assess and restore eroding stream reaches.

  18. Label-Free Discovery Array Platform for the Characterization of Glycan Binding Proteins and Glycoproteins.

    PubMed

    Gray, Christopher J; Sánchez-Ruíz, Antonio; Šardzíková, Ivana; Ahmed, Yassir A; Miller, Rebecca L; Reyes Martinez, Juana E; Pallister, Edward; Huang, Kun; Both, Peter; Hartmann, Mirja; Roberts, Hannah N; Šardzík, Robert; Mandal, Santanu; Turnbull, Jerry E; Eyers, Claire E; Flitsch, Sabine L

    2017-04-18

    The identification of carbohydrate-protein interactions is central to our understanding of the roles of cell-surface carbohydrates (the glycocalyx), fundamental for cell-recognition events. Therefore, there is a need for fast high-throughput biochemical tools to capture the complexity of these biological interactions. Here, we describe a rapid method for qualitative label-free detection of carbohydrate-protein interactions on arrays of simple synthetic glycans, more complex natural glycosaminoglycans (GAG), and lectins/carbohydrate binding proteins using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry. The platform can unequivocally identify proteins that are captured from either purified or complex sample mixtures, including biofluids. Identification of proteins bound to the functionalized array is achieved by analyzing either the intact protein mass or, after on-chip proteolytic digestion, the peptide mass fingerprint and/or tandem mass spectrometry of selected peptides, which can yield highly diagnostic sequence information. The platform described here should be a valuable addition to the limited analytical toolbox that is currently available for glycomics.

  19. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Garcia-Mondejar, Albert; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the frontend for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the dataformatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat2 Baseline C. The second release of the Toolbox, published in October 2016, has a new graphical user interface and other visualisation improvements. The third release (January 2017) includes more features and solves issues from the previous versions.

  20. Smoke Ready Toolbox for Wildfires

    EPA Pesticide Factsheets

    This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.

  1. Developing a congestion mitigation toolbox.

    DOT National Transportation Integrated Search

    2011-09-30

    Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference for practitioners and an educational tool for elected officials acting through public policy boards to better understand the development, planning,...

  2. Grid Integrated Distributed PV (GridPV) Version 2.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Matthew J.; Coogan, Kyle

    2014-12-01

    This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functio ns are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in th e OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included tomore » show potential uses of the toolbox functions. Each function i n the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.« less

  3. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less

  4. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    PubMed

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Toolbox for Research, or how to facilitate a central data management in small-scale research projects.

    PubMed

    Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang

    2018-01-25

    In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.

  6. Correlation techniques to determine model form in robust nonlinear system realization/identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1991-01-01

    The fundamental challenge in identification of nonlinear dynamic systems is determining the appropriate form of the model. A robust technique is presented which essentially eliminates this problem for many applications. The technique is based on the Minimum Model Error (MME) optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature is the ability to identify nonlinear dynamic systems without prior assumption regarding the form of the nonlinearities, in contrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. Model form is determined via statistical correlation of the MME optimal state estimates with the MME optimal model error estimates. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  7. A Molecular Toolbox for Rapid Generation of Viral Vectors to Up- or Down-Regulate Neuronal Gene Expression in vivo

    PubMed Central

    White, Melanie D.; Milne, Ruth V. J.; Nolan, Matthew F.

    2011-01-01

    We introduce a molecular toolbox for manipulation of neuronal gene expression in vivo. The toolbox includes promoters, ion channels, optogenetic tools, fluorescent proteins, and intronic artificial microRNAs. The components are easily assembled into adeno-associated virus (AAV) or lentivirus vectors using recombination cloning. We demonstrate assembly of toolbox components into lentivirus and AAV vectors and use these vectors for in vivo expression of inwardly rectifying potassium channels (Kir2.1, Kir3.1, and Kir3.2) and an artificial microRNA targeted against the ion channel HCN1 (HCN1 miRNA). We show that AAV assembled to express HCN1 miRNA produces efficacious and specific in vivo knockdown of HCN1 channels. Comparison of in vivo viral transduction using HCN1 miRNA with mice containing a germ line deletion of HCN1 reveals similar physiological phenotypes in cerebellar Purkinje cells. The easy assembly and re-usability of the toolbox components, together with the ability to up- or down-regulate neuronal gene expression in vivo, may be useful for applications in many areas of neuroscience. PMID:21772812

  8. Fecal pollution source tracking toolbox for identification, evaluation and characterization of fecal contamination in receiving urban surface waters and groundwater.

    PubMed

    Tran, Ngoc Han; Gin, Karina Yew-Hoong; Ngo, Huu Hao

    2015-12-15

    The quality of surface waters/groundwater of a geographical region can be affected by anthropogenic activities, land use patterns and fecal pollution sources from humans and animals. Therefore, the development of an efficient fecal pollution source tracking toolbox for identifying the origin of the fecal pollution sources in surface waters/groundwater is especially helpful for improving management efforts and remediation actions of water resources in a more cost-effective and efficient manner. This review summarizes the updated knowledge on the use of fecal pollution source tracking markers for detecting, evaluating and characterizing fecal pollution sources in receiving surface waters and groundwater. The suitability of using chemical markers (i.e. fecal sterols, fluorescent whitening agents, pharmaceuticals and personal care products, and artificial sweeteners) and/or microbial markers (e.g. F+RNA coliphages, enteric viruses, and host-specific anaerobic bacterial 16S rDNA genetic markers) for tracking fecal pollution sources in receiving water bodies is discussed. In addition, this review also provides a comprehensive approach, which is based on the detection ratios (DR), detection frequencies (DF), and fate of potential microbial and chemical markers. DR and DF are considered as the key criteria for selecting appropriate markers for identifying and evaluating the impacts of fecal contamination in surface waters/groundwater. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. A Michigan toolbox for mitigating traffic congestion.

    DOT National Transportation Integrated Search

    2011-09-30

    "Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference : for practitioners and an educational tool for elected officials acting through public policy boards to better : understand the development, plan...

  10. Drinking Water Cyanotoxin Risk Communication Toolbox

    EPA Pesticide Factsheets

    The drinking water cyanotoxin risk communication toolbox is a ready-to-use, “one-stop-shop” to support public water systems, states, and local governments in developing, as they deem appropriate, their own risk communication materials.

  11. EPA ExpoBox Toolbox Search

    EPA Pesticide Factsheets

    EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant assessment data bases,

  12. 40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...

  13. 40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...

  14. Air Sensor Toolbox for Citizen Scientists

    EPA Pesticide Factsheets

    EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.

  15. A portable toolbox to monitor and evaluate signal operations.

    DOT National Transportation Integrated Search

    2011-10-01

    Researchers from the Texas Transportation Institute developed a portable tool consisting of a fieldhardened : computer interfacing with the traffic signal cabinet through special enhanced Bus Interface Units. : The toolbox consisted of a monitoring t...

  16. Air Sensor Toolbox: Resources and Funding

    EPA Pesticide Factsheets

    EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.

  17. System modelling for LISA Pathfinder

    NASA Astrophysics Data System (ADS)

    Diaz-Aguiló, Marc; Grynagier, Adrien; Rais, Boutheina

    LISA Pathfinder is the technology demonstrator for LISA, a space-borne gravitational waves observatory. The goal of the mission is to characterise the dynamics of the LISA Technology Package (LTP) to prove that on-board experimental conditions are compatible with the de-tection of gravitational waves. The LTP is a drag-free dynamics experiment which includes a control loop with sensors (interferometric and capacitive), actuators (capacitive actuators and thrusters), controlled disturbances (magnetic coils, heaters) and which is subject to various endogenous or exogenous noise sources such as infrared pressure or solar wind. The LTP experiment features new hardware which was never flown in space. The mission has a tight operation timeline as it is constrained to about 100 days. It is therefore vital to have efficient and precise means of investigation and diagnostics to be used during the on-orbit operations. These will be conducted using the LTP Data Analysis toolbox (LTPDA) which allows for simulation, parameter identification and various analyses (covariance analysis, state estimation) given an experimental model. The LTPDA toolbox therefore contains a series of models which are state-space representations of each component in the LTP. The State-Space Models (SSM) are objects of a state-space class within the LTPDA toolbox especially designed to address all the requirements of this tool. The user has access to a set of linear models which represent every satellite subsystem; the models are available in different forms representing 1D, 2D and 3D systems, each with settable symbolic and numeric parameters. To limit the possible errors, the models can be automatically linked to produce composite systems and closed-loops of the LTP. Finally, for the sake of completeness, accuracy and maintainability of the tool, the models contain all the physical information they mimic (i.e. variable units, description of parameters, description of inputs/outputs, etc). Models developed for this work include the fixed-point linearized equations of motion for the LTP and the linear models for sensors and actuators with their noise modelling blocks issued from the analysis of the individual actuators. The drag-free controller model includes the dis-crete delays expected in the hardware. In this work we briefly describe the software architecture, in order to concentrate then on the physical description of the models. This is supported by an overview of different user scenarios and some examples of model analysis that highlight the advantages of this high-level programming engineering toolbox for space mission data analysis and calibration.

  18. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    PubMed

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.

  19. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    PubMed

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.

  20. Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods

    NASA Astrophysics Data System (ADS)

    Rogers, Adam; Safi-Harb, Samar; Fiege, Jason

    2015-08-01

    The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.

  1. EPA EMERGENCY PLANNING TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  2. Ironbound Community Citizen Science Toolbox Fact Sheet

    EPA Pesticide Factsheets

    EPA is partnering with Newark’s Ironbound Community Corporation (ICC) to design, develop, and pilot a Citizen Science Toolbox that will enable communities to collect their own environmental data and increase their ability to understand local conditions.

  3. Evaluating a 2D image-based computerized approach for measuring riverine pebble roundness

    NASA Astrophysics Data System (ADS)

    Cassel, Mathieu; Piégay, Hervé; Lavé, Jérôme; Vaudor, Lise; Hadmoko Sri, Danang; Wibiwo Budi, Sandy; Lavigne, Franck

    2018-06-01

    The geometrical characteristics of pebbles are important features to study transport pathways, sedimentary history, depositional environments, abrasion processes or to target sediment sources. Both the shape and roundness of pebbles can be described by a still growing number of metrics in 2D and 3D or by visual charts. Despite new developments, existing tools remain proprietary and no pebble roundness toolbox has been available widely within the scientific community. The toolbox developed by Roussillon et al. (2009) automatically computes the size, shape and roundness indexes of pebbles from their 2D maximal projection plans. Using a digital camera, this toolbox operates using 2D pictures taken of pebbles placed on a one square meter red board, allowing data collection to be quickly and efficiently acquired at a large scale. Now that the toolbox is freely available for download,

  4. System engineering toolbox for design-oriented engineers

    NASA Technical Reports Server (NTRS)

    Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.

    1994-01-01

    This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.

  5. National Water-Quality Assessment (NAWQA) area-characterization toolbox

    USGS Publications Warehouse

    Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  6. Discriminative and robust zero-watermarking scheme based on completed local binary pattern for authentication and copyright identification of medical images

    NASA Astrophysics Data System (ADS)

    Liu, Xiyao; Lou, Jieting; Wang, Yifan; Du, Jingyu; Zou, Beiji; Chen, Yan

    2018-03-01

    Authentication and copyright identification are two critical security issues for medical images. Although zerowatermarking schemes can provide durable, reliable and distortion-free protection for medical images, the existing zerowatermarking schemes for medical images still face two problems. On one hand, they rarely considered the distinguishability for medical images, which is critical because different medical images are sometimes similar to each other. On the other hand, their robustness against geometric attacks, such as cropping, rotation and flipping, is insufficient. In this study, a novel discriminative and robust zero-watermarking (DRZW) is proposed to address these two problems. In DRZW, content-based features of medical images are first extracted based on completed local binary pattern (CLBP) operator to ensure the distinguishability and robustness, especially against geometric attacks. Then, master shares and ownership shares are generated from the content-based features and watermark according to (2,2) visual cryptography. Finally, the ownership shares are stored for authentication and copyright identification. For queried medical images, their content-based features are extracted and master shares are generated. Their watermarks for authentication and copyright identification are recovered by stacking the generated master shares and stored ownership shares. 200 different medical images of 5 types are collected as the testing data and our experimental results demonstrate that DRZW ensures both the accuracy and reliability of authentication and copyright identification. When fixing the false positive rate to 1.00%, the average value of false negative rates by using DRZW is only 1.75% under 20 common attacks with different parameters.

  7. A voting-based star identification algorithm utilizing local and global distribution

    NASA Astrophysics Data System (ADS)

    Fan, Qiaoyun; Zhong, Xuyang; Sun, Junhua

    2018-03-01

    A novel star identification algorithm based on voting scheme is presented in this paper. In the proposed algorithm, the global distribution and local distribution of sensor stars are fully utilized, and the stratified voting scheme is adopted to obtain the candidates for sensor stars. The database optimization is employed to reduce its memory requirement and improve the robustness of the proposed algorithm. The simulation shows that the proposed algorithm exhibits 99.81% identification rate with 2-pixel standard deviations of positional noises and 0.322-Mv magnitude noises. Compared with two similar algorithms, the proposed algorithm is more robust towards noise, and the average identification time and required memory is less. Furthermore, the real sky test shows that the proposed algorithm performs well on the real star images.

  8. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Escolà, Roger; Garcia-Mondejar, Albert; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrozio, Americo; Restano, Marco; Benveniste, Jérôme

    2016-04-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the envissat reprocessed, new features regarding data interpolation and formulas updates.

  9. Broadview Radar Altimetry Toolbox

    NASA Astrophysics Data System (ADS)

    Mondéjar, Albert; Benveniste, Jérôme; Naeije, Marc; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Ambrózio, Américo; Restano, Marco

    2016-07-01

    The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Études Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the EnviSat reprocessed, new features regarding data interpolation and formulas updates.

  10. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  11. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide.

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  12. A comprehensive validation toolbox for regional ocean models - Outline, implementation and application to the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Jandt, Simon; Laagemaa, Priidik; Janssen, Frank

    2014-05-01

    The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product version is compared with the previous version. Although, the toolbox is mainly tested for the Baltic Sea yet, it can easily be adapted to different datasets and parameters, regardless of the geographic region. In this presentation the usability of the toolbox is demonstrated along with several results of the validation process.

  13. Chiral thiazoline and thiazole building blocks for the synthesis of peptide-derived natural products.

    PubMed

    Just-Baringo, Xavier; Albericio, Fernando; Alvarez, Mercedes

    2014-01-01

    Thiazoline and thiazole heterocycles are privileged motifs found in numerous peptide-derived natural products of biological interest. During the last decades, the synthesis of optically pure building blocks has been addressed by numerous groups, which have developed a plethora of strategies to that end. Efficient and reliable methodologies that are compatible with the intricate and capricious architectures of natural products are a must to further develop their science. Structure confirmation, structure-activity relationship studies and industrial production are fields of paramount importance that require these robust methodologies in order to successfully bring natural products into the clinic. Today's chemist toolbox is assorted with many powerful methods for chiral thiazoline and thiazole synthesis. Ranging from biomimetic approaches to stereoselective alkylations, one is likely to find a suitable method for their needs.

  14. Engineering emergent multicellular behavior through synthetic adhesion

    NASA Astrophysics Data System (ADS)

    Glass, David; Riedel-Kruse, Ingmar

    In over a decade, synthetic biology has developed increasingly robust gene networks within single cells, but constructed very few systems that demonstrate multicellular spatio-temporal dynamics. We are filling this gap in synthetic biology's toolbox by developing an E. coli self-assembly platform based on modular cell-cell adhesion. We developed a system in which adhesive selectivity is provided by a library of outer membrane-displayed peptides with intra-library specificities, while affinity is provided by consistent expression across the entire library. We further provide a biophysical model to help understand the parameter regimes in which this tool can be used to self-assemble into cellular clusters, filaments, or meshes. The combined platform will enable future development of synthetic multicellular systems for use in consortia-based metabolic engineering, in living materials, and in controlled study of minimal multicellular systems. Stanford Bio-X Bowes Fellowship.

  15. Real-time Neuroimaging and Cognitive Monitoring Using Wearable Dry EEG

    PubMed Central

    Mullen, Tim R.; Kothe, Christian A.E.; Chi, Mike; Ojeda, Alejandro; Kerth, Trevor; Makeig, Scott; Jung, Tzyy-Ping; Cauwenberghs, Gert

    2015-01-01

    Goal We present and evaluate a wearable high-density dry electrode EEG system and an open-source software framework for online neuroimaging and state classification. Methods The system integrates a 64-channel dry EEG form-factor with wireless data streaming for online analysis. A real-time software framework is applied, including adaptive artifact rejection, cortical source localization, multivariate effective connectivity inference, data visualization, and cognitive state classification from connectivity features using a constrained logistic regression approach (ProxConn). We evaluate the system identification methods on simulated 64-channel EEG data. Then we evaluate system performance, using ProxConn and a benchmark ERP method, in classifying response errors in 9 subjects using the dry EEG system. Results Simulations yielded high accuracy (AUC=0.97±0.021) for real-time cortical connectivity estimation. Response error classification using cortical effective connectivity (sdDTF) was significantly above chance with similar performance (AUC) for cLORETA (0.74±0.09) and LCMV (0.72±0.08) source localization. Cortical ERP-based classification was equivalent to ProxConn for cLORETA (0.74±0.16) but significantly better for LCMV (0.82±0.12). Conclusion We demonstrated the feasibility for real-time cortical connectivity analysis and cognitive state classification from high-density wearable dry EEG. Significance This paper is the first validated application of these methods to 64-channel dry EEG. The work addresses a need for robust real-time measurement and interpretation of complex brain activity in the dynamic environment of the wearable setting. Such advances can have broad impact in research, medicine, and brain-computer interfaces. The pipelines are made freely available in the open-source SIFT and BCILAB toolboxes. PMID:26415149

  16. Aeroservoelastic Uncertainty Model Identification from Flight Data

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.

    2001-01-01

    Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

  17. Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows

    NASA Astrophysics Data System (ADS)

    Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.

    2017-06-01

    The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.

  18. SSOAP Toolbox Enhancements and Case Study

    EPA Science Inventory

    Recognizing the need for tools to support the development of sanitary sewer overflow (SSO) control plans, in October 2009 the U.S. Environmental Protection Agency (EPA) released the first version of the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox. This first ve...

  19. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic System (T-MATS)

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei (OA)

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This presentation describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this presentation is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture.

  20. An Educational Model for Hands-On Hydrology Education

    NASA Astrophysics Data System (ADS)

    AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.

    2014-12-01

    This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.

  1. Wyrm: A Brain-Computer Interface Toolbox in Python.

    PubMed

    Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin

    2015-10-01

    In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.

  2. Cereal powdery mildew effectors: a complex toolbox for an obligate pathogen.

    PubMed

    Bourras, Salim; Praz, Coraline R; Spanu, Pietro D; Keller, Beat

    2018-02-15

    Cereal powdery mildews are major pathogens of cultivated monocot crops, and all are obligate biotrophic fungi that can only grow and reproduce on living hosts. This lifestyle is combined with extreme host specialization where every mildew subspecies (referred to as forma specialis) can only infect one plant species. Recently there has been much progress in our understanding of the possible roles effectors play in this complex host-pathogen interaction. Here, we review current knowledge on the origin, evolution, and mode of action of cereal mildew effectors, with a particular focus on recent advances in the identification of bona fide effectors and avirulence effector proteins from wheat and barley powdery mildews. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. An experimental study of nonlinear dynamic system identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1990-01-01

    A technique for robust identification of nonlinear dynamic systems is developed and illustrated using both simulations and analog experiments. The technique is based on the Minimum Model Error optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature of the current work is the ability to identify nonlinear dynamic systems without prior assumptions regarding the form of the nonlinearities, in constrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  4. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  5. MEIGO: an open-source software suite based on metaheuristics for global optimization in systems biology and bioinformatics.

    PubMed

    Egea, Jose A; Henriques, David; Cokelaer, Thomas; Villaverde, Alejandro F; MacNamara, Aidan; Danciu, Diana-Patricia; Banga, Julio R; Saez-Rodriguez, Julio

    2014-05-10

    Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods.

  6. MEIGO: an open-source software suite based on metaheuristics for global optimization in systems biology and bioinformatics

    PubMed Central

    2014-01-01

    Background Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. Results We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. Conclusions MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods. PMID:24885957

  7. Determining the transport mechanism of an enzyme-catalytic complex metabolic network based on biological robustness.

    PubMed

    Wang, Lei

    2013-04-01

    Understanding the transport mechanism of 1,3-propanediol (1,3-PD) is of critical importance to do further research on gene regulation. Due to the lack of intracellular information, on the basis of enzyme-catalytic system, using biological robustness as performance index, we present a system identification model to infer the most possible transport mechanism of 1,3-PD, in which the performance index consists of the relative error of the extracellular substance concentrations and biological robustness of the intracellular substance concentrations. We will not use a Boolean framework but prefer a model description based on ordinary differential equations. Among other advantages, this also facilitates the robustness analysis, which is the main goal of this paper. An algorithm is constructed to seek the solution of the identification model. Numerical results show that the most possible transport way is active transport coupled with passive diffusion.

  8. A fast iterative recursive least squares algorithm for Wiener model identification of highly nonlinear systems.

    PubMed

    Kazemi, Mahdi; Arefi, Mohammad Mehdi

    2017-03-01

    In this paper, an online identification algorithm is presented for nonlinear systems in the presence of output colored noise. The proposed method is based on extended recursive least squares (ERLS) algorithm, where the identified system is in polynomial Wiener form. To this end, an unknown intermediate signal is estimated by using an inner iterative algorithm. The iterative recursive algorithm adaptively modifies the vector of parameters of the presented Wiener model when the system parameters vary. In addition, to increase the robustness of the proposed method against variations, a robust RLS algorithm is applied to the model. Simulation results are provided to show the effectiveness of the proposed approach. Results confirm that the proposed method has fast convergence rate with robust characteristics, which increases the efficiency of the proposed model and identification approach. For instance, the FIT criterion will be achieved 92% in CSTR process where about 400 data is used. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Aerospace Toolbox--a flight vehicle design, analysis, simulation, and software development environment II: an in-depth overview

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.

    2002-07-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  10. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    PubMed

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  11. Integrated system dynamics toolbox for water resources planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associatedmore » processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward achieving the technology development goals of this center.« less

  12. Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing ...

  13. A Toolbox for Corrective Action: Resource Conservation and Recovery Act Facilities Investigation Remedy Selection Track

    EPA Pesticide Factsheets

    The purpose of this toolbox is to help EPA Regional staff and their partners to take advantage of the efficiency and quality gains from the Resource Conservation and Recovery Act (RCRA) Facilities Investigation Remedy Selection Track (FIRST) approach.

  14. Traffic analysis toolbox volume IX : work zone modeling and simulation, a guide for analysts

    DOT National Transportation Integrated Search

    2009-03-01

    This document is the second volume in the FHWA Traffic Analysis Toolbox: Work Zone Analysis series. Whereas the first volume provides guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in work zone plan...

  15. An analysis toolbox to explore mesenchymal migration heterogeneity reveals adaptive switching between distinct modes

    PubMed Central

    Shafqat-Abbasi, Hamdah; Kowalewski, Jacob M; Kiss, Alexa; Gong, Xiaowei; Hernandez-Varas, Pablo; Berge, Ulrich; Jafari-Mamaghani, Mehrdad; Lock, John G; Strömblad, Staffan

    2016-01-01

    Mesenchymal (lamellipodial) migration is heterogeneous, although whether this reflects progressive variability or discrete, 'switchable' migration modalities, remains unclear. We present an analytical toolbox, based on quantitative single-cell imaging data, to interrogate this heterogeneity. Integrating supervised behavioral classification with multivariate analyses of cell motion, membrane dynamics, cell-matrix adhesion status and F-actin organization, this toolbox here enables the detection and characterization of two quantitatively distinct mesenchymal migration modes, termed 'Continuous' and 'Discontinuous'. Quantitative mode comparisons reveal differences in cell motion, spatiotemporal coordination of membrane protrusion/retraction, and how cells within each mode reorganize with changed cell speed. These modes thus represent distinctive migratory strategies. Additional analyses illuminate the macromolecular- and cellular-scale effects of molecular targeting (fibronectin, talin, ROCK), including 'adaptive switching' between Continuous (favored at high adhesion/full contraction) and Discontinuous (low adhesion/inhibited contraction) modes. Overall, this analytical toolbox now facilitates the exploration of both spontaneous and adaptive heterogeneity in mesenchymal migration. DOI: http://dx.doi.org/10.7554/eLife.11384.001 PMID:26821527

  16. A Toolbox to Improve Algorithms for Insulin-Dosing Decision Support

    PubMed Central

    Donsa, K.; Plank, J.; Schaupp, L.; Mader, J. K.; Truskaller, T.; Tschapeller, B.; Höll, B.; Spat, S.; Pieber, T. R.

    2014-01-01

    Summary Background Standardized insulin order sets for subcutaneous basal-bolus insulin therapy are recommended by clinical guidelines for the inpatient management of diabetes. The algorithm based GlucoTab system electronically assists health care personnel by supporting clinical workflow and providing insulin-dose suggestions. Objective To develop a toolbox for improving clinical decision-support algorithms. Methods The toolbox has three main components. 1) Data preparation: Data from several heterogeneous sources is extracted, cleaned and stored in a uniform data format. 2) Simulation: The effects of algorithm modifications are estimated by simulating treatment workflows based on real data from clinical trials. 3) Analysis: Algorithm performance is measured, analyzed and simulated by using data from three clinical trials with a total of 166 patients. Results Use of the toolbox led to algorithm improvements as well as the detection of potential individualized subgroup-specific algorithms. Conclusion These results are a first step towards individualized algorithm modifications for specific patient subgroups. PMID:25024768

  17. A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering

    PubMed Central

    Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng

    2013-01-01

    Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791

  18. Velocity Mapping Toolbox (VMT): a processing and visualization suite for moving-vessel ADCP measurements

    USGS Publications Warehouse

    Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.

    2013-01-01

    The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.

  19. ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes

    PubMed Central

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273

  20. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    PubMed

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  1. SBEToolbox: A Matlab Toolbox for Biological Network Analysis

    PubMed Central

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418

  2. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    PubMed

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  3. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO CONTAMINATION THREATS TO DRINKING WATER SYSTEMS

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  4. UAS-NAS Live Virtual Constructive Distributed Environment (LVC): LVC Gateway, Gateway Toolbox, Gateway Data Logger (GDL), SaaProc Software Design Description

    NASA Technical Reports Server (NTRS)

    Jovic, Srboljub

    2015-01-01

    This document provides the software design description for the two core software components, the LVC Gateway, the LVC Gateway Toolbox, and two participants, the LVC Gateway Data Logger and the SAA Processor (SaaProc).

  5. Expanding the seat belt program strategies toolbox: a starter kit for trying new program ideas : traffic tech.

    DOT National Transportation Integrated Search

    2016-10-01

    The National Highway Traffic Safety Administration has just : released a new resource for developing seat belt programs in : the traffic safety communityExpanding the Seat Belt Program : Toolbox: A Starter Kit for Trying New Program Ideas. : Resea...

  6. Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox - slides

    EPA Science Inventory

    The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing S...

  7. Motor assessment using the NIH Toolbox

    PubMed Central

    Magasi, Susan; McCreath, Heather E.; Bohannon, Richard W.; Wang, Ying-Chih; Bubela, Deborah J.; Rymer, William Z.; Beaumont, Jennifer; Rine, Rose Marie; Lai, Jin-Shei; Gershon, Richard C.

    2013-01-01

    Motor function involves complex physiologic processes and requires the integration of multiple systems, including neuromuscular, musculoskeletal, and cardiopulmonary, and neural motor and sensory-perceptual systems. Motor-functional status is indicative of current physical health status, burden of disease, and long-term health outcomes, and is integrally related to daily functioning and quality of life. Given its importance to overall neurologic health and function, motor function was identified as a key domain for inclusion in the NIH Toolbox for Assessment of Neurological and Behavioral Function (NIH Toolbox). We engaged in a 3-stage developmental process to: 1) identify key subdomains and candidate measures for inclusion in the NIH Toolbox, 2) pretest candidate measures for feasibility across the age span of people aged 3 to 85 years, and 3) validate candidate measures against criterion measures in a sample of healthy individuals aged 3 to 85 years (n = 340). Based on extensive literature review and input from content experts, the 5 subdomains of dexterity, strength, balance, locomotion, and endurance were recommended for inclusion in the NIH Toolbox motor battery. Based on our validation testing, valid and reliable measures that are simultaneously low-cost and portable have been recommended to assess each subdomain, including the 9-hole peg board for dexterity, grip dynamometry for upper-extremity strength, standing balance test, 4-m walk test for gait speed, and a 2-minute walk test for endurance. PMID:23479547

  8. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    PubMed

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  9. Data Driven Model Development for the Supersonic Semispan Transport (S(sup 4)T)

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2011-01-01

    We investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models and a data-driven system identification procedure. It is shown via analysis of experimental Super- Sonic SemiSpan Transport (S4T) wind-tunnel data using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.

  10. An ethics toolbox for neurotechnology.

    PubMed

    Farah, Martha J

    2015-04-08

    Advances in neurotechnology will raise new ethical dilemmas, to which scientists and the rest of society must respond. Here I present a "toolbox" of concepts to help us analyze these issues and communicate with each other about them across differences of ethical intuition. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. The triticeae toolbox: combining phenotype and genotype data to advance small-grains breeding

    USDA-ARS?s Scientific Manuscript database

    The Triticeae Toolbox (http://triticeaetoolbox.org; T3) is the database schema enabling plant breeders and researchers to combine, visualize, and interrogate the wealth of phenotype and genotype data generated by the Triticeae Coordinated Agricultural Project (TCAP). T3 enables users to define speci...

  12. Wastewater Collection System Toolbox | Eliminating Sanitary ...

    EPA Pesticide Factsheets

    2017-04-10

    Communities across the United States are working to find cost-effective, long-term approaches to managing their aging wastewater infrastructure and preventing the problems that lead to sanitary sewer overflows. The Toolbox is an effort by EPA New England to provide examples of programs and educational efforts from New England and beyond.

  13. LAB ANALYSIS OF EMERGENCY WATER SAMPLES CONTAINING UNKNOWN CONTAMINANTS: CONSIDERATIONS FROM THE USEPA RESPONSE PROTOCOL TOOLBOX

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  14. 40 CFR 141.717 - Pre-filtration treatment toolbox components.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... surface water or GWUDI source. (c) Bank filtration. Systems receive Cryptosporidium treatment credit for... paragraph. Systems using bank filtration when they begin source water monitoring under § 141.701(a) must... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Pre-filtration treatment toolbox...

  15. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  16. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    ERIC Educational Resources Information Center

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  17. Multi-damage identification based on joint approximate diagonalisation and robust distance measure

    NASA Astrophysics Data System (ADS)

    Cao, S.; Ouyang, H.

    2017-05-01

    Mode shapes or operational deflection shapes are highly sensitive to damage and can be used for multi-damage identification. Nevertheless, one drawback of this kind of methods is that the extracted spatial shape features tend to be compromised by noise, which degrades their damage identification accuracy, especially for incipient damage. To overcome this, joint approximate diagonalisation (JAD) also known as simultaneous diagonalisation is investigated to estimate mode shapes (MS’s) statistically. The major advantage of JAD method is that it efficiently provides the common Eigen-structure of a set of power spectral density matrices. In this paper, a new criterion in terms of coefficient of variation (CV) is utilised to numerically demonstrate the better noise robustness and accuracy of JAD method over traditional frequency domain decomposition method (FDD). Another original contribution is that a new robust damage index (DI) is proposed, which is comprised of local MS distortions of several modes weighted by their associated vibration participation factors. The advantage of doing this is to include fair contributions from changes of all modes concerned. Moreover, the proposed DI provides a measure of damage-induced changes in ‘modal vibration energy’ in terms of the selected mode shapes. Finally, an experimental study is presented to verify the efficiency and noise robustness of JAD method and the proposed DI. The results show that the proposed DI is effective and robust under random vibration situations, which indicates that it has the potential to be applied to practical engineering structures with ambient excitations.

  18. Mechanostimulation protocols for cardiac tissue engineering.

    PubMed

    Govoni, Marco; Muscari, Claudio; Guarnieri, Carlo; Giordano, Emanuele

    2013-01-01

    Owing to the inability of self-replacement by a damaged myocardium, alternative strategies to heart transplantation have been explored within the last decades and cardiac tissue engineering/regenerative medicine is among the present challenges in biomedical research. Hopefully, several studies witness the constant extension of the toolbox available to engineer a fully functional, contractile, and robust cardiac tissue using different combinations of cells, template bioscaffolds, and biophysical stimuli obtained by the use of specific bioreactors. Mechanical forces influence the growth and shape of every tissue in our body generating changes in intracellular biochemistry and gene expression. That is why bioreactors play a central role in the task of regenerating a complex tissue such as the myocardium. In the last fifteen years a large number of dynamic culture devices have been developed and many results have been collected. The aim of this brief review is to resume in a single streamlined paper the state of the art in this field.

  19. Mechanostimulation Protocols for Cardiac Tissue Engineering

    PubMed Central

    Govoni, Marco; Muscari, Claudio; Guarnieri, Carlo; Giordano, Emanuele

    2013-01-01

    Owing to the inability of self-replacement by a damaged myocardium, alternative strategies to heart transplantation have been explored within the last decades and cardiac tissue engineering/regenerative medicine is among the present challenges in biomedical research. Hopefully, several studies witness the constant extension of the toolbox available to engineer a fully functional, contractile, and robust cardiac tissue using different combinations of cells, template bioscaffolds, and biophysical stimuli obtained by the use of specific bioreactors. Mechanical forces influence the growth and shape of every tissue in our body generating changes in intracellular biochemistry and gene expression. That is why bioreactors play a central role in the task of regenerating a complex tissue such as the myocardium. In the last fifteen years a large number of dynamic culture devices have been developed and many results have been collected. The aim of this brief review is to resume in a single streamlined paper the state of the art in this field. PMID:23936858

  20. A convolutional neural network approach to calibrating the rotation axis for X-ray computed tomography.

    PubMed

    Yang, Xiaogang; De Carlo, Francesco; Phatak, Charudatta; Gürsoy, Dogˇa

    2017-03-01

    This paper presents an algorithm to calibrate the center-of-rotation for X-ray tomography by using a machine learning approach, the Convolutional Neural Network (CNN). The algorithm shows excellent accuracy from the evaluation of synthetic data with various noise ratios. It is further validated with experimental data of four different shale samples measured at the Advanced Photon Source and at the Swiss Light Source. The results are as good as those determined by visual inspection and show better robustness than conventional methods. CNN has also great potential for reducing or removing other artifacts caused by instrument instability, detector non-linearity, etc. An open-source toolbox, which integrates the CNN methods described in this paper, is freely available through GitHub at tomography/xlearn and can be easily integrated into existing computational pipelines available at various synchrotron facilities. Source code, documentation and information on how to contribute are also provided.

  1. In vivo measurement of muscle output in intact Drosophila.

    PubMed

    Elliott, Christopher J H; Sparrow, John C

    2012-01-01

    We describe our methods for analysing muscle function in a whole intact small insect, taking advantage of a simple flexible optical beam to produce an inexpensive transducer with wide application. We review our previous data measuring the response to a single action potential driven muscle twitch to explore jumping behaviour in Drosophila melanogaster. In the fruitfly, where the sophisticated and powerful genetic toolbox is being widely employed to investigate neuromuscular function, we further demonstrate the use of the apparatus to analyse in detail, within whole flies, neuronal and muscle mutations affecting activation of muscle contraction in the jump muscle. We have now extended the use of the apparatus to record the muscle forces during larval and other aspects of adult locomotion. The robustness, simplicity and versatility of the apparatus are key to these measurements. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Design And Implementation Of PID Controller Using Relay Feedback On TRMS (Twin Rotor MIMO System)

    NASA Astrophysics Data System (ADS)

    Shah, Dipesh H.

    2011-12-01

    Today, many process control problems can be adequately and routinely solved by conventional PID control strategies. The overriding reason that the PID controller is so widely accepted is its simple structure which has proved to be very robust with regard to many commonly met process control problems as for instance disturbances and nonlinearities. Relay feedback methods have been widely used in tuning proportional-integral-derivative controllers due to its closed loop nature. In this work, Relay based PID controller is designed and successfully implemented on TRMS (Twin Rotor MIMO System) in SISO and MIMO configurations. The performance of a Relay based PID controller for control of TRMS is investigated and performed satisfactorily. The system shares some features with a helicopter, such as important interactions between the vertical and horizontal motions. The RTWT toolbox in the MATLAB environment is used to perform real-time experiments.

  3. Fault-tolerant Control of a Cyber-physical System

    NASA Astrophysics Data System (ADS)

    Roxana, Rusu-Both; Eva-Henrietta, Dulf

    2017-10-01

    Cyber-physical systems represent a new emerging field in automatic control. The fault system is a key component, because modern, large scale processes must meet high standards of performance, reliability and safety. Fault propagation in large scale chemical processes can lead to loss of production, energy, raw materials and even environmental hazard. The present paper develops a multi-agent fault-tolerant control architecture using robust fractional order controllers for a (13C) cryogenic separation column cascade. The JADE (Java Agent DEvelopment Framework) platform was used to implement the multi-agent fault tolerant control system while the operational model of the process was implemented in Matlab/SIMULINK environment. MACSimJX (Multiagent Control Using Simulink with Jade Extension) toolbox was used to link the control system and the process model. In order to verify the performance and to prove the feasibility of the proposed control architecture several fault simulation scenarios were performed.

  4. Robust nano-fabrication of an integrated platform for spin control in a tunable microcavity

    NASA Astrophysics Data System (ADS)

    Bogdanović, Stefan; Liddy, Madelaine S. Z.; van Dam, Suzanne B.; Coenen, Lisanne C.; Fink, Thomas; Lončar, Marko; Hanson, Ronald

    2017-12-01

    Coupling nitrogen-vacancy (NV) centers in diamonds to optical cavities is a promising way to enhance the efficiency of diamond-based quantum networks. An essential aspect of the full toolbox required for the operation of these networks is the ability to achieve the microwave control of the electron spin associated with this defect within the cavity framework. Here, we report on the fabrication of an integrated platform for the microwave control of an NV center electron spin in an open, tunable Fabry-Pérot microcavity. A critical aspect of the measurements of the cavity's finesse reveals that the presented fabrication process does not compromise its optical properties. We provide a method to incorporate a thin diamond slab into the cavity architecture and demonstrate the control of the NV center spin. These results show the promise of this design for future cavity-enhanced NV center spin-photon entanglement experiments.

  5. Impact-oriented steering--the concept of NGO-IDEAs 'impact toolbox'.

    PubMed

    2008-03-01

    The NGO-IDEAs 'Impact Toolbox' has been developed with a group of NGOs all of which are active in the area of saving and credit in South India. This compilation of methods to apply in impact-oriented steering was devised by the executive staff of the Indian partner NGOs, also known as the Resource Persons, in 2006 and tested from late 2006 to early 2007. At first glance, the approach may appear to be highly specialised and difficult to transfer. However, in fact it follows principles that can be adapted for several NGOs in other countries and in other sectors. The following article presents the concept of the NGO-IDEAs 'Impact Toolbox'.

  6. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    PubMed

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  7. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing

    PubMed Central

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment. PMID:25101013

  8. ViennaNGS: A toolbox for building efficient next- generation sequencing analysis pipelines

    PubMed Central

    Wolfinger, Michael T.; Fallmann, Jörg; Eggenhofer, Florian; Amman, Fabian

    2015-01-01

    Recent achievements in next-generation sequencing (NGS) technologies lead to a high demand for reuseable software components to easily compile customized analysis workflows for big genomics data. We present ViennaNGS, an integrated collection of Perl modules focused on building efficient pipelines for NGS data processing. It comes with functionality for extracting and converting features from common NGS file formats, computation and evaluation of read mapping statistics, as well as normalization of RNA abundance. Moreover, ViennaNGS provides software components for identification and characterization of splice junctions from RNA-seq data, parsing and condensing sequence motif data, automated construction of Assembly and Track Hubs for the UCSC genome browser, as well as wrapper routines for a set of commonly used NGS command line tools. PMID:26236465

  9. Approaches for assessing and discovering protein interactions in cancer

    PubMed Central

    Mohammed, Hisham; Carroll, Jason S.

    2013-01-01

    Significant insight into the function of proteins, can be delineated by discovering and characterising interacting proteins. There are numerous methods for the discovery of unknown associated protein networks, with purification of the bait (the protein of interest) followed by Mass Spectrometry (MS) as a common theme. In recent years, advances have permitted the purification of endogenous proteins and methods for scaling down starting material. As such, approaches for rapid, unbiased identification of protein interactomes are becoming a standard tool in the researchers toolbox, rather than a technique that is only available to specialists. This review will highlight some of the recent technical advances in proteomic based discovery approaches, the pros and cons of various methods and some of the key findings in cancer related systems. PMID:24072816

  10. Engineering a robust DNA split proximity circuit with minimized circuit leakage

    PubMed Central

    Ang, Yan Shan; Tong, Rachel; Yung, Lin-Yue Lanry

    2016-01-01

    DNA circuit is a versatile and highly-programmable toolbox which can potentially be used for the autonomous sensing of dynamic events, such as biomolecular interactions. However, the experimental implementation of in silico circuit designs has been hindered by the problem of circuit leakage. Here, we systematically analyzed the sources and characteristics of various types of leakage in a split proximity circuit which was engineered to spatially probe for target sites held within close proximity. Direct evidence that 3′-truncated oligonucleotides were the major impurity contributing to circuit leakage was presented. More importantly, a unique strategy of translocating a single nucleotide between domains, termed ‘inter-domain bridging’, was introduced to eliminate toehold-independent leakages while enhancing the strand displacement kinetics across a three-way junction. We also analyzed the dynamics of intermediate complexes involved in the circuit computation in order to define the working range of domain lengths for the reporter toehold and association region respectively. The final circuit design was successfully implemented on a model streptavidin-biotin system and demonstrated to be robust against both circuit leakage and biological interferences. We anticipate that this simple signal transduction strategy can be used to probe for diverse biomolecular interactions when used in conjunction with specific target recognition moieties. PMID:27207880

  11. How can sensitivity analysis improve the robustness of mathematical models utilized by the re/insurance industry?

    NASA Astrophysics Data System (ADS)

    Noacco, V.; Wagener, T.; Pianosi, F.; Philp, T.

    2017-12-01

    Insurance companies provide insurance against a wide range of threats, such as natural catastrophes, nuclear incidents and terrorism. To quantify risk and support investment decisions, mathematical models are used, for example to set the premiums charged to clients that protect from financial loss, should deleterious events occur. While these models are essential tools for adequately assessing the risk attached to an insurer's portfolio, their development is costly and their value for decision-making may be limited by an incomplete understanding of uncertainty and sensitivity. Aside from the business need to understand risk and uncertainty, the insurance sector also faces regulation which requires them to test their models in such a way that uncertainties are appropriately captured and that plans are in place to assess the risks and their mitigation. The building and testing of models constitutes a high cost for insurance companies, and it is a time intensive activity. This study uses an established global sensitivity analysis toolbox (SAFE) to more efficiently capture the uncertainties and sensitivities embedded in models used by a leading re/insurance firm, with structured approaches to validate these models and test the impact of assumptions on the model predictions. It is hoped that this in turn will lead to better-informed and more robust business decisions.

  12. Chemical Microsensor Development for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Xu, Jennifer C.; Hunter, Gary W.; Lukco, Dorothy; Chen, Liangyu; Biaggi-Labiosa, Azlin M.

    2013-01-01

    Numerous aerospace applications, including low-false-alarm fire detection, environmental monitoring, fuel leak detection, and engine emission monitoring, would benefit greatly from robust and low weight, cost, and power consumption chemical microsensors. NASA Glenn Research Center has been working to develop a variety of chemical microsensors with these attributes to address the aforementioned applications. Chemical microsensors using different material platforms and sensing mechanisms have been produced. Approaches using electrochemical cells, resistors, and Schottky diode platforms, combined with nano-based materials, high temperature solid electrolytes, and room temperature polymer electrolytes have been realized to enable different types of microsensors. By understanding the application needs and chemical gas species to be detected, sensing materials and unique microfabrication processes were selected and applied. The chemical microsensors were designed utilizing simple structures and the least number of microfabrication processes possible, while maintaining high yield and low cost. In this presentation, an overview of carbon dioxide (CO2), oxygen (O2), and hydrogen/hydrocarbons (H2/CxHy) microsensors and their fabrication, testing results, and applications will be described. Particular challenges associated with improving the H2/CxHy microsensor contact wire-bonding pad will be discussed. These microsensors represent our research approach and serve as major tools as we expand our sensor development toolbox. Our ultimate goal is to develop robust chemical microsensor systems for aerospace and commercial applications.

  13. Striking against bioterrorism with advanced proteomics and reference methods.

    PubMed

    Armengaud, Jean

    2017-01-01

    The intentional use by terrorists of biological toxins as weapons has been of great concern for many years. Among the numerous toxins produced by plants, animals, algae, fungi, and bacteria, ricin is one of the most scrutinized by the media because it has already been used in biocrimes and acts of bioterrorism. Improving the analytical toolbox of national authorities to monitor these potential bioweapons all at once is of the utmost interest. MS/MS allows their absolute quantitation and exhibits advantageous sensitivity, discriminative power, multiplexing possibilities, and speed. In this issue of Proteomics, Gilquin et al. (Proteomics 2017, 17, 1600357) present a robust multiplex assay to quantify a set of eight toxins in the presence of a complex food matrix. This MS/MS reference method is based on scheduled SRM and high-quality standards consisting of isotopically labeled versions of these toxins. Their results demonstrate robust reliability based on rather loose scheduling of SRM transitions and good sensitivity for the eight toxins, lower than their oral median lethal doses. In the face of an increased threat from terrorism, relevant reference assays based on advanced proteomics and high-quality companion toxin standards are reliable and firm answers. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  15. De-identification of health records using Anonym: effectiveness and robustness across datasets.

    PubMed

    Zuccon, Guido; Kotzur, Daniel; Nguyen, Anthony; Bergheim, Anton

    2014-07-01

    Evaluate the effectiveness and robustness of Anonym, a tool for de-identifying free-text health records based on conditional random fields classifiers informed by linguistic and lexical features, as well as features extracted by pattern matching techniques. De-identification of personal health information in electronic health records is essential for the sharing and secondary usage of clinical data. De-identification tools that adapt to different sources of clinical data are attractive as they would require minimal intervention to guarantee high effectiveness. The effectiveness and robustness of Anonym are evaluated across multiple datasets, including the widely adopted Integrating Biology and the Bedside (i2b2) dataset, used for evaluation in a de-identification challenge. The datasets used here vary in type of health records, source of data, and their quality, with one of the datasets containing optical character recognition errors. Anonym identifies and removes up to 96.6% of personal health identifiers (recall) with a precision of up to 98.2% on the i2b2 dataset, outperforming the best system proposed in the i2b2 challenge. The effectiveness of Anonym across datasets is found to depend on the amount of information available for training. Findings show that Anonym compares to the best approach from the 2006 i2b2 shared task. It is easy to retrain Anonym with new datasets; if retrained, the system is robust to variations of training size, data type and quality in presence of sufficient training data. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  16. Evaluation of mass spectrometric data using principal component analysis for determination of the effects of organic lakes on protein binder identification.

    PubMed

    Hrdlickova Kuckova, Stepanka; Rambouskova, Gabriela; Hynek, Radovan; Cejnar, Pavel; Oltrogge, Doris; Fuchs, Robert

    2015-11-01

    Matrix-assisted laser desorption/ionisation-time of flight (MALDI-TOF) mass spectrometry is commonly used for the identification of proteinaceous binders and their mixtures in artworks. The determination of protein binders is based on a comparison between the m/z values of tryptic peptides in the unknown sample and a reference one (egg, casein, animal glues etc.), but this method has greater potential to study changes due to ageing and the influence of organic/inorganic components on protein identification. However, it is necessary to then carry out statistical evaluation on the obtained data. Before now, it has been complicated to routinely convert the mass spectrometric data into a statistical programme, to extract and match the appropriate peaks. Only several 'homemade' computer programmes without user-friendly interfaces are available for these purposes. In this paper, we would like to present our completely new, publically available, non-commercial software, ms-alone and multiMS-toolbox, for principal component analyses of MALDI-TOF MS data for R software, and their application to the study of the influence of heterogeneous matrices (organic lakes) for protein identification. Using this new software, we determined the main factors that influence the protein analyses of artificially aged model mixtures of organic lakes and fish glue, prepared according to historical recipes that were used for book illumination, using MALDI-TOF peptide mass mapping. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Policy Analysis for Sustainable Development: The Toolbox for the Environmental Social Scientist

    ERIC Educational Resources Information Center

    Runhaar, Hens; Dieperink, Carel; Driessen, Peter

    2006-01-01

    Purpose: The paper seeks to propose the basic competencies of environmental social scientists regarding policy analysis for sustainable development. The ultimate goal is to contribute to an improvement of educational programmes in higher education by suggesting a toolbox that should be integrated in the curriculum. Design/methodology/approach:…

  18. EPA RESPONSE PROTOCOL TOOLBOX TO HELP EVALUATION OF CONTAMINATION THREATS & RESPONDING TO THREATS: MODULE 1-WATER UTILITY PLANNING GUIDE

    EPA Science Inventory

    EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...

  19. Rural ITS toolbox and deployment plan for Regions 2, 6, 7 and 9 : ITS toolbox for rural and small urban areas

    DOT National Transportation Integrated Search

    1998-12-01

    As a part of the Small Urban and Rural ITS Study it conducted in 4 of its more rural regions, the New York State Department of Transportation has developed a compendium of systems, devices and strategies that can enhance safety, provide information, ...

  20. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. OVERVIEW AND APPLICATION. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  1. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. MODULE 4: ANALYTICAL GUIDE. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  2. The Psychometric Toolbox: An Excel Package for Use in Measurement and Psychometrics Courses

    ERIC Educational Resources Information Center

    Ferrando, Pere J.; Masip-Cabrera, Antoni; Navarro-González, David; Lorenzo-Seva, Urbano

    2017-01-01

    The Psychometric Toolbox (PT) is a user-friendly, non-commercial package mainly intended to be used for instructional purposes in introductory courses of educational and psychological measurement, psychometrics and statistics. The PT package is organized in six separate modules or sub-programs: Data preprocessor (descriptive analyses and data…

  3. Toolbox or Adjustable Spanner? A Critical Comparison of Two Metaphors for Adaptive Decision Making

    ERIC Educational Resources Information Center

    Söllner, Anke; Bröder, Arndt

    2016-01-01

    For multiattribute decision tasks, different metaphors exist that describe the process of decision making and its adaptation to diverse problems and situations. Multiple strategy models (MSMs) assume that decision makers choose adaptively from a set of different strategies (toolbox metaphor), whereas evidence accumulation models (EAMs) hold that a…

  4. FALCON: a toolbox for the fast contextualization of logical networks

    PubMed Central

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-01-01

    Abstract Motivation Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. Results We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. Availability and implementation FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. Contact thomas.sauter@uni.lu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28673016

  5. FALCON: a toolbox for the fast contextualization of logical networks.

    PubMed

    De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas

    2017-11-01

    Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. thomas.sauter@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  6. TopoToolbox: Using Sensor Topography to Calculate Psychologically Meaningful Measures from Event-Related EEG/MEG

    PubMed Central

    Tian, Xing; Poeppel, David; Huber, David E.

    2011-01-01

    The open-source toolbox “TopoToolbox” is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004). PMID:21577268

  7. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    NASA Astrophysics Data System (ADS)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinuesa, Ricardo; Fick, Lambert; Negi, Prabal

    In the present document we describe a toolbox for the spectral-element code Nek5000, aimed at computing turbulence statistics. The toolbox is presented for a small test case, namely a square duct with L x = 2h, L y = 2h and L z = 4h, where x, y and z are the horizontal, vertical and streamwise directions, respectively. The number of elements in the xy-plane is 16 X 16 = 256, and the number of elements in z is 4, leading to a total of 1,204 spectral elements. A polynomial order of N = 5 is chosen, and the meshmore » is generated using the Nek5000 tool genbox. The toolbox presented here allows to compute mean-velocity components, the Reynolds-stress tensor as well as turbulent kinetic energy (TKE) and Reynolds-stress budgets. Note that the present toolbox allows to compute turbulence statistics in turbulent flows with one homogeneous direction (where the statistics are based on time-averaging as well as averaging in the homogeneous direction), as well as in fully three-dimensional flows (with no periodic directions, where only time-averaging is considered).« less

  9. Robust finger vein ROI localization based on flexible segmentation.

    PubMed

    Lu, Yu; Xie, Shan Juan; Yoon, Sook; Yang, Jucheng; Park, Dong Sun

    2013-10-24

    Finger veins have been proved to be an effective biometric for personal identification in the recent years. However, finger vein images are easily affected by influences such as image translation, orientation, scale, scattering, finger structure, complicated background, uneven illumination, and collection posture. All these factors may contribute to inaccurate region of interest (ROI) definition, and so degrade the performance of finger vein identification system. To improve this problem, in this paper, we propose a finger vein ROI localization method that has high effectiveness and robustness against the above factors. The proposed method consists of a set of steps to localize ROIs accurately, namely segmentation, orientation correction, and ROI detection. Accurate finger region segmentation and correct calculated orientation can support each other to produce higher accuracy in localizing ROIs. Extensive experiments have been performed on the finger vein image database, MMCBNU_6000, to verify the robustness of the proposed method. The proposed method shows the segmentation accuracy of 100%. Furthermore, the average processing time of the proposed method is 22 ms for an acquired image, which satisfies the criterion of a real-time finger vein identification system.

  10. Robust Finger Vein ROI Localization Based on Flexible Segmentation

    PubMed Central

    Lu, Yu; Xie, Shan Juan; Yoon, Sook; Yang, Jucheng; Park, Dong Sun

    2013-01-01

    Finger veins have been proved to be an effective biometric for personal identification in the recent years. However, finger vein images are easily affected by influences such as image translation, orientation, scale, scattering, finger structure, complicated background, uneven illumination, and collection posture. All these factors may contribute to inaccurate region of interest (ROI) definition, and so degrade the performance of finger vein identification system. To improve this problem, in this paper, we propose a finger vein ROI localization method that has high effectiveness and robustness against the above factors. The proposed method consists of a set of steps to localize ROIs accurately, namely segmentation, orientation correction, and ROI detection. Accurate finger region segmentation and correct calculated orientation can support each other to produce higher accuracy in localizing ROIs. Extensive experiments have been performed on the finger vein image database, MMCBNU_6000, to verify the robustness of the proposed method. The proposed method shows the segmentation accuracy of 100%. Furthermore, the average processing time of the proposed method is 22 ms for an acquired image, which satisfies the criterion of a real-time finger vein identification system. PMID:24284769

  11. Cross-species 3D virtual reality toolbox for visual and cognitive experiments.

    PubMed

    Doucet, Guillaume; Gulli, Roberto A; Martinez-Trujillo, Julio C

    2016-06-15

    Although simplified visual stimuli, such as dots or gratings presented on homogeneous backgrounds, provide strict control over the stimulus parameters during visual experiments, they fail to approximate visual stimulation in natural conditions. Adoption of virtual reality (VR) in neuroscience research has been proposed to circumvent this problem, by combining strict control of experimental variables and behavioral monitoring within complex and realistic environments. We have created a VR toolbox that maximizes experimental flexibility while minimizing implementation costs. A free VR engine (Unreal 3) has been customized to interface with any control software via text commands, allowing seamless introduction into pre-existing laboratory data acquisition frameworks. Furthermore, control functions are provided for the two most common programming languages used in visual neuroscience: Matlab and Python. The toolbox offers milliseconds time resolution necessary for electrophysiological recordings and is flexible enough to support cross-species usage across a wide range of paradigms. Unlike previously proposed VR solutions whose implementation is complex and time-consuming, our toolbox requires minimal customization or technical expertise to interface with pre-existing data acquisition frameworks as it relies on already familiar programming environments. Moreover, as it is compatible with a variety of display and input devices, identical VR testing paradigms can be used across species, from rodents to humans. This toolbox facilitates the addition of VR capabilities to any laboratory without perturbing pre-existing data acquisition frameworks, or requiring any major hardware changes. Copyright © 2016 Z. All rights reserved.

  12. Optimizing detection and analysis of slow waves in sleep EEG.

    PubMed

    Mensen, Armand; Riedner, Brady; Tononi, Giulio

    2016-12-01

    Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction.

    PubMed

    Abulnaga, S Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M; Onyike, Chiadi U; Ying, Sarah H; Prince, Jerry L

    2016-02-27

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  14. A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction

    NASA Astrophysics Data System (ADS)

    Abulnaga, S. Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M.; Onyike, Chiadi U.; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.

  15. Online model evaluation of large-eddy simulations covering Germany with a horizontal resolution of 156 m

    NASA Astrophysics Data System (ADS)

    Hansen, Akio; Ament, Felix; Lammert, Andrea

    2017-04-01

    Large-eddy simulations have been performed since several decades, but due to computational limits most studies were restricted to small domains or idealised initial-/boundary conditions. Within the High definition clouds and precipitation for advancing climate prediction (HD(CP)2) project realistic weather forecasting like LES simulations were performed with the newly developed ICON LES model for several days. The domain covers central Europe with a horizontal resolution down to 156 m. The setup consists of more than 3 billion grid cells, by what one 3D dump requires roughly 500 GB. A newly developed online evaluation toolbox was created to check instantaneously for realistic model simulations. The toolbox automatically combines model results with observations and generates several quicklooks for various variables. So far temperature-/humidity profiles, cloud cover, integrated water vapour, precipitation and many more are included. All kind of observations like aircraft observations, soundings or precipitation radar networks are used. For each dataset, a specific module is created, which allows for an easy handling and enhancement of the toolbox. Most of the observations are automatically downloaded from the Standardized Atmospheric Measurement Database (SAMD). The evaluation tool should support scientists at monitoring computational costly model simulations as well as to give a first overview about model's performance. The structure of the toolbox as well as the SAMD database are presented. Furthermore, the toolbox was applied on an ICON LES sensitivity study, where example results are shown.

  16. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  17. U.S. Geological Survey groundwater toolbox, a graphical and mapping interface for analysis of hydrologic data (version 1.0): user guide for estimation of base flow, runoff, and groundwater recharge from streamflow data

    USGS Publications Warehouse

    Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark

    2015-01-01

    This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.

  18. Robust time and frequency domain estimation methods in adaptive control

    NASA Technical Reports Server (NTRS)

    Lamaire, Richard Orville

    1987-01-01

    A robust identification method was developed for use in an adaptive control system. The type of estimator is called the robust estimator, since it is robust to the effects of both unmodeled dynamics and an unmeasurable disturbance. The development of the robust estimator was motivated by a need to provide guarantees in the identification part of an adaptive controller. To enable the design of a robust control system, a nominal model as well as a frequency-domain bounding function on the modeling uncertainty associated with this nominal model must be provided. Two estimation methods are presented for finding parameter estimates, and, hence, a nominal model. One of these methods is based on the well developed field of time-domain parameter estimation. In a second method of finding parameter estimates, a type of weighted least-squares fitting to a frequency-domain estimated model is used. The frequency-domain estimator is shown to perform better, in general, than the time-domain parameter estimator. In addition, a methodology for finding a frequency-domain bounding function on the disturbance is used to compute a frequency-domain bounding function on the additive modeling error due to the effects of the disturbance and the use of finite-length data. The performance of the robust estimator in both open-loop and closed-loop situations is examined through the use of simulations.

  19. Evidence for a Common Toolbox Based on Necrotrophy in a Fungal Lineage Spanning Necrotrophs, Biotrophs, Endophytes, Host Generalists and Specialists

    PubMed Central

    Andrew, Marion; Barua, Reeta; Short, Steven M.; Kohn, Linda M.

    2012-01-01

    The Sclerotiniaceae (Ascomycotina, Leotiomycetes) is a relatively recently evolved lineage of necrotrophic host generalists, and necrotrophic or biotrophic host specialists, some latent or symptomless. We hypothesized that they inherited a basic toolbox of genes for plant symbiosis from their common ancestor. Maintenance and evolutionary diversification of symbiosis could require selection on toolbox genes or on timing and magnitude of gene expression. The genes studied were chosen because their products have been previously investigated as pathogenicity factors in the Sclerotiniaceae. They encode proteins associated with cell wall degradation: acid protease 1 (acp1), aspartyl protease (asps), and polygalacturonases (pg1, pg3, pg5, pg6), and the oxalic acid (OA) pathway: a zinc finger transcription factor (pac1), and oxaloacetate acetylhydrolase (oah), catalyst in OA production, essential for full symptom production in Sclerotinia sclerotiorum. Site-specific likelihood analyses provided evidence for purifying selection in all 8 pathogenicity-related genes. Consistent with an evolutionary arms race model, positive selection was detected in 5 of 8 genes. Only generalists produced large, proliferating disease lesions on excised Arabidopsis thaliana leaves and oxalic acid by 72 hours in vitro. In planta expression of oah was 10–300 times greater among the necrotrophic host generalists than necrotrophic and biotrophic host specialists; pac1 was not differentially expressed. Ability to amplify 6/8 pathogenicity related genes and produce oxalic acid in all genera are consistent with the common toolbox hypothesis for this gene sample. That our data did not distinguish biotrophs from necrotrophs is consistent with 1) a common toolbox based on necrotrophy and 2) the most conservative interpretation of the 3-locus housekeeping gene phylogeny – a baseline of necrotrophy from which forms of biotrophy emerged at least twice. Early oah overexpression likely expands the host range of necrotrophic generalists in the Sclerotiniaceae, while specialists and biotrophs deploy oah, or other as-yet-unknown toolbox genes, differently. PMID:22253834

  20. The 'Toolbox' of strategies for managing Haemonchus contortus in goats: What's in and what's out.

    PubMed

    Kearney, P E; Murray, P J; Hoy, J M; Hohenhaus, M; Kotze, A

    2016-04-15

    A dynamic and innovative approach to managing the blood-consuming nematode Haemonchus contortus in goats is critical to crack dependence on veterinary anthelmintics. H. contortus management strategies have been the subject of intense research for decades, and must be selected to create a tailored, individualized program for goat farms. Through the selection and combination of strategies from the Toolbox, an effective management program for H. contortus can be designed according to the unique conditions of each particular farm. This Toolbox investigates strategies including vaccines, bioactive forages, pasture/grazing management, behavioural management, natural immunity, FAMACHA, Refugia and strategic drenching, mineral/vitamin supplementation, copper Oxide Wire Particles (COWPs), breeding and selection/selecting resistant and resilient individuals, biological control and anthelmintic drugs. Barbervax(®), the ground-breaking Haemonchus vaccine developed and currently commercially available on a pilot scale for sheep, is prime for trialling in goats and would be an invaluable inclusion to this Toolbox. The specialised behaviours of goats, specifically their preferences to browse a variety of plants and accompanying physiological adaptations to the consumption of secondary compounds contained in browse, have long been unappreciated and thus overlooked as a valuable, sustainable strategy for Haemonchus management. These strategies are discussed in this review as to their value for inclusion into the 'Toolbox' currently, and the future implications of ongoing research for goat producers. Combining and manipulating strategies such as browsing behaviour, pasture management, bioactive forages and identifying and treating individual animals for haemonchosis, in addition to continuous evaluation of strategy effectiveness, is conducted using a model farm scenario. Selecting strategies from the Toolbox, with regard to their current availability, feasibility, economical cost and potential ease of implementation depending on the systems of production and their complementary nature, is the future of managing H. contortus in farmed goats internationally and maintaining the remaining efficacy of veterinary anthelmintics. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  2. FZUImageReg: A toolbox for medical image registration and dose fusion in cervical cancer radiotherapy

    PubMed Central

    Bai, Penggang; Du, Min; Ni, Xiaolei; Ke, Dongzhong; Tong, Tong

    2017-01-01

    The combination external-beam radiotherapy and high-dose-rate brachytherapy is a standard form of treatment for patients with locally advanced uterine cervical cancer. Personalized radiotherapy in cervical cancer requires efficient and accurate dose planning and assessment across these types of treatment. To achieve radiation dose assessment, accurate mapping of the dose distribution from HDR-BT onto EBRT is extremely important. However, few systems can achieve robust dose fusion and determine the accumulated dose distribution during the entire course of treatment. We have therefore developed a toolbox (FZUImageReg), which is a user-friendly dose fusion system based on hybrid image registration for radiation dose assessment in cervical cancer radiotherapy. The main part of the software consists of a collection of medical image registration algorithms and a modular design with a user-friendly interface, which allows users to quickly configure, test, monitor, and compare different registration methods for a specific application. Owing to the large deformation, the direct application of conventional state-of-the-art image registration methods is not sufficient for the accurate alignment of EBRT and HDR-BT images. To solve this problem, a multi-phase non-rigid registration method using local landmark-based free-form deformation is proposed for locally large deformation between EBRT and HDR-BT images, followed by intensity-based free-form deformation. With the transformation, the software also provides a dose mapping function according to the deformation field. The total dose distribution during the entire course of treatment can then be presented. Experimental results clearly show that the proposed system can achieve accurate registration between EBRT and HDR-BT images and provide radiation dose warping and fusion results for dose assessment in cervical cancer radiotherapy in terms of high accuracy and efficiency. PMID:28388623

  3. Data Driven Model Development for the SuperSonic SemiSpan Transport (S(sup 4)T)

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2011-01-01

    In this report, we will investigate two common approaches to model development for robust control synthesis in the aerospace community; namely, reduced order aeroservoelastic modelling based on structural finite-element and computational fluid dynamics based aerodynamic models, and a data-driven system identification procedure. It is shown via analysis of experimental SuperSonic SemiSpan Transport (S4T) wind-tunnel data that by using a system identification approach it is possible to estimate a model at a fixed Mach, which is parsimonious and robust across varying dynamic pressures.

  4. An experimental toolbox for the generation of cold and ultracold polar molecules

    NASA Astrophysics Data System (ADS)

    Zeppenfeld, Martin; Gantner, Thomas; Glöckner, Rosa; Ibrügger, Martin; Koller, Manuel; Prehn, Alexander; Wu, Xing; Chervenkov, Sotir; Rempe, Gerhard

    2017-01-01

    Cold and ultracold molecules enable fascinating applications in quantum science. We present our toolbox of techniques to generate the required molecule ensembles, including buffergas cooling, centrifuge deceleration and optoelectrical Sisyphus cooling. We obtain excellent control over both the motional and internal molecular degrees of freedom, allowing us to aim at various applications.

  5. Water Power Data and Tools | Water Power | NREL

    Science.gov Websites

    computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically

  6. Tensor Toolbox for MATLAB v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kola, Tamara; Bader, Brett W.; Acar Ataman, Evrim NMN

    Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors using MATLAB's object-oriented features. It also provides algorithms for tensor decomposition and factorization, algorithms for computing tensor eigenvalues, and methods for visualization of results.

  7. MOFA Software for the COBRA Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesemer, Marc; Navid, Ali

    MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.

  8. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS. MODULE 1: WATER UTILITIES PLANNING GUIDE - INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  9. RESPONSE PROTOCOL TOOLBOX: PLANNING FOR AND RESPONDING TO DRINKING WATER CONTAMINATION THREATS AND INCIDENTS, MODULE 3: SITE CHARACTERIZATION AND SAMPLING GUIDE. INTERIM FINAL - DECEMBER 2003

    EPA Science Inventory

    The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...

  10. The panacea toolbox of a PhD biomedical student.

    PubMed

    Skaik, Younis

    2014-01-01

    Doing a PhD (doctor of philosophy) for the sake of contribution to knowledge should give the student an immense enthusiasm through the PhD period. It is the time in one's life that one spends to "hit the nail on the head" in a specific area and topic of interest. A PhD consists mostly of hard work and tenacity; however, luck and genius might also play a little role. You can pass all PhD phases without having both luck and genius. The PhD student should have pre-PhD and PhD toolboxes, which are "sine quibus non" for getting successfully a PhD degree. In this manuscript, the toolboxes of the PhD student are discussed.

  11. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  12. Performance enhancement for audio-visual speaker identification using dynamic facial muscle model.

    PubMed

    Asadpour, Vahid; Towhidkhah, Farzad; Homayounpour, Mohammad Mehdi

    2006-10-01

    Science of human identification using physiological characteristics or biometry has been of great concern in security systems. However, robust multimodal identification systems based on audio-visual information has not been thoroughly investigated yet. Therefore, the aim of this work to propose a model-based feature extraction method which employs physiological characteristics of facial muscles producing lip movements. This approach adopts the intrinsic properties of muscles such as viscosity, elasticity, and mass which are extracted from the dynamic lip model. These parameters are exclusively dependent on the neuro-muscular properties of speaker; consequently, imitation of valid speakers could be reduced to a large extent. These parameters are applied to a hidden Markov model (HMM) audio-visual identification system. In this work, a combination of audio and video features has been employed by adopting a multistream pseudo-synchronized HMM training method. Noise robust audio features such as Mel-frequency cepstral coefficients (MFCC), spectral subtraction (SS), and relative spectra perceptual linear prediction (J-RASTA-PLP) have been used to evaluate the performance of the multimodal system once efficient audio feature extraction methods have been utilized. The superior performance of the proposed system is demonstrated on a large multispeaker database of continuously spoken digits, along with a sentence that is phonetically rich. To evaluate the robustness of algorithms, some experiments were performed on genetically identical twins. Furthermore, changes in speaker voice were simulated with drug inhalation tests. In 3 dB signal to noise ratio (SNR), the dynamic muscle model improved the identification rate of the audio-visual system from 91 to 98%. Results on identical twins revealed that there was an apparent improvement on the performance for the dynamic muscle model-based system, in which the identification rate of the audio-visual system was enhanced from 87 to 96%.

  13. Advancements in robust algorithm formulation for speaker identification of whispered speech

    NASA Astrophysics Data System (ADS)

    Fan, Xing

    Whispered speech is an alternative speech production mode from neutral speech, which is used by talkers intentionally in natural conversational scenarios to protect privacy and to avoid certain content from being overheard/made public. Due to the profound differences between whispered and neutral speech in production mechanism and the absence of whispered adaptation data, the performance of speaker identification systems trained with neutral speech degrades significantly. This dissertation therefore focuses on developing a robust closed-set speaker recognition system for whispered speech by using no or limited whispered adaptation data from non-target speakers. This dissertation proposes the concept of "High''/"Low'' performance whispered data for the purpose of speaker identification. A variety of acoustic properties are identified that contribute to the quality of whispered data. An acoustic analysis is also conducted to compare the phoneme/speaker dependency of the differences between whispered and neutral data in the feature domain. The observations from those acoustic analysis are new in this area and also serve as a guidance for developing robust speaker identification systems for whispered speech. This dissertation further proposes two systems for speaker identification of whispered speech. One system focuses on front-end processing. A two-dimensional feature space is proposed to search for "Low''-quality performance based whispered utterances and separate feature mapping functions are applied to vowels and consonants respectively in order to retain the speaker's information shared between whispered and neutral speech. The other system focuses on speech-mode-independent model training. The proposed method generates pseudo whispered features from neutral features by using the statistical information contained in a whispered Universal Background model (UBM) trained from extra collected whispered data from non-target speakers. Four modeling methods are proposed for the transformation estimation in order to generate the pseudo whispered features. Both of the above two systems demonstrate a significant improvement over the baseline system on the evaluation data. This dissertation has therefore contributed to providing a scientific understanding of the differences between whispered and neutral speech as well as improved front-end processing and modeling method for speaker identification of whispered speech. Such advancements will ultimately contribute to improve the robustness of speech processing systems.

  14. Robust Fault Detection and Isolation for Stochastic Systems

    NASA Technical Reports Server (NTRS)

    George, Jemin; Gregory, Irene M.

    2010-01-01

    This paper outlines the formulation of a robust fault detection and isolation scheme that can precisely detect and isolate simultaneous actuator and sensor faults for uncertain linear stochastic systems. The given robust fault detection scheme based on the discontinuous robust observer approach would be able to distinguish between model uncertainties and actuator failures and therefore eliminate the problem of false alarms. Since the proposed approach involves precise reconstruction of sensor faults, it can also be used for sensor fault identification and the reconstruction of true outputs from faulty sensor outputs. Simulation results presented here validate the effectiveness of the robust fault detection and isolation system.

  15. Extending Inferential Group Analysis in Type 2 Diabetic Patients with Multivariate GLM Implemented in SPM8.

    PubMed

    Ferreira, Fábio S; Pereira, João M S; Duarte, João V; Castelo-Branco, Miguel

    2017-01-01

    Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately - using standard univariate VBM - and simultaneously, with multivariate analyses. Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities.

  16. Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox

    PubMed Central

    Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-01-01

    Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194

  17. COMETS2: An advanced MATLAB toolbox for the numerical analysis of electric fields generated by transcranial direct current stimulation.

    PubMed

    Lee, Chany; Jung, Young-Jin; Lee, Sang Jun; Im, Chang-Hwan

    2017-02-01

    Since there is no way to measure electric current generated by transcranial direct current stimulation (tDCS) inside the human head through in vivo experiments, numerical analysis based on the finite element method has been widely used to estimate the electric field inside the head. In 2013, we released a MATLAB toolbox named COMETS, which has been used by a number of groups and has helped researchers to gain insight into the electric field distribution during stimulation. The aim of this study was to develop an advanced MATLAB toolbox, named COMETS2, for the numerical analysis of the electric field generated by tDCS. COMETS2 can generate any sizes of rectangular pad electrodes on any positions on the scalp surface. To reduce the large computational burden when repeatedly testing multiple electrode locations and sizes, a new technique to decompose the global stiffness matrix was proposed. As examples of potential applications, we observed the effects of sizes and displacements of electrodes on the results of electric field analysis. The proposed mesh decomposition method significantly enhanced the overall computational efficiency. We implemented an automatic electrode modeler for the first time, and proposed a new technique to enhance the computational efficiency. In this paper, an efficient toolbox for tDCS analysis is introduced (freely available at http://www.cometstool.com). It is expected that COMETS2 will be a useful toolbox for researchers who want to benefit from the numerical analysis of electric fields generated by tDCS. Copyright © 2016. Published by Elsevier B.V.

  18. Review of qualitative approaches for the construction industry: designing a risk management toolbox.

    PubMed

    Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul

    2011-06-01

    This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.

  19. Extending Inferential Group Analysis in Type 2 Diabetic Patients with Multivariate GLM Implemented in SPM8

    PubMed Central

    Ferreira, Fábio S.; Pereira, João M.S.; Duarte, João V.; Castelo-Branco, Miguel

    2017-01-01

    Background: Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Objective: Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). Method: We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately – using standard univariate VBM - and simultaneously, with multivariate analyses. Results: Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. Conclusion: While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities. PMID:28761571

  20. T-MATS Toolbox for the Modeling and Analysis of Thermodynamic Systems

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a MATLABSimulink (The MathWorks Inc.) plug-in for creating and simulating thermodynamic systems and controls. The package contains generic parameterized components that can be combined with a variable input iterative solver and optimization algorithm to create complex system models, such as gas turbines.

  1. Various Solution Methods, Accompanied by Dynamic Investigation, for the Same Problem as a Means for Enriching the Mathematical Toolbox

    ERIC Educational Resources Information Center

    Oxman, Victor; Stupel, Moshe

    2018-01-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  2. Various solution methods, accompanied by dynamic investigation, for the same problem as a means for enriching the mathematical toolbox

    NASA Astrophysics Data System (ADS)

    Oxman, Victor; Stupel, Moshe

    2018-04-01

    A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.

  3. GOCE User Toolbox and Tutorial

    NASA Astrophysics Data System (ADS)

    Knudsen, P.; Benveniste, J.

    2011-07-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),

  4. Turbo-Satori: a neurofeedback and brain-computer interface toolbox for real-time functional near-infrared spectroscopy.

    PubMed

    Lührs, Michael; Goebel, Rainer

    2017-10-01

    Turbo-Satori is a neurofeedback and brain-computer interface (BCI) toolbox for real-time functional near-infrared spectroscopy (fNIRS). It incorporates multiple pipelines from real-time preprocessing and analysis to neurofeedback and BCI applications. The toolbox is designed with a focus in usability, enabling a fast setup and execution of real-time experiments. Turbo-Satori uses an incremental recursive least-squares procedure for real-time general linear model calculation and support vector machine classifiers for advanced BCI applications. It communicates directly with common NIRx fNIRS hardware and was tested extensively ensuring that the calculations can be performed in real time without a significant change in calculation times for all sampling intervals during ongoing experiments of up to 6 h of recording. Enabling immediate access to advanced processing features also allows the use of this toolbox for students and nonexperts in the field of fNIRS data acquisition and processing. Flexible network interfaces allow third party stimulus applications to access the processed data and calculated statistics in real time so that this information can be easily incorporated in neurofeedback or BCI presentations.

  5. HYDRORECESSION: A toolbox for streamflow recession analysis

    NASA Astrophysics Data System (ADS)

    Arciniega, S.

    2015-12-01

    Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.

  6. Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis

    PubMed Central

    Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.

    2006-01-01

    In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709

  7. A sigma factor toolbox for orthogonal gene expression in Escherichia coli

    PubMed Central

    Van Brempt, Maarten; Van Nerom, Katleen; Van Hove, Bob; Maertens, Jo; De Mey, Marjan; Charlier, Daniel

    2018-01-01

    Abstract Synthetic genetic sensors and circuits enable programmable control over timing and conditions of gene expression and, as a result, are increasingly incorporated into the control of complex and multi-gene pathways. Size and complexity of genetic circuits are growing, but stay limited by a shortage of regulatory parts that can be used without interference. Therefore, orthogonal expression and regulation systems are needed to minimize undesired crosstalk and allow for dynamic control of separate modules. This work presents a set of orthogonal expression systems for use in Escherichia coli based on heterologous sigma factors from Bacillus subtilis that recognize specific promoter sequences. Up to four of the analyzed sigma factors can be combined to function orthogonally between each other and toward the host. Additionally, the toolbox is expanded by creating promoter libraries for three sigma factors without loss of their orthogonal nature. As this set covers a wide range of transcription initiation frequencies, it enables tuning of multiple outputs of the circuit in response to different sensory signals in an orthogonal manner. This sigma factor toolbox constitutes an interesting expansion of the synthetic biology toolbox and may contribute to the assembly of more complex synthetic genetic systems in the future. PMID:29361130

  8. ICT: isotope correction toolbox.

    PubMed

    Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan

    2016-01-01

    Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Robust detection-isolation-accommodation for sensor failures

    NASA Technical Reports Server (NTRS)

    Weiss, J. L.; Pattipati, K. R.; Willsky, A. S.; Eterno, J. S.; Crawford, J. T.

    1985-01-01

    The results of a one year study to: (1) develop a theory for Robust Failure Detection and Identification (FDI) in the presence of model uncertainty, (2) develop a design methodology which utilizes the robust FDI ththeory, (3) apply the methodology to a sensor FDI problem for the F-100 jet engine, and (4) demonstrate the application of the theory to the evaluation of alternative FDI schemes are presented. Theoretical results in statistical discrimination are used to evaluate the robustness of residual signals (or parity relations) in terms of their usefulness for FDI. Furthermore, optimally robust parity relations are derived through the optimization of robustness metrics. The result is viewed as decentralization of the FDI process. A general structure for decentralized FDI is proposed and robustness metrics are used for determining various parameters of the algorithm.

  10. From black box to toolbox: Outlining device functionality, engagement activities, and the pervasive information architecture of mHealth interventions.

    PubMed

    Danaher, Brian G; Brendryen, Håvar; Seeley, John R; Tyler, Milagra S; Woolley, Tim

    2015-03-01

    mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1) mobile device functionality, which represents the technological toolbox available to intervention developers; and (2) the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be better able to achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture.

  11. Audio fingerprint extraction for content identification

    NASA Astrophysics Data System (ADS)

    Shiu, Yu; Yeh, Chia-Hung; Kuo, C. C. J.

    2003-11-01

    In this work, we present an audio content identification system that identifies some unknown audio material by comparing its fingerprint with those extracted off-line and saved in the music database. We will describe in detail the procedure to extract audio fingerprints and demonstrate that they are robust to noise and content-preserving manipulations. The main feature in the proposed system is the zero-crossing rate extracted with the octave-band filter bank. The zero-crossing rate can be used to describe the dominant frequency in each subband with a very low computational cost. The size of audio fingerprint is small and can be efficiently stored along with the compressed files in the database. It is also robust to many modifications such as tempo change and time-alignment distortion. Besides, the octave-band filter bank is used to enhance the robustness to distortion, especially those localized on some frequency regions.

  12. Smart phones: platform enabling modular, chemical, biological, and explosives sensing

    NASA Astrophysics Data System (ADS)

    Finch, Amethist S.; Coppock, Matthew; Bickford, Justin R.; Conn, Marvin A.; Proctor, Thomas J.; Stratis-Cullum, Dimitra N.

    2013-05-01

    Reliable, robust, and portable technologies are needed for the rapid identification and detection of chemical, biological, and explosive (CBE) materials. A key to addressing the persistent threat to U.S. troops in the current war on terror is the rapid detection and identification of the precursor materials used in development of improvised explosive devices, homemade explosives, and bio-warfare agents. However, a universal methodology for detection and prevention of CBE materials in the use of these devices has proven difficult. Herein, we discuss our efforts towards the development of a modular, robust, inexpensive, pervasive, archival, and compact platform (android based smart phone) enabling the rapid detection of these materials.

  13. Leaf epidermis images for robust identification of plants

    PubMed Central

    da Silva, Núbia Rosa; Oliveira, Marcos William da Silva; Filho, Humberto Antunes de Almeida; Pinheiro, Luiz Felipe Souza; Rossatto, Davi Rodrigo; Kolb, Rosana Marta; Bruno, Odemir Martinez

    2016-01-01

    This paper proposes a methodology for plant analysis and identification based on extracting texture features from microscopic images of leaf epidermis. All the experiments were carried out using 32 plant species with 309 epidermal samples captured by an optical microscope coupled to a digital camera. The results of the computational methods using texture features were compared to the conventional approach, where quantitative measurements of stomatal traits (density, length and width) were manually obtained. Epidermis image classification using texture has achieved a success rate of over 96%, while success rate was around 60% for quantitative measurements taken manually. Furthermore, we verified the robustness of our method accounting for natural phenotypic plasticity of stomata, analysing samples from the same species grown in different environments. Texture methods were robust even when considering phenotypic plasticity of stomatal traits with a decrease of 20% in the success rate, as quantitative measurements proved to be fully sensitive with a decrease of 77%. Results from the comparison between the computational approach and the conventional quantitative measurements lead us to discover how computational systems are advantageous and promising in terms of solving problems related to Botany, such as species identification. PMID:27217018

  14. Toolbox No. 2: Expanding the Role of Foster Parents in Achieving Permanency. Toolboxes for Permanency.

    ERIC Educational Resources Information Center

    Dougherty, Susan

    Noting that over the last decade, the role of a foster parent has evolved from temporary caregiver to essential part of a professional team in determining the best long-term plan for children in their care, this guide focuses on practical ways in which best child welfare practice can be incorporated into the recruitment, training, and support of…

  15. Analytical redundancy and the design of robust failure detection systems

    NASA Technical Reports Server (NTRS)

    Chow, E. Y.; Willsky, A. S.

    1984-01-01

    The Failure Detection and Identification (FDI) process is viewed as consisting of two stages: residual generation and decision making. It is argued that a robust FDI system can be achieved by designing a robust residual generation process. Analytical redundancy, the basis for residual generation, is characterized in terms of a parity space. Using the concept of parity relations, residuals can be generated in a number of ways and the design of a robust residual generation process can be formulated as a minimax optimization problem. An example is included to illustrate this design methodology. Previously announcedd in STAR as N83-20653

  16. A nonlinear disturbance-decoupled elevation axis controller for the Multiple Mirror Telescope

    NASA Astrophysics Data System (ADS)

    Clark, Dusty; Trebisky, Tom; Powell, Keith

    2008-07-01

    The Multiple Mirror Telescope (MMT), upgraded in 2000 to a monolithic 6.5m primary mirror from its original array of six 1.8m primary mirrors, was commissioned with axis controllers designed early in the upgrade process without regard to structural resonances or the possibility of the need for digital filtering of the control axis signal path. Post-commissioning performance issues led us to investigate replacement of the original control system with a more modern digital controller with full control over the system filters and gain paths. This work, from system identification through controller design iteration by simulation, and pre-deployment hardware-in-the-loop testing, was performed using latest-generation tools with Matlab® and Simulink®. Using Simulink's Real Time Workshop toolbox to automatically generate C source code for the controller from the Simulink diagram and a custom target build script, we were able to deploy the new controller into our existing software infrastructure running Wind River's VxWorks™real-time operating system. This paper describes the process of the controller design, including system identification data collection, with discussion of implementation of non-linear control modes and disturbance decoupling, which became necessary to obtain acceptable wind buffeting rejection.

  17. Fission gas bubble identification using MATLAB's image processing toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collette, R.; King, J.; Keiser, Jr., D.

    Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less

  18. Identification of Dynapenia in Older Adults Through the Use of Grip Strength T-Scores

    PubMed Central

    Bohannon, Richard W; Magasi, Susan

    2014-01-01

    Objective To generate reference values and t-scores (1.0 to 2.5 standard deviations below average) for grip strength for healthy young adults and to examine the utility of t-scores from this group for the identification of dynapenia in older adults. Design Secondary analysis of cross-sectional grip strength data from the NIH Toolbox norming sample. Setting Population-based general community sample. Participants Community dwelling adults, between the ages 20 and 40 years (n=558); and 60 to 85 years (n=390) Main Outcomes Measures Grip strength measured with a Jamar plus dynamometer. Results Maximum grip strengths were consistent over the 20–40 year age span. For men they were 108.0 lbs (S.D. 22.6). For women, they were 65.8 lbs (S.D. 14.6) Comparison of older participant grip strengths to those of the younger reference group revealed (depending on age strata) that 46.2–87.1% of older men and 50.0–82.4% of older women could be designated as dynapenic on the basis of t-scores. Conclusion The use of reference value t-scores from younger adults is a promising method for determining dynapenia in older adults. PMID:24729356

  19. Identification of dynapenia in older adults through the use of grip strength t-scores.

    PubMed

    Bohannon, Richard W; Magasi, Susan

    2015-01-01

    The aim of this study was to generate reference values and t-scores (1.0-2.5 standard deviations below average) for grip strength for healthy young adults and to examine the utility of t-scores from this group for the identification of dynapenia in older adults. Our investigation was a population-based, general community secondary analysis of cross-sectional grip strength data utilizing the NIH Toolbox Assessment norming sample. Participants consisted of community-dwelling adults, with age ranges of 20-40 years (n = 558) and 60-85 years (n = 390). The main outcome measure was grip strength using a Jamar plus dynamometer. Maximum grip strengths were consistent over the 20-40-year age group [men 108.0 (SD 22.6) pounds, women 65.8 (SD 14.6) pounds]. Comparison of older group grip strengths to those of the younger reference group revealed (depending on age strata) that 46.2-87.1% of older men and 50.0-82.4% of older women could be designated as dynapenic on the basis of t-scores. The use of reference value t-scores from younger adults is a promising method for determining dynapenia in older adults. © 2014 Wiley Periodicals, Inc.

  20. Fission gas bubble identification using MATLAB's image processing toolbox

    DOE PAGES

    Collette, R.; King, J.; Keiser, Jr., D.; ...

    2016-06-08

    Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less

  1. High resolution critical habitat mapping and classification of tidal freshwater wetlands in the ACE Basin

    NASA Astrophysics Data System (ADS)

    Strickland, Melissa Anne

    In collaboration with the South Carolina Department of Natural Resources ACE Basin National Estuarine Research Reserve (ACE Basin NERR), the tidal freshwater ecosystems along the South Edisto River in the ACE Basin are being accurately mapped and classified using a LIDAR-Remote Sensing Fusion technique that integrates LAS LIDAR data into texture images and then merges the elevation textures and multispectral imagery for very high resolution mapping. This project discusses the development and refinement of an ArcGIS Toolbox capable of automating protocols and procedures for marsh delineation and microhabitat identification. The result is a high resolution habitat and land use map used for the identification of threatened habitat. Tidal freshwater wetlands are also a critical habitat for colonial wading birds and an accurate assessment of community diversity and acreage of this habitat type in the ACE Basin will support SCDNR's conservation and protection efforts. The maps developed by this study will be used to better monitor the freshwater/saltwater interface and establish a baseline for an ACE NERR monitoring program to track the rates and extent of alterations due to projected environmental stressors. Preliminary ground-truthing in the field will provide information about the accuracy of the mapping tool.

  2. ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes

    NASA Astrophysics Data System (ADS)

    Benveniste, J.; Ambrozio, A.; Restano, M.

    2016-12-01

    The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's future release (4.0.0) is planned for September 2016. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.0 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE's variance-covariance matrices. BRAT and GUT toolboxes can be freely downloaded, along with ancillary material, at https://earth.esa.int/brat and https://earth.esa.int/gut.

  3. Simple tool for the rapid, automated quantification of glacier advance/retreat observations using multiple methods

    NASA Astrophysics Data System (ADS)

    Lea, J.

    2017-12-01

    The quantification of glacier change is a key variable within glacier monitoring, with the method used potentially being crucial to ensuring that data can be appropriately compared with environmental data. The topic and timescales of study (e.g. land/marine terminating environments; sub-annual/decadal/centennial/millennial timescales) often mean that different methods are more suitable for different problems. However, depending on the GIS/coding expertise of the user, some methods can potentially be time consuming to undertake, making large-scale studies problematic. In addition, examples exist where different users have nominally applied the same methods in different studies, though with minor methodological inconsistencies in their approach. In turn, this will have implications for data homogeneity where regional/global datasets may be constructed. Here, I present a simple toolbox scripted in a Matlab® environment that requires only glacier margin and glacier centreline data to quantify glacier length, glacier change between observations, rate of change, in addition to other metrics. The toolbox includes the option to apply the established centreline or curvilinear box methods, or a new method: the variable box method - designed for tidewater margins where box width is defined as the total width of the individual terminus observation. The toolbox is extremely flexible, and has the option to be applied as either Matlab® functions within user scripts, or via a graphical user interface (GUI) for those unfamiliar with a coding environment. In both instances, there is potential to apply the methods quickly to large datasets (100s-1000s of glaciers, with potentially similar numbers of observations each), thus ensuring large scale methodological consistency (and therefore data homogeneity) and allowing regional/global scale analyses to be achievable for those with limited GIS/coding experience. The toolbox has been evaluated against idealised scenarios demonstrating its accuracy, while feedback from undergraduate students who have trialled the toolbox is that it is intuitive and simple to use. When released, the toolbox will be free and open source allowing users to potentially modify, improve and expand upon the current version.

  4. TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy

    PubMed Central

    2011-01-01

    Background Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. Results In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. Conclusions TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox. PMID:22098775

  5. Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.

    PubMed

    Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe

    2017-11-16

    Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.

  6. Evaluation of an educational "toolbox" for improving nursing staff competence and psychosocial work environment in elderly care: results of a prospective, non-randomized controlled intervention.

    PubMed

    Arnetz, J E; Hasson, H

    2007-07-01

    Lack of professional development opportunities among nursing staff is a major concern in elderly care and has been associated with work dissatisfaction and staff turnover. There is a lack of prospective, controlled studies evaluating the effects of educational interventions on nursing competence and work satisfaction. The aim of this study was to evaluate the possible effects of an educational "toolbox" intervention on nursing staff ratings of their competence, psychosocial work environment and overall work satisfaction. The study was a prospective, non-randomized, controlled intervention. Nursing staff in two municipal elderly care organizations in western Sweden. In an initial questionnaire survey, nursing staff in the intervention municipality described several areas in which they felt a need for competence development. Measurement instruments and educational materials for improving staff knowledge and work practices were then collated by researchers and managers in a "toolbox." Nursing staff ratings of their competence and work were measured pre and post-intervention by questionnaire. Staff ratings in the intervention municipality were compared to staff ratings in the reference municipality, where no toolbox was introduced. Nursing staff ratings of their competence and psychosocial work environment, including overall work satisfaction, improved significantly over time in the intervention municipality, compared to the reference group. Both competence and work environment ratings were largely unchanged among reference municipality staff. Multivariate analysis revealed a significant interaction effect between municipalities over time for nursing staff ratings of participation, leadership, performance feedback and skills' development. Staff ratings for these four scales improved significantly in the intervention municipality as compared to the reference municipality. Compared to a reference municipality, nursing staff ratings of their competence and the psychosocial work environment improved in the municipality where the toolbox was introduced.

  7. ReTrOS: a MATLAB toolbox for reconstructing transcriptional activity from gene and protein expression data.

    PubMed

    Minas, Giorgos; Momiji, Hiroshi; Jenkins, Dafyd J; Costa, Maria J; Rand, David A; Finkenstädt, Bärbel

    2017-06-26

    Given the development of high-throughput experimental techniques, an increasing number of whole genome transcription profiling time series data sets, with good temporal resolution, are becoming available to researchers. The ReTrOS toolbox (Reconstructing Transcription Open Software) provides MATLAB-based implementations of two related methods, namely ReTrOS-Smooth and ReTrOS-Switch, for reconstructing the temporal transcriptional activity profile of a gene from given mRNA expression time series or protein reporter time series. The methods are based on fitting a differential equation model incorporating the processes of transcription, translation and degradation. The toolbox provides a framework for model fitting along with statistical analyses of the model with a graphical interface and model visualisation. We highlight several applications of the toolbox, including the reconstruction of the temporal cascade of transcriptional activity inferred from mRNA expression data and protein reporter data in the core circadian clock in Arabidopsis thaliana, and how such reconstructed transcription profiles can be used to study the effects of different cell lines and conditions. The ReTrOS toolbox allows users to analyse gene and/or protein expression time series where, with appropriate formulation of prior information about a minimum of kinetic parameters, in particular rates of degradation, users are able to infer timings of changes in transcriptional activity. Data from any organism and obtained from a range of technologies can be used as input due to the flexible and generic nature of the model and implementation. The output from this software provides a useful analysis of time series data and can be incorporated into further modelling approaches or in hypothesis generation.

  8. TRENTOOL: a Matlab open source toolbox to analyse information flow in time series data with transfer entropy.

    PubMed

    Lindner, Michael; Vicente, Raul; Priesemann, Viola; Wibral, Michael

    2011-11-18

    Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox.

  9. A new impetus for guideline development and implementation: construction and evaluation of a toolbox.

    PubMed

    Hilbink, Mirrian A H W; Ouwens, Marielle M T J; Burgers, Jako S; Kool, Rudolf B

    2014-03-19

    In the last decade, guideline organizations faced a number of problems, including a lack of standardization in guideline development methods and suboptimal guideline implementation. To contribute to the solution of these problems, we produced a toolbox for guideline development, implementation, revision, and evaluation. All relevant guideline organizations in the Netherlands were approached to prioritize the topics. We sent out a questionnaire and discussed the results at an invitational conference. Based on consensus, twelve topics were selected for the development of new tools. Subsequently, working groups were composed for the development of the tools. After development of the tools, their draft versions were pilot tested in 40 guideline projects. Based on the results of the pilot tests, the tools were refined and their final versions were presented. The vast majority of organizations involved in pilot testing of the tools reported satisfaction with using the tools. Guideline experts involved in pilot testing of the tools proposed a variety of suggestions for the implementation of the tools. The tools are available in Dutch and in English at a web-based platform on guideline development and implementation (http://www.ha-ring.nl). A collaborative approach was used for the development and evaluation of a toolbox for development, implementation, revision, and evaluation of guidelines. This approach yielded a potentially powerful toolbox for improving the quality and implementation of Dutch clinical guidelines. Collaboration between guideline organizations within this project led to stronger linkages, which is useful for enhancing coordination of guideline development and implementation and preventing duplication of efforts. Use of the toolbox could improve quality standards in the Netherlands, and might facilitate the development of high-quality guidelines in other countries as well.

  10. Predictive Mining of Time Series Data

    NASA Astrophysics Data System (ADS)

    Java, A.; Perlman, E. S.

    2002-05-01

    All-sky monitors are a relatively new development in astronomy, and their data represent a largely untapped resource. Proper utilization of this resource could lead to important discoveries not only in the physics of variable objects, but in how one observes such objects. We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build predictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets (representing several classes of objects) from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. A possible application would be in scheduling target of opportunity observations where the astronomer wants to observe an object when a certain event or series of events occurs. By combining such a toolbox with an automatic, Java query tool which regularly gathers data on objects of interest, the astronomer or telescope operator could use the real-time datastream to efficiently predict the occurrence of (for example) a flare or other event. By combining the toolbox with dynamic time warping data-mining tools, one could predict events which may happen on variable time scales.

  11. Switch of Sensitivity Dynamics Revealed with DyGloSA Toolbox for Dynamical Global Sensitivity Analysis as an Early Warning for System's Critical Transition

    PubMed Central

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574

  12. Switch of sensitivity dynamics revealed with DyGloSA toolbox for dynamical global sensitivity analysis as an early warning for system's critical transition.

    PubMed

    Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas

    2013-01-01

    Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.

  13. FATSLiM: a fast and robust software to analyze MD simulations of membranes.

    PubMed

    Buchoux, Sébastien

    2017-01-01

    When studying biological membranes, Molecular Dynamics (MD) simulations reveal to be quite complementary to experimental techniques. Because the simulated systems keep increasing both in size and complexity, the analysis of MD trajectories need to be computationally efficient while being robust enough to perform analysis on membranes that may be curved or deformed due to their size and/or protein-lipid interactions. This work presents a new software named FATSLiM ('Fast Analysis Toolbox for Simulations of Lipid Membranes') that can extract physical properties from MD simulations of membranes (with or without interacting proteins). Because it relies on the calculation of local normals, FATSLiM does not depend of the bilayer morphology and thus can handle with the same accuracy vesicles for instance. Thanks to an efficiency-driven development, it is also fast and consumes a rather low amount of memory. FATSLiM (http://fatslim.github.io) is a stand-alone software written in Python. Source code is released under the GNU GPLv3 and is freely available at https://github.com/FATSLiM/fatslim A complete online documentation including instructions for platform-independent installation is available at http://pythonhosted.org/fatslim CONTACT: sebastien.buchoux@u-picardie.frSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic System T-MATS

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.

  15. Propulsion System Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.

  16. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    PubMed

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  17. EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.

    PubMed

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.

  18. III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.

    PubMed

    Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L

    2013-08-01

    One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.

  19. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    PubMed

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  20. Topologically preserving straightening of spinal cord MRI.

    PubMed

    De Leener, Benjamin; Mangeat, Gabriel; Dupont, Sara; Martin, Allan R; Callot, Virginie; Stikov, Nikola; Fehlings, Michael G; Cohen-Adad, Julien

    2017-10-01

    To propose a robust and accurate method for straightening magnetic resonance (MR) images of the spinal cord, based on spinal cord segmentation, that preserves spinal cord topology and that works for any MRI contrast, in a context of spinal cord template-based analysis. The spinal cord curvature was computed using an iterative Non-Uniform Rational B-Spline (NURBS) approximation. Forward and inverse deformation fields for straightening were computed by solving analytically the straightening equations for each image voxel. Computational speed-up was accomplished by solving all voxel equation systems as one single system. Straightening accuracy (mean and maximum distance from straight line), computational time, and robustness to spinal cord length was evaluated using the proposed and the standard straightening method (label-based spline deformation) on 3T T 2 - and T 1 -weighted images from 57 healthy subjects and 33 patients with spinal cord compression due to degenerative cervical myelopathy (DCM). The proposed algorithm was more accurate, more robust, and faster than the standard method (mean distance = 0.80 vs. 0.83 mm, maximum distance = 1.49 vs. 1.78 mm, time = 71 vs. 174 sec for the healthy population and mean distance = 0.65 vs. 0.68 mm, maximum distance = 1.28 vs. 1.55 mm, time = 32 vs. 60 sec for the DCM population). A novel image straightening method that enables template-based analysis of quantitative spinal cord MRI data is introduced. This algorithm works for any MRI contrast and was validated on healthy and patient populations. The presented method is implemented in the Spinal Cord Toolbox, an open-source software for processing spinal cord MRI data. 1 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2017;46:1209-1219. © 2017 International Society for Magnetic Resonance in Medicine.

  1. Ideas Exchange: How Do You Use NASPE's Teacher Toolbox to Enhance Professional Activities with Students, Sport or Physical Education Lessons, Faculty Wellness Classes or Community Programs?

    ERIC Educational Resources Information Center

    Simpkins, Mary Ann; McNeill, Shane; Dieckman, Dale; Sissom, Mark; LoBianco, Judy; Lund, Jackie; Barney, David C.; Manson, Mara; Silva, Betsy

    2009-01-01

    NASPE's Teacher Toolbox is an instructional resource site which provides educators with a wide variety of teaching tools that focus on physical activity. This service is provided by NASPE to support instructional activities as well as promote quality programs. New monthly issues support NASPE's mission to enhance knowledge, improve professional…

  2. Motion Simulation in the Environment for Auditory Research

    DTIC Science & Technology

    2011-08-01

    Toolbox, Centre for Digital Music , Queen Mary University of London, 2009. http://www.isophonics.net/content/spatial-audio- matlab-toolbox (accessed July 27...this work. 46 Student Bio I studied Music Technology at Northwestern University, graduating as valedictorian of the School of... Music in 2008. In 2009, I was awarded the Gates Cambridge Scholarship to fund a postgraduate degree at the University of Cambridge. I read for a

  3. DETECT: Detection of Events in Continuous Time Toolbox: User’s Guide, Examples, and Function Reference Documentation

    DTIC Science & Technology

    2013-06-01

    benefitting from rapid, automated discrimination of specific predefined signals , and is free-standing (requiring no other plugins or packages). The...previously labeled dataset, and comparing two labeled datasets. 15. SUBJECT TERMS Artifact, signal detection, EEG, MATLAB, toolbox 16. SECURITY... CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 56 19a. NAME OF RESPONSIBLE PERSON W. David Hairston a. REPORT

  4. Development of a Dependency Theory Toolbox for Database Design.

    DTIC Science & Technology

    1987-12-01

    published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and

  5. PLDAPS: A Hardware Architecture and Software Toolbox for Neurophysiology Requiring Complex Visual Stimuli and Online Behavioral Control.

    PubMed

    Eastman, Kyler M; Huk, Alexander C

    2012-01-01

    Neurophysiological studies in awake, behaving primates (both human and non-human) have focused with increasing scrutiny on the temporal relationship between neural signals and behaviors. Consequently, laboratories are often faced with the problem of developing experimental equipment that can support data recording with high temporal precision and also be flexible enough to accommodate a wide variety of experimental paradigms. To this end, we have developed a MATLAB toolbox that integrates several modern pieces of equipment, but still grants experimenters the flexibility of a high-level programming language. Our toolbox takes advantage of three popular and powerful technologies: the Plexon apparatus for neurophysiological recordings (Plexon, Inc., Dallas, TX, USA), a Datapixx peripheral (Vpixx Technologies, Saint-Bruno, QC, Canada) for control of analog, digital, and video input-output signals, and the Psychtoolbox MATLAB toolbox for stimulus generation (Brainard, 1997; Pelli, 1997; Kleiner et al., 2007). The PLDAPS ("Platypus") system is designed to support the study of the visual systems of awake, behaving primates during multi-electrode neurophysiological recordings, but can be easily applied to other related domains. Despite its wide range of capabilities and support for cutting-edge video displays and neural recording systems, the PLDAPS system is simple enough for someone with basic MATLAB programming skills to design their own experiments.

  6. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    NASA Astrophysics Data System (ADS)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  7. Optics Program Simplifies Analysis and Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Engineers at Goddard Space Flight Center partnered with software experts at Mide Technology Corporation, of Medford, Massachusetts, through a Small Business Innovation Research (SBIR) contract to design the Disturbance-Optics-Controls-Structures (DOCS) Toolbox, a software suite for performing integrated modeling for multidisciplinary analysis and design. The DOCS Toolbox integrates various discipline models into a coupled process math model that can then predict system performance as a function of subsystem design parameters. The system can be optimized for performance; design parameters can be traded; parameter uncertainties can be propagated through the math model to develop error bounds on system predictions; and the model can be updated, based on component, subsystem, or system level data. The Toolbox also allows the definition of process parameters as explicit functions of the coupled model and includes a number of functions that analyze the coupled system model and provide for redesign. The product is being sold commercially by Nightsky Systems Inc., of Raleigh, North Carolina, a spinoff company that was formed by Mide specifically to market the DOCS Toolbox. Commercial applications include use by any contractors developing large space-based optical systems, including Lockheed Martin Corporation, The Boeing Company, and Northrup Grumman Corporation, as well as companies providing technical audit services, like General Dynamics Corporation

  8. Wavefront Control Toolbox for James Webb Space Telescope Testbed

    NASA Technical Reports Server (NTRS)

    Shiri, Ron; Aronstein, David L.; Smith, Jeffery Scott; Dean, Bruce H.; Sabatke, Erin

    2007-01-01

    We have developed a Matlab toolbox for wavefront control of optical systems. We have applied this toolbox to the optical models of James Webb Space Telescope (JWST) in general and to the JWST Testbed Telescope (TBT) in particular, implementing both unconstrained and constrained wavefront optimization to correct for possible misalignments present on the segmented primary mirror or the monolithic secondary mirror. The optical models implemented in Zemax optical design program and information is exchanged between Matlab and Zemax via the Dynamic Data Exchange (DDE) interface. The model configuration is managed using the XML protocol. The optimization algorithm uses influence functions for each adjustable degree of freedom of the optical mode. The iterative and non-iterative algorithms have been developed to converge to a local minimum of the root-mean-square (rms) of wavefront error using singular value decomposition technique of the control matrix of influence functions. The toolkit is highly modular and allows the user to choose control strategies for the degrees of freedom to be adjusted on a given iteration and wavefront convergence criterion. As the influence functions are nonlinear over the control parameter space, the toolkit also allows for trade-offs between frequency of updating the local influence functions and execution speed. The functionality of the toolbox and the validity of the underlying algorithms have been verified through extensive simulations.

  9. Structure Computation of Quiet Spike[Trademark] Flight-Test Data During Envelope Expansion

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2008-01-01

    System identification or mathematical modeling is used in the aerospace community for development of simulation models for robust control law design. These models are often described as linear time-invariant processes. Nevertheless, it is well known that the underlying process is often nonlinear. The reason for using a linear approach has been due to the lack of a proper set of tools for the identification of nonlinear systems. Over the past several decades, the controls and biomedical communities have made great advances in developing tools for the identification of nonlinear systems. These approaches are robust and readily applicable to aerospace systems. In this paper, we show the application of one such nonlinear system identification technique, structure detection, for the analysis of F-15B Quiet Spike(TradeMark) aeroservoelastic flight-test data. Structure detection is concerned with the selection of a subset of candidate terms that best describe the observed output. This is a necessary procedure to compute an efficient system description that may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modeling may be of critical importance for the development of robust parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion, which may save significant development time and costs. The objectives of this study are to demonstrate via analysis of F-15B Quiet Spike aeroservoelastic flight-test data for several flight conditions that 1) linear models are inefficient for modeling aeroservoelastic data, 2) nonlinear identification provides a parsimonious model description while providing a high percent fit for cross-validated data, and 3) the model structure and parameters vary as the flight condition is altered.

  10. Estimating serial correlation and self-similarity in financial time series-A diversification approach with applications to high frequency data

    NASA Astrophysics Data System (ADS)

    Gerlich, Nikolas; Rostek, Stefan

    2015-09-01

    We derive a heuristic method to estimate the degree of self-similarity and serial correlation in financial time series. Especially, we propagate the use of a tailor-made selection of different estimation techniques that are used in various fields of time series analysis but until now have not consequently found their way into the finance literature. Following the idea of portfolio diversification, we show that considerable improvements with respect to robustness and unbiasedness can be achieved by using a basket of estimation methods. With this methodological toolbox at hand, we investigate real market data to show that noticeable deviations from the assumptions of constant self-similarity and absence of serial correlation occur during certain periods. On the one hand, this may shed a new light on seemingly ambiguous scientific findings concerning serial correlation of financial time series. On the other hand, a proven time-changing degree of self-similarity may help to explain high-volatility clusters of stock price indices.

  11. Why Early Prevention of Childhood Obesity Is More Than a Medical Concern: A Health Economic Approach.

    PubMed

    Sonntag, Diana

    2017-01-01

    Childhood overweight and obesity are a non-deniable health concern with increasing economic attention. International studies provide robust evidence about substantial lifetime excess costs due to childhood obesity, thereby underscoring the urgent need to implement potent obesity prevention programs in early childhood. Fortunately, this is happening more and more, as evidenced by the increase in well-conducted interventions. Nevertheless, an important piece of the puzzle is often missing, that is, health economic evaluations. There are 3 main reasons for this: an insufficient number of economic approaches which consider the complexity of childhood obesity, a lack of (significant) long-term effect sizes of an intervention, and inadequate planning of health economic evaluations in the design phase of an intervention. Key Messages: It is advisable to involve health economists during the design phase of an intervention. Equally necessary is the development of a tailored toolbox for efficient data acquisition. © 2017 S. Karger AG, Basel.

  12. Issues around radiological protection of the environment and its integration with protection of humans: promoting debate on the way forward.

    PubMed

    Brownless, G P

    2007-12-01

    This paper explores issues to consider around integrating direct, explicit protection of the environment into the current system of radiological protection, which is focused on the protection of humans. Many issues around environmental radiological protection have been discussed, and ready-to-use toolboxes have been constructed for assessing harm to non-human biota, but it is not clear how (or even if) these should be fitted into the current system of protection. Starting from the position that the current approach to protecting the environment (namely that it follows from adequately protecting humans) is generally effective, this paper considers how explicit radiological protection of the environment can be integrated with the current system, through developing a 'worked example' of how this could be done and highlighting issues peculiar to protection of the environment. The aim of the paper is to promote debate on this topic, with the ultimate aim of ensuring that any changes to the system are consensual and robust.

  13. Pupillary Stroop effects

    PubMed Central

    Ørbo, Marte; Holmlund, Terje; Miozzo, Michele

    2010-01-01

    We recorded the pupil diameters of participants performing the words’ color-naming Stroop task (i.e., naming the color of a word that names a color). Non-color words were used as baseline to firmly establish the effects of semantic relatedness induced by color word distractors. We replicated the classic Stroop effects of color congruency and color incongruency with pupillary diameter recordings: relative to non-color words, pupil diameters increased for color distractors that differed from color responses, while they reduced for color distractors that were identical to color responses. Analyses of the time courses of pupil responses revealed further differences between color-congruent and color-incongruent distractors, with the latter inducing a steep increase of pupil size and the former a relatively lower increase. Consistent with previous findings that have demonstrated that pupil size increases as task demands rise, the present results indicate that pupillometry is a robust measure of Stroop interference, and it represents a valuable addition to the cognitive scientist’s toolbox. PMID:20865297

  14. A functional metagenomic approach for expanding the synthetic biology toolbox for biomass conversion

    PubMed Central

    Sommer, Morten OA; Church, George M; Dantas, Gautam

    2010-01-01

    Sustainable biofuel alternatives to fossil fuel energy are hampered by recalcitrance and toxicity of biomass substrates to microbial biocatalysts. To address this issue, we present a culture-independent functional metagenomic platform for mining Nature's vast enzymatic reservoir and show its relevance to biomass conversion. We performed functional selections on 4.7 Gb of metagenomic fosmid libraries and show that genetic elements conferring tolerance toward seven important biomass inhibitors can be identified. We select two metagenomic fosmids that improve the growth of Escherichia coli by 5.7- and 6.9-fold in the presence of inhibitory concentrations of syringaldehyde and 2-furoic acid, respectively, and identify the individual genes responsible for these tolerance phenotypes. Finally, we combine the individual genes to create a three-gene construct that confers tolerance to mixtures of these important biomass inhibitors. This platform presents a route for expanding the repertoire of genetic elements available to synthetic biology and provides a starting point for efforts to engineer robust strains for biofuel generation. PMID:20393580

  15. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  16. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  17. A convolutional neural network approach to calibrating the rotation axis for X-ray computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaogang; De Carlo, Francesco; Phatak, Charudatta

    This paper presents an algorithm to calibrate the center-of-rotation for X-ray tomography by using a machine learning approach, the Convolutional Neural Network (CNN). The algorithm shows excellent accuracy from the evaluation of synthetic data with various noise ratios. It is further validated with experimental data of four different shale samples measured at the Advanced Photon Source and at the Swiss Light Source. The results are as good as those determined by visual inspection and show better robustness than conventional methods. CNN has also great potential forreducing or removingother artifacts caused by instrument instability, detector non-linearity,etc. An open-source toolbox, which integratesmore » the CNN methods described in this paper, is freely available through GitHub at tomography/xlearn and can be easily integrated into existing computational pipelines available at various synchrotron facilities. Source code, documentation and information on how to contribute are also provided.« less

  18. ArcAtlas in the Classroom: Pattern Identification, Description, and Explanation

    ERIC Educational Resources Information Center

    DeMers, Michael N.; Vincent, Jeffrey S.

    2007-01-01

    The use of geographic information systems (GIS) in the classroom provides a robust and effective method of teaching the primary spatial skills of identification, description, and explanation of spatial pattern. A major handicap for the development of GIS-based learning experiences, especially for non-GIS specialist educators, is the availability…

  19. Intergration of system identification and robust controller designs for flexible structures in space

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Lew, Jiann-Shiun

    1990-01-01

    An approach is developed using experimental data to identify a reduced-order model and its model error for a robust controller design. There are three steps involved in the approach. First, an approximately balanced model is identified using the Eigensystem Realization Algorithm, which is an identification algorithm. Second, the model error is calculated and described in frequency domain in terms of the H(infinity) norm. Third, a pole placement technique in combination with a H(infinity) control method is applied to design a controller for the considered system. A set experimental data from an existing setup, namely the Mini-Mast system, is used to illustrate and verify the approach.

  20. The 32nd CDC: System identification using interval dynamic models

    NASA Technical Reports Server (NTRS)

    Keel, L. H.; Lew, J. S.; Bhattacharyya, S. P.

    1992-01-01

    Motivated by the recent explosive development of results in the area of parametric robust control, a new technique to identify a family of uncertain systems is identified. The new technique takes the frequency domain input and output data obtained from experimental test signals and produces an 'interval transfer function' that contains the complete frequency domain behavior with respect to the test signals. This interval transfer function is one of the key concepts in the parametric robust control approach and identification with such an interval model allows one to predict the worst case performance and stability margins using recent results on interval systems. The algorithm is illustrated by applying it to an 18 bay Mini-Mast truss structure.

  1. Precision pointing and control of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Bantell, M. H., Jr.

    1987-01-01

    The problem and long term objectives for the precision pointing and control of flexible spacecraft are given. The four basic objectives are stated in terms of two principle tasks. Under Task 1, robust low order controllers, improved structural modeling methods for control applications and identification methods for structural dynamics are being developed. Under Task 2, a lab test experiment for verification of control laws and system identification algorithms is being developed. For Task 1, work has focused on robust low order controller design and some initial considerations for structural modeling in control applications. For Task 2, work has focused on experiment design and fabrication, along with sensor selection and initial digital controller implementation. Conclusions are given.

  2. A Method for the Direct Identification of Differentiating Muscle Cells by a Fluorescent Mitochondrial Dye

    PubMed Central

    Miyake, Tetsuaki; McDermott, John C.; Gramolini, Anthony O.

    2011-01-01

    Identification of differentiating muscle cells generally requires fixation, antibodies directed against muscle specific proteins, and lengthy staining processes or, alternatively, transfection of muscle specific reporter genes driving GFP expression. In this study, we examined the possibility of using the robust mitochondrial network seen in maturing muscle cells as a marker of cellular differentiation. The mitochondrial fluorescent tracking dye, MitoTracker, which is a cell-permeable, low toxicity, fluorescent dye, allowed us to distinguish and track living differentiating muscle cells visually by epi-fluorescence microscopy. MitoTracker staining provides a robust and simple detection strategy for living differentiating cells in culture without the need for fixation or biochemical processing. PMID:22174849

  3. High performance data acquisition, identification, and monitoring for active magnetic bearings

    NASA Technical Reports Server (NTRS)

    Herzog, Raoul; Siegwart, Roland

    1994-01-01

    Future active magnetic bearing systems (AMB) must feature easier on-site tuning, higher stiffness and damping, better robustness with respect to undesirable vibrations in housing and foundation, and enhanced monitoring and identification abilities. To get closer to these goals we developed a fast parallel link from the digitally controlled AMB to Matlab, which is used on a host computer for data processing, identification, and controller layout. This enables the magnetic bearing to take its frequency responses without using any additional measurement equipment. These measurements can be used for AMB identification.

  4. Applications of a morphological scene change detection (MSCD) for visual leak and failure identification in process and chemical engineering

    NASA Astrophysics Data System (ADS)

    Tickle, Andrew J.; Harvey, Paul K.; Smith, Jeremy S.

    2010-10-01

    Morphological Scene Change Detection (MSCD) is a process typically tasked at detecting relevant changes in a guarded environment for security applications. This can be implemented on a Field Programmable Gate Array (FPGA) by a combination of binary differences based around exclusive-OR (XOR) gates, mathematical morphology and a crucial threshold setting. The additional ability to set up the system in virtually any location due to the FPGA makes it ideal for insertion into an autonomous mobile robot for patrol duties. However, security is not the only potential of this robust algorithm. This paper details how such a system can be used for the detection of leaks in piping for use in the process and chemical industries and could be deployed as stated in the above manner. The test substance in this work was water, which was pumped either as a liquid or as low pressure steam through a simple pipe configuration with holes at set points to simulate the leaks. These holes were situated randomly at either the center of a pipe (in order to simulate an impact to it) or at a joint or corner (to simulate a failed weld). Imagery of the resultant leaks, which were visualised as drips or the accumulation of steam, which where analysed using MATLAB to determine their pixel volume in order to calibrate the trigger for the MSCD. The triggering mechanism is adaptive to make it possible in theory for the type of leak to be determined by the number of pixels in the threshold of the image and a numerical output signal to state which of the leak situations is being observed. The system was designed using the DSP Builder package from Altera so that its graphical nature is easily comprehensible to the non-embedded system designer. Furthermore, all the data from the DSP Builder simulation underwent verification against MATLAB comparisons using the image processing toolbox in order to validate the results.

  5. Design and implementation of robust controllers for a gait trainer.

    PubMed

    Wang, F C; Yu, C H; Chou, T Y

    2009-08-01

    This paper applies robust algorithms to control an active gait trainer for children with walking disabilities. Compared with traditional rehabilitation procedures, in which two or three trainers are required to assist the patient, a motor-driven mechanism was constructed to improve the efficiency of the procedures. First, a six-bar mechanism was designed and constructed to mimic the trajectory of children's ankles in walking. Second, system identification techniques were applied to obtain system transfer functions at different operating points by experiments. Third, robust control algorithms were used to design Hinfinity robust controllers for the system. Finally, the designed controllers were implemented to verify experimentally the system performance. From the results, the proposed robust control strategies are shown to be effective.

  6. The Visible Signature Modelling and Evaluation ToolBox

    DTIC Science & Technology

    2008-12-01

    Technology Organisation DSTO–TR–2212 ABSTRACT A new software suite, the Visible Signature ToolBox ( VST ), has been developed to model and evaluate the...visible signatures of maritime platforms. The VST is a collection of commercial, off-the-shelf software and DSTO developed pro- grams and procedures. The...suite. The VST can be utilised to model and assess visible signatures of maritime platforms. A number of examples are presented to demonstrate the

  7. The Benefits of Comparing Grapefruits and Tangerines: A Toolbox for European Cross-Cultural Comparisons in Engineering Education--Using This Toolbox to Study Gendered Images of Engineering among Students

    ERIC Educational Resources Information Center

    Godfroy-Genin, Anne-Sophie; Pinault, Cloe

    2006-01-01

    The main objective of the WomEng European research project was to assess when, how and why women decide to or not to study engineering. This question was addressed through an international cross-comparison by an interdisciplinary research team in seven European countries. This article presents, in the first part, the methodological toolbox…

  8. System-Events Toolbox--Activating Urban Places for Social Cohesion through Designing a System of Events That Relies on Local Resources

    ERIC Educational Resources Information Center

    Fassi, Davide; Motter, Roberta

    2014-01-01

    This paper is a reflection on the use of public spaces in towns and the development of a system-events toolbox to activate them towards social cohesion. It is the result of a 1 year action research developed together with POLIMI DESIS Lab of the Department of Design to develop design solutions to open up the public spaces of the campus to the…

  9. Evaluating 3D-printed biomaterials as scaffolds for vascularized bone tissue engineering.

    PubMed

    Wang, Martha O; Vorwald, Charlotte E; Dreher, Maureen L; Mott, Eric J; Cheng, Ming-Huei; Cinar, Ali; Mehdizadeh, Hamidreza; Somo, Sami; Dean, David; Brey, Eric M; Fisher, John P

    2015-01-07

    There is an unmet need for a consistent set of tools for the evaluation of 3D-printed constructs. A toolbox developed to design, characterize, and evaluate 3D-printed poly(propylene fumarate) scaffolds is proposed for vascularized engineered tissues. This toolbox combines modular design and non-destructive fabricated design evaluation, evaluates biocompatibility and mechanical properties, and models angiogenesis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Social Network Mapping: A New Tool For The Leadership Toolbox

    DTIC Science & Technology

    2002-04-01

    SOCIAL NETWORK MAPPING: A NEW TOOL FOR THE LEADERSHIP TOOLBOX By Elisabeth J. Strines, Colonel, USAF 8037 Washington Road Alexandria...valid OMB control number. 1. REPORT DATE 00 APR 2002 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Social Network Mapping: A...describes the concept of social network mapping and demonstrates how it can be used by squadron commanders and leaders at all levels to provide subtle

  11. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  12. Synthetic Biology Toolbox for Controlling Gene Expression in the Cyanobacterium Synechococcus sp. strain PCC 7002

    DOE PAGES

    Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.; ...

    2014-09-12

    The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less

  13. GUIDANCE DOCUMENT ON IMPLEMENTATION OF THE ...

    EPA Pesticide Factsheets

    The Agreement in Principle for the Stage 2 M-DBP Federal Advisory Committee contains a list of treatment processes and management practices for water systems to use in meeting additional Cryptosporidium treatment requirements under the LT2ESWTR. This list, termed the microbial toolbox, includes watershed control programs, alternative intake locations, pretreatment processes, additional filtration barriers, inactivation technologies, and enhanced plant performance. The intent of the microbial toolbox is to provide water systems with broad flexibility in selecting cost-effective LT2ESWTR compliance strategies. Moreover, the toolbox allows systems that currently provide additional pathogen barriers or that can demonstrate enhanced performance to receive additional Cryptosporidium treatment credit. Provide guidance to utilities with surface water supplies and to state drinking water programs on the use of different treatment technologies to reduce the level of Cryptosporidium in drinking water. Technologies included in the guidance manual may be used to achieve compliance with the requirements of the LT2ESWTR.

  14. ΔΔPT: a comprehensive toolbox for the analysis of protein motion

    PubMed Central

    2013-01-01

    Background Normal Mode Analysis is one of the most successful techniques for studying motions in proteins and macromolecules. It can provide information on the mechanism of protein functions, used to aid crystallography and NMR data reconstruction, and calculate protein free energies. Results ΔΔPT is a toolbox allowing calculation of elastic network models and principle component analysis. It allows the analysis of pdb files or trajectories taken from; Gromacs, Amber, and DL_POLY. As well as calculation of the normal modes it also allows comparison of the modes with experimental protein motion, variation of modes with mutation or ligand binding, and calculation of molecular dynamic entropies. Conclusions This toolbox makes the respective tools available to a wide community of potential NMA users, and allows them unrivalled ability to analyse normal modes using a variety of techniques and current software. PMID:23758746

  15. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  16. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates

    PubMed Central

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: • The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms. • The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform. • The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time. PMID:26150988

  17. The laboratory test utilization management toolbox

    PubMed Central

    Baird, Geoffrey

    2014-01-01

    Efficiently managing laboratory test utilization requires both ensuring adequate utilization of needed tests in some patients and discouraging superfluous tests in other patients. After the difficult clinical decision is made to define the patients that do and do not need a test, a wealth of interventions are available to the clinician and laboratorian to help guide appropriate utilization. These interventions are collectively referred to here as the utilization management toolbox. Experience has shown that some tools in the toolbox are weak and other are strong, and that tools are most effective when many are used simultaneously. While the outcomes of utilization management studies are not always as concrete as may be desired, what data is available in the literature indicate that strong utilization management interventions are safe and effective measures to improve patient health and reduce waste in an era of increasing financial pressure. PMID:24969916

  18. Challenges and solutions for the analysis of in situ , in crystallo micro-spectrophotometric data

    DOE PAGES

    Dworkowski, Florian S. N.; Hough, Michael A.; Pompidor, Guillaume; ...

    2015-01-01

    Combining macromolecular crystallography with in crystallo micro-spectrophotometry yields valuable complementary information on the sample, including the redox states of metal cofactors, the identification of bound ligands and the onset and strength of undesired photochemistry, also known as radiation damage. However, the analysis and processing of the resulting data differs significantly from the approaches used for solution spectrophotometric data. The varying size and shape of the sample, together with the suboptimal sample environment, the lack of proper reference signals and the general influence of the X-ray beam on the sample have to be considered and carefully corrected for. In the presentmore » article, we discuss how to characterize and treat these sample-dependent artefacts in a reproducible manner and we demonstrate the SLS-APE in situ, in crystallo optical spectroscopy data-analysis toolbox.« less

  19. Protein recognition by a pattern-generating fluorescent molecular probe.

    PubMed

    Pode, Zohar; Peri-Naor, Ronny; Georgeson, Joseph M; Ilani, Tal; Kiss, Vladimir; Unger, Tamar; Markus, Barak; Barr, Haim M; Motiei, Leila; Margulies, David

    2017-12-01

    Fluorescent molecular probes have become valuable tools in protein research; however, the current methods for using these probes are less suitable for analysing specific populations of proteins in their native environment. In this study, we address this gap by developing a unimolecular fluorescent probe that combines the properties of small-molecule-based probes and cross-reactive sensor arrays (the so-called chemical 'noses/tongues'). On the one hand, the probe can detect different proteins by generating unique identification (ID) patterns, akin to cross-reactive arrays. On the other hand, its unimolecular scaffold and selective binding enable this ID-generating probe to identify combinations of specific protein families within complex mixtures and to discriminate among isoforms in living cells, where macroscopic arrays cannot access. The ability to recycle the molecular device and use it to track several binding interactions simultaneously further demonstrates how this approach could expand the fluorescent toolbox currently used to detect and image proteins.

  20. Identification and characterization of multiple rubisco activases in chemoautotrophic bacteria

    PubMed Central

    Tsai, Yi-Chin Candace; Lapina, Maria Claribel; Bhushan, Shashi; Mueller-Cajar, Oliver

    2015-01-01

    Ribulose-1,5-bisphosphate carboxylase/oxygenase (rubisco) is responsible for almost all biological CO2 assimilation, but forms inhibited complexes with its substrate ribulose-1,5-bisphosphate (RuBP) and other sugar phosphates. The distantly related AAA+ proteins rubisco activase and CbbX remodel inhibited rubisco complexes to effect inhibitor release in plants and α-proteobacteria, respectively. Here we characterize a third class of rubisco activase in the chemolithoautotroph Acidithiobacillus ferrooxidans. Two sets of isoforms of CbbQ and CbbO form hetero-oligomers that function as specific activases for two structurally diverse rubisco forms. Mutational analysis supports a model wherein the AAA+ protein CbbQ functions as motor and CbbO is a substrate adaptor that binds rubisco via a von Willebrand factor A domain. Understanding the mechanisms employed by nature to overcome rubisco's shortcomings will increase our toolbox for engineering photosynthetic carbon dioxide fixation. PMID:26567524

  1. Protein recognition by a pattern-generating fluorescent molecular probe

    NASA Astrophysics Data System (ADS)

    Pode, Zohar; Peri-Naor, Ronny; Georgeson, Joseph M.; Ilani, Tal; Kiss, Vladimir; Unger, Tamar; Markus, Barak; Barr, Haim M.; Motiei, Leila; Margulies, David

    2017-12-01

    Fluorescent molecular probes have become valuable tools in protein research; however, the current methods for using these probes are less suitable for analysing specific populations of proteins in their native environment. In this study, we address this gap by developing a unimolecular fluorescent probe that combines the properties of small-molecule-based probes and cross-reactive sensor arrays (the so-called chemical 'noses/tongues'). On the one hand, the probe can detect different proteins by generating unique identification (ID) patterns, akin to cross-reactive arrays. On the other hand, its unimolecular scaffold and selective binding enable this ID-generating probe to identify combinations of specific protein families within complex mixtures and to discriminate among isoforms in living cells, where macroscopic arrays cannot access. The ability to recycle the molecular device and use it to track several binding interactions simultaneously further demonstrates how this approach could expand the fluorescent toolbox currently used to detect and image proteins.

  2. Fractal dimension based damage identification incorporating multi-task sparse Bayesian learning

    NASA Astrophysics Data System (ADS)

    Huang, Yong; Li, Hui; Wu, Stephen; Yang, Yongchao

    2018-07-01

    Sensitivity to damage and robustness to noise are critical requirements for the effectiveness of structural damage detection. In this study, a two-stage damage identification method based on the fractal dimension analysis and multi-task Bayesian learning is presented. The Higuchi’s fractal dimension (HFD) based damage index is first proposed, directly examining the time-frequency characteristic of local free vibration data of structures based on the irregularity sensitivity and noise robustness analysis of HFD. Katz’s fractal dimension is then presented to analyze the abrupt irregularity change of the spatial curve of the displacement mode shape along the structure. At the second stage, the multi-task sparse Bayesian learning technique is employed to infer the final damage localization vector, which borrow the dependent strength of the two fractal dimension based damage indication information and also incorporate the prior knowledge that structural damage occurs at a limited number of locations in a structure in the absence of its collapse. To validate the capability of the proposed method, a steel beam and a bridge, named Yonghe Bridge, are analyzed as illustrative examples. The damage identification results demonstrate that the proposed method is capable of localizing single and multiple damages regardless of its severity, and show superior robustness under heavy noise as well.

  3. Périgord black truffle genome uncovers evolutionary origins and mechanisms of symbiosis.

    PubMed

    Martin, Francis; Kohler, Annegret; Murat, Claude; Balestrini, Raffaella; Coutinho, Pedro M; Jaillon, Olivier; Montanini, Barbara; Morin, Emmanuelle; Noel, Benjamin; Percudani, Riccardo; Porcel, Bettina; Rubini, Andrea; Amicucci, Antonella; Amselem, Joelle; Anthouard, Véronique; Arcioni, Sergio; Artiguenave, François; Aury, Jean-Marc; Ballario, Paola; Bolchi, Angelo; Brenna, Andrea; Brun, Annick; Buée, Marc; Cantarel, Brandi; Chevalier, Gérard; Couloux, Arnaud; Da Silva, Corinne; Denoeud, France; Duplessis, Sébastien; Ghignone, Stefano; Hilselberger, Benoît; Iotti, Mirco; Marçais, Benoît; Mello, Antonietta; Miranda, Michele; Pacioni, Giovanni; Quesneville, Hadi; Riccioni, Claudia; Ruotolo, Roberta; Splivallo, Richard; Stocchi, Vilberto; Tisserant, Emilie; Viscomi, Arturo Roberto; Zambonelli, Alessandra; Zampieri, Elisa; Henrissat, Bernard; Lebrun, Marc-Henri; Paolocci, Francesco; Bonfante, Paola; Ottonello, Simone; Wincker, Patrick

    2010-04-15

    The Périgord black truffle (Tuber melanosporum Vittad.) and the Piedmont white truffle dominate today's truffle market. The hypogeous fruiting body of T. melanosporum is a gastronomic delicacy produced by an ectomycorrhizal symbiont endemic to calcareous soils in southern Europe. The worldwide demand for this truffle has fuelled intense efforts at cultivation. Identification of processes that condition and trigger fruit body and symbiosis formation, ultimately leading to efficient crop production, will be facilitated by a thorough analysis of truffle genomic traits. In the ectomycorrhizal Laccaria bicolor, the expansion of gene families may have acted as a 'symbiosis toolbox'. This feature may however reflect evolution of this particular taxon and not a general trait shared by all ectomycorrhizal species. To get a better understanding of the biology and evolution of the ectomycorrhizal symbiosis, we report here the sequence of the haploid genome of T. melanosporum, which at approximately 125 megabases is the largest and most complex fungal genome sequenced so far. This expansion results from a proliferation of transposable elements accounting for approximately 58% of the genome. In contrast, this genome only contains approximately 7,500 protein-coding genes with very rare multigene families. It lacks large sets of carbohydrate cleaving enzymes, but a few of them involved in degradation of plant cell walls are induced in symbiotic tissues. The latter feature and the upregulation of genes encoding for lipases and multicopper oxidases suggest that T. melanosporum degrades its host cell walls during colonization. Symbiosis induces an increased expression of carbohydrate and amino acid transporters in both L. bicolor and T. melanosporum, but the comparison of genomic traits in the two ectomycorrhizal fungi showed that genetic predispositions for symbiosis-'the symbiosis toolbox'-evolved along different ways in ascomycetes and basidiomycetes.

  4. OPTESIM, a Versatile Toolbox for Numerical Simulation of Electron Spin Echo Envelope Modulation (ESEEM) that Features Hybrid Optimization and Statistical Assessment of Parameters

    PubMed Central

    Sun, Li; Hernandez-Guzman, Jessica; Warncke, Kurt

    2009-01-01

    Electron spin echo envelope modulation (ESEEM) is a technique of pulsed-electron paramagnetic resonance (EPR) spectroscopy. The analyis of ESEEM data to extract information about the nuclear and electronic structure of a disordered (powder) paramagnetic system requires accurate and efficient numerical simulations. A single coupled nucleus of known nuclear g value (gN) and spin I=1 can have up to eight adjustable parameters in the nuclear part of the spin Hamiltonian. We have developed OPTESIM, an ESEEM simulation toolbox, for automated numerical simulation of powder two- and three-pulse one-dimensional ESEEM for arbitrary number (N) and type (I, gN) of coupled nuclei, and arbitrary mutual orientations of the hyperfine tensor principal axis systems for N>1. OPTESIM is based in the Matlab environment, and includes the following features: (1) a fast algorithm for translation of the spin Hamiltonian into simulated ESEEM, (2) different optimization methods that can be hybridized to achieve an efficient coarse-to-fine grained search of the parameter space and convergence to a global minimum, (3) statistical analysis of the simulation parameters, which allows the identification of simultaneous confidence regions at specific confidence levels. OPTESIM also includes a geometry-preserving spherical averaging algorithm as default for N>1, and global optimization over multiple experimental conditions, such as the dephasing time ( ) for three-pulse ESEEM, and external magnetic field values. Application examples for simulation of 14N coupling (N=1, N=2) in biological and chemical model paramagnets are included. Automated, optimized simulations by using OPTESIM lead to a convergence on dramatically shorter time scales, relative to manual simulations. PMID:19553148

  5. Integrated structural control design of large space structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, J.J.; Lauffer, J.P.

    1995-01-01

    Active control of structures has been under intensive development for the last ten years. Reference 2 reviews much of the identification and control technology for structural control developed during this time. The technology was initially focused on space structure and weapon applications; however, recently the technology is also being directed toward applications in manufacturing and transportation. Much of this technology focused on multiple-input/multiple-output (MIMO) identification and control methodology because many of the applications require a coordinated control involving multiple disturbances and control objectives where multiple actuators and sensors are necessary for high performance. There have been many optimal robust controlmore » methods developed for the design of MIMO robust control laws; however, there appears to be a significant gap between the theoretical development and experimental evaluation of control and identification methods to address structural control applications. Many methods have been developed for MIMO identification and control of structures, such as the Eigensystem Realization Algorithm (ERA), Q-Markov Covariance Equivalent Realization (Q-Markov COVER) for identification; and, Linear Quadratic Gaussian (LQG), Frequency Weighted LQG and H-/ii-synthesis methods for control. Upon implementation, many of the identification and control methods have shown limitations such as the excitation of unmodelled dynamics and sensitivity to system parameter variations. As a result, research on methods which address these problems have been conducted.« less

  6. A robust star identification algorithm with star shortlisting

    NASA Astrophysics Data System (ADS)

    Mehta, Deval Samirbhai; Chen, Shoushun; Low, Kay Soon

    2018-05-01

    A star tracker provides the most accurate attitude solution in terms of arc seconds compared to the other existing attitude sensors. When no prior attitude information is available, it operates in "Lost-In-Space (LIS)" mode. Star pattern recognition, also known as star identification algorithm, forms the most crucial part of a star tracker in the LIS mode. Recognition reliability and speed are the two most important parameters of a star pattern recognition technique. In this paper, a novel star identification algorithm with star ID shortlisting is proposed. Firstly, the star IDs are shortlisted based on worst-case patch mismatch, and later stars are identified in the image by an initial match confirmed with a running sequential angular match technique. The proposed idea is tested on 16,200 simulated star images having magnitude uncertainty, noise stars, positional deviation, and varying size of the field of view. The proposed idea is also benchmarked with the state-of-the-art star pattern recognition techniques. Finally, the real-time performance of the proposed technique is tested on the 3104 real star images captured by a star tracker SST-20S currently mounted on a satellite. The proposed technique can achieve an identification accuracy of 98% and takes only 8.2 ms for identification on real images. Simulation and real-time results depict that the proposed technique is highly robust and achieves a high speed of identification suitable for actual space applications.

  7. Methodology for creating dedicated machine and algorithm on sunflower counting

    NASA Astrophysics Data System (ADS)

    Muracciole, Vincent; Plainchault, Patrick; Mannino, Maria-Rosaria; Bertrand, Dominique; Vigouroux, Bertrand

    2007-09-01

    In order to sell grain lots in European countries, seed industries need a government certification. This certification requests purity testing, seed counting in order to quantify specified seed species and other impurities in lots, and germination testing. These analyses are carried out within the framework of international trade according to the methods of the International Seed Testing Association. Presently these different analyses are still achieved manually by skilled operators. Previous works have already shown that seeds can be characterized by around 110 visual features (morphology, colour, texture), and thus have presented several identification algorithms. Until now, most of the works in this domain are computer based. The approach presented in this article is based on the design of dedicated electronic vision machine aimed to identify and sort seeds. This machine is composed of a FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor) and a PC bearing the GUI (Human Machine Interface) of the system. Its operation relies on the stroboscopic image acquisition of a seed falling in front of a camera. A first machine was designed according to this approach, in order to simulate all the vision chain (image acquisition, feature extraction, identification) under the Matlab environment. In order to perform this task into dedicated hardware, all these algorithms were developed without the use of the Matlab toolbox. The objective of this article is to present a design methodology for a special purpose identification algorithm based on distance between groups into dedicated hardware machine for seed counting.

  8. The Activation of Embedded Words in Spoken Word Identification Is Robust but Constrained: Evidence from the Picture-Word Interference Paradigm

    ERIC Educational Resources Information Center

    Bowers, Jeffrey S.; Davis, Colin J.; Mattys, Sven L.; Damian, Markus F.; Hanley, Derek

    2009-01-01

    Three picture-word interference (PWI) experiments assessed the extent to which embedded subset words are activated during the identification of spoken superset words (e.g., "bone" in "trombone"). Participants named aloud pictures (e.g., "brain") while spoken distractors were presented. In the critical condition,…

  9. Wave data processing toolbox manual

    USGS Publications Warehouse

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.

  10. Robust Selection Algorithm (RSA) for Multi-Omic Biomarker Discovery; Integration with Functional Network Analysis to Identify miRNA Regulated Pathways in Multiple Cancers.

    PubMed

    Sehgal, Vasudha; Seviour, Elena G; Moss, Tyler J; Mills, Gordon B; Azencott, Robert; Ram, Prahlad T

    2015-01-01

    MicroRNAs (miRNAs) play a crucial role in the maintenance of cellular homeostasis by regulating the expression of their target genes. As such, the dysregulation of miRNA expression has been frequently linked to cancer. With rapidly accumulating molecular data linked to patient outcome, the need for identification of robust multi-omic molecular markers is critical in order to provide clinical impact. While previous bioinformatic tools have been developed to identify potential biomarkers in cancer, these methods do not allow for rapid classification of oncogenes versus tumor suppressors taking into account robust differential expression, cutoffs, p-values and non-normality of the data. Here, we propose a methodology, Robust Selection Algorithm (RSA) that addresses these important problems in big data omics analysis. The robustness of the survival analysis is ensured by identification of optimal cutoff values of omics expression, strengthened by p-value computed through intensive random resampling taking into account any non-normality in the data and integration into multi-omic functional networks. Here we have analyzed pan-cancer miRNA patient data to identify functional pathways involved in cancer progression that are associated with selected miRNA identified by RSA. Our approach demonstrates the way in which existing survival analysis techniques can be integrated with a functional network analysis framework to efficiently identify promising biomarkers and novel therapeutic candidates across diseases.

  11. ESA's Multi-mission Sentinel-1 Toolbox

    NASA Astrophysics Data System (ADS)

    Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus

    2017-04-01

    The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for exploitation of polarimetric data including speckle filters, decompositions, and classifiers. The Toolbox will also include tools for large data stacks, supervised and unsupervised classification, improved vector handling and change detection. Architectural improvements such as smart memory configuration, task queuing, and optimizations for complex data will provide better support and performance for very large products and stacks.In addition, a Cloud Exploitation Platform Extension (CEP) has been developed to add the capability to smoothly utilize a cloud computing platform where EO data repositories and high performance processing capabilities are available. The extension to the SENTINEL Application Platform would facilitate entry into cloud processing services for supporting bulk processing on high performance clusters. Since December 2016, the COMET-LiCS InSAR portal (http://comet.nerc.ac.uk/COMET-LiCS-portal/) has been live, delivering interferograms and coherence estimates over the entire Alpine-Himalayan belt. The portal already contains tens of thousands of products, which can be browsed in a user-friendly portal, and downloaded for free by the general public. For our processing, we use the facilities at the Climate and Environmental Monitoring from Space (CEMS). Here we have large storage and processing facilities to our disposal, and a complete duplicate of the Sentinel-1 archive is maintained. This greatly simplifies the infrastructure we had to develop for automated processing of large areas. Here we will give an overview of the current status of the processing system, as well as discuss future plans. We will cover the infrastructure we developed to automatically produce interferograms and its challenges, and the processing strategy for time series analysis. We will outline the objectives of the system in the near and distant future, and a roadmap for its continued development. Finally, we will highlight some of the scientific results and projects linked to the system.

  12. Visualizing flow fields using acoustic Doppler current profilers and the Velocity Mapping Toolbox

    USGS Publications Warehouse

    Jackson, P. Ryan

    2013-01-01

    The purpose of this fact sheet is to provide examples of how the U.S. Geological Survey is using acoustic Doppler current profilers for much more than routine discharge measurements. These instruments are capable of mapping complex three-dimensional flow fields within rivers, lakes, and estuaries. Using the Velocity Mapping Toolbox to process the ADCP data allows detailed visualization of the data, providing valuable information for a range of studies and applications.

  13. IV. NIH Toolbox Cognition Battery (CB): measuring language (vocabulary comprehension and reading decoding).

    PubMed

    Gershon, Richard C; Slotkin, Jerry; Manly, Jennifer J; Blitz, David L; Beaumont, Jennifer L; Schnipke, Deborah; Wallner-Allen, Kathleen; Golinkoff, Roberta Michnick; Gleason, Jean Berko; Hirsh-Pasek, Kathy; Adams, Marilyn Jager; Weintraub, Sandra

    2013-08-01

    Mastery of language skills is an important predictor of daily functioning and health. Vocabulary comprehension and reading decoding are relatively quick and easy to measure and correlate highly with overall cognitive functioning, as well as with success in school and work. New measures of vocabulary comprehension and reading decoding (in both English and Spanish) were developed for the NIH Toolbox Cognition Battery (CB). In the Toolbox Picture Vocabulary Test (TPVT), participants hear a spoken word while viewing four pictures, and then must choose the picture that best represents the word. This approach tests receptive vocabulary knowledge without the need to read or write, removing the literacy load for children who are developing literacy and for adults who struggle with reading and writing. In the Toolbox Oral Reading Recognition Test (TORRT), participants see a letter or word onscreen and must pronounce or identify it. The examiner determines whether it was pronounced correctly by comparing the response to the pronunciation guide on a separate computer screen. In this chapter, we discuss the importance of language during childhood and the relation of language and brain function. We also review the development of the TPVT and TORRT, including information about the item calibration process and results from a validation study. Finally, the strengths and weaknesses of the measures are discussed. © 2013 The Society for Research in Child Development, Inc.

  14. The iRoCS Toolbox--3D analysis of the plant root apical meristem at cellular resolution.

    PubMed

    Schmidt, Thorsten; Pasternak, Taras; Liu, Kun; Blein, Thomas; Aubry-Hivet, Dorothée; Dovzhenko, Alexander; Duerr, Jasmin; Teale, William; Ditengou, Franck A; Burkhardt, Hans; Ronneberger, Olaf; Palme, Klaus

    2014-03-01

    To achieve a detailed understanding of processes in biological systems, cellular features must be quantified in the three-dimensional (3D) context of cells and organs. We described use of the intrinsic root coordinate system (iRoCS) as a reference model for the root apical meristem of plants. iRoCS enables direct and quantitative comparison between the root tips of plant populations at single-cell resolution. The iRoCS Toolbox automatically fits standardized coordinates to raw 3D image data. It detects nuclei or segments cells, automatically fits the coordinate system, and groups the nuclei/cells into the root's tissue layers. The division status of each nucleus may also be determined. The only manual step required is to mark the quiescent centre. All intermediate outputs may be refined if necessary. The ability to learn the visual appearance of nuclei by example allows the iRoCS Toolbox to be easily adapted to various phenotypes. The iRoCS Toolbox is provided as an open-source software package, licensed under the GNU General Public License, to make it accessible to a broad community. To demonstrate the power of the technique, we measured subtle changes in cell division patterns caused by modified auxin flux within the Arabidopsis thaliana root apical meristem. © 2014 The Authors The Plant Journal © 2014 John Wiley & Sons Ltd.

  15. NIA outreach to minority and health disparity populations can a toolbox for recruitment and retention be far behind?

    PubMed

    Harden, J Taylor; Silverberg, Nina

    2010-01-01

    The ability to locate the right research tool at the right time for recruitment and retention of minority and health disparity populations is a challenge. This article provides an introduction to a number of recruitment and retention tools in a National Institute on Aging Health Disparities Toolbox and to this special edition on challenges and opportunities in recruitment and retention of minority populations in Alzheimer disease and dementia research. The Health Disparities Toolbox and Health Disparities Resource Persons Network are described along with other more established resource tools including the Alzheimer Disease Center Education Cores, Alzheimer Disease Education and Referral Center, and Resource Centers for Minority Aging Research. Nine featured articles are introduced. The articles address a range of concerns including what we know and do not know, conceptual and theoretical perspectives framing issues of diversity and inclusion, success as a result of sustained investment of time and community partnerships, the significant issue of mistrust, willingness to participate in research as a dynamic personal attribute, Helpline Service and the amount of resources required for success, assistance in working with Limited English Proficiency elders, and sage advice from social marketing and investigations of health literacy as a barrier to recruitment and retention. Finally, an appeal is made for scientists to share tools for the National Institute on Aging Health Disparity Toolbox and to join the Health Disparities Resource Persons Network.

  16. Development of a Robust star identification technique for use in attitude determination of the ACE spacecraft

    NASA Technical Reports Server (NTRS)

    Woodard, Mark; Rohrbaugh, Dave

    1995-01-01

    The Advanced Composition Explorer (ACE) spacecraft is designed to fly in a spin-stabilized attitude. The spacecraft will carry two attitude sensors - a digital fine Sun sensor and a charge coupled device (CCD) star tracker - to allow ground-based determination of the spacecraft attitude and spin rate. Part of the processing that must be performed on the CCD star tracker data is the star identification. Star data received from the spacecraft must be matched with star information in the SKYMAP catalog to determine exactly which stars the sensor is tracking. This information, along with the Sun vector measured by the Sun sensor, is used to determine the spacecraft attitude. Several existing star identification (star ID) systems were examined to determine whether they could be modified for use on the ACE mission. Star ID systems which exist for three-axis stabilized spacecraft tend to be complex in nature and many require fairly good knowledge of the spacecraft attitude, making their use for ACE excessive. Star ID systems used for spinners carrying traditional slit star sensors would have to be modified to model the CCD star tracker. The ACE star ID algorithm must also be robust, in that it will be able to correctly identify stars even though the attitude is not known to a high degree of accuracy, and must be very efficient to allow real-time star identification. The paper presents the star ID algorithm that was developed for ACE. Results from prototype testing are also presented to demonstrate the efficiency, accuracy, and robustness of the algorithm.

  17. Linear control of oscillator and amplifier flows*

    NASA Astrophysics Data System (ADS)

    Schmid, Peter J.; Sipp, Denis

    2016-08-01

    Linear control applied to fluid systems near an equilibrium point has important applications for many flows of industrial or fundamental interest. In this article we give an exposition of tools and approaches for the design of control strategies for globally stable or unstable flows. For unstable oscillator flows a feedback configuration and a model-based approach is proposed, while for stable noise-amplifier flows a feedforward setup and an approach based on system identification is advocated. Model reduction and robustness issues are addressed for the oscillator case; statistical learning techniques are emphasized for the amplifier case. Effective suppression of global and convective instabilities could be demonstrated for either case, even though the system-identification approach results in a superior robustness to off-design conditions.

  18. Identification of Terrestrial Reflectance From Remote Sensing

    NASA Technical Reports Server (NTRS)

    Alter-Gartenberg, Rachel; Nolf, Scott R.; Stacy, Kathryn (Technical Monitor)

    2000-01-01

    Correcting for atmospheric effects is an essential part of surface-reflectance recovery from radiance measurements. Model-based atmospheric correction techniques enable an accurate identification and classification of terrestrial reflectances from multi-spectral imagery. Successful and efficient removal of atmospheric effects from remote-sensing data is a key factor in the success of Earth observation missions. This report assesses the performance, robustness and sensitivity of two atmospheric-correction and reflectance-recovery techniques as part of an end-to-end simulation of hyper-spectral acquisition, identification and classification.

  19. A robust set of black walnut microsatellites for parentage and clonal identification

    Treesearch

    Rodney L. Robichaud; Jeffrey C. Glaubitz; Olin E. Rhodes; Keith Woeste

    2006-01-01

    We describe the development of a robust and powerful suite of 12 microsatellite marker loci for use in genetic investigations of black walnut and related species. These 12 loci were chosen from a set of 17 candidate loci used to genotype 222 trees sampled from a 38-year-old black walnut progeny test. The 222 genotypes represent a sampling from the broad geographic...

  20. Wavelet Filtering to Reduce Conservatism in Aeroservoelastic Robust Stability Margins

    NASA Technical Reports Server (NTRS)

    Brenner, Marty; Lind, Rick

    1998-01-01

    Wavelet analysis for filtering and system identification was used to improve the estimation of aeroservoelastic stability margins. The conservatism of the robust stability margins was reduced with parametric and nonparametric time-frequency analysis of flight data in the model validation process. Nonparametric wavelet processing of data was used to reduce the effects of external desirableness and unmodeled dynamics. Parametric estimates of modal stability were also extracted using the wavelet transform. Computation of robust stability margins for stability boundary prediction depends on uncertainty descriptions derived from the data for model validation. F-18 high Alpha Research Vehicle aeroservoelastic flight test data demonstrated improved robust stability prediction by extension of the stability boundary beyond the flight regime.

  1. Spectral imaging toolbox: segmentation, hyperstack reconstruction, and batch processing of spectral images for the determination of cell and model membrane lipid order.

    PubMed

    Aron, Miles; Browning, Richard; Carugo, Dario; Sezgin, Erdinc; Bernardino de la Serna, Jorge; Eggeling, Christian; Stride, Eleanor

    2017-05-12

    Spectral imaging with polarity-sensitive fluorescent probes enables the quantification of cell and model membrane physical properties, including local hydration, fluidity, and lateral lipid packing, usually characterized by the generalized polarization (GP) parameter. With the development of commercial microscopes equipped with spectral detectors, spectral imaging has become a convenient and powerful technique for measuring GP and other membrane properties. The existing tools for spectral image processing, however, are insufficient for processing the large data sets afforded by this technological advancement, and are unsuitable for processing images acquired with rapidly internalized fluorescent probes. Here we present a MATLAB spectral imaging toolbox with the aim of overcoming these limitations. In addition to common operations, such as the calculation of distributions of GP values, generation of pseudo-colored GP maps, and spectral analysis, a key highlight of this tool is reliable membrane segmentation for probes that are rapidly internalized. Furthermore, handling for hyperstacks, 3D reconstruction and batch processing facilitates analysis of data sets generated by time series, z-stack, and area scan microscope operations. Finally, the object size distribution is determined, which can provide insight into the mechanisms underlying changes in membrane properties and is desirable for e.g. studies involving model membranes and surfactant coated particles. Analysis is demonstrated for cell membranes, cell-derived vesicles, model membranes, and microbubbles with environmentally-sensitive probes Laurdan, carboxyl-modified Laurdan (C-Laurdan), Di-4-ANEPPDHQ, and Di-4-AN(F)EPPTEA (FE), for quantification of the local lateral density of lipids or lipid packing. The Spectral Imaging Toolbox is a powerful tool for the segmentation and processing of large spectral imaging datasets with a reliable method for membrane segmentation and no ability in programming required. The Spectral Imaging Toolbox can be downloaded from https://uk.mathworks.com/matlabcentral/fileexchange/62617-spectral-imaging-toolbox .

  2. Performance study of LMS based adaptive algorithms for unknown system identification

    NASA Astrophysics Data System (ADS)

    Javed, Shazia; Ahmad, Noor Atinah

    2014-07-01

    Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signal is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment.

  3. Performance study of LMS based adaptive algorithms for unknown system identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Javed, Shazia; Ahmad, Noor Atinah

    Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signalmore » is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment.« less

  4. Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) Users' Workshop Presentations

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S. (Compiler)

    2018-01-01

    NASA Glenn Research Center hosted a Users' Workshop on the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) on August 21, 2017. The objective of this workshop was to update the user community on the latest features of T-MATS, and to provide a forum to present work performed using T-MATS. Presentations highlighted creative applications and the development of new features and libraries, and emphasized the flexibility and simulation power of T-MATS.

  5. Multirate Flutter Suppression System Design for the Benchmark Active Controls Technology Wing. Part 2; Methodology Application Software Toolbox

    NASA Technical Reports Server (NTRS)

    Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek

    2002-01-01

    To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.

  6. Identification of Differential Item Functioning in Multiple-Group Settings: A Multivariate Outlier Detection Approach

    ERIC Educational Resources Information Center

    Magis, David; De Boeck, Paul

    2011-01-01

    We focus on the identification of differential item functioning (DIF) when more than two groups of examinees are considered. We propose to consider items as elements of a multivariate space, where DIF items are outlying elements. Following this approach, the situation of multiple groups is a quite natural case. A robust statistics technique is…

  7. Study of environmental sound source identification based on hidden Markov model for robust speech recognition

    NASA Astrophysics Data System (ADS)

    Nishiura, Takanobu; Nakamura, Satoshi

    2003-10-01

    Humans communicate with each other through speech by focusing on the target speech among environmental sounds in real acoustic environments. We can easily identify the target sound from other environmental sounds. For hands-free speech recognition, the identification of the target speech from environmental sounds is imperative. This mechanism may also be important for a self-moving robot to sense the acoustic environments and communicate with humans. Therefore, this paper first proposes hidden Markov model (HMM)-based environmental sound source identification. Environmental sounds are modeled by three states of HMMs and evaluated using 92 kinds of environmental sounds. The identification accuracy was 95.4%. This paper also proposes a new HMM composition method that composes speech HMMs and an HMM of categorized environmental sounds for robust environmental sound-added speech recognition. As a result of the evaluation experiments, we confirmed that the proposed HMM composition outperforms the conventional HMM composition with speech HMMs and a noise (environmental sound) HMM trained using noise periods prior to the target speech in a captured signal. [Work supported by Ministry of Public Management, Home Affairs, Posts and Telecommunications of Japan.

  8. Soft and Robust Identification of Body Fluid Using Fourier Transform Infrared Spectroscopy and Chemometric Strategies for Forensic Analysis.

    PubMed

    Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko; Ozawa, Takeaki

    2018-05-31

    Body fluid (BF) identification is a critical part of a criminal investigation because of its ability to suggest how the crime was committed and to provide reliable origins of DNA. In contrast to current methods using serological and biochemical techniques, vibrational spectroscopic approaches provide alternative advantages for forensic BF identification, such as non-destructivity and versatility for various BF types and analytical interests. However, unexplored issues remain for its practical application to forensics; for example, a specific BF needs to be discriminated from all other suspicious materials as well as other BFs, and the method should be applicable even to aged BF samples. Herein, we describe an innovative modeling method for discriminating the ATR FT-IR spectra of various BFs, including peripheral blood, saliva, semen, urine and sweat, to meet the practical demands described above. Spectra from unexpected non-BF samples were efficiently excluded as outliers by adopting the Q-statistics technique. The robustness of the models against aged BFs was significantly improved by using the discrimination scheme of a dichotomous classification tree with hierarchical clustering. The present study advances the use of vibrational spectroscopy and a chemometric strategy for forensic BF identification.

  9. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  10. CEMENTITIOUS BARRIERS PARTNERSHIP FY13 MID-YEAR REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, H.; Flach, G.; Langton, C.

    2013-05-01

    In FY2013, the Cementitious Barriers Partnership (CBP) is continuing in its effort to develop and enhance software tools demonstrating tangible progress toward fulfilling the objective of developing a set of tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In FY2012, the CBP released the initial inhouse “Beta-version” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. The current primary software components are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. THAMESmore » is a planned future CBP Toolbox component (FY13/14) focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. This past November, the CBP Software Toolbox Version 1.0 was released that supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). The CBP issued numerous reports and other documentation that accompanied the “Version 1.0” release including a CBP Software Toolbox User Guide and Installation Guide. These documents, as well as, the presentations from the CBP Software Toolbox Demonstration and User Workshop, which are briefly described below, can be accessed from the CBP webpage at http://cementbarriers.org/. The website was recently modified to describe the CBP Software Toolbox and includes an interest form for application to use the software. The CBP FY13 program is continuing research to improve and enhance the simulation tools as well as develop new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools thru laboratory experiments and analysis of field specimens are ongoing to quantify and reduce the uncertainty associated with performance assessments are ongoing. This mid-year report also includes both a summary on the FY13 software accomplishments in addition to the release of Version 1.0 of the CBP Software Toolbox and the various experimental programs that are providing data for calibration and validation of the CBP developed software. The focus this year for experimental studies was to measure transport in cementitious material by utilization of a leaching method and reduction capacity of saltstone field samples. Results are being used to calibrate and validate the updated carbonation model.« less

  11. Identification and differentiation of food-related bacteria: A comparison of FTIR spectroscopy and MALDI-TOF mass spectrometry.

    PubMed

    Wenning, Mareike; Breitenwieser, Franziska; Konrad, Regina; Huber, Ingrid; Busch, Ulrich; Scherer, Siegfried

    2014-08-01

    The food industry requires easy, accurate, and cost-effective techniques for microbial identification to ensure safe products and identify microbial contaminations. In this work, FTIR spectroscopy and MALDI-TOF mass spectrometry were assessed for their suitability and applicability for routine microbial diagnostics of food-related microorganisms by analyzing their robustness according to changes in incubation time and medium, identification accuracy and their ability to differentiate isolates down to the strain level. Changes in the protocol lead to a significantly impaired performance of FTIR spectroscopy, whereas they had only little effects on MALDI-TOF MS. Identification accuracy was tested using 174 food-related bacteria (93 species) from an in-house strain collection and 40 fresh isolates from routine food analyses. For MALDI-TOF MS, weaknesses in the identification of bacilli and pseudomonads were observed; FTIR spectroscopy had most difficulties in identifying pseudomonads and enterobacteria. In general, MALDI-TOF MS obtained better results (52-85% correct at species level), since the analysis of mainly ribosomal proteins is more robust and seems to be more reliable. FTIR spectroscopy suffers from the fact that it generates a whole-cell fingerprint and intraspecies diversity may lead to overlapping species borders which complicates identification. In the present study values between 56% and 67% correct species identification were obtained. On the opposite, this high sensitivity offers the opportunity of typing below the species level which was not possible using MALDI-TOF MS. Using fresh isolates from routine diagnostics, both techniques performed well with 88% (MALDI-TOF) and 75% (FTIR) correct identifications at species level, respectively. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Robust nonlinear system identification: Bayesian mixture of experts using the t-distribution

    NASA Astrophysics Data System (ADS)

    Baldacchino, Tara; Worden, Keith; Rowson, Jennifer

    2017-02-01

    A novel variational Bayesian mixture of experts model for robust regression of bifurcating and piece-wise continuous processes is introduced. The mixture of experts model is a powerful model which probabilistically splits the input space allowing different models to operate in the separate regions. However, current methods have no fail-safe against outliers. In this paper, a robust mixture of experts model is proposed which consists of Student-t mixture models at the gates and Student-t distributed experts, trained via Bayesian inference. The Student-t distribution has heavier tails than the Gaussian distribution, and so it is more robust to outliers, noise and non-normality in the data. Using both simulated data and real data obtained from the Z24 bridge this robust mixture of experts performs better than its Gaussian counterpart when outliers are present. In particular, it provides robustness to outliers in two forms: unbiased parameter regression models, and robustness to overfitting/complex models.

  13. Shape detection of Gaborized outline versions of everyday objects

    PubMed Central

    Sassi, Michaël; Machilsen, Bart; Wagemans, Johan

    2012-01-01

    We previously tested the identifiability of six versions of Gaborized outlines of everyday objects, differing in the orientations assigned to elements inside and outside the outline. We found significant differences in identifiability between the versions, and related a number of stimulus metrics to identifiability [Sassi, M., Vancleef, K., Machilsen, B., Panis, S., & Wagemans, J. (2010). Identification of everyday objects on the basis of Gaborized outline versions. i-Perception, 1(3), 121–142]. In this study, after retesting the identifiability of new variants of three of the stimulus versions, we tested their robustness to local orientation jitter in a detection experiment. In general, our results replicated the key findings from the previous study, and allowed us to substantiate our earlier interpretations of the effects of our stimulus metrics and of the performance differences between the different stimulus versions. The results of the detection task revealed a different ranking order of stimulus versions than the identification task. By examining the parallels and differences between the effects of our stimulus metrics in the two tasks, we found evidence for a trade-off between shape detectability and identifiability. The generally simple and smooth shapes that yield the strongest contour integration and most robust detectability tend to lack the distinguishing features necessary for clear-cut identification. Conversely, contours that do contain such identifying features tend to be inherently more complex and, therefore, yield weaker integration and less robust detectability. PMID:23483752

  14. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling

    PubMed Central

    Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078

  15. An Early Years Toolbox for Assessing Early Executive Function, Language, Self-Regulation, and Social Development: Validity, Reliability, and Preliminary Norms

    PubMed Central

    Howard, Steven J.; Melhuish, Edward

    2016-01-01

    Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years Toolbox (EYT) offers substantial advantages for early assessment of language, EF, self-regulation, and social development. In the current study, results of our large-scale administration of this toolbox to 1,764 preschool and early primary school students indicated very good reliability, convergent validity with existing measures, and developmental sensitivity. Results were also suggestive of better capture of children’s emerging abilities relative to comparison measures. Preliminary norms are presented, showing a clear developmental trajectory across half-year age groups. The accessibility of the EYT, as well as its advantages over existing measures, offers considerably enhanced opportunities for objective measurement of young children’s abilities to enable research and educational applications. PMID:28503022

  16. Distributed Aerodynamic Sensing and Processing Toolbox

    NASA Technical Reports Server (NTRS)

    Brenner, Martin; Jutte, Christine; Mangalam, Arun

    2011-01-01

    A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.

  17. The MONGOOSE Rational Arithmetic Toolbox.

    PubMed

    Le, Christopher; Chindelevitch, Leonid

    2018-01-01

    The modeling of metabolic networks has seen a rapid expansion following the complete sequencing of thousands of genomes. The constraint-based modeling framework has emerged as one of the most popular approaches to reconstructing and analyzing genome-scale metabolic models. Its main assumption is that of a quasi-steady-state, requiring that the production of each internal metabolite be balanced by its consumption. However, due to the multiscale nature of the models, the large number of reactions and metabolites, and the use of floating-point arithmetic for the stoichiometric coefficients, ensuring that this assumption holds can be challenging.The MONGOOSE toolbox addresses this problem by using rational arithmetic, thus ensuring that models are analyzed in a reproducible manner and consistently with modeling assumptions. In this chapter we present a protocol for the complete analysis of a metabolic network model using the MONGOOSE toolbox, via its newly developed GUI, and describe how it can be used as a model-checking platform both during and after the model construction process.

  18. Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.

    PubMed

    Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja

    2016-01-01

    Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.

  19. BOLDSync: a MATLAB-based toolbox for synchronized stimulus presentation in functional MRI.

    PubMed

    Joshi, Jitesh; Saharan, Sumiti; Mandal, Pravat K

    2014-02-15

    Precise and synchronized presentation of paradigm stimuli in functional magnetic resonance imaging (fMRI) is central to obtaining accurate information about brain regions involved in a specific task. In this manuscript, we present a new MATLAB-based toolbox, BOLDSync, for synchronized stimulus presentation in fMRI. BOLDSync provides a user friendly platform for design and presentation of visual, audio, as well as multimodal audio-visual (AV) stimuli in functional imaging experiments. We present simulation experiments that demonstrate the millisecond synchronization accuracy of BOLDSync, and also illustrate the functionalities of BOLDSync through application to an AV fMRI study. BOLDSync gains an advantage over other available proprietary and open-source toolboxes by offering a user friendly and accessible interface that affords both precision in stimulus presentation and versatility across various types of stimulus designs and system setups. BOLDSync is a reliable, efficient, and versatile solution for synchronized stimulus presentation in fMRI study. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. A spike sorting toolbox for up to thousands of electrodes validated with ground truth recordings in vitro and in vivo

    PubMed Central

    Lefebvre, Baptiste; Deny, Stéphane; Gardella, Christophe; Stimberg, Marcel; Jetter, Florian; Zeck, Guenther; Picaud, Serge; Duebel, Jens

    2018-01-01

    In recent years, multielectrode arrays and large silicon probes have been developed to record simultaneously between hundreds and thousands of electrodes packed with a high density. However, they require novel methods to extract the spiking activity of large ensembles of neurons. Here, we developed a new toolbox to sort spikes from these large-scale extracellular data. To validate our method, we performed simultaneous extracellular and loose patch recordings in rodents to obtain ‘ground truth’ data, where the solution to this sorting problem is known for one cell. The performance of our algorithm was always close to the best expected performance, over a broad range of signal-to-noise ratios, in vitro and in vivo. The algorithm is entirely parallelized and has been successfully tested on recordings with up to 4225 electrodes. Our toolbox thus offers a generic solution to sort accurately spikes for up to thousands of electrodes. PMID:29557782

  1. Developing a Fluorescent Toolbox To Shed Light on the Mysteries of RNA.

    PubMed

    Alexander, Seth C; Devaraj, Neal K

    2017-10-03

    Technologies that detect and image RNA have illuminated the complex roles played by RNA, redefining the traditional and superficial role first outlined by the central dogma of biology. Because there is such a wide diversity of RNA structure arising from an assortment of functions within biology, a toolbox of approaches have emerged for investigation of this important class of biomolecules. These methods are necessary to detect and elucidate the localization and dynamics of specific RNAs and in doing so unlock our understanding of how RNA dysregulation leads to disease. Current methods for detecting and imaging RNA include in situ hybridization techniques, fluorescent aptamers, RNA binding proteins fused to fluorescent reporters, and covalent labeling strategies. Because of the inherent diversity of these methods, each approach comes with a set of strengths and limitations that leave room for future improvement. This perspective seeks to highlight the most recent advances and remaining challenges for the wide-ranging toolbox of technologies that illuminate RNA's contribution to cellular complexity.

  2. GenSSI 2.0: multi-experiment structural identifiability analysis of SBML models.

    PubMed

    Ligon, Thomas S; Fröhlich, Fabian; Chis, Oana T; Banga, Julio R; Balsa-Canto, Eva; Hasenauer, Jan

    2018-04-15

    Mathematical modeling using ordinary differential equations is used in systems biology to improve the understanding of dynamic biological processes. The parameters of ordinary differential equation models are usually estimated from experimental data. To analyze a priori the uniqueness of the solution of the estimation problem, structural identifiability analysis methods have been developed. We introduce GenSSI 2.0, an advancement of the software toolbox GenSSI (Generating Series for testing Structural Identifiability). GenSSI 2.0 is the first toolbox for structural identifiability analysis to implement Systems Biology Markup Language import, state/parameter transformations and multi-experiment structural identifiability analysis. In addition, GenSSI 2.0 supports a range of MATLAB versions and is computationally more efficient than its previous version, enabling the analysis of more complex models. GenSSI 2.0 is an open-source MATLAB toolbox and available at https://github.com/genssi-developer/GenSSI. thomas.ligon@physik.uni-muenchen.de or jan.hasenauer@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  3. Closed-Loop Evaluation of an Integrated Failure Identification and Fault Tolerant Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine; Khong, thuan

    2006-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems developed for failure detection, identification, and reconfiguration, as well as upset recovery, need to be evaluated over broad regions of the flight envelope or under extreme flight conditions, and should include various sources of uncertainty. To apply formal robustness analysis, formulation of linear fractional transformation (LFT) models of complex parameter-dependent systems is required, which represent system uncertainty due to parameter uncertainty and actuator faults. This paper describes a detailed LFT model formulation procedure from the nonlinear model of a transport aircraft by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The closed-loop system is evaluated over the entire flight envelope based on the generated LFT model which can cover nonlinear dynamics. The robustness analysis results of the closed-loop fault tolerant control system of a transport aircraft are presented. A reliable flight envelope (safe flight regime) is also calculated from the robust performance analysis results, over which the closed-loop system can achieve the desired performance of command tracking and failure detection.

  4. Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox

    NASA Astrophysics Data System (ADS)

    Karimpour, Arash; Chen, Qin

    2017-09-01

    There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana's estuaries.

  5. A CRISPR/Cas9 Toolbox for Multiplexed Plant Genome Editing and Transcriptional Regulation.

    PubMed

    Lowder, Levi G; Zhang, Dengwei; Baltes, Nicholas J; Paul, Joseph W; Tang, Xu; Zheng, Xuelian; Voytas, Daniel F; Hsieh, Tzung-Fu; Zhang, Yong; Qi, Yiping

    2015-10-01

    The relative ease, speed, and biological scope of clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated Protein9 (Cas9)-based reagents for genomic manipulations are revolutionizing virtually all areas of molecular biosciences, including functional genomics, genetics, applied biomedical research, and agricultural biotechnology. In plant systems, however, a number of hurdles currently exist that limit this technology from reaching its full potential. For example, significant plant molecular biology expertise and effort is still required to generate functional expression constructs that allow simultaneous editing, and especially transcriptional regulation, of multiple different genomic loci or multiplexing, which is a significant advantage of CRISPR/Cas9 versus other genome-editing systems. To streamline and facilitate rapid and wide-scale use of CRISPR/Cas9-based technologies for plant research, we developed and implemented a comprehensive molecular toolbox for multifaceted CRISPR/Cas9 applications in plants. This toolbox provides researchers with a protocol and reagents to quickly and efficiently assemble functional CRISPR/Cas9 transfer DNA constructs for monocots and dicots using Golden Gate and Gateway cloning methods. It comes with a full suite of capabilities, including multiplexed gene editing and transcriptional activation or repression of plant endogenous genes. We report the functionality and effectiveness of this toolbox in model plants such as tobacco (Nicotiana benthamiana), Arabidopsis (Arabidopsis thaliana), and rice (Oryza sativa), demonstrating its utility for basic and applied plant research. © 2015 American Society of Plant Biologists. All Rights Reserved.

  6. F-15B QuietSpike(TradeMark) Aeroservoelastic Flight Test Data Analysis

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2007-01-01

    System identification or mathematical modelling is utilised in the aerospace community for the development of simulation models for robust control law design. These models are often described as linear, time-invariant processes and assumed to be uniform throughout the flight envelope. Nevertheless, it is well known that the underlying process is inherently nonlinear. The reason for utilising a linear approach has been due to the lack of a proper set of tools for the identification of nonlinear systems. Over the past several decades the controls and biomedical communities have made great advances in developing tools for the identification of nonlinear systems. These approaches are robust and readily applicable to aerospace systems. In this paper, we show the application of one such nonlinear system identification technique, structure detection, for the analysis of F-15B QuietSpike(TradeMark) aeroservoelastic flight test data. Structure detection is concerned with the selection of a subset of candidate terms that best describe the observed output. This is a necessary procedure to compute an efficient system description which may afford greater insight into the functionality of the system or a simpler controller design. Structure computation as a tool for black-box modelling may be of critical importance for the development of robust, parsimonious models for the flight-test community. Moreover, this approach may lead to efficient strategies for rapid envelope expansion which may save significant development time and costs. The objectives of this study are to demonstrate via analysis of F-15B QuietSpike(TradeMark) aeroservoelastic flight test data for several flight conditions (Mach number) that (i) linear models are inefficient for modelling aeroservoelastic data, (ii) nonlinear identification provides a parsimonious model description whilst providing a high percent fit for cross-validated data and (iii) the model structure and parameters vary as the flight condition is altered.

  7. Development and experimental verification of a robust active noise control system for a diesel engine in submarines

    NASA Astrophysics Data System (ADS)

    Sachau, D.; Jukkert, S.; Hövelmann, N.

    2016-08-01

    This paper presents the development and experimental validation of an ANC (active noise control)-system designed for a particular application in the exhaust line of a submarine. Thereby, tonal components of the exhaust noise in the frequency band from 75 Hz to 120 Hz are reduced by more than 30 dB. The ANC-system is based on the feedforward leaky FxLMS-algorithm. The observability of the sound pressure in standing wave field is ensured by using two error microphones. The noninvasive online plant identification method is used to increase the robustness of the controller. Online plant identification is extended by a time-varying convergence gain to improve the performance in the presence of slight error in the frequency of the reference signal.

  8. Identification and MS-assisted interpretation of genetically influenced NMR signals in human plasma

    PubMed Central

    2013-01-01

    Nuclear magnetic resonance spectroscopy (NMR) provides robust readouts of many metabolic parameters in one experiment. However, identification of clinically relevant markers in 1H NMR spectra is a major challenge. Association of NMR-derived quantities with genetic variants can uncover biologically relevant metabolic traits. Using NMR data of plasma samples from 1,757 individuals from the KORA study together with 655,658 genetic variants, we show that ratios between NMR intensities at two chemical shift positions can provide informative and robust biomarkers. We report seven loci of genetic association with NMR-derived traits (APOA1, CETP, CPS1, GCKR, FADS1, LIPC, PYROXD2) and characterize these traits biochemically using mass spectrometry. These ratios may now be used in clinical studies. PMID:23414815

  9. Improving Leishmania Species Identification in Different Types of Samples from Cutaneous Lesions

    PubMed Central

    Cruz-Barrera, Mónica L.; Ovalle-Bracho, Clemencia; Ortegon-Vergara, Viviana; Pérez-Franco, Jairo E.

    2015-01-01

    The discrimination of Leishmania species from patient samples has epidemiological and clinical relevance. In this study, different gene target PCR-restriction fragment length polymorphism (RFLP) protocols were evaluated for their robustness as Leishmania species discriminators in 61 patients with cutaneous leishmaniasis. We modified the hsp70-PCR-RFLP protocol and found it to be the most reliable protocol for species identification. PMID:25609727

  10. ESA Atmospheric Toolbox

    NASA Astrophysics Data System (ADS)

    Niemeijer, Sander

    2017-04-01

    The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/

  11. Substructure System Identification for Finite Element Model Updating

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.; Blades, Eric L.

    1997-01-01

    This report summarizes research conducted under a NASA grant on the topic 'Substructure System Identification for Finite Element Model Updating.' The research concerns ongoing development of the Substructure System Identification Algorithm (SSID Algorithm), a system identification algorithm that can be used to obtain mathematical models of substructures, like Space Shuttle payloads. In the present study, particular attention was given to the following topics: making the algorithm robust to noisy test data, extending the algorithm to accept experimental FRF data that covers a broad frequency bandwidth, and developing a test analytical model (TAM) for use in relating test data to reduced-order finite element models.

  12. Implementation of facial recognition with Microsoft Kinect v2 sensor for patient verification.

    PubMed

    Silverstein, Evan; Snyder, Michael

    2017-06-01

    The aim of this study was to present a straightforward implementation of facial recognition using the Microsoft Kinect v2 sensor for patient identification in a radiotherapy setting. A facial recognition system was created with the Microsoft Kinect v2 using a facial mapping library distributed with the Kinect v2 SDK as a basis for the algorithm. The system extracts 31 fiducial points representing various facial landmarks which are used in both the creation of a reference data set and subsequent evaluations of real-time sensor data in the matching algorithm. To test the algorithm, a database of 39 faces was created, each with 465 vectors derived from the fiducial points, and a one-to-one matching procedure was performed to obtain sensitivity and specificity data of the facial identification system. ROC curves were plotted to display system performance and identify thresholds for match determination. In addition, system performance as a function of ambient light intensity was tested. Using optimized parameters in the matching algorithm, the sensitivity of the system for 5299 trials was 96.5% and the specificity was 96.7%. The results indicate a fairly robust methodology for verifying, in real-time, a specific face through comparison from a precollected reference data set. In its current implementation, the process of data collection for each face and subsequent matching session averaged approximately 30 s, which may be too onerous to provide a realistic supplement to patient identification in a clinical setting. Despite the time commitment, the data collection process was well tolerated by all participants and most robust when consistent ambient light conditions were maintained across both the reference recording session and subsequent real-time identification sessions. A facial recognition system can be implemented for patient identification using the Microsoft Kinect v2 sensor and the distributed SDK. In its present form, the system is accurate-if time consuming-and further iterations of the method could provide a robust, easy to implement, and cost-effective supplement to traditional patient identification methods. © 2017 American Association of Physicists in Medicine.

  13. A novel method for accurate needle-tip identification in trans-rectal ultrasound-based high-dose-rate prostate brachytherapy.

    PubMed

    Zheng, Dandan; Todor, Dorin A

    2011-01-01

    In real-time trans-rectal ultrasound (TRUS)-based high-dose-rate prostate brachytherapy, the accurate identification of needle-tip position is critical for treatment planning and delivery. Currently, needle-tip identification on ultrasound images can be subject to large uncertainty and errors because of ultrasound image quality and imaging artifacts. To address this problem, we developed a method based on physical measurements with simple and practical implementation to improve the accuracy and robustness of needle-tip identification. Our method uses measurements of the residual needle length and an off-line pre-established coordinate transformation factor, to calculate the needle-tip position on the TRUS images. The transformation factor was established through a one-time systematic set of measurements of the probe and template holder positions, applicable to all patients. To compare the accuracy and robustness of the proposed method and the conventional method (ultrasound detection), based on the gold-standard X-ray fluoroscopy, extensive measurements were conducted in water and gel phantoms. In water phantom, our method showed an average tip-detection accuracy of 0.7 mm compared with 1.6 mm of the conventional method. In gel phantom (more realistic and tissue-like), our method maintained its level of accuracy while the uncertainty of the conventional method was 3.4mm on average with maximum values of over 10mm because of imaging artifacts. A novel method based on simple physical measurements was developed to accurately detect the needle-tip position for TRUS-based high-dose-rate prostate brachytherapy. The method demonstrated much improved accuracy and robustness over the conventional method. Copyright © 2011 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  14. Autonomous frequency domain identification: Theory and experiment

    NASA Technical Reports Server (NTRS)

    Yam, Yeung; Bayard, D. S.; Hadaegh, F. Y.; Mettler, E.; Milman, M. H.; Scheid, R. E.

    1989-01-01

    The analysis, design, and on-orbit tuning of robust controllers require more information about the plant than simply a nominal estimate of the plant transfer function. Information is also required concerning the uncertainty in the nominal estimate, or more generally, the identification of a model set within which the true plant is known to lie. The identification methodology that was developed and experimentally demonstrated makes use of a simple but useful characterization of the model uncertainty based on the output error. This is a characterization of the additive uncertainty in the plant model, which has found considerable use in many robust control analysis and synthesis techniques. The identification process is initiated by a stochastic input u which is applied to the plant p giving rise to the output. Spectral estimation (h = P sub uy/P sub uu) is used as an estimate of p and the model order is estimated using the produce moment matrix (PMM) method. A parametric model unit direction vector p is then determined by curve fitting the spectral estimate to a rational transfer function. The additive uncertainty delta sub m = p - unit direction vector p is then estimated by the cross spectral estimate delta = P sub ue/P sub uu where e = y - unit direction vectory y is the output error, and unit direction vector y = unit direction vector pu is the computed output of the parametric model subjected to the actual input u. The experimental results demonstrate the curve fitting algorithm produces the reduced-order plant model which minimizes the additive uncertainty. The nominal transfer function estimate unit direction vector p and the estimate delta of the additive uncertainty delta sub m are subsequently available to be used for optimization of robust controller performance and stability.

  15. Robust identification of transcriptional regulatory networks using a Gibbs sampler on outlier sum statistic.

    PubMed

    Gu, Jinghua; Xuan, Jianhua; Riggins, Rebecca B; Chen, Li; Wang, Yue; Clarke, Robert

    2012-08-01

    Identification of transcriptional regulatory networks (TRNs) is of significant importance in computational biology for cancer research, providing a critical building block to unravel disease pathways. However, existing methods for TRN identification suffer from the inclusion of excessive 'noise' in microarray data and false-positives in binding data, especially when applied to human tumor-derived cell line studies. More robust methods that can counteract the imperfection of data sources are therefore needed for reliable identification of TRNs in this context. In this article, we propose to establish a link between the quality of one target gene to represent its regulator and the uncertainty of its expression to represent other target genes. Specifically, an outlier sum statistic was used to measure the aggregated evidence for regulation events between target genes and their corresponding transcription factors. A Gibbs sampling method was then developed to estimate the marginal distribution of the outlier sum statistic, hence, to uncover underlying regulatory relationships. To evaluate the effectiveness of our proposed method, we compared its performance with that of an existing sampling-based method using both simulation data and yeast cell cycle data. The experimental results show that our method consistently outperforms the competing method in different settings of signal-to-noise ratio and network topology, indicating its robustness for biological applications. Finally, we applied our method to breast cancer cell line data and demonstrated its ability to extract biologically meaningful regulatory modules related to estrogen signaling and action in breast cancer. The Gibbs sampler MATLAB package is freely available at http://www.cbil.ece.vt.edu/software.htm. xuan@vt.edu Supplementary data are available at Bioinformatics online.

  16. Robust identification of transcriptional regulatory networks using a Gibbs sampler on outlier sum statistic

    PubMed Central

    Gu, Jinghua; Xuan, Jianhua; Riggins, Rebecca B.; Chen, Li; Wang, Yue; Clarke, Robert

    2012-01-01

    Motivation: Identification of transcriptional regulatory networks (TRNs) is of significant importance in computational biology for cancer research, providing a critical building block to unravel disease pathways. However, existing methods for TRN identification suffer from the inclusion of excessive ‘noise’ in microarray data and false-positives in binding data, especially when applied to human tumor-derived cell line studies. More robust methods that can counteract the imperfection of data sources are therefore needed for reliable identification of TRNs in this context. Results: In this article, we propose to establish a link between the quality of one target gene to represent its regulator and the uncertainty of its expression to represent other target genes. Specifically, an outlier sum statistic was used to measure the aggregated evidence for regulation events between target genes and their corresponding transcription factors. A Gibbs sampling method was then developed to estimate the marginal distribution of the outlier sum statistic, hence, to uncover underlying regulatory relationships. To evaluate the effectiveness of our proposed method, we compared its performance with that of an existing sampling-based method using both simulation data and yeast cell cycle data. The experimental results show that our method consistently outperforms the competing method in different settings of signal-to-noise ratio and network topology, indicating its robustness for biological applications. Finally, we applied our method to breast cancer cell line data and demonstrated its ability to extract biologically meaningful regulatory modules related to estrogen signaling and action in breast cancer. Availability and implementation: The Gibbs sampler MATLAB package is freely available at http://www.cbil.ece.vt.edu/software.htm. Contact: xuan@vt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22595208

  17. DARPA super resolution vision system (SRVS) robust turbulence data collection and analysis

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Leonard, Kevin R.; Thompson, Roger; Tofsted, David; D'Arcy, Sean

    2014-05-01

    Atmospheric turbulence degrades the range performance of military imaging systems, specifically those intended for long range, ground-to-ground target identification. The recent Defense Advanced Research Projects Agency (DARPA) Super Resolution Vision System (SRVS) program developed novel post-processing system components to mitigate turbulence effects on visible and infrared sensor systems. As part of the program, the US Army RDECOM CERDEC NVESD and the US Army Research Laboratory Computational & Information Sciences Directorate (CISD) collaborated on a field collection and atmospheric characterization of a two-handed weapon identification dataset through a diurnal cycle for a variety of ranges and sensor systems. The robust dataset is useful in developing new models and simulations of turbulence, as well for providing as a standard baseline for comparison of sensor systems in the presence of turbulence degradation and mitigation. In this paper, we describe the field collection and atmospheric characterization and present the robust dataset to the defense, sensing, and security community. In addition, we present an expanded model validation of turbulence degradation using the field collected video sequences.

  18. Performance analysis of robust road sign identification

    NASA Astrophysics Data System (ADS)

    Ali, Nursabillilah M.; Mustafah, Y. M.; Rashid, N. K. A. M.

    2013-12-01

    This study describes performance analysis of a robust system for road sign identification that incorporated two stages of different algorithms. The proposed algorithms consist of HSV color filtering and PCA techniques respectively in detection and recognition stages. The proposed algorithms are able to detect the three standard types of colored images namely Red, Yellow and Blue. The hypothesis of the study is that road sign images can be used to detect and identify signs that are involved with the existence of occlusions and rotational changes. PCA is known as feature extraction technique that reduces dimensional size. The sign image can be easily recognized and identified by the PCA method as is has been used in many application areas. Based on the experimental result, it shows that the HSV is robust in road sign detection with minimum of 88% and 77% successful rate for non-partial and partial occlusions images. For successful recognition rates using PCA can be achieved in the range of 94-98%. The occurrences of all classes are recognized successfully is between 5% and 10% level of occlusions.

  19. An adaptive deep learning approach for PPG-based identification.

    PubMed

    Jindal, V; Birjandtalab, J; Pouyan, M Baran; Nourani, M

    2016-08-01

    Wearable biosensors have become increasingly popular in healthcare due to their capabilities for low cost and long term biosignal monitoring. This paper presents a novel two-stage technique to offer biometric identification using these biosensors through Deep Belief Networks and Restricted Boltzman Machines. Our identification approach improves robustness in current monitoring procedures within clinical, e-health and fitness environments using Photoplethysmography (PPG) signals through deep learning classification models. The approach is tested on TROIKA dataset using 10-fold cross validation and achieved an accuracy of 96.1%.

  20. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  1. Why Heuristics Work.

    PubMed

    Gigerenzer, Gerd

    2008-01-01

    The adaptive toolbox is a Darwinian-inspired theory that conceives of the mind as a modular system that is composed of heuristics, their building blocks, and evolved capacities. The study of the adaptive toolbox is descriptive and analyzes the selection and structure of heuristics in social and physical environments. The study of ecological rationality is prescriptive and identifies the structure of environments in which specific heuristics either succeed or fail. Results have been used for designing heuristics and environments to improve professional decision making in the real world. © 2008 Association for Psychological Science.

  2. GIXSGUI : a MATLAB toolbox for grazing-incidence X-ray scattering data visualization and reduction, and indexing of buried three-dimensional periodic nanostructured films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Zhang

    GIXSGUIis a MATLAB toolbox that offers both a graphical user interface and script-based access to visualize and process grazing-incidence X-ray scattering data from nanostructures on surfaces and in thin films. It provides routine surface scattering data reduction methods such as geometric correction, one-dimensional intensity linecut, two-dimensional intensity reshapingetc. Three-dimensional indexing is also implemented to determine the space group and lattice parameters of buried organized nanoscopic structures in supported thin films.

  3. Designing a Robust Micromixer Based on Fluid Stretching

    NASA Astrophysics Data System (ADS)

    Mott, David; Gautam, Dipesh; Voth, Greg; Oran, Elaine

    2010-11-01

    A metric for measuring fluid stretching based on finite-time Lyapunov exponents is described, and the use of this metric for optimizing mixing in microfluidic components is explored. The metric is implemented within an automated design approach called the Computational Toolbox (CTB). The CTB designs components by adding geometric features, such a grooves of various shapes, to a microchannel. The transport produced by each of these features in isolation was pre-computed and stored as an "advection map" for that feature, and the flow through a composite geometry that combines these features is calculated rapidly by applying the corresponding maps in sequence. A genetic algorithm search then chooses the feature combination that optimizes a user-specified metric. Metrics based on the variance of concentration generally require the user to specify the fluid distributions at inflow, which leads to different mixer designs for different inflow arrangements. The stretching metric is independent of the fluid arrangement at inflow. Mixers designed using the stretching metric are compared to those designed using a variance of concentration metric and show excellent performance across a variety of inflow distributions and diffusivities.

  4. Cellular neural networks, the Navier-Stokes equation, and microarray image reconstruction.

    PubMed

    Zineddin, Bachar; Wang, Zidong; Liu, Xiaohui

    2011-11-01

    Although the last decade has witnessed a great deal of improvements achieved for the microarray technology, many major developments in all the main stages of this technology, including image processing, are still needed. Some hardware implementations of microarray image processing have been proposed in the literature and proved to be promising alternatives to the currently available software systems. However, the main drawback of those proposed approaches is the unsuitable addressing of the quantification of the gene spot in a realistic way without any assumption about the image surface. Our aim in this paper is to present a new image-reconstruction algorithm using the cellular neural network that solves the Navier-Stokes equation. This algorithm offers a robust method for estimating the background signal within the gene-spot region. The MATCNN toolbox for Matlab is used to test the proposed method. Quantitative comparisons are carried out, i.e., in terms of objective criteria, between our approach and some other available methods. It is shown that the proposed algorithm gives highly accurate and realistic measurements in a fully automated manner within a remarkably efficient time.

  5. Using Arduino microcontroller boards to measure response latencies.

    PubMed

    Schubert, Thomas W; D'Ausilio, Alessandro; Canto, Rosario

    2013-12-01

    Latencies of buttonpresses are a staple of cognitive science paradigms. Often keyboards are employed to collect buttonpresses, but their imprecision and variability decreases test power and increases the risk of false positives. Response boxes and data acquisition cards are precise, but expensive and inflexible, alternatives. We propose using open-source Arduino microcontroller boards as an inexpensive and flexible alternative. These boards connect to standard experimental software using a USB connection and a virtual serial port, or by emulating a keyboard. In our solution, an Arduino measures response latencies after being signaled the start of a trial, and communicates the latency and response back to the PC over a USB connection. We demonstrated the reliability, robustness, and precision of this communication in six studies. Test measures confirmed that the error added to the measurement had an SD of less than 1 ms. Alternatively, emulation of a keyboard results in similarly precise measurement. The Arduino performs as well as a serial response box, and better than a keyboard. In addition, our setup allows for the flexible integration of other sensors, and even actuators, to extend the cognitive science toolbox.

  6. Homo heuristicus: why biased minds make better inferences.

    PubMed

    Gigerenzer, Gerd; Brighton, Henry

    2009-01-01

    Heuristics are efficient cognitive processes that ignore information. In contrast to the widely held view that less processing reduces accuracy, the study of heuristics shows that less information, computation, and time can in fact improve accuracy. We review the major progress made so far: (a) the discovery of less-is-more effects; (b) the study of the ecological rationality of heuristics, which examines in which environments a given strategy succeeds or fails, and why; (c) an advancement from vague labels to computational models of heuristics; (d) the development of a systematic theory of heuristics that identifies their building blocks and the evolved capacities they exploit, and views the cognitive system as relying on an "adaptive toolbox;" and (e) the development of an empirical methodology that accounts for individual differences, conducts competitive tests, and has provided evidence for people's adaptive use of heuristics. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. Copyright © 2009 Cognitive Science Society, Inc.

  7. Biochemical Engineering Approaches for Increasing Viability and Functionality of Probiotic Bacteria

    PubMed Central

    Nguyen, Huu-Thanh; Truong, Dieu-Hien; Kouhoundé, Sonagnon; Ly, Sokny; Razafindralambo, Hary; Delvigne, Frank

    2016-01-01

    The literature presents a growing body of evidence demonstrating the positive effect of probiotics on health. Probiotic consumption levels are rising quickly in the world despite the fluctuation of their viability and functionality. Technological methods aiming at improving probiotic characteristics are thus highly wanted. However, microbial metabolic engineering toolbox is not available for this kind of application. On the other hand, basic microbiology teaches us that bacteria are able to exhibit adaptation to external stresses. It is known that adequately applied sub-lethal stress, i.e., controlled in amplitude and frequency at a given stage of the culture, is able to enhance microbial robustness. This property could be potentially used to improve the viability of probiotic bacteria, but some technical challenges still need to be overcome before any industrial implementation. This review paper investigates the different technical tools that can be used in order to define the proper condition for improving viability of probiotic bacteria and their implementation at the industrial scale. Based on the example of Bifidobacterium bifidum, potentialities for simultaneously improving viability, but also functionality of probiotics will be described. PMID:27271598

  8. Graceful Failure, Engineering, and Planning for Extremes: The Engineering for Climate Extremes Partnership (ECEP)

    NASA Astrophysics Data System (ADS)

    Bruyere, C. L.; Tye, M. R.; Holland, G. J.; Done, J.

    2015-12-01

    Graceful failure acknowledges that all systems will fail at some level and incorporates the potential for failure as a key component of engineering design, community planning, and the associated research and development. This is a fundamental component of the ECEP, an interdisciplinary partnership bringing together scientific, engineering, cultural, business and government expertise to develop robust, well-communicated predictions and advice on the impacts of weather and climate extremes in support of decision-making. A feature of the partnership is the manner in which basic and applied research and development is conducted in direct collaboration with the end user. A major ECEP focus is the Global Risk and Resilience Toolbox (GRRT) that is aimed at developing public-domain, risk-modeling and response data and planning system in support of engineering design, and community planning and adaptation activities. In this presentation I will outline the overall ECEP and GRIP activities, and expand on the 'graceful failure' concept. Specific examples for direct assessment and prediction of hurricane impacts and damage potential will be included.

  9. Support vector machines to detect physiological patterns for EEG and EMG-based human-computer interaction: a review

    NASA Astrophysics Data System (ADS)

    Quitadamo, L. R.; Cavrini, F.; Sbernini, L.; Riillo, F.; Bianchi, L.; Seri, S.; Saggio, G.

    2017-02-01

    Support vector machines (SVMs) are widely used classifiers for detecting physiological patterns in human-computer interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the applications of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.

  10. Augmenting the Genetic Toolbox for Sulfolobus islandicus with a Stringent Positive Selectable Marker for Agmatine Prototrophy

    PubMed Central

    Cooper, Tara E.; Krause, David J.

    2013-01-01

    Sulfolobus species have become the model organisms for studying the unique biology of the crenarchaeal division of the archaeal domain. In particular, Sulfolobus islandicus provides a powerful opportunity to explore natural variation via experimental functional genomics. To support these efforts, we further expanded genetic tools for S. islandicus by developing a stringent positive selection for agmatine prototrophs in strains in which the argD gene, encoding arginine decarboxylase, has been deleted. Strains with deletions in argD were shown to be auxotrophic for agmatine even in nutrient-rich medium, but growth could be restored by either supplementation of exogenous agmatine or reintroduction of a functional copy of the argD gene from S. solfataricus P2 into the ΔargD host. Using this stringent selection, a robust targeted gene knockout system was established via an improved next generation of the MID (marker insertion and unmarked target gene deletion) method. Application of this novel system was validated by targeted knockout of the upsEF genes involved in UV-inducible cell aggregation formation. PMID:23835176

  11. A new technique for quantifying symmetry and opening angles in quartz c-axis pole figures: Implications for interpreting the kinematic and thermal properties of rocks

    NASA Astrophysics Data System (ADS)

    Hunter, N. J. R.; Weinberg, R. F.; Wilson, C. J. L.; Law, R. D.

    2018-07-01

    Variations in flow kinematics influence the type of crystallographic preferred orientations (CPOs) in plastically deformed quartz, yet we currently lack a robust means of quantifying the diagnostic symmetries that develop in the c-axis (0001) pole figure. In this contribution, we demonstrate how the symmetry of common c-axis topologies may be quantified by analysing the intensity distribution across a line transect of the pole figure margin. A symmetry value (S) measures the relative difference in intensities between marginal girdle maxima in the pole figure, and thus the degree to which the pole figure defines orthorhombic or monoclinic end member symmetries. This provides a semi-quantitative depiction of whether the rocks underwent coaxial or non-coaxial flow, respectively, and may subsequently be used to quantify other topological properties, such as the opening angle of girdle maxima. The open source Matlab® toolbox MTEX is used to quantify pole figure symmetries in quartzite samples from the Main Central Thrust (NW Himalaya) and the Moine Thrust (NW Scotland).

  12. Effects of thermal blooming on systems comprised of tiled subapertures

    NASA Astrophysics Data System (ADS)

    Leakeas, Charles L.; Bartell, Richard J.; Krizo, Matthew J.; Fiorino, Steven T.; Cusumano, Salvatore J.; Whiteley, Matthew R.

    2010-04-01

    Laser weapon systems comprise of tiled subapertures are rapidly emerging in the directed energy community. The Air Force Institute of Technology Center for Directed Energy (AFIT/CDE), under sponsorship of the HEL Joint Technology Office has developed performance models of such laser weapon system configurations consisting of tiled arrays of both slab and fiber subapertures. These performance models are based on results of detailed waveoptics analyses conducted using WaveTrain. Previous performance model versions developed in this effort represent system characteristics such as subaperture shape, aperture fill factor, subaperture intensity profile, subaperture placement in the primary aperture, subaperture mutual coherence (piston), subaperture differential jitter (tilt), and beam quality wave-front error associated with each subaperture. The current work is a prerequisite for the development of robust performance models for turbulence and thermal blooming effects for tiled systems. Emphasis is placed on low altitude tactical scenarios. The enhanced performance model developed will be added to AFIT/CDE's HELEEOS parametric one-on-one engagement level model via the Scaling for High Energy Laser and Relay Engagement (SHaRE) toolbox.

  13. Robust nonlinear control of vectored thrust aircraft

    NASA Technical Reports Server (NTRS)

    Doyle, John C.; Murray, Richard; Morris, John

    1993-01-01

    An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.

  14. The Gini Coefficient as a Tool for Image Family Idenitification in Strong Lensing Systems with Multiple Images

    NASA Astrophysics Data System (ADS)

    Florian, Michael K.; Gladders, Michael D.; Li, Nan; Sharon, Keren

    2016-01-01

    The sample of cosmological strong lensing systems has been steadily growing in recent years and with the advent of the next generation of space-based survey telescopes, the sample will reach into the thousands. The accuracy of strong lens models relies on robust identification of multiple image families of lensed galaxies. For the most massive lenses, often more than one background galaxy is magnified and multiply imaged, and even in the cases of only a single lensed source, identification of counter images is not always robust. Recently, we have shown that the Gini coefficient in space-telescope-quality imaging is a measurement of galaxy morphology that is relatively well-preserved by strong gravitational lensing. Here, we investigate its usefulness as a diagnostic for the purposes of image family identification and show that it can remove some of the degeneracies encountered when using color as the sole diagnostic, and can do so without the need for additional observations since whenever a color is available, two Gini coefficients are as well.

  15. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    DOE PAGES

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; ...

    2013-01-01

    Background . The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective . To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods . The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expertmore » knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results . The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions . Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less

  16. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    PubMed Central

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Varnum, Susan M.; Brown, Joseph N.; Riensche, Roderick M.; Adkins, Joshua N.; Jacobs, Jon M.; Hoidal, John R.; Scholand, Mary Beth; Pounds, Joel G.; Blackburn, Michael R.; Rodland, Karin D.; McDermott, Jason E.

    2013-01-01

    Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification. PMID:24223463

  17. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.

    2013-10-01

    Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integratedmore » into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.« less

  18. F-15B Quiet Spike(TradeMark) Aeroservoelastic Flight-Test Data Analysis

    NASA Technical Reports Server (NTRS)

    Kukreja, Sunil L.

    2007-01-01

    System identification is utilized in the aerospace community for development of simulation models for robust control law design. These models are often described as linear, time-invariant processes and assumed to be uniform throughout the flight envelope. Nevertheless, it is well known that the underlying process is inherently nonlinear. Over the past several decades the controls and biomedical communities have made great advances in developing tools for the identification of nonlin ear systems. In this report, we show the application of one such nonlinear system identification technique, structure detection, for the an alysis of Quiet Spike(TradeMark)(Gulfstream Aerospace Corporation, Savannah, Georgia) aeroservoelastic flight-test data. Structure detectio n is concerned with the selection of a subset of candidate terms that best describe the observed output. Structure computation as a tool fo r black-box modeling may be of critical importance for the development of robust, parsimonious models for the flight-test community. The ob jectives of this study are to demonstrate via analysis of Quiet Spike(TradeMark) aeroservoelastic flight-test data for several flight conditions that: linear models are inefficient for modelling aeroservoelast ic data, nonlinear identification provides a parsimonious model description whilst providing a high percent fit for cross-validated data an d the model structure and parameters vary as the flight condition is altered.

  19. Robust volcano plot: identification of differential metabolites in the presence of outliers.

    PubMed

    Kumar, Nishith; Hoque, Md Aminul; Sugimoto, Masahiro

    2018-04-11

    The identification of differential metabolites in metabolomics is still a big challenge and plays a prominent role in metabolomics data analyses. Metabolomics datasets often contain outliers because of analytical, experimental, and biological ambiguity, but the currently available differential metabolite identification techniques are sensitive to outliers. We propose a kernel weight based outlier-robust volcano plot for identifying differential metabolites from noisy metabolomics datasets. Two numerical experiments are used to evaluate the performance of the proposed technique against nine existing techniques, including the t-test and the Kruskal-Wallis test. Artificially generated data with outliers reveal that the proposed method results in a lower misclassification error rate and a greater area under the receiver operating characteristic curve compared with existing methods. An experimentally measured breast cancer dataset to which outliers were artificially added reveals that our proposed method produces only two non-overlapping differential metabolites whereas the other nine methods produced between seven and 57 non-overlapping differential metabolites. Our data analyses show that the performance of the proposed differential metabolite identification technique is better than that of existing methods. Thus, the proposed method can contribute to analysis of metabolomics data with outliers. The R package and user manual of the proposed method are available at https://github.com/nishithkumarpaul/Rvolcano .

  20. Comparison of frequency-domain and time-domain rotorcraft vibration control methods

    NASA Technical Reports Server (NTRS)

    Gupta, N. K.

    1984-01-01

    Active control of rotor-induced vibration in rotorcraft has received significant attention recently. Two classes of techniques have been proposed. The more developed approach works with harmonic analysis of measured time histories and is called the frequency-domain approach. The more recent approach computes the control input directly using the measured time history data and is called the time-domain approach. The report summarizes the results of a theoretical investigation to compare the two approaches. Five specific areas were addressed: (1) techniques to derive models needed for control design (system identification methods), (2) robustness with respect to errors, (3) transient response, (4) susceptibility to noise, and (5) implementation difficulties. The system identification methods are more difficult for the time-domain models. The time-domain approach is more robust (e.g., has higher gain and phase margins) than the frequency-domain approach. It might thus be possible to avoid doing real-time system identification in the time-domain approach by storing models at a number of flight conditions. The most significant error source is the variation in open-loop vibrations caused by pilot inputs, maneuvers or gusts. The implementation requirements are similar except that the time-domain approach can be much simpler to implement if real-time system identification were not necessary.

  1. Identification of filamentous fungi isolates by MALDI-TOF mass spectrometry: clinical evaluation of an extended reference spectra library.

    PubMed

    Becker, Pierre T; de Bel, Annelies; Martiny, Delphine; Ranque, Stéphane; Piarroux, Renaud; Cassagne, Carole; Detandt, Monique; Hendrickx, Marijke

    2014-11-01

    The identification of filamentous fungi by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) relies mainly on a robust and extensive database of reference spectra. To this end, a large in-house library containing 760 strains and representing 472 species was built and evaluated on 390 clinical isolates by comparing MALDI-TOF MS with the classical identification method based on morphological observations. The use of MALDI-TOF MS resulted in the correct identification of 95.4% of the isolates at species level, without considering LogScore values. Taking into account the Brukers' cutoff value for reliability (LogScore >1.70), 85.6% of the isolates were correctly identified. For a number of isolates, microscopic identification was limited to the genus, resulting in only 61.5% of the isolates correctly identified at species level while the correctness reached 94.6% at genus level. Using this extended in-house database, MALDI-TOF MS thus appears superior to morphology in order to obtain a robust and accurate identification of filamentous fungi. A continuous extension of the library is however necessary to further improve its reliability. Indeed, 15 isolates were still not represented while an additional three isolates were not recognized, probably because of a lack of intraspecific variability of the corresponding species in the database. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, III, F. G.

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less

  3. ESP Toolbox: A Computational Framework for Precise, Scale-Independent Analysis of Bulk Elastic and Seismic Properties

    NASA Astrophysics Data System (ADS)

    Johnson, S. E.; Vel, S. S.; Cook, A. C.; Song, W. J.; Gerbi, C. C.; Okaya, D. A.

    2014-12-01

    Owing to the abundance of highly anisotropic minerals in the crust, the Voigt and Reuss bounds on the seismic velocities can be separated by more than 1 km/s. These bounds are determined by modal mineralogy and crystallographic preferred orientations (CPO) of the constituent minerals, but where the true velocities lie between these bounds is determined by other fabric parameters such as the shapes, shape-preferred orientations, and spatial arrangements of grains. Thus, the calculation of accurate bulk stiffness relies on explicitly treating the grain-scale heterogeneity, and the same principle applies at larger scales, for example calculating accurate bulk stiffness for a crustal volume with varying proportions and distributions of folds or shear zones. We have developed stand-alone GUI software - ESP Toolbox - for the calculation of 3D bulk elastic and seismic properties of heterogeneous and polycrystalline materials using image or EBSD data. The GUI includes a number of different homogenization techniques, including Voigt, Reuss, Hill, geometric mean, self-consistent and asymptotic expansion homogenization (AEH) methods. The AEH method, which uses a finite element mesh, is most accurate since it explicitly accounts for elastic interactions of constituent minerals/phases. The user need only specify the microstructure and material properties of the minerals/phases. We use the Toolbox to explore changes in bulk elasticity and related seismic anisotropy caused by specific variables, including: (a) the quartz alpha-beta phase change in rocks with varying proportions of quartz, (b) changes in modal mineralogy and CPO fabric that occur during progressive deformation and metamorphism, and (c) shear zones of varying thickness, abundance and geometry in continental crust. The Toolbox allows rapid sensitivity analysis around these and other variables, and the resulting bulk stiffness matrices can be used to populate volumes for synthetic wave propagation experiments that allow direct visualization of how variables of interest might affect propagation at a variety of scales. Sensitivity analyses also illustrate the value of the more precise AEH method. The ESP Toolbox can be downloaded here: http://umaine.edu/mecheng/faculty-and-staff/senthil-vel/software/

  4. Basic Radar Altimetry Toolbox: Tools to Use Radar Altimetry for Geodesy

    NASA Astrophysics Data System (ADS)

    Rosmorduc, V.; Benveniste, J. J.; Bronner, E.; Niejmeier, S.

    2010-12-01

    Radar altimetry is very much a technique expanding its applications and uses. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for geodesy, especially combined witht ESA GOCE mission data is still somehow hard. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed (as well as, on ESA side, the GOCE User Toolbox, both being linked). The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are ongoing, some are in discussion. Examples and Data use cases on geodesy will be presented. BRAT is developed under contract with ESA and CNES.

  5. The Role of Ethnic and National Identifications in Perceived Discrimination for Asian Americans: Toward a Better Understanding of the Buffering Effect of Group Identifications on Psychological Distress

    PubMed Central

    Huynh, Que-Lam; Devos, Thierry; Goldberg, Robyn

    2013-01-01

    A robust relationship between perceived racial discrimination and psychological distress has been established. Yet, mixed evidence exists regarding the extent to which ethnic identification moderates this relationship, and scarce attention has been paid to the moderating role of national identification. We propose that the role of group identifications in the perceived discrimination–psychological distress relationship is best understood by simultaneously and interactively considering ethnic and national identifications. A sample of 259 Asian American students completed measures of perceived discrimination, group identifications (specific ethnic identification stated by respondents and national or “mainstream American” identification), and psychological distress (anxiety and depression symptoms). Regression analyses revealed a significant three-way interaction of perceived discrimination, ethnic identification, and national identification on psychological distress. Simple-slope analyses indicated that dual identification (strong ethnic and national identifications) was linked to a weaker relationship between perceived discrimination and psychological distress compared with other group identification configurations. These findings underscore the need to consider the interconnections between ethnic and national identifications to better understand the circumstances under which group identifications are likely to buffer individuals against the adverse effects of racial discrimination. PMID:25258674

  6. PETPVC: a toolbox for performing partial volume correction techniques in positron emission tomography

    NASA Astrophysics Data System (ADS)

    Thomas, Benjamin A.; Cuplov, Vesna; Bousse, Alexandre; Mendes, Adriana; Thielemans, Kris; Hutton, Brian F.; Erlandsson, Kjell

    2016-11-01

    Positron emission tomography (PET) images are degraded by a phenomenon known as the partial volume effect (PVE). Approaches have been developed to reduce PVEs, typically through the utilisation of structural information provided by other imaging modalities such as MRI or CT. These methods, known as partial volume correction (PVC) techniques, reduce PVEs by compensating for the effects of the scanner resolution, thereby improving the quantitative accuracy. The PETPVC toolbox described in this paper comprises a suite of methods, both classic and more recent approaches, for the purposes of applying PVC to PET data. Eight core PVC techniques are available. These core methods can be combined to create a total of 22 different PVC techniques. Simulated brain PET data are used to demonstrate the utility of toolbox in idealised conditions, the effects of applying PVC with mismatched point-spread function (PSF) estimates and the potential of novel hybrid PVC methods to improve the quantification of lesions. All anatomy-based PVC techniques achieve complete recovery of the PET signal in cortical grey matter (GM) when performed in idealised conditions. Applying deconvolution-based approaches results in incomplete recovery due to premature termination of the iterative process. PVC techniques are sensitive to PSF mismatch, causing a bias of up to 16.7% in GM recovery when over-estimating the PSF by 3 mm. The recovery of both GM and a simulated lesion was improved by combining two PVC techniques together. The PETPVC toolbox has been written in C++, supports Windows, Mac and Linux operating systems, is open-source and publicly available.

  7. Characterization of Disease-Related Covariance Topographies with SSMPCA Toolbox: Effects of Spatial Normalization and PET Scanners

    PubMed Central

    Peng, Shichun; Ma, Yilong; Spetsieris, Phoebe G; Mattis, Paul; Feigin, Andrew; Dhawan, Vijay; Eidelberg, David

    2013-01-01

    In order to generate imaging biomarkers from disease-specific brain networks, we have implemented a general toolbox to rapidly perform scaled subprofile modeling (SSM) based on principal component analysis (PCA) on brain images of patients and normals. This SSMPCA toolbox can define spatial covariance patterns whose expression in individual subjects can discriminate patients from controls or predict behavioral measures. The technique may depend on differences in spatial normalization algorithms and brain imaging systems. We have evaluated the reproducibility of characteristic metabolic patterns generated by SSMPCA in patients with Parkinson's disease (PD). We used [18F]fluorodeoxyglucose PET scans from PD patients and normal controls. Motor-related (PDRP) and cognition-related (PDCP) metabolic patterns were derived from images spatially normalized using four versions of SPM software (spm99, spm2, spm5 and spm8). Differences between these patterns and subject scores were compared across multiple independent groups of patients and control subjects. These patterns and subject scores were highly reproducible with different normalization programs in terms of disease discrimination and cognitive correlation. Subject scores were also comparable in PD patients imaged across multiple PET scanners. Our findings confirm a very high degree of consistency among brain networks and their clinical correlates in PD using images normalized in four different SPM platforms. SSMPCA toolbox can be used reliably for generating disease-specific imaging biomarkers despite the continued evolution of image preprocessing software in the neuroimaging community. Network expressions can be quantified in individual patients independent of different physical characteristics of PET cameras. PMID:23671030

  8. Characterization of disease-related covariance topographies with SSMPCA toolbox: effects of spatial normalization and PET scanners.

    PubMed

    Peng, Shichun; Ma, Yilong; Spetsieris, Phoebe G; Mattis, Paul; Feigin, Andrew; Dhawan, Vijay; Eidelberg, David

    2014-05-01

    To generate imaging biomarkers from disease-specific brain networks, we have implemented a general toolbox to rapidly perform scaled subprofile modeling (SSM) based on principal component analysis (PCA) on brain images of patients and normals. This SSMPCA toolbox can define spatial covariance patterns whose expression in individual subjects can discriminate patients from controls or predict behavioral measures. The technique may depend on differences in spatial normalization algorithms and brain imaging systems. We have evaluated the reproducibility of characteristic metabolic patterns generated by SSMPCA in patients with Parkinson's disease (PD). We used [(18) F]fluorodeoxyglucose PET scans from patients with PD and normal controls. Motor-related (PDRP) and cognition-related (PDCP) metabolic patterns were derived from images spatially normalized using four versions of SPM software (spm99, spm2, spm5, and spm8). Differences between these patterns and subject scores were compared across multiple independent groups of patients and control subjects. These patterns and subject scores were highly reproducible with different normalization programs in terms of disease discrimination and cognitive correlation. Subject scores were also comparable in patients with PD imaged across multiple PET scanners. Our findings confirm a very high degree of consistency among brain networks and their clinical correlates in PD using images normalized in four different SPM platforms. SSMPCA toolbox can be used reliably for generating disease-specific imaging biomarkers despite the continued evolution of image preprocessing software in the neuroimaging community. Network expressions can be quantified in individual patients independent of different physical characteristics of PET cameras. Copyright © 2013 Wiley Periodicals, Inc.

  9. MagPy: A Python toolbox for controlling Magstim transcranial magnetic stimulators.

    PubMed

    McNair, Nicolas A

    2017-01-30

    To date, transcranial magnetic stimulation (TMS) studies manipulating stimulation parameters have largely used blocked paradigms. However, altering these parameters on a trial-by-trial basis in Magstim stimulators is complicated by the need to send regular (1Hz) commands to the stimulator. Additionally, effecting such control interferes with the ability to send TMS pulses or simultaneously present stimuli with high-temporal precision. This manuscript presents the MagPy toolbox, a Python software package that provides full control over Magstim stimulators via the serial port. It is able to maintain this control with no impact on concurrent processing, such as stimulus delivery. In addition, a specially-designed "QuickFire" serial cable is specified that allows MagPy to trigger TMS pulses with very low-latency. In a series of experimental simulations, MagPy was able to maintain uninterrupted remote control over the connected Magstim stimulator across all testing sessions. In addition, having MagPy enabled had no effect on stimulus timing - all stimuli were presented for precisely the duration specified. Finally, using the QuickFire cable, MagPy was able to elicit TMS pulses with sub-millisecond latencies. The MagPy toolbox allows for experiments that require manipulating stimulation parameters from trial to trial. Furthermore, it can achieve this in contexts that require tight control over timing, such as those seeking to combine TMS with fMRI or EEG. Together, the MagPy toolbox and QuickFire serial cable provide an effective means for controlling Magstim stimulators during experiments while ensuring high-precision timing. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. FACET - a "Flexible Artifact Correction and Evaluation Toolbox" for concurrently recorded EEG/fMRI data.

    PubMed

    Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S

    2013-11-09

    In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.

  11. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis.

    PubMed

    Fayyaz S, S Kiavash; Liu, Xiaoyue Cathy; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George's transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis.

  12. An efficient General Transit Feed Specification (GTFS) enabled algorithm for dynamic transit accessibility analysis

    PubMed Central

    Fayyaz S., S. Kiavash; Zhang, Guohui

    2017-01-01

    The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George’s transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis. PMID:28981544

  13. MNPBEM - A Matlab toolbox for the simulation of plasmonic nanoparticles

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich; Trügler, Andreas

    2012-02-01

    MNPBEM is a Matlab toolbox for the simulation of metallic nanoparticles (MNP), using a boundary element method (BEM) approach. The main purpose of the toolbox is to solve Maxwell's equations for a dielectric environment where bodies with homogeneous and isotropic dielectric functions are separated by abrupt interfaces. Although the approach is in principle suited for arbitrary body sizes and photon energies, it is tested (and probably works best) for metallic nanoparticles with sizes ranging from a few to a few hundreds of nanometers, and for frequencies in the optical and near-infrared regime. The toolbox has been implemented with Matlab classes. These classes can be easily combined, which has the advantage that one can adapt the simulation programs flexibly for various applications. Program summaryProgram title: MNPBEM Catalogue identifier: AEKJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v2 No. of lines in distributed program, including test data, etc.: 15 700 No. of bytes in distributed program, including test data, etc.: 891 417 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b) Computer: Any which supports Matlab 7.11.0 (R2010b) Operating system: Any which supports Matlab 7.11.0 (R2010b) RAM: ⩾1 GByte Classification: 18 Nature of problem: Solve Maxwell's equations for dielectric particles with homogeneous dielectric functions separated by abrupt interfaces. Solution method: Boundary element method using electromagnetic potentials. Running time: Depending on surface discretization between seconds and hours.

  14. Low-order auditory Zernike moment: a novel approach for robust music identification in the compressed domain

    NASA Astrophysics Data System (ADS)

    Li, Wei; Xiao, Chuan; Liu, Yaduo

    2013-12-01

    Audio identification via fingerprint has been an active research field for years. However, most previously reported methods work on the raw audio format in spite of the fact that nowadays compressed format audio, especially MP3 music, has grown into the dominant way to store music on personal computers and/or transmit it over the Internet. It will be interesting if a compressed unknown audio fragment could be directly recognized from the database without decompressing it into the wave format at first. So far, very few algorithms run directly on the compressed domain for music information retrieval, and most of them take advantage of the modified discrete cosine transform coefficients or derived cepstrum and energy type of features. As a first attempt, we propose in this paper utilizing compressed domain auditory Zernike moment adapted from image processing techniques as the key feature to devise a novel robust audio identification algorithm. Such fingerprint exhibits strong robustness, due to its statistically stable nature, against various audio signal distortions such as recompression, noise contamination, echo adding, equalization, band-pass filtering, pitch shifting, and slight time scale modification. Experimental results show that in a music database which is composed of 21,185 MP3 songs, a 10-s long music segment is able to identify its original near-duplicate recording, with average top-5 hit rate up to 90% or above even under severe audio signal distortions.

  15. GPU-accelerated automatic identification of robust beam setups for proton and carbon-ion radiotherapy

    NASA Astrophysics Data System (ADS)

    Ammazzalorso, F.; Bednarz, T.; Jelen, U.

    2014-03-01

    We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.

  16. The RAVEN Toolbox and Its Use for Generating a Genome-scale Metabolic Model for Penicillium chrysogenum

    PubMed Central

    Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens

    2013-01-01

    We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215

  17. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    PubMed

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  18. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    PubMed Central

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  19. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    PubMed

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  20. A Toolbox for Ab Initio 3-D Reconstructions in Single-particle Electron Microscopy

    PubMed Central

    Voss, Neil R; Lyumkis, Dmitry; Cheng, Anchi; Lau, Pick-Wei; Mulder, Anke; Lander, Gabriel C; Brignole, Edward J; Fellmann, Denis; Irving, Christopher; Jacovetty, Erica L; Leung, Albert; Pulokas, James; Quispe, Joel D; Winkler, Hanspeter; Yoshioka, Craig; Carragher, Bridget; Potter, Clinton S

    2010-01-01

    Structure determination of a novel macromolecular complex via single-particle electron microscopy depends upon overcoming the challenge of establishing a reliable 3-D reconstruction using only 2-D images. There are a variety of strategies that deal with this issue, but not all of them are readily accessible and straightforward to use. We have developed a “toolbox” of ab initio reconstruction techniques that provide several options for calculating 3-D volumes in an easily managed and tightly controlled work-flow that adheres to standard conventions and formats. This toolbox is designed to streamline the reconstruction process by removing the necessity for bookkeeping, while facilitating transparent data transfer between different software packages. It currently includes procedures for calculating ab initio reconstructions via random or orthogonal tilt geometry, tomograms, and common lines, all of which have been tested using the 50S ribosomal subunit. Our goal is that the accessibility of multiple independent reconstruction algorithms via this toolbox will improve the ease with which models can be generated, and provide a means of evaluating the confidence and reliability of the final reconstructed map. PMID:20018246

Top