CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox
NASA Astrophysics Data System (ADS)
Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano
2018-03-01
Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.
PeTTSy: a computational tool for perturbation analysis of complex systems biology models.
Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A
2016-03-10
Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.
Optimising the Parallelisation of OpenFOAM Simulations
2014-06-01
UNCLASSIFIED UNCLASSIFIED Optimising the Parallelisation of OpenFOAM Simulations Shannon Keough Maritime Division Defence...Science and Technology Organisation DSTO-TR-2987 ABSTRACT The OpenFOAM computational fluid dynamics toolbox allows parallel computation of...performance of a given high performance computing cluster with several OpenFOAM cases, running using a combination of MPI libraries and corresponding MPI
Automated model optimisation using the Cylc workflow engine (Cyclops v1.0)
NASA Astrophysics Data System (ADS)
Gorman, Richard M.; Oliver, Hilary J.
2018-06-01
Most geophysical models include many parameters that are not fully determined by theory, and can be tuned
to improve the model's agreement with available data. We might attempt to automate this tuning process in an objective way by employing an optimisation algorithm to find the set of parameters that minimises a cost function derived from comparing model outputs with measurements. A number of algorithms are available for solving optimisation problems, in various programming languages, but interfacing such software to a complex geophysical model simulation presents certain challenges. To tackle this problem, we have developed an optimisation suite (Cyclops
) based on the Cylc workflow engine that implements a wide selection of optimisation algorithms from the NLopt Python toolbox (Johnson, 2014). The Cyclops optimisation suite can be used to calibrate any modelling system that has itself been implemented as a (separate) Cylc model suite, provided it includes computation and output of the desired scalar cost function. A growing number of institutions are using Cylc to orchestrate complex distributed suites of interdependent cycling tasks within their operational forecast systems, and in such cases application of the optimisation suite is particularly straightforward. As a test case, we applied the Cyclops to calibrate a global implementation of the WAVEWATCH III (v4.18) third-generation spectral wave model, forced by ERA-Interim input fields. This was calibrated over a 1-year period (1997), before applying the calibrated model to a full (1979-2016) wave hindcast. The chosen error metric was the spatial average of the root mean square error of hindcast significant wave height compared with collocated altimeter records. We describe the results of a calibration in which up to 19 parameters were optimised.
A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus
NASA Astrophysics Data System (ADS)
AghaKouchak, A.; Sadegh, M.; Mallakpour, I.
2017-12-01
Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.
Pizzolato, Claudio; Lloyd, David G.; Sartori, Massimo; Ceseracciu, Elena; Besier, Thor F.; Fregly, Benjamin J.; Reggiani, Monica
2015-01-01
Personalized neuromusculoskeletal (NMS) models can represent the neurological, physiological, and anatomical characteristics of an individual and can be used to estimate the forces generated inside the human body. Currently, publicly available software to calculate muscle forces are restricted to static and dynamic optimisation methods, or limited to isometric tasks only. We have created and made freely available for the research community the Calibrated EMG-Informed NMS Modelling Toolbox (CEINMS), an OpenSim plug-in that enables investigators to predict different neural control solutions for the same musculoskeletal geometry and measured movements. CEINMS comprises EMG-driven and EMG-informed algorithms that have been previously published and tested. It operates on dynamic skeletal models possessing any number of degrees of freedom and musculotendon units and can be calibrated to the individual to predict measured joint moments and EMG patterns. In this paper we describe the components of CEINMS and its integration with OpenSim. We then analyse how EMG-driven, EMG-assisted, and static optimisation neural control solutions affect the estimated joint moments, muscle forces, and muscle excitations, including muscle co-contraction. PMID:26522621
Pizzolato, Claudio; Lloyd, David G; Sartori, Massimo; Ceseracciu, Elena; Besier, Thor F; Fregly, Benjamin J; Reggiani, Monica
2015-11-05
Personalized neuromusculoskeletal (NMS) models can represent the neurological, physiological, and anatomical characteristics of an individual and can be used to estimate the forces generated inside the human body. Currently, publicly available software to calculate muscle forces are restricted to static and dynamic optimisation methods, or limited to isometric tasks only. We have created and made freely available for the research community the Calibrated EMG-Informed NMS Modelling Toolbox (CEINMS), an OpenSim plug-in that enables investigators to predict different neural control solutions for the same musculoskeletal geometry and measured movements. CEINMS comprises EMG-driven and EMG-informed algorithms that have been previously published and tested. It operates on dynamic skeletal models possessing any number of degrees of freedom and musculotendon units and can be calibrated to the individual to predict measured joint moments and EMG patterns. In this paper we describe the components of CEINMS and its integration with OpenSim. We then analyse how EMG-driven, EMG-assisted, and static optimisation neural control solutions affect the estimated joint moments, muscle forces, and muscle excitations, including muscle co-contraction. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Proposal for the design of a zero gravity tool storage device
NASA Technical Reports Server (NTRS)
Stuckwisch, Sue; Carrion, Carlos A.; Phillips, Lee; Laughlin, Julia; Francois, Jason
1994-01-01
Astronauts frequently use a variety of hand tools during space missions, especially on repair missions. A toolbox is needed to allow storage and retrieval of tools with minimal difficulties. The toolbox must contain tools during launch, landing, and on-orbit operations. The toolbox will be used in the Shuttle Bay and therefore must withstand the hazardous space environment. The three main functions of the toolbox in space are: to protect the tools from the space environment and from damaging one another, to allow for quick, one-handed access to the tools; and to minimize the heat transfer between the astronaut's hand and the tools. This proposal explores the primary design issues associated with the design of the toolbox. Included are the customer and design specifications, global and refined function structures, possible solution principles, concept variants, and finally design recommendations.
NASA Astrophysics Data System (ADS)
Isken, Marius P.; Sudhaus, Henriette; Heimann, Sebastian; Steinberg, Andreas; Bathke, Hannes M.
2017-04-01
We present a modular open-source software framework (pyrocko, kite, grond; http://pyrocko.org) for rapid InSAR data post-processing and modelling of tectonic and volcanic displacement fields derived from satellite data. Our aim is to ease and streamline the joint optimisation of earthquake observations from InSAR and GPS data together with seismological waveforms for an improved estimation of the ruptures' parameters. Through this approach we can provide finite models of earthquake ruptures and therefore contribute to a timely and better understanding of earthquake kinematics. The new kite module enables a fast processing of unwrapped InSAR scenes for source modelling: the spatial sub-sampling and data error/noise estimation for the interferogram is evaluated automatically and interactively. The rupture's near-field surface displacement data are then combined with seismic far-field waveforms and jointly modelled using the pyrocko.gf framwork, which allows for fast forward modelling based on pre-calculated elastodynamic and elastostatic Green's functions. Lastly the grond module supplies a bootstrap-based probabilistic (Monte Carlo) joint optimisation to estimate the parameters and uncertainties of a finite-source earthquake rupture model. We describe the developed and applied methods as an effort to establish a semi-automatic processing and modelling chain. The framework is applied to Sentinel-1 data from the 2016 Central Italy earthquake sequence, where we present the earthquake mechanism and rupture model from which we derive regions of increased coulomb stress. The open source software framework is developed at GFZ Potsdam and at the University of Kiel, Germany, it is written in Python and C programming languages. The toolbox architecture is modular and independent, and can be utilized flexibly for a variety of geophysical problems. This work is conducted within the BridGeS project (http://www.bridges.uni-kiel.de) funded by the German Research Foundation DFG through an Emmy-Noether grant.
Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas
2013-01-01
Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574
Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas
2013-01-01
Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.
NASA Astrophysics Data System (ADS)
Harré, Michael S.
2013-02-01
Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.
A novel global Harmony Search method based on Ant Colony Optimisation algorithm
NASA Astrophysics Data System (ADS)
Fouad, Allouani; Boukhetala, Djamel; Boudjema, Fares; Zenger, Kai; Gao, Xiao-Zhi
2016-03-01
The Global-best Harmony Search (GHS) is a stochastic optimisation algorithm recently developed, which hybridises the Harmony Search (HS) method with the concept of swarm intelligence in the particle swarm optimisation (PSO) to enhance its performance. In this article, a new optimisation algorithm called GHSACO is developed by incorporating the GHS with the Ant Colony Optimisation algorithm (ACO). Our method introduces a novel improvisation process, which is different from that of the GHS in the following aspects. (i) A modified harmony memory (HM) representation and conception. (ii) The use of a global random switching mechanism to monitor the choice between the ACO and GHS. (iii) An additional memory consideration selection rule using the ACO random proportional transition rule with a pheromone trail update mechanism. The proposed GHSACO algorithm has been applied to various benchmark functions and constrained optimisation problems. Simulation results demonstrate that it can find significantly better solutions when compared with the original HS and some of its variants.
Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools
NASA Astrophysics Data System (ADS)
Tonini, F.; Liu, J.
2016-12-01
Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.
The Handover Toolbox: a knowledge exchange and training platform for improving patient care.
Drachsler, Hendrik; Kicken, Wendy; van der Klink, Marcel; Stoyanov, Slavi; Boshuizen, Henny P A; Barach, Paul
2012-12-01
Safe and effective patient handovers remain a global organisational and training challenge. Limited evidence supports available handover training programmes. Customisable training is a promising approach to improve the quality and sustainability of handover training and outcomes. We present a Handover Toolbox designed in the context of the European HANDOVER Project. The Toolbox aims to support physicians, nurses, individuals in health professions training, medical educators and handover experts by providing customised handover training tools for different clinical needs and contexts. The Handover Toolbox uses the Technology Enhanced Learning Design Process (TEL-DP), which encompasses user requirements analysis; writing personas; group concept mapping; analysis of suitable software; plus, minus, interesting rating; and usability testing. TEL-DP is aligned with participatory design approaches and ensures development occurs in close collaboration with, and engagement of, key stakeholders. Application of TEL-DP confirmed that the ideal formats of handover training differs for practicing professionals versus individuals in health profession education programmes. Training experts from different countries differed in their views on the optimal content and delivery of training. Analysis of suitable software identified ready-to-use systems that provide required functionalities and can be further customised to users' needs. Interest rating and usability testing resulted in improved usability, navigation and uptake of the Handover Toolbox. The design of the Handover Toolbox was based on a carefully led stakeholder participatory design using the TEL-DP approach. The Toolbox supports a customisable learning approach that allows trainers to design training that addresses the specific information needs of the various target groups. We offer recommendations regarding the application of the Handover Toolbox to medical educators.
FracPaQ: A MATLAB™ toolbox for the quantification of fracture patterns
NASA Astrophysics Data System (ADS)
Healy, David; Rizzo, Roberto E.; Cornwell, David G.; Farrell, Natalie J. C.; Watkins, Hannah; Timms, Nick E.; Gomez-Rivas, Enrique; Smith, Michael
2017-02-01
The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, and spatial distributions often exhibit some kind of order. In detail, relationships may exist among the different fracture attributes, e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture attributes and patterns. This paper describes FracPaQ, a new open source, cross-platform toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on previously published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales, rock types and tectonic settings. The implemented methods presented are inherently scale independent, and a key task where applicable is analysing and integrating quantitative fracture pattern data from micro-to macro-scales. The toolbox was developed in MATLAB™ and the source code is publicly available on GitHub™ and the Mathworks™ FileExchange. The code runs on any computer with MATLAB installed, including PCs with Microsoft Windows, Apple Macs with Mac OS X, and machines running different flavours of Linux. The application, source code and sample input files are available in open repositories in the hope that other developers and researchers will optimise and extend the functionality for the benefit of the wider community.
NASA Astrophysics Data System (ADS)
Lea, J.
2017-12-01
The quantification of glacier change is a key variable within glacier monitoring, with the method used potentially being crucial to ensuring that data can be appropriately compared with environmental data. The topic and timescales of study (e.g. land/marine terminating environments; sub-annual/decadal/centennial/millennial timescales) often mean that different methods are more suitable for different problems. However, depending on the GIS/coding expertise of the user, some methods can potentially be time consuming to undertake, making large-scale studies problematic. In addition, examples exist where different users have nominally applied the same methods in different studies, though with minor methodological inconsistencies in their approach. In turn, this will have implications for data homogeneity where regional/global datasets may be constructed. Here, I present a simple toolbox scripted in a Matlab® environment that requires only glacier margin and glacier centreline data to quantify glacier length, glacier change between observations, rate of change, in addition to other metrics. The toolbox includes the option to apply the established centreline or curvilinear box methods, or a new method: the variable box method - designed for tidewater margins where box width is defined as the total width of the individual terminus observation. The toolbox is extremely flexible, and has the option to be applied as either Matlab® functions within user scripts, or via a graphical user interface (GUI) for those unfamiliar with a coding environment. In both instances, there is potential to apply the methods quickly to large datasets (100s-1000s of glaciers, with potentially similar numbers of observations each), thus ensuring large scale methodological consistency (and therefore data homogeneity) and allowing regional/global scale analyses to be achievable for those with limited GIS/coding experience. The toolbox has been evaluated against idealised scenarios demonstrating its accuracy, while feedback from undergraduate students who have trialled the toolbox is that it is intuitive and simple to use. When released, the toolbox will be free and open source allowing users to potentially modify, improve and expand upon the current version.
MacBean, Natasha; Maignan, Fabienne; Bacour, Cédric; Lewis, Philip; Peylin, Philippe; Guanter, Luis; Köhler, Philipp; Gómez-Dans, Jose; Disney, Mathias
2018-01-31
Accurate terrestrial biosphere model (TBM) simulations of gross carbon uptake (gross primary productivity - GPP) are essential for reliable future terrestrial carbon sink projections. However, uncertainties in TBM GPP estimates remain. Newly-available satellite-derived sun-induced chlorophyll fluorescence (SIF) data offer a promising direction for addressing this issue by constraining regional-to-global scale modelled GPP. Here, we use monthly 0.5° GOME-2 SIF data from 2007 to 2011 to optimise GPP parameters of the ORCHIDEE TBM. The optimisation reduces GPP magnitude across all vegetation types except C4 plants. Global mean annual GPP therefore decreases from 194 ± 57 PgCyr -1 to 166 ± 10 PgCyr -1 , bringing the model more in line with an up-scaled flux tower estimate of 133 PgCyr -1 . Strongest reductions in GPP are seen in boreal forests: the result is a shift in global GPP distribution, with a ~50% increase in the tropical to boreal productivity ratio. The optimisation resulted in a greater reduction in GPP than similar ORCHIDEE parameter optimisation studies using satellite-derived NDVI from MODIS and eddy covariance measurements of net CO 2 fluxes from the FLUXNET network. Our study shows that SIF data will be instrumental in constraining TBM GPP estimates, with a consequent improvement in global carbon cycle projections.
NASA Astrophysics Data System (ADS)
Wang, Hui; Chen, Huansheng; Wu, Qizhong; Lin, Junmin; Chen, Xueshun; Xie, Xinwei; Wang, Rongrong; Tang, Xiao; Wang, Zifa
2017-08-01
The Global Nested Air Quality Prediction Modeling System (GNAQPMS) is the global version of the Nested Air Quality Prediction Modeling System (NAQPMS), which is a multi-scale chemical transport model used for air quality forecast and atmospheric environmental research. In this study, we present the porting and optimisation of GNAQPMS on a second-generation Intel Xeon Phi processor, codenamed Knights Landing
(KNL). Compared with the first-generation Xeon Phi coprocessor (codenamed Knights Corner, KNC), KNL has many new hardware features such as a bootable processor, high-performance in-package memory and ISA compatibility with Intel Xeon processors. In particular, we describe the five optimisations we applied to the key modules of GNAQPMS, including the CBM-Z gas-phase chemistry, advection, convection and wet deposition modules. These optimisations work well on both the KNL 7250 processor and the Intel Xeon E5-2697 V4 processor. They include (1) updating the pure Message Passing Interface (MPI) parallel mode to the hybrid parallel mode with MPI and OpenMP in the emission, advection, convection and gas-phase chemistry modules; (2) fully employing the 512 bit wide vector processing units (VPUs) on the KNL platform; (3) reducing unnecessary memory access to improve cache efficiency; (4) reducing the thread local storage (TLS) in the CBM-Z gas-phase chemistry module to improve its OpenMP performance; and (5) changing the global communication from writing/reading interface files to MPI functions to improve the performance and the parallel scalability. These optimisations greatly improved the GNAQPMS performance. The same optimisations also work well for the Intel Xeon Broadwell processor, specifically E5-2697 v4. Compared with the baseline version of GNAQPMS, the optimised version was 3.51 × faster on KNL and 2.77 × faster on the CPU. Moreover, the optimised version ran at 26 % lower average power on KNL than on the CPU. With the combined performance and energy improvement, the KNL platform was 37.5 % more efficient on power consumption compared with the CPU platform. The optimisations also enabled much further parallel scalability on both the CPU cluster and the KNL cluster scaled to 40 CPU nodes and 30 KNL nodes, with a parallel efficiency of 70.4 and 42.2 %, respectively.
Multiobjective optimisation of bogie suspension to boost speed on curves
NASA Astrophysics Data System (ADS)
Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor
2016-01-01
To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.
A new effective operator for the hybrid algorithm for solving global optimisation problems
NASA Astrophysics Data System (ADS)
Duc, Le Anh; Li, Kenli; Nguyen, Tien Trong; Yen, Vu Minh; Truong, Tung Khac
2018-04-01
Hybrid algorithms have been recently used to solve complex single-objective optimisation problems. The ultimate goal is to find an optimised global solution by using these algorithms. Based on the existing algorithms (HP_CRO, PSO, RCCRO), this study proposes a new hybrid algorithm called MPC (Mean-PSO-CRO), which utilises a new Mean-Search Operator. By employing this new operator, the proposed algorithm improves the search ability on areas of the solution space that the other operators of previous algorithms do not explore. Specifically, the Mean-Search Operator helps find the better solutions in comparison with other algorithms. Moreover, the authors have proposed two parameters for balancing local and global search and between various types of local search, as well. In addition, three versions of this operator, which use different constraints, are introduced. The experimental results on 23 benchmark functions, which are used in previous works, show that our framework can find better optimal or close-to-optimal solutions with faster convergence speed for most of the benchmark functions, especially the high-dimensional functions. Thus, the proposed algorithm is more effective in solving single-objective optimisation problems than the other existing algorithms.
Optimizing detection and analysis of slow waves in sleep EEG.
Mensen, Armand; Riedner, Brady; Tononi, Giulio
2016-12-01
Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.
Distributed Aerodynamic Sensing and Processing Toolbox
NASA Technical Reports Server (NTRS)
Brenner, Martin; Jutte, Christine; Mangalam, Arun
2011-01-01
A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.
Ridgeway, William K; Millar, David P; Williamson, James R
2013-01-01
Fluorescence Correlation Spectroscopy (FCS) is widely used to quantitate reaction rates and concentrations of molecules in vitro and in vivo. We recently reported Fluorescence Triple Correlation Spectroscopy (F3CS), which correlates three signals together instead of two. F3CS can analyze the stoichiometries of complex mixtures and detect irreversible processes by identifying time-reversal asymmetries. Here we report the computational developments that were required for the realization of F3CS and present the results as the Triple Correlation Toolbox suite of programs. Triple Correlation Toolbox is a complete data analysis pipeline capable of acquiring, correlating and fitting large data sets. Each segment of the pipeline handles error estimates for accurate error-weighted global fitting. Data acquisition was accelerated with a combination of off-the-shelf counter-timer chips and vectorized operations on 128-bit registers. This allows desktop computers with inexpensive data acquisition cards to acquire hours of multiple-channel data with sub-microsecond time resolution. Off-line correlation integrals were implemented as a two delay time multiple-tau scheme that scales efficiently with multiple processors and provides an unprecedented view of linked dynamics. Global fitting routines are provided to fit FCS and F3CS data to models containing up to ten species. Triple Correlation Toolbox is a complete package that enables F3CS to be performed on existing microscopes. PMID:23525193
VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox
NASA Astrophysics Data System (ADS)
Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.
2016-12-01
VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.
Lee, Chany; Jung, Young-Jin; Lee, Sang Jun; Im, Chang-Hwan
2017-02-01
Since there is no way to measure electric current generated by transcranial direct current stimulation (tDCS) inside the human head through in vivo experiments, numerical analysis based on the finite element method has been widely used to estimate the electric field inside the head. In 2013, we released a MATLAB toolbox named COMETS, which has been used by a number of groups and has helped researchers to gain insight into the electric field distribution during stimulation. The aim of this study was to develop an advanced MATLAB toolbox, named COMETS2, for the numerical analysis of the electric field generated by tDCS. COMETS2 can generate any sizes of rectangular pad electrodes on any positions on the scalp surface. To reduce the large computational burden when repeatedly testing multiple electrode locations and sizes, a new technique to decompose the global stiffness matrix was proposed. As examples of potential applications, we observed the effects of sizes and displacements of electrodes on the results of electric field analysis. The proposed mesh decomposition method significantly enhanced the overall computational efficiency. We implemented an automatic electrode modeler for the first time, and proposed a new technique to enhance the computational efficiency. In this paper, an efficient toolbox for tDCS analysis is introduced (freely available at http://www.cometstool.com). It is expected that COMETS2 will be a useful toolbox for researchers who want to benefit from the numerical analysis of electric fields generated by tDCS. Copyright © 2016. Published by Elsevier B.V.
Quadratic Optimisation with One Quadratic Equality Constraint
2010-06-01
This report presents a theoretical framework for minimising a quadratic objective function subject to a quadratic equality constraint. The first part of the report gives a detailed algorithm which computes the global minimiser without calling special nonlinear optimisation solvers. The second part of the report shows how the developed theory can be applied to solve the time of arrival geolocation problem.
Global exponential stability of BAM neural networks with time-varying delays: The discrete-time case
NASA Astrophysics Data System (ADS)
Raja, R.; Marshal Anthoni, S.
2011-02-01
This paper deals with the problem of stability analysis for a class of discrete-time bidirectional associative memory (BAM) neural networks with time-varying delays. By employing the Lyapunov functional and linear matrix inequality (LMI) approach, a new sufficient conditions is proposed for the global exponential stability of discrete-time BAM neural networks. The proposed LMI based results can be easily checked by LMI control toolbox. Moreover, an example is also provided to demonstrate the effectiveness of the proposed method.
Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry
2016-01-01
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases.
ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes
NASA Astrophysics Data System (ADS)
Benveniste, J.; Ambrozio, A.; Restano, M.
2016-12-01
The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's future release (4.0.0) is planned for September 2016. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.0 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE's variance-covariance matrices. BRAT and GUT toolboxes can be freely downloaded, along with ancillary material, at https://earth.esa.int/brat and https://earth.esa.int/gut.
A web-based tool for ranking landslide mitigation measures
NASA Astrophysics Data System (ADS)
Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.
2012-04-01
As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.
2016-10-31
statistical physics. Sec. IV includes several examples of the application of the stochastic method, including matching of a shape to a fixed design, and...an important part of any future application of this method. Second, re-initialization of the level set can lead to small but significant movements of...of engineering design problems [6, 17]. However, many of the relevant applications involve non-convex optimisation problems with multiple locally
Stability analysis for stochastic BAM nonlinear neural network with delays
NASA Astrophysics Data System (ADS)
Lv, Z. W.; Shu, H. S.; Wei, G. L.
2008-02-01
In this paper, stochastic bidirectional associative memory neural networks with constant or time-varying delays is considered. Based on a Lyapunov-Krasovskii functional and the stochastic stability analysis theory, we derive several sufficient conditions in order to guarantee the global asymptotically stable in the mean square. Our investigation shows that the stochastic bidirectional associative memory neural networks are globally asymptotically stable in the mean square if there are solutions to some linear matrix inequalities(LMIs). Hence, the global asymptotic stability of the stochastic bidirectional associative memory neural networks can be easily checked by the Matlab LMI toolbox. A numerical example is given to demonstrate the usefulness of the proposed global asymptotic stability criteria.
The Toolbox for Local and Global Plagiarism Detection
ERIC Educational Resources Information Center
Butakov, Sergey; Scherbinin, Vladislav
2009-01-01
Digital plagiarism is a problem for educators all over the world. There are many software tools on the market for uncovering digital plagiarism. Most of them can work only with text submissions. In this paper, we present a new architecture for a plagiarism detection tool that can work with many different kinds of digital submissions, from plain or…
Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry
2016-01-01
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471
BiKEGG: a COBRA toolbox extension for bridging the BiGG and KEGG databases.
Jamialahmadi, Oveis; Motamedian, Ehsan; Hashemi-Najafabadi, Sameereh
2016-10-18
Development of an interface tool between the Biochemical, Genetic and Genomic (BiGG) and KEGG databases is necessary for simultaneous access to the features of both databases. For this purpose, we present the BiKEGG toolbox, an open source COBRA toolbox extension providing a set of functions to infer the reaction correspondences between the KEGG reaction identifiers and those in the BiGG knowledgebase using a combination of manual verification and computational methods. Inferred reaction correspondences using this approach are supported by evidence from the literature, which provides a higher number of reconciled reactions between these two databases compared to the MetaNetX and MetRxn databases. This set of equivalent reactions is then used to automatically superimpose the predicted fluxes using COBRA methods on classical KEGG pathway maps or to create a customized metabolic map based on the KEGG global metabolic pathway, and to find the corresponding reactions in BiGG based on the genome annotation of an organism in the KEGG database. Customized metabolic maps can be created for a set of pathways of interest, for the whole KEGG global map or exclusively for all pathways for which there exists at least one flux carrying reaction. This flexibility in visualization enables BiKEGG to indicate reaction directionality as well as to visualize the reaction fluxes for different static or dynamic conditions in an animated manner. BiKEGG allows the user to export (1) the output visualized metabolic maps to various standard image formats or save them as a video or animated GIF file, and (2) the equivalent reactions for an organism as an Excel spreadsheet.
Measurement, monitoring, and verification: make it work!
Coeli M. Hoover
2011-01-01
The capacity of forests to absorb and store carbon is certainly, as the authors note, an important tool in the greenhouse gas mitigation toolbox. Our understanding of what elements can make forest carbon offset projects successful has grown a great deal over time, as the global community has come to understand that forest degradation and conversion are the result of a...
Optimisation by hierarchical search
NASA Astrophysics Data System (ADS)
Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias
2015-03-01
Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.
Cultural-based particle swarm for dynamic optimisation problems
NASA Astrophysics Data System (ADS)
Daneshyari, Moayed; Yen, Gary G.
2012-07-01
Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.
A management and optimisation model for water supply planning in water deficit areas
NASA Astrophysics Data System (ADS)
Molinos-Senante, María; Hernández-Sancho, Francesc; Mocholí-Arce, Manuel; Sala-Garrido, Ramón
2014-07-01
The integrated water resources management approach has proven to be a suitable option for efficient, equitable and sustainable water management. In water-poor regions experiencing acute and/or chronic shortages, optimisation techniques are a useful tool for supporting the decision process of water allocation. In order to maximise the value of water use, an optimisation model was developed which involves multiple supply sources (conventional and non-conventional) and multiple users. Penalties, representing monetary losses in the event of an unfulfilled water demand, have been incorporated into the objective function. This model represents a novel approach which considers water distribution efficiency and the physical connections between water supply and demand points. Subsequent empirical testing using data from a Spanish Mediterranean river basin demonstrated the usefulness of the global optimisation model to solve existing water imbalances at the river basin level.
Registration of in vivo MR to histology of rodent brains using blockface imaging
NASA Astrophysics Data System (ADS)
Uberti, Mariano; Liu, Yutong; Dou, Huanyu; Mosley, R. Lee; Gendelman, Howard E.; Boska, Michael
2009-02-01
Registration of MRI to histopathological sections can enhance bioimaging validation for use in pathobiologic, diagnostic, and therapeutic evaluations. However, commonly used registration methods fall short of this goal due to tissue shrinkage and tearing after brain extraction and preparation. In attempts to overcome these limitations we developed a software toolbox using 3D blockface imaging as the common space of reference. This toolbox includes a semi-automatic brain extraction technique using constraint level sets (CLS), 3D reconstruction methods for the blockface and MR volume, and a 2D warping technique using thin-plate splines with landmark optimization. Using this toolbox, the rodent brain volume is first extracted from the whole head MRI using CLS. The blockface volume is reconstructed followed by 3D brain MRI registration to the blockface volume to correct the global deformations due to brain extraction and fixation. Finally, registered MRI and histological slices are warped to corresponding blockface images to correct slice specific deformations. The CLS brain extraction technique was validated by comparing manual results showing 94% overlap. The image warping technique was validated by calculating target registration error (TRE). Results showed a registration accuracy of a TRE < 1 pixel. Lastly, the registration method and the software tools developed were used to validate cell migration in murine human immunodeficiency virus type one encephalitis.
Babakhanyan, Ida; McKenna, Benjamin S; Casaletto, Kaitlin B; Nowinski, Cindy J; Heaton, Robert K
2018-01-01
The National Institutes of Health Toolbox Emotion Battery (NIHTB-EB) is a "common currency", computerized assessment developed to measure the full spectrum of emotional health. Though comprehensive, the NIHTB-EB's 17 scales may be unwieldy for users aiming to capture more global indices of emotional functioning. NIHTB-EB was administered to 1,036 English-speaking and 408 Spanish-speaking adults as a part of the NIH Toolbox norming project. We examined the factor structure of the NIHTB-EB in English- and Spanish-speaking adults and developed factor analysis-based summary scores. Census-weighted norms were presented for English speakers, and sample-weighted norms were presented for Spanish speakers. Exploratory factor analysis for both English- and Spanish-speaking cohorts resulted in the same 3-factor solution: 1) negative affect, 2) social satisfaction, and 3) psychological well-being. Confirmatory factor analysis supported similar factor structures for English- and Spanish-speaking cohorts. Model fit indices fell within the acceptable/good range, and our final solution was optimal compared to other solutions. Summary scores based upon the normative samples appear to be psychometrically supported and should be applied to clinical samples to further validate the factor structures and investigate rates of problematic emotions in medical and psychiatric populations.
Shen, Yi; Dai, Wei; Richards, Virginia M
2015-03-01
A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given.
Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki
2014-09-01
Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.
Rose, Jonas; Otto, Tobias; Dittrich, Lars
2008-10-30
The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.
Richards, V. M.; Dai, W.
2014-01-01
A MATLAB toolbox for the efficient estimation of the threshold, slope, and lapse rate of the psychometric function is described. The toolbox enables the efficient implementation of the updated maximum-likelihood (UML) procedure. The toolbox uses an object-oriented architecture for organizing the experimental variables and computational algorithms, which provides experimenters with flexibility in experimental design and data management. Descriptions of the UML procedure and the UML Toolbox are provided, followed by toolbox use examples. Finally, guidelines and recommendations of parameter configurations are given. PMID:24671826
Testing adaptive toolbox models: a Bayesian hierarchical approach.
Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.
NASA Astrophysics Data System (ADS)
Zhang, Langwen; Xie, Wei; Wang, Jingcheng
2017-11-01
In this work, synthesis of robust distributed model predictive control (MPC) is presented for a class of linear systems subject to structured time-varying uncertainties. By decomposing a global system into smaller dimensional subsystems, a set of distributed MPC controllers, instead of a centralised controller, are designed. To ensure the robust stability of the closed-loop system with respect to model uncertainties, distributed state feedback laws are obtained by solving a min-max optimisation problem. The design of robust distributed MPC is then transformed into solving a minimisation optimisation problem with linear matrix inequality constraints. An iterative online algorithm with adjustable maximum iteration is proposed to coordinate the distributed controllers to achieve a global performance. The simulation results show the effectiveness of the proposed robust distributed MPC algorithm.
The conception of life in synthetic biology.
Deplazes-Zemp, Anna
2012-12-01
The phrase 'synthetic biology' is used to describe a set of different scientific and technological disciplines, which share the objective to design and produce new life forms. This essay addresses the following questions: What conception of life stands behind this ambitious objective? In what relation does this conception of life stand to that of traditional biology and biotechnology? And, could such a conception of life raise ethical concerns? Three different observations that provide useful indications for the conception of life in synthetic biology will be discussed in detail: 1. Synthetic biologists focus on different features of living organisms in order to design new life forms, 2. Synthetic biologists want to contribute to the understanding of life, and 3. Synthetic biologists want to modify life through a rational design, which implies the notions of utilising, minimising/optimising, varying and overcoming life. These observations indicate a tight connection between science and technology, a focus on selected aspects of life, a production-oriented approach to life, and a design-oriented understanding of life. It will be argued that through this conception of life synthetic biologists present life in a different light. This conception of life will be illustrated by the metaphor of a toolbox. According to the notion of life as a toolbox, the different features of living organisms are perceived as various rationally designed instruments that can be used for the production of the living organism itself or secondary products made by the organism. According to certain ethical positions this conception of life might raise ethical concerns related to the status of the organism, the motives of the scientists and the role of technology in our society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelt, Daniël M.; Gürsoy, Dogˇa; Palenstijn, Willem Jan
2016-04-28
The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it ismore » shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy's standard reconstruction method.« less
Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H
2014-02-07
RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.
Pelt, Daniël M.; Gürsoy, Doǧa; Palenstijn, Willem Jan; Sijbers, Jan; De Carlo, Francesco; Batenburg, Kees Joost
2016-01-01
The processing of tomographic synchrotron data requires advanced and efficient software to be able to produce accurate results in reasonable time. In this paper, the integration of two software toolboxes, TomoPy and the ASTRA toolbox, which, together, provide a powerful framework for processing tomographic data, is presented. The integration combines the advantages of both toolboxes, such as the user-friendliness and CPU-efficient methods of TomoPy and the flexibility and optimized GPU-based reconstruction methods of the ASTRA toolbox. It is shown that both toolboxes can be easily installed and used together, requiring only minor changes to existing TomoPy scripts. Furthermore, it is shown that the efficient GPU-based reconstruction methods of the ASTRA toolbox can significantly decrease the time needed to reconstruct large datasets, and that advanced reconstruction methods can improve reconstruction quality compared with TomoPy’s standard reconstruction method. PMID:27140167
Arc_Mat: a Matlab-based spatial data analysis toolbox
NASA Astrophysics Data System (ADS)
Liu, Xingjian; Lesage, James
2010-03-01
This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.
van Griensven, A; Vanrolleghem, P A
2006-01-01
Web-based toolboxes are handy tools to inform experienced users of existing software in their disciplines. However, for the implementation of the Water Framework Directive, a much more diverse public (water managers, consultancy firms, scientists, etc.) will ask for a very wide diversity of Information and Communication Technology (ICT) tools. It is obvious that the users of a web-based ICT-toolbox providing all this will not be experts in all of the disciplines and that a toolbox for ICT tools for Water Framework Directive implementation should thus go beyond just making interesting web-links. To deal with this issue, expert knowledge is brought to the users through the incorporation of visitor-geared guidance (materials) in the Harmoni-CA toolbox. Small workshops of expert teams were organized to deliver documents explaining why the tools are important, when they are required and what activity they support/perform, as well as a categorization of the multitude of available tools. An integration of this information in the web-based toolbox helps the users to browse through a toolbox containing tools, reports, guidance documents and interesting links. The Harmoni-CA toolbox thus provides not only a virtual toolbox, but incorporates a virtual expert as well.
Lévy flight artificial bee colony algorithm
NASA Astrophysics Data System (ADS)
Sharma, Harish; Bansal, Jagdish Chand; Arya, K. V.; Yang, Xin-She
2016-08-01
Artificial bee colony (ABC) optimisation algorithm is a relatively simple and recent population-based probabilistic approach for global optimisation. The solution search equation of ABC is significantly influenced by a random quantity which helps in exploration at the cost of exploitation of the search space. In the ABC, there is a high chance to skip the true solution due to its large step sizes. In order to balance between diversity and convergence in the ABC, a Lévy flight inspired search strategy is proposed and integrated with ABC. The proposed strategy is named as Lévy Flight ABC (LFABC) has both the local and global search capability simultaneously and can be achieved by tuning the Lévy flight parameters and thus automatically tuning the step sizes. In the LFABC, new solutions are generated around the best solution and it helps to enhance the exploitation capability of ABC. Furthermore, to improve the exploration capability, the numbers of scout bees are increased. The experiments on 20 test problems of different complexities and five real-world engineering optimisation problems show that the proposed strategy outperforms the basic ABC and recent variants of ABC, namely, Gbest-guided ABC, best-so-far ABC and modified ABC in most of the experiments.
Modelling the protocol stack in NCS with deterministic and stochastic petri net
NASA Astrophysics Data System (ADS)
Hui, Chen; Chunjie, Zhou; Weifeng, Zhu
2011-06-01
Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.
Stochastic optimisation of water allocation on a global scale
NASA Astrophysics Data System (ADS)
Schmitz, Oliver; Straatsma, Menno; Karssenberg, Derek; Bierkens, Marc F. P.
2014-05-01
Climate change, increasing population and further economic developments are expected to increase water scarcity for many regions of the world. Optimal water management strategies are required to minimise the water gap between water supply and domestic, industrial and agricultural water demand. A crucial aspect of water allocation is the spatial scale of optimisation. Blue water supply peaks at the upstream parts of large catchments, whereas demands are often largest at the industrialised downstream parts. Two extremes exist in water allocation: (i) 'First come, first serve,' which allows the upstream water demands to be fulfilled without considerations of downstream demands, and (ii) 'All for one, one for all' that satisfies water allocation over the whole catchment. In practice, water treaties govern intermediate solutions. The objective of this study is to determine the effect of these two end members on water allocation optimisation with respect to water scarcity. We conduct this study on a global scale with the year 2100 as temporal horizon. Water supply is calculated using the hydrological model PCR-GLOBWB, operating at a 5 arcminutes resolution and a daily time step. PCR-GLOBWB is forced with temperature and precipitation fields from the Hadgem2-ES global circulation model that participated in the latest coupled model intercomparison project (CMIP5). Water demands are calculated for representative concentration pathway 6.0 (RCP 6.0) and shared socio-economic pathway scenario 2 (SSP2). To enable the fast computation of the optimisation, we developed a hydrologically correct network of 1800 basin segments with an average size of 100 000 square kilometres. The maximum number of nodes in a network was 140 for the Amazon Basin. Water demands and supplies are aggregated to cubic kilometres per month per segment. A new open source implementation of the water allocation is developed for the stochastic optimisation of the water allocation. We apply a Genetic Algorithm for each segment to estimate the set of parameters that distribute the water supply for each node. We use the Python programming language and a flexible software architecture allowing to straightforwardly 1) exchange the process description for the nodes such that different water allocation schemes can be tested 2) exchange the objective function 3) apply the optimisation either to the whole catchment or to different sub-levels and 4) use multi-core CPUs concurrently and therefore reducing computation time. We demonstrate the application of the scientific workflow to the model outputs of PCR-GLOBWB and present first results on how water scarcity depends on the choice between the two extremes in water allocation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago
2014-07-15
Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.
Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John
2014-02-21
The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less
MOEMS Modeling Using the Geometrical Matrix Toolbox
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2005-01-01
New technologies such as MicroOptoElectro-Mechanical Systems (MOEMS) require new modeling tools. These tools must simultaneously model the optical, electrical, and mechanical domains and the interactions between these domains. To facilitate rapid prototyping of these new technologies an optical toolbox has been developed for modeling MOEMS devices. The toolbox models are constructed using MATLAB's dynamical simulator, Simulink. Modeling toolboxes will allow users to focus their efforts on system design and analysis as opposed to developing component models. This toolbox was developed to facilitate rapid modeling and design of a MOEMS based laser ultrasonic receiver system.
C%2B%2B tensor toolbox user manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plantenga, Todd D.; Kolda, Tamara Gibson
2012-04-01
The C++ Tensor Toolbox is a software package for computing tensor decompositions. It is based on the Matlab Tensor Toolbox, and is particularly optimized for sparse data sets. This user manual briefly overviews tensor decomposition mathematics, software capabilities, and installation of the package. Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors in C++. The Toolbox compiles into libraries and is intended for use with custom applications written by users.
An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.
Silva, Ikaro; Moody, George B
The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.
Speed management toolbox for rural communities.
DOT National Transportation Integrated Search
2013-04-01
The primary objective of this toolbox is to summarize various known traffic-calming treatments and their effectiveness. This toolbox focuses on roadway-based treatments for speed management, particularly for rural communities with transition zones. E...
Convex relaxations for gas expansion planning
Borraz-Sanchez, Conrado; Bent, Russell Whitford; Backhaus, Scott N.; ...
2016-01-01
Expansion of natural gas networks is a critical process involving substantial capital expenditures with complex decision-support requirements. Here, given the non-convex nature of gas transmission constraints, global optimality and infeasibility guarantees can only be offered by global optimisation approaches. Unfortunately, state-of-the-art global optimisation solvers are unable to scale up to real-world size instances. In this study, we present a convex mixed-integer second-order cone relaxation for the gas expansion planning problem under steady-state conditions. The underlying model offers tight lower bounds with high computational efficiency. In addition, the optimal solution of the relaxation can often be used to derive high-quality solutionsmore » to the original problem, leading to provably tight optimality gaps and, in some cases, global optimal solutions. The convex relaxation is based on a few key ideas, including the introduction of flux direction variables, exact McCormick relaxations, on/off constraints, and integer cuts. Numerical experiments are conducted on the traditional Belgian gas network, as well as other real larger networks. The results demonstrate both the accuracy and computational speed of the relaxation and its ability to produce high-quality solution« less
NASA Astrophysics Data System (ADS)
Grimminck, Dennis L. A. G.; Vasa, Suresh K.; Meerts, W. Leo; Kentgens, P. M.
2011-06-01
A global optimisation scheme for phase modulated proton homonuclear decoupling sequences in solid-state NMR is presented. Phase modulations, parameterised by DUMBO Fourier coefficients, were optimized using a Covariance Matrix Adaptation Evolution Strategies algorithm. Our method, denoted EASY-GOING homonuclear decoupling, starts with featureless spectra and optimises proton-proton decoupling, during either proton or carbon signal detection. On the one hand, our solutions closely resemble (e)DUMBO for moderate sample spinning frequencies and medium radio-frequency (rf) field strengths. On the other hand, the EASY-GOING approach resulted in a superior solution, achieving significantly better resolved proton spectra at very high 680 kHz rf field strength. N. Hansen, and A. Ostermeier. Evol. Comput. 9 (2001) 159-195 B. Elena, G. de Paepe, L. Emsley. Chem. Phys. Lett. 398 (2004) 532-538
NASA Astrophysics Data System (ADS)
Mallick, S.; Kar, R.; Mandal, D.; Ghoshal, S. P.
2016-07-01
This paper proposes a novel hybrid optimisation algorithm which combines the recently proposed evolutionary algorithm Backtracking Search Algorithm (BSA) with another widely accepted evolutionary algorithm, namely, Differential Evolution (DE). The proposed algorithm called BSA-DE is employed for the optimal designs of two commonly used analogue circuits, namely Complementary Metal Oxide Semiconductor (CMOS) differential amplifier circuit with current mirror load and CMOS two-stage operational amplifier (op-amp) circuit. BSA has a simple structure that is effective, fast and capable of solving multimodal problems. DE is a stochastic, population-based heuristic approach, having the capability to solve global optimisation problems. In this paper, the transistors' sizes are optimised using the proposed BSA-DE to minimise the areas occupied by the circuits and to improve the performances of the circuits. The simulation results justify the superiority of BSA-DE in global convergence properties and fine tuning ability, and prove it to be a promising candidate for the optimal design of the analogue CMOS amplifier circuits. The simulation results obtained for both the amplifier circuits prove the effectiveness of the proposed BSA-DE-based approach over DE, harmony search (HS), artificial bee colony (ABC) and PSO in terms of convergence speed, design specifications and design parameters of the optimal design of the analogue CMOS amplifier circuits. It is shown that BSA-DE-based design technique for each amplifier circuit yields the least MOS transistor area, and each designed circuit is shown to have the best performance parameters such as gain, power dissipation, etc., as compared with those of other recently reported literature.
Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0
Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.
2012-01-01
Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097
NASA Astrophysics Data System (ADS)
Jin, Chenxia; Li, Fachao; Tsang, Eric C. C.; Bulysheva, Larissa; Kataev, Mikhail Yu
2017-01-01
In many real industrial applications, the integration of raw data with a methodology can support economically sound decision-making. Furthermore, most of these tasks involve complex optimisation problems. Seeking better solutions is critical. As an intelligent search optimisation algorithm, genetic algorithm (GA) is an important technique for complex system optimisation, but it has internal drawbacks such as low computation efficiency and prematurity. Improving the performance of GA is a vital topic in academic and applications research. In this paper, a new real-coded crossover operator, called compound arithmetic crossover operator (CAC), is proposed. CAC is used in conjunction with a uniform mutation operator to define a new genetic algorithm CAC10-GA. This GA is compared with an existing genetic algorithm (AC10-GA) that comprises an arithmetic crossover operator and a uniform mutation operator. To judge the performance of CAC10-GA, two kinds of analysis are performed. First the analysis of the convergence of CAC10-GA is performed by the Markov chain theory; second, a pair-wise comparison is carried out between CAC10-GA and AC10-GA through two test problems available in the global optimisation literature. The overall comparative study shows that the CAC performs quite well and the CAC10-GA defined outperforms the AC10-GA.
Genetic algorithm-based improved DOA estimation using fourth-order cumulants
NASA Astrophysics Data System (ADS)
Ahmed, Ammar; Tufail, Muhammad
2017-05-01
Genetic algorithm (GA)-based direction of arrival (DOA) estimation is proposed using fourth-order cumulants (FOC) and ESPRIT principle which results in Multiple Invariance Cumulant ESPRIT algorithm. In the existing FOC ESPRIT formulations, only one invariance is utilised to estimate DOAs. The unused multiple invariances (MIs) must be exploited simultaneously in order to improve the estimation accuracy. In this paper, a fitness function based on a carefully designed cumulant matrix is developed which incorporates MIs present in the sensor array. Better DOA estimation can be achieved by minimising this fitness function. Moreover, the effectiveness of Newton's method as well as GA for this optimisation problem has been illustrated. Simulation results show that the proposed algorithm provides improved estimation accuracy compared to existing algorithms, especially in the case of low SNR, less number of snapshots, closely spaced sources and high signal and noise correlation. Moreover, it is observed that the optimisation using Newton's method is more likely to converge to false local optima resulting in erroneous results. However, GA-based optimisation has been found attractive due to its global optimisation capability.
Intermeuse: The Meuse Reconnected
NASA Astrophysics Data System (ADS)
Geilen, N.; Pedroli, B.; van Looy, K.; Krebs, L.
In the coming years decision makers are confronted with the question how to com- bine aims for sustainable flood protection of river systems and floodplain rehabili- tation in the best possible way. Both topics deal with spatial planning aspects and dimensions of measures. On this basis an evaluation method was developed within the IRMA/SPONGE project INTERMEUSE and illustrated for a number of (fictive) situations in the Meuse basin. The integration of flood protection and floodplain reha- bilitation can be performed on two scale levels that are interrelated: global for (large parts of) a stream basin or local for a specific site. Both scale levels are elaborated within INTERMEUSE: a link with flood protection measures and/or strategies is made via changed abiotic conditions, resulting in indications on chances to link flood pro- tection goals to ecosystem rehabilitation goals. Ecological aspects under study were spatial cohesion and habitat configuration (global level) and habitat quality (local level). Based on the results of the analyses performed an integration approach was constructed that can be used in different parts of the planning cycle: toolboxes for the planning phase, the actual evaluation and guidelines of how to use these toolboxes in practise. The results of this study show clearly that there is a good chance to combine floodplain rehabilitation aims with flood protection activities, both on a local and in- ternational scale. In practise, for both cases close co-operation of parties involved is an important prerequisite.
Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P
2017-08-01
Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.
An Optimised System for Generating Multi-Resolution Dtms Using NASA Mro Datasets
NASA Astrophysics Data System (ADS)
Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Veitch-Michaelis, J.; Yershov, V.
2016-06-01
Within the EU FP-7 iMars project, a fully automated multi-resolution DTM processing chain, called Co-registration ASP-Gotcha Optimised (CASP-GO) has been developed, based on the open source NASA Ames Stereo Pipeline (ASP). CASP-GO includes tiepoint based multi-resolution image co-registration and an adaptive least squares correlation-based sub-pixel refinement method called Gotcha. The implemented system guarantees global geo-referencing compliance with respect to HRSC (and thence to MOLA), provides refined stereo matching completeness and accuracy based on the ASP normalised cross-correlation. We summarise issues discovered from experimenting with the use of the open-source ASP DTM processing chain and introduce our new working solutions. These issues include global co-registration accuracy, de-noising, dealing with failure in matching, matching confidence estimation, outlier definition and rejection scheme, various DTM artefacts, uncertainty estimation, and quality-efficiency trade-offs.
Optimisation of assembly scheduling in VCIM systems using genetic algorithm
NASA Astrophysics Data System (ADS)
Dao, Son Duy; Abhary, Kazem; Marian, Romeo
2017-09-01
Assembly plays an important role in any production system as it constitutes a significant portion of the lead time and cost of a product. Virtual computer-integrated manufacturing (VCIM) system is a modern production system being conceptually developed to extend the application of traditional computer-integrated manufacturing (CIM) system to global level. Assembly scheduling in VCIM systems is quite different from one in traditional production systems because of the difference in the working principles of the two systems. In this article, the assembly scheduling problem in VCIM systems is modeled and then an integrated approach based on genetic algorithm (GA) is proposed to search for a global optimised solution to the problem. Because of dynamic nature of the scheduling problem, a novel GA with unique chromosome representation and modified genetic operations is developed herein. Robustness of the proposed approach is verified by a numerical example.
WEC Design Response Toolbox v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey
2016-03-30
The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.
Orava, Taryn; Provvidenza, Christine; Townley, Ashleigh; Kingsnorth, Shauna
2018-06-08
Though high numbers of children with cerebral palsy experience chronic pain, it remains under-recognized. This paper describes an evaluation of implementation supports and adoption of the Chronic Pain Assessment Toolbox for Children with Disabilities (the Toolbox) to enhance pain screening and assessment practices within a pediatric rehabilitation and complex continuing care hospital. A multicomponent knowledge translation strategy facilitated Toolbox adoption, inclusive of a clinical practice guideline, cerebral palsy practice points and assessment tools. Across the hospital, seven ambulatory care clinics with cerebral palsy caseloads participated in a staggered roll-out (Group 1: exclusive CP caseloads, March-December; Group 2: mixed diagnostic caseloads, August-December). Evaluation measures included client electronic medical record audit, document review and healthcare provider survey and interviews. A significant change in documentation of pain screening and assessment practice from pre-Toolbox (<2%) to post-Toolbox adoption (53%) was found. Uptake in Group 2 clinics lagged behind Group 1. Opportunities to use the Toolbox consistently (based on diagnostic caseload) and frequently (based on client appointments) were noted among contextual factors identified. Overall, the Toolbox was positively received and clinically useful. Findings affirm that the Toolbox, in conjunction with the application of integrated knowledge translation principles and an established knowledge translation framework, has potential to be a useful resource to enrich and standardize chronic pain screening and assessment practices among children with cerebral palsy. Implications for Rehabilitation It is important to engage healthcare providers in the conceptualization, development, implementation and evaluation of a knowledge-to-action best practice product. The Chronic Pain Toolbox for Children with Disabilities provides rehabilitation staff with guidance on pain screening and assessment best practice and offers a range of validated tools that can be incorporated in ambulatory clinic settings to meet varied client needs. Considering unique clinical contexts (i.e., opportunities for use, provider engagement, staffing absences/turnover) is required to optimize and sustain chronic pain screening and assessment practices in rehabilitation outpatient settings.
PFA toolbox: a MATLAB tool for Metabolic Flux Analysis.
Morales, Yeimy; Bosque, Gabriel; Vehí, Josep; Picó, Jesús; Llaneras, Francisco
2016-07-11
Metabolic Flux Analysis (MFA) is a methodology that has been successfully applied to estimate metabolic fluxes in living cells. However, traditional frameworks based on this approach have some limitations, particularly when measurements are scarce and imprecise. This is very common in industrial environments. The PFA Toolbox can be used to face those scenarios. Here we present the PFA (Possibilistic Flux Analysis) Toolbox for MATLAB, which simplifies the use of Interval and Possibilistic Metabolic Flux Analysis. The main features of the PFA Toolbox are the following: (a) It provides reliable MFA estimations in scenarios where only a few fluxes can be measured or those available are imprecise. (b) It provides tools to easily plot the results as interval estimates or flux distributions. (c) It is composed of simple functions that MATLAB users can apply in flexible ways. (d) It includes a Graphical User Interface (GUI), which provides a visual representation of the measurements and their uncertainty. (e) It can use stoichiometric models in COBRA format. In addition, the PFA Toolbox includes a User's Guide with a thorough description of its functions and several examples. The PFA Toolbox for MATLAB is a freely available Toolbox that is able to perform Interval and Possibilistic MFA estimations.
A novel Laser Ion Mobility Spectrometer
NASA Astrophysics Data System (ADS)
Göbel, J.; Kessler, M.; Langmeier, A.
2009-05-01
IMS is a well know technology within the range of security based applications. Its main advantages lie in the simplicity of measurement, along with a fast and sensitive detection method. Contemporary technology often fails due to interference substances, in conjunction with saturation effects and a low dynamic detection range. High throughput facilities, such as airports, require the analysis of many samples at low detection limits within a very short timeframe. High detection reliability is a requirement for safe and secure operation. In our present work we developed a laser based ion-mobility-sensor which shows several advantages over known IMS sensor technology. The goal of our research was to increase the sensitivity compared to the range of 63Ni based instruments. This was achieved with an optimised geometric drift tube design and a pulsed UV laser system at an efficient intensity. In this intensity range multi-photon ionisation is possible, which leads to higher selectivity in the ion-formation process itself. After high speed capturing of detection samples, a custom designed pattern recognition software toolbox provides reliable auto-detection capability with a learning algorithm and a graphical user interface.
Pointing System Simulation Toolbox with Application to a Balloon Mission Simulator
NASA Technical Reports Server (NTRS)
Maringolo Baldraco, Rosana M.; Aretskin-Hariton, Eliot D.; Swank, Aaron J.
2017-01-01
The development of attitude estimation and pointing-control algorithms is necessary in order to achieve high-fidelity modeling for a Balloon Mission Simulator (BMS). A pointing system simulation toolbox was developed to enable this. The toolbox consists of a star-tracker (ST) and Inertial Measurement Unit (IMU) signal generator, a UDP (User Datagram Protocol) communication le (bridge), and an indirect-multiplicative extended Kalman filter (imEKF). This document describes the Python toolbox developed and the results of its implementation in the imEKF.
A Module for Graphical Display of Model Results with the CBP Toolbox
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.
2015-04-21
This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide amore » basis for further development of the CBP Toolbox.« less
40 CFR 141.716 - Source toolbox components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...
40 CFR 141.716 - Source toolbox components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...
40 CFR 141.716 - Source toolbox components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... for Microbial Toolbox Components § 141.716 Source toolbox components. (a) Watershed control program. Systems receive 0.5-log Cryptosporidium treatment credit for implementing a watershed control program that meets the requirements of this section. (1) Systems that intend to apply for the watershed control...
Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) User's Guide
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) software package is an open source, MATLABSimulink toolbox (plug in) that can be used by industry professionals and academics for the development of thermodynamic and controls simulations.
CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.
Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka
2017-09-15
CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).
Dynamic least-cost optimisation of wastewater system remedial works requirements.
Vojinovic, Z; Solomatine, D; Price, R K
2006-01-01
In recent years, there has been increasing concern for wastewater system failure and identification of optimal set of remedial works requirements. So far, several methodologies have been developed and applied in asset management activities by various water companies worldwide, but often with limited success. In order to fill the gap, there are several research projects that have been undertaken in exploring various algorithms to optimise remedial works requirements, but mostly for drinking water supply systems, and very limited work has been carried out for the wastewater assets. Some of the major deficiencies of commonly used methods can be found in either one or more of the following aspects: inadequate representation of systems complexity, incorporation of a dynamic model into the decision-making loop, the choice of an appropriate optimisation technique and experience in applying that technique. This paper is oriented towards resolving these issues and discusses a new approach for the optimisation of wastewater systems remedial works requirements. It is proposed that the optimal problem search is performed by a global optimisation tool (with various random search algorithms) and the system performance is simulated by the hydrodynamic pipe network model. The work on assembling all required elements and the development of an appropriate interface protocols between the two tools, aimed to decode the potential remedial solutions into the pipe network model and to calculate the corresponding scenario costs, is currently underway.
The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...
The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water(1). The Toolbox contains guidance that may be adopted voluntarily, a...
RESPONSE PROTOCOL TOOLBOX OVERVIEW, STATUS UPDATE, AND RELATIONSHIP TO OTHER WATER SECURITY PRODUCTS
The Response Protocol Toolbox was released by USEPA to address the complex, multi-faceted challenges of a water utility's planning and response to the threat or act of intentional contamination of drinking water (1). The Toolbox contains guidance that may be adopted voluntarily,...
The Brain's Versatile Toolbox.
ERIC Educational Resources Information Center
Pinker, Steven
1997-01-01
Considers the role of evolution and natural selection in the functioning of the modern human brain. Natural selection equipped humans with a mental toolbox of intuitive theories about the world which were used to master rocks, tools, plants, animals, and one another. The same toolbox is used today to master the intellectual challenges of modern…
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2006-01-01
The Compressible Flow Toolbox is primarily a MATLAB-language implementation of a set of algorithms that solve approximately 280 linear and nonlinear classical equations for compressible flow. The toolbox is useful for analysis of one-dimensional steady flow with either constant entropy, friction, heat transfer, or Mach number greater than 1. The toolbox also contains algorithms for comparing and validating the equation-solving algorithms against solutions previously published in open literature. The classical equations solved by the Compressible Flow Toolbox are as follows: The isentropic-flow equations, The Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), The Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section), The normal-shock equations, The oblique-shock equations, and The expansion equations.
A toolbox for safety instrumented system evaluation based on improved continuous-time Markov chain
NASA Astrophysics Data System (ADS)
Wardana, Awang N. I.; Kurniady, Rahman; Pambudi, Galih; Purnama, Jaka; Suryopratomo, Kutut
2017-08-01
Safety instrumented system (SIS) is designed to restore a plant into a safe condition when pre-hazardous event is occur. It has a vital role especially in process industries. A SIS shall be meet with safety requirement specifications. To confirm it, SIS shall be evaluated. Typically, the evaluation is calculated by hand. This paper presents a toolbox for SIS evaluation. It is developed based on improved continuous-time Markov chain. The toolbox supports to detailed approach of evaluation. This paper also illustrates an industrial application of the toolbox to evaluate arch burner safety system of primary reformer. The results of the case study demonstrates that the toolbox can be used to evaluate industrial SIS in detail and to plan the maintenance strategy.
Dixon, Philippe C; Loh, Jonathan J; Michaud-Paquette, Yannick; Pearsall, David J
2017-03-01
It is common for biomechanics data sets to contain numerous dependent variables recorded over time, for many subjects, groups, and/or conditions. These data often require standard sorting, processing, and analysis operations to be performed in order to answer research questions. Visualization of these data is also crucial. This manuscript presents biomechZoo, an open-source toolbox that provides tools and graphical user interfaces to help users achieve these goals. The aims of this manuscript are to (1) introduce the main features of the toolbox, including a virtual three-dimensional environment to animate motion data (Director), a data plotting suite (Ensembler), and functions for the computation of three-dimensional lower-limb joint angles, moments, and power and (2) compare these computations to those of an existing validated system. To these ends, the steps required to process and analyze a sample data set via the toolbox are outlined. The data set comprises three-dimensional marker, ground reaction force (GRF), joint kinematic, and joint kinetic data of subjects performing straight walking and 90° turning manoeuvres. Joint kinematics and kinetics processed within the toolbox were found to be similar to outputs from a commercial system. The biomechZoo toolbox represents the work of several years and multiple contributors to provide a flexible platform to examine time-series data sets typical in the movement sciences. The toolbox has previously been used to process and analyse walking, running, and ice hockey data sets, and can integrate existing routines, such as the KineMat toolbox, for additional analyses. The toolbox can help researchers and clinicians new to programming or biomechanics to process and analyze their data through a customizable workflow, while advanced users are encouraged to contribute additional functionality to the project. Students may benefit from using biomechZoo as a learning and research tool. It is hoped that the toolbox can play a role in advancing research in the movement sciences. The biomechZoo m-files, sample data, and help repositories are available online (http://www.biomechzoo.com) under the Apache 2.0 License. The toolbox is supported for Matlab (r2014b or newer, The Mathworks Inc., Natick, USA) for Windows (Microsoft Corp., Redmond, USA) and Mac OS (Apple Inc., Cupertino, USA). Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Software Toolbox for Low-Frequency Conductivity and Current Density Imaging Using MRI.
Sajib, Saurav Z K; Katoch, Nitish; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je
2017-11-01
Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes. Low-frequency conductivity and current density imaging using MRI includes magnetic resonance electrical impedance tomography (MREIT), diffusion tensor MREIT (DT-MREIT), conductivity tensor imaging (CTI), and magnetic resonance current density imaging (MRCDI). MRCDI and MREIT provide current density and isotropic conductivity images, respectively, using current-injection phase MRI techniques. DT-MREIT produces anisotropic conductivity tensor images by incorporating diffusion weighted MRI into MREIT. These current-injection techniques are finding clinical applications in diagnostic imaging and also in transcranial direct current stimulation (tDCS), deep brain stimulation (DBS), and electroporation where treatment currents can function as imaging currents. To avoid adverse effects of nerve and muscle stimulations due to injected currents, conductivity tensor imaging (CTI) utilizes B1 mapping and multi-b diffusion weighted MRI to produce low-frequency anisotropic conductivity tensor images without injecting current. This paper describes numerical implementations of several key mathematical functions for conductivity and current density image reconstructions in MRCDI, MREIT, DT-MREIT, and CTI. To facilitate experimental studies of clinical applications, we developed a software toolbox for these low-frequency conductivity and current density imaging methods. This MR-based conductivity imaging (MRCI) toolbox includes 11 toolbox functions which can be used in the MATLAB environment. The MRCI toolbox is available at http://iirc.khu.ac.kr/software.html . Its functions were tested by using several experimental datasets, which are provided together with the toolbox. Users of the toolbox can focus on experimental designs and interpretations of reconstructed images instead of developing their own image reconstruction softwares. We expect more toolbox functions to be added from future research outcomes.
A New Computational Technique for the Generation of Optimised Aircraft Trajectories
NASA Astrophysics Data System (ADS)
Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto
2017-12-01
A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.
NASA Astrophysics Data System (ADS)
Luo, Xiaoguang; Mayer, Michael; Heck, Bernhard
2010-05-01
One essential deficiency of the stochastic model used in many GNSS (Global Navigation Satellite Systems) software products consists in neglecting temporal correlation of GNSS observations. Analysing appropriately detrended time series of observation residuals resulting from GPS (Global Positioning System) data processing, the temporal correlation behaviour of GPS observations can be sufficiently described by means of so-called autoregressive moving average (ARMA) processes. Using the toolbox ARMASA which is available free of charge in MATLAB® Central (open exchange platform for the MATLAB® and SIMULINK® user community), a well-fitting time series model can be identified automatically in three steps. Firstly, AR, MA, and ARMA models are computed up to some user-specified maximum order. Subsequently, for each model type, the best-fitting model is selected using the combined (for AR processes) resp. generalised (for MA and ARMA processes) information criterion. The final model identification among the best-fitting AR, MA, and ARMA models is performed based on the minimum prediction error characterising the discrepancies between the given data and the fitted model. The ARMA coefficients are computed using Burg's maximum entropy algorithm (for AR processes), Durbin's first (for MA processes) and second (for ARMA processes) methods, respectively. This paper verifies the performance of the automated ARMA identification using the toolbox ARMASA. For this purpose, a representative data base is generated by means of ARMA simulation with respect to sample size, correlation level, and model complexity. The model error defined as a transform of the prediction error is used as measure for the deviation between the true and the estimated model. The results of the study show that the recognition rates of underlying true processes increase with increasing sample sizes and decrease with rising model complexity. Considering large sample sizes, the true underlying processes can be correctly recognised for nearly 80% of the analysed data sets. Additionally, the model errors of first-order AR resp. MA processes converge clearly more rapidly to the corresponding asymptotical values than those of high-order ARMA processes.
EPA has developed a "Response Protocol Toolbox" to address the complex, multi-faceted challenges of planning and response to intentional contamination of drinking water (http://www.epa.gov/safewater/security/ertools.html#toolbox). The toolbox is designed to be applied by a numbe...
ERIC Educational Resources Information Center
Backman, Desiree; Scruggs, Valarie; Atiedu, Akpene Ama; Bowie, Shene; Bye, Larry; Dennis, Angela; Hall, Melanie; Ossa, Alexandra; Wertlieb, Stacy; Foerster, Susan B.
2011-01-01
Objective: Evaluate the effectiveness of the "Fruit, Vegetable, and Physical Activity Toolbox for Community Educators" ("Toolbox"), an intervention originally designed for Spanish- and English-speaking audiences, in changing knowledge, attitudes, and behavior among low-income African American women. Design: Quasi-experimental…
DOT National Transportation Integrated Search
2012-02-01
The report provides a suite of recommended strategies to reduce single-occupant vehicle traffic in the urban : areas of Phoenix and Tucson, Arizona, which are presented as a travel demand management toolbox. The : toolbox includes supporting research...
Healy, Sinead; McMahon, Jill; Owens, Peter; Dockery, Peter; FitzGerald, Una
2018-02-01
Image segmentation is often imperfect, particularly in complex image sets such z-stack micrographs of slice cultures and there is a need for sufficient details of parameters used in quantitative image analysis to allow independent repeatability and appraisal. For the first time, we have critically evaluated, quantified and validated the performance of different segmentation methodologies using z-stack images of ex vivo glial cells. The BioVoxxel toolbox plugin, available in FIJI, was used to measure the relative quality, accuracy, specificity and sensitivity of 16 global and 9 local threshold automatic thresholding algorithms. Automatic thresholding yields improved binary representation of glial cells compared with the conventional user-chosen single threshold approach for confocal z-stacks acquired from ex vivo slice cultures. The performance of threshold algorithms varies considerably in quality, specificity, accuracy and sensitivity with entropy-based thresholds scoring highest for fluorescent staining. We have used the BioVoxxel toolbox to correctly and consistently select the best automated threshold algorithm to segment z-projected images of ex vivo glial cells for downstream digital image analysis and to define segmentation quality. The automated OLIG2 cell count was validated using stereology. As image segmentation and feature extraction can quite critically affect the performance of successive steps in the image analysis workflow, it is becoming increasingly necessary to consider the quality of digital segmenting methodologies. Here, we have applied, validated and extended an existing performance-check methodology in the BioVoxxel toolbox to z-projected images of ex vivo glia cells. Copyright © 2017 Elsevier B.V. All rights reserved.
Strain, William David; Paldánius, Päivi M
2017-08-01
The last decade has witnessed the role of dipeptidyl peptidase-4 (DPP-4) inhibitors in producing a conceptual change in early management of type 2 diabetes mellitus (T2DM) by shifting emphasis from a gluco-centric approach to holistically treating underlying pathophysiological processes. DPP-4 inhibitors highlighted the importance of acknowledging hypoglycaemia and weight gain as barriers to optimised care in T2DM. These complications were an integral part of diabetes management before the introduction of DPP-4 inhibitors. During the development of DPP-4 inhibitors, regulatory requirements for introducing new agents underwent substantial changes, with increased emphasis on safety. This led to the systematic collection of adjudicated cardiovascular (CV) safety data, and, where 95% confidence of a lack of harm could not be demonstrated, the standardised CV safety studies. Furthermore, the growing awareness of the worldwide extent of T2DM demanded a more diverse approach to recruitment and participation in clinical trials. Finally, the global financial crisis placed a new awareness on the health economics of diabetes, which rapidly became the most expensive disease in the world. This review encompasses unique developments in the global landscape, and the role DPP-4 inhibitors, specifically vildagliptin, have played in research advancement and optimisation of diabetes care in a diverse population with T2DM worldwide.
Paldánius, Päivi M
2017-01-01
Abstract The last decade has witnessed the role of dipeptidyl peptidase-4 (DPP-4) inhibitors in producing a conceptual change in early management of type 2 diabetes mellitus (T2DM) by shifting emphasis from a gluco-centric approach to holistically treating underlying pathophysiological processes. DPP-4 inhibitors highlighted the importance of acknowledging hypoglycaemia and weight gain as barriers to optimised care in T2DM. These complications were an integral part of diabetes management before the introduction of DPP-4 inhibitors. During the development of DPP-4 inhibitors, regulatory requirements for introducing new agents underwent substantial changes, with increased emphasis on safety. This led to the systematic collection of adjudicated cardiovascular (CV) safety data, and, where 95% confidence of a lack of harm could not be demonstrated, the standardised CV safety studies. Furthermore, the growing awareness of the worldwide extent of T2DM demanded a more diverse approach to recruitment and participation in clinical trials. Finally, the global financial crisis placed a new awareness on the health economics of diabetes, which rapidly became the most expensive disease in the world. This review encompasses unique developments in the global landscape, and the role DPP-4 inhibitors, specifically vildagliptin, have played in research advancement and optimisation of diabetes care in a diverse population with T2DM worldwide. PMID:29632609
JWST Wavefront Control Toolbox
NASA Technical Reports Server (NTRS)
Shin, Shahram Ron; Aronstein, David L.
2011-01-01
A Matlab-based toolbox has been developed for the wavefront control and optimization of segmented optical surfaces to correct for possible misalignments of James Webb Space Telescope (JWST) using influence functions. The toolbox employs both iterative and non-iterative methods to converge to an optimal solution by minimizing the cost function. The toolbox could be used in either of constrained and unconstrained optimizations. The control process involves 1 to 7 degrees-of-freedom perturbations per segment of primary mirror in addition to the 5 degrees of freedom of secondary mirror. The toolbox consists of a series of Matlab/Simulink functions and modules, developed based on a "wrapper" approach, that handles the interface and data flow between existing commercial optical modeling software packages such as Zemax and Code V. The limitations of the algorithm are dictated by the constraints of the moving parts in the mirrors.
A CRISPR-Based Toolbox for Studying T Cell Signal Transduction
Chi, Shen; Weiss, Arthur; Wang, Haopeng
2016-01-01
CRISPR/Cas9 system is a powerful technology to perform genome editing in a variety of cell types. To facilitate the application of Cas9 in mapping T cell signaling pathways, we generated a toolbox for large-scale genetic screens in human Jurkat T cells. The toolbox has three different Jurkat cell lines expressing distinct Cas9 variants, including wild-type Cas9, dCas9-KRAB, and sunCas9. We demonstrated that the toolbox allows us to rapidly disrupt endogenous gene expression at the DNA level and to efficiently repress or activate gene expression at the transcriptional level. The toolbox, in combination with multiple currently existing genome-wide sgRNA libraries, will be useful to systematically investigate T cell signal transduction using both loss-of-function and gain-of-function genetic screens. PMID:27057542
NASA Astrophysics Data System (ADS)
Shoemaker, C. A.; Pang, M.; Akhtar, T.; Bindel, D.
2016-12-01
New parallel surrogate global optimization algorithms are developed and applied to objective functions that are expensive simulations (possibly with multiple local minima). The algorithms can be applied to most geophysical simulations, including those with nonlinear partial differential equations. The optimization does not require simulations be parallelized. Asynchronous (and synchronous) parallel execution is available in the optimization toolbox "pySOT". The parallel algorithms are modified from serial to eliminate fine grained parallelism. The optimization is computed with open source software pySOT, a Surrogate Global Optimization Toolbox that allows user to pick the type of surrogate (or ensembles), the search procedure on surrogate, and the type of parallelism (synchronous or asynchronous). pySOT also allows the user to develop new algorithms by modifying parts of the code. In the applications here, the objective function takes up to 30 minutes for one simulation, and serial optimization can take over 200 hours. Results from Yellowstone (NSF) and NCSS (Singapore) supercomputers are given for groundwater contaminant hydrology simulations with applications to model parameter estimation and decontamination management. All results are compared with alternatives. The first results are for optimization of pumping at many wells to reduce cost for decontamination of groundwater at a superfund site. The optimization runs with up to 128 processors. Superlinear speed up is obtained for up to 16 processors, and efficiency with 64 processors is over 80%. Each evaluation of the objective function requires the solution of nonlinear partial differential equations to describe the impact of spatially distributed pumping and model parameters on model predictions for the spatial and temporal distribution of groundwater contaminants. The second application uses an asynchronous parallel global optimization for groundwater quality model calibration. The time for a single objective function evaluation varies unpredictably, so efficiency is improved with asynchronous parallel calculations to improve load balancing. The third application (done at NCSS) incorporates new global surrogate multi-objective parallel search algorithms into pySOT and applies it to a large watershed calibration problem.
NASA Astrophysics Data System (ADS)
Vanhuyse, Johan; Deckers, Elke; Jonckheere, Stijn; Pluymers, Bert; Desmet, Wim
2016-02-01
The Biot theory is commonly used for the simulation of the vibro-acoustic behaviour of poroelastic materials. However, it relies on a number of material parameters. These can be hard to characterize and require dedicated measurement setups, yielding a time-consuming and costly characterisation. This paper presents a characterisation method which is able to identify all material parameters using only an impedance tube. The method relies on the assumption that the sample is clamped within the tube, that the shear wave is excited and that the acoustic field is no longer one-dimensional. This paper numerically shows the potential of the developed method. It therefore performs a sensitivity analysis of the quantification parameters, i.e. reflection coefficients and relative pressures, and a parameter estimation using global optimisation methods. A 3-step procedure is developed and validated. It is shown that even in the presence of numerically simulated noise this procedure leads to a robust parameter estimation.
Integrating hidden Markov model and PRAAT: a toolbox for robust automatic speech transcription
NASA Astrophysics Data System (ADS)
Kabir, A.; Barker, J.; Giurgiu, M.
2010-09-01
An automatic time-aligned phone transcription toolbox of English speech corpora has been developed. Especially the toolbox would be very useful to generate robust automatic transcription and able to produce phone level transcription using speaker independent models as well as speaker dependent models without manual intervention. The system is based on standard Hidden Markov Models (HMM) approach and it was successfully experimented over a large audiovisual speech corpus namely GRID corpus. One of the most powerful features of the toolbox is the increased flexibility in speech processing where the speech community would be able to import the automatic transcription generated by HMM Toolkit (HTK) into a popular transcription software, PRAAT, and vice-versa. The toolbox has been evaluated through statistical analysis on GRID data which shows that automatic transcription deviates by an average of 20 ms with respect to manual transcription.
NASA Astrophysics Data System (ADS)
Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan
2016-02-01
In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.
The BRAT and GUT Couple: Broadview Radar Altimetry and GOCE User Toolboxes
NASA Astrophysics Data System (ADS)
Benveniste, J.; Restano, M.; Ambrózio, A.
2017-12-01
The Broadview Radar Altimetry Toolbox (BRAT) is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's next release (4.2.0) is planned for October 2017. Based on the community feedback, the front-end has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. The GOCE User Toolbox (GUT) is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.1 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's Variance-Covariance Matrix tool (VCM). BRAT and GUT toolboxes can be freely downloaded, along with ancillary material, at https://earth.esa.int/brat and https://earth.esa.int/gut.
NASA Astrophysics Data System (ADS)
van Haveren, Rens; Ogryczak, Włodzimierz; Verduijn, Gerda M.; Keijzer, Marleen; Heijmen, Ben J. M.; Breedveld, Sebastiaan
2017-06-01
Previously, we have proposed Erasmus-iCycle, an algorithm for fully automated IMRT plan generation based on prioritised (lexicographic) multi-objective optimisation with the 2-phase ɛ-constraint (2pɛc) method. For each patient, the output of Erasmus-iCycle is a clinically favourable, Pareto optimal plan. The 2pɛc method uses a list of objective functions that are consecutively optimised, following a strict, user-defined prioritisation. The novel lexicographic reference point method (LRPM) is capable of solving multi-objective problems in a single optimisation, using a fuzzy prioritisation of the objectives. Trade-offs are made globally, aiming for large favourable gains for lower prioritised objectives at the cost of only slight degradations for higher prioritised objectives, or vice versa. In this study, the LRPM is validated for 15 head and neck cancer patients receiving bilateral neck irradiation. The generated plans using the LRPM are compared with the plans resulting from the 2pɛc method. Both methods were capable of automatically generating clinically relevant treatment plans for all patients. For some patients, the LRPM allowed large favourable gains in some treatment plan objectives at the cost of only small degradations for the others. Moreover, because of the applied single optimisation instead of multiple optimisations, the LRPM reduced the average computation time from 209.2 to 9.5 min, a speed-up factor of 22 relative to the 2pɛc method.
Automation of route identification and optimisation based on data-mining and chemical intuition.
Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G
2017-09-21
Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.
Building Interdisciplinary Research Models Through Interactive Education.
Hessels, Amanda J; Robinson, Brian; O'Rourke, Michael; Begg, Melissa D; Larson, Elaine L
2015-12-01
Critical interdisciplinary research skills include effective communication with diverse disciplines and cultivating collaborative relationships. Acquiring these skills during graduate education may foster future interdisciplinary research quality and productivity. The project aim was to develop and evaluate an interactive Toolbox workshop approach within an interprofessional graduate level course to enhance student learning and skill in interdisciplinary research. We sought to examine the student experience of integrating the Toolbox workshop in modular format over the duration of a 14-week course. The Toolbox Health Sciences Instrument includes six modules that were introduced in a 110-minute dialogue session during the first class and then integrated into the course in a series of six individual workshops in three phases over the course of the semester. Seventeen students participated; the majority were nursing students. Three measures were used to assess project outcomes: pre-post intervention Toolbox survey, competency self-assessment, and a postcourse survey. All measures indicated the objectives were met by a change in survey responses, improved competencies, and favorable experience of the Toolbox modular intervention. Our experience indicates that incorporating this Toolbox modular approach into research curricula can enhance individual level scientific capacity, future interdisciplinary research project success, and ultimately impact on practice and policy. © 2015 Wiley Periodicals, Inc.
Hebart, Martin N.; Görgen, Kai; Haynes, John-Dylan
2015-01-01
The multivariate analysis of brain signals has recently sparked a great amount of interest, yet accessible and versatile tools to carry out decoding analyses are scarce. Here we introduce The Decoding Toolbox (TDT) which represents a user-friendly, powerful and flexible package for multivariate analysis of functional brain imaging data. TDT is written in Matlab and equipped with an interface to the widely used brain data analysis package SPM. The toolbox allows running fast whole-brain analyses, region-of-interest analyses and searchlight analyses, using machine learning classifiers, pattern correlation analysis, or representational similarity analysis. It offers automatic creation and visualization of diverse cross-validation schemes, feature scaling, nested parameter selection, a variety of feature selection methods, multiclass capabilities, and pattern reconstruction from classifier weights. While basic users can implement a generic analysis in one line of code, advanced users can extend the toolbox to their needs or exploit the structure to combine it with external high-performance classification toolboxes. The toolbox comes with an example data set which can be used to try out the various analysis methods. Taken together, TDT offers a promising option for researchers who want to employ multivariate analyses of brain activity patterns. PMID:25610393
The seasonal behaviour of carbon fluxes in the Amazon: fusion of FLUXNET data and the ORCHIDEE model
NASA Astrophysics Data System (ADS)
Verbeeck, H.; Peylin, P.; Bacour, C.; Ciais, P.
2009-04-01
Eddy covariance measurements at the Santarém (km 67) site revealed an unexpected seasonal pattern in carbon fluxes which could not be simulated by existing state-of-the-art global ecosystem models (Saleska et al., Sciece 2003). An unexpected high carbon uptake was measured during dry season. In contrast, carbon release was observed in the wet season. There are several possible (combined) underlying mechanisms of this phenomenon: (1) an increased soil respiration due to soil moisture in the wet season, (2) increased photosynthesis during the dry season due to deep rooting, hydraulic lift, increased radiation and/or a leaf flush. The objective of this study is to optimise the ORCHIDEE model using eddy covariance data in order to be able to mimic the seasonal response of carbon fluxes to dry/wet conditions in tropical forest ecosystems. By doing this, we try to identify the underlying mechanisms of this seasonal response. The ORCHIDEE model is a state of the art mechanistic global vegetation model that can be run at local or global scale. It calculates the carbon and water cycle in the different soil and vegetation pools and resolves the diurnal cycle of fluxes. ORCHIDEE is built on the concept of plant functional types (PFT) to describe vegetation. To bring the different carbon pool sizes to realistic values, spin-up runs are used. ORCHIDEE uses climate variables as drivers together with a number of ecosystem parameters that have been assessed from laboratory and in situ experiments. These parameters are still associated with a large uncertainty and may vary between and within PFTs in a way that is currently not informed or captured by the model. Recently, the development of assimilation techniques allows the objective use of eddy covariance data to improve our knowledge of these parameters in a statistically coherent approach. We use a Bayesian optimisation approach. This approach is based on the minimization of a cost function containing the mismatch between simulated model output and observations as well as the mismatch between a priori and optimized parameters. The parameters can be optimized on different time scales (annually, monthly, daily). For this study the model is optimised at local scale for 5 eddy flux sites: 4 sites in Brazil and one in French Guyana. The seasonal behaviour of C fluxes in response to wet and dry conditions differs among these sites. Key processes that are optimised include: the effect of the soil water on heterotrophic soil respiration, the effect of soil water availability on stomatal conductance and photosynthesis, and phenology. By optimising several key parameters we could improve the simulation of the seasonal pattern of NEE significantly. Nevertheless, posterior parameters should be interpreted with care, because resulting parameter values might compensate for uncertainties on the model structure or other parameters. Moreover, several critical issues appeared during this study e.g. how to assimilate latent and sensible heat data, when the energy balance is not closed in the data? Optimisation of the Q10 parameter showed that on some sites respiration was not sensitive at all to temperature, which show only small variations in this region. Considering this, one could question the reliability of the partitioned fluxes (GPP/Reco) at these sites. This study also tests if there is coherence between optimised parameter values of different sites within the tropical forest PFT and if the forward model response to climate variations is similar between sites.
Global health diplomacy, 'smart power', and the new world order.
Kevany, Sebastian
2014-01-01
Both the theory and practice of foreign policy and diplomacy, including systems of hard and soft power, are undergoing paradigm shifts, with an increasing number of innovative actors and strategies contributing to international relations outcomes in the 'New World Order'. Concurrently, global health programmes continue to ascend the political spectrum in scale, scope and influence. This concatenation of circumstances has demanded a re-examination of the existing and potential effectiveness of global health programmes in the 'smart power' context, based on adherence to a range of design, implementation and assessment criteria, which may simultaneously optimise their humanitarian, foreign policy and diplomatic effectiveness. A synthesis of contemporary characteristics of 'global health diplomacy' and 'global health as foreign policy', grouped by common themes and generated in the context of related field experiences, are presented in the form of 'Top Ten' criteria lists for optimising both diplomatic and foreign policy effectiveness of global health programmes, and criteria are presented in concert with an examination of implications for programme design and delivery. Key criteria for global health programmes that are sensitised to both diplomatic and foreign policy goals include visibility, sustainability, geostrategic considerations, accountability, effectiveness and alignment with broader policy objectives. Though diplomacy is a component of foreign policy, criteria for 'diplomatically-sensitised' versus 'foreign policy-sensitised' global health programmes were not always consistent, and were occasionally in conflict, with each other. The desirability of making diplomatic and foreign policy criteria explicit, rather than implicit, in the context of global health programme design, delivery and evaluation are reflected in the identified implications for (1) international security, (2) programme evaluation, (3) funding and resource allocation decisions, (4) approval systems and (5) training. On this basis, global health programmes are shown to provide a valuable, yet underutilised, tool for diplomacy and foreign policy purposes, including their role in the pursuit of benign international influence. A corresponding alignment of resources between 'hard' and 'smart' power options is encouraged.
NASA Astrophysics Data System (ADS)
Nickless, A.; Rayner, P. J.; Erni, B.; Scholes, R. J.
2018-05-01
The design of an optimal network of atmospheric monitoring stations for the observation of carbon dioxide (CO2) concentrations can be obtained by applying an optimisation algorithm to a cost function based on minimising posterior uncertainty in the CO2 fluxes obtained from a Bayesian inverse modelling solution. Two candidate optimisation methods assessed were the evolutionary algorithm: the genetic algorithm (GA), and the deterministic algorithm: the incremental optimisation (IO) routine. This paper assessed the ability of the IO routine in comparison to the more computationally demanding GA routine to optimise the placement of a five-member network of CO2 monitoring sites located in South Africa. The comparison considered the reduction in uncertainty of the overall flux estimate, the spatial similarity of solutions, and computational requirements. Although the IO routine failed to find the solution with the global maximum uncertainty reduction, the resulting solution had only fractionally lower uncertainty reduction compared with the GA, and at only a quarter of the computational resources used by the lowest specified GA algorithm. The GA solution set showed more inconsistency if the number of iterations or population size was small, and more so for a complex prior flux covariance matrix. If the GA completed with a sub-optimal solution, these solutions were similar in fitness to the best available solution. Two additional scenarios were considered, with the objective of creating circumstances where the GA may outperform the IO. The first scenario considered an established network, where the optimisation was required to add an additional five stations to an existing five-member network. In the second scenario the optimisation was based only on the uncertainty reduction within a subregion of the domain. The GA was able to find a better solution than the IO under both scenarios, but with only a marginal improvement in the uncertainty reduction. These results suggest that the best use of resources for the network design problem would be spent in improvement of the prior estimates of the flux uncertainties rather than investing these resources in running a complex evolutionary optimisation algorithm. The authors recommend that, if time and computational resources allow, that multiple optimisation techniques should be used as a part of a comprehensive suite of sensitivity tests when performing such an optimisation exercise. This will provide a selection of best solutions which could be ranked based on their utility and practicality.
National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox
Price, Curtis
2010-01-01
This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells.
SOCIB Glider toolbox: from sensor to data repository
NASA Astrophysics Data System (ADS)
Pau Beltran, Joan; Heslop, Emma; Ruiz, Simón; Troupin, Charles; Tintoré, Joaquín
2015-04-01
Nowadays in oceanography, gliders constitutes a mature, cost-effective technology for the acquisition of measurements independently of the sea state (unlike ships), providing subsurface data during sustained periods, including extreme weather events. The SOCIB glider toolbox is a set of MATLAB/Octave scripts and functions developed in order to manage the data collected by a glider fleet. They cover the main stages of the data management process, both in real-time and delayed-time modes: metadata aggregation, downloading, processing, and automatic generation of data products and figures. The toolbox is distributed under the GNU licence (http://www.gnu.org/copyleft/gpl.html) and is available at http://www.socib.es/users/glider/glider_toolbox.
NASA Astrophysics Data System (ADS)
Folley, Christopher; Bronowicki, Allen
2005-09-01
Prediction of optical performance for large, deployable telescopes under environmental conditions and mechanical disturbances is a crucial part of the design verification process of such instruments for all phases of design and operation: ground testing, commissioning, and on-orbit operation. A Structural-Thermal-Optical-Performance (STOP) analysis methodology is often created that integrates the output of one analysis with the input of another. The integration of thermal environment predictions with structural models is relatively well understood, while the integration of structural deformation results into optical analysis/design software is less straightforward. A Matlab toolbox has been created that effectively integrates the predictions of mechanical deformations on optical elements generated by, for example, finite element analysis, and computes optical path differences for the distorted prescription. The engine of the toolbox is the real ray-tracing algorithm that allows the optical surfaces to be defined in a single, global coordinate system thereby allowing automatic alignment of the mechanical coordinate system with the optical coordinate system. Therefore, the physical location of the optical surfaces is identical in the optical prescription and the finite element model. The application of rigid body displacements to optical surfaces, however, is more general than for use solely in STOP analysis, such as the analysis of misalignments during the commissioning process. Furthermore, all the functionality of Matlab is available for optimization and control. Since this is a new tool for use on flight programs, it has been verified against CODE V. The toolbox' functionality, to date, is described, verification results are presented, and, as an example of its utility, results of a thermal distortion analysis are presented using the James Webb Space Telescope (JWST) prescription.
Tian, Xing; Poeppel, David; Huber, David E
2011-01-01
The open-source toolbox "TopoToolbox" is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004).
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2006-01-01
This report provides a user guide for the Compressible Flow Toolbox, a collection of algorithms that solve almost 300 linear and nonlinear classical compressible flow relations. The algorithms, implemented in the popular MATLAB programming language, are useful for analysis of one-dimensional steady flow with constant entropy, friction, heat transfer, or shock discontinuities. The solutions do not include any gas dissociative effects. The toolbox also contains functions for comparing and validating the equation-solving algorithms against solutions previously published in the open literature. The classical equations solved by the Compressible Flow Toolbox are: isentropic-flow equations, Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section.), normal-shock equations, oblique-shock equations, and Prandtl-Meyer expansion equations. At the time this report was published, the Compressible Flow Toolbox was available without cost from the NASA Software Repository.
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies. PMID:29163006
MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.
Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong
2017-01-01
Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Benveniste, Jérôme; Knudsen, Per
2016-07-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.
A versatile software package for inter-subject correlation based analyses of fMRI.
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/
A versatile software package for inter-subject correlation based analyses of fMRI
Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi
2014-01-01
In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818
Egea, Jose A; Henriques, David; Cokelaer, Thomas; Villaverde, Alejandro F; MacNamara, Aidan; Danciu, Diana-Patricia; Banga, Julio R; Saez-Rodriguez, Julio
2014-05-10
Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods.
2014-01-01
Background Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. Results We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. Conclusions MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods. PMID:24885957
Air Sensor Toolbox provides information to citizen scientists, researchers and developers interested in learning more about new lower-cost compact air sensor technologies and tools for measuring air quality.
2016-01-01
USER’S GUIDE Demonstration of a Fractured Rock Geophysical Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in...Toolbox (FRGT) for Characterization and Monitoring of DNAPL Biodegradation in Fractured Rock Aquifers F.D. Day-Lewis, C.D. Johnson, J.H. Williams, C.L...are doomed to failure. DNAPL biodegradation charactrization and monitoring, remediation, fractured rock aquifers. Unclassified Unclassified UU UL 6
Explaining Society: An Expanded Toolbox for Social Scientists
Bell, David C.; Atkinson-Schnell, Jodie L.; DiBacco, Aron E.
2012-01-01
We propose for social scientists a theoretical toolbox containing a set of motivations that neurobiologists have recently validated. We show how these motivations can be used to create a theory of society recognizably similar to existing stable societies (sustainable, self-reproducing, and largely peaceful). Using this toolbox, we describe society in terms of three institutions: economy (a source of sustainability), government (peace), and the family (reproducibility). Conducting a thought experiment in three parts, we begin with a simple theory with only two motivations. We then create successive theories that systematically add motivations, showing that each element in the toolbox makes its own contribution to explain the workings of a stable society and that the family has a critical role in this process. PMID:23082093
Complete scanpaths analysis toolbox.
Augustyniak, Piotr; Mikrut, Zbigniew
2006-01-01
This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.
Toolbox for Renewable Energy Project Development
The Toolbox for Renewable Energy Project Development summarizes key project development issues, addresses how to overcome major hurdles, and provides a curated directory of project development resources.
Global Corporate Priorities and Demand-Led Learning Strategies
ERIC Educational Resources Information Center
Dealtry, Richard
2008-01-01
Purpose: The purpose of this article is to start the process of exploring how to optimise connections between the strategic needs of an organisation as directed by top management and its learning management structures and strategies. Design/methodology/approach: The article takes a broad brush approach to a complex and large subject area that is…
Optimising Meritocratic Advantage with the International Baccalaureate Diploma in Australian Schools
ERIC Educational Resources Information Center
Doherty, Catherine
2012-01-01
This paper explores two of the tensions Tarc identifies in the history of the International Baccalaureate (IB) Diploma: firstly, between its design for meritocratic competition and its internationalist vision and, secondly, between the IB as a global commodity and its localised interpretations. Using data from three case studies of Australian…
Acrylamide mitigation strategies: critical appraisal of the FoodDrinkEurope toolbox.
Palermo, M; Gökmen, V; De Meulenaer, B; Ciesarová, Z; Zhang, Y; Pedreschi, F; Fogliano, V
2016-06-15
FoodDrinkEurope Federation recently released the latest version of the Acrylamide Toolbox to support manufacturers in acrylamide reduction activities giving indication about the possible mitigation strategies. The Toolbox is intended for small and medium size enterprises with limited R&D resources, however no comments about the pro and cons of the different measures were provided to advise the potential users. Experts of the field are aware that not all the strategies proposed have equal value in terms of efficacy and cost/benefit ratio. This consideration prompted us to provide a qualitative science-based ranking of the mitigation strategies proposed in the acrylamide Toolbox, focusing on bakery and fried potato products. Five authors from different geographical areas having a publication record on acrylamide mitigation strategies worked independently ranking the efficacy of the acrylamide mitigation strategies taking into account three key parameters: (i) reduction rate; (ii) side effects; and (iii) applicability and economic impact. On the basis of their own experience and considering selected literature of the last ten years, the authors scored for each key parameter the acrylamide mitigation strategies proposed in the Toolbox. As expected, all strategies selected in the Toolbox turned out to be useful, however, not at the same level. The use of enzyme asparaginase and the selection of low sugar varieties were considered the best mitigation strategies in bakery and in potato products, respectively. According to authors' opinion most of the other mitigation strategies, although effective, either have relevant side effects on the sensory profile of the products, or they are not easy to implement in industrial production. The final outcome was a science based commented ranking which can enrich the acrylamide Toolbox supporting individual manufacturer in taking the best actions to reduce the acrylamide content in their specific production context.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Knudsen, Per; Benveniste, Jerome
2017-04-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Knudsen, Per; Benveniste, Jerome; Team Gut
2016-04-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy, Oceanography and Solid earth studies. Accordingly, the GUT version 3 has: - An attractive and easy to use Graphic User Interface (GUI) for the toolbox, - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients, anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies. - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.
An image analysis toolbox for high-throughput C. elegans assays
Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.
2012-01-01
We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656
FISSA: A neuropil decontamination toolbox for calcium imaging signals.
Keemink, Sander W; Lowe, Scott C; Pakan, Janelle M P; Dylda, Evelyn; van Rossum, Mark C W; Rochefort, Nathalie L
2018-02-22
In vivo calcium imaging has become a method of choice to image neuronal population activity throughout the nervous system. These experiments generate large sequences of images. Their analysis is computationally intensive and typically involves motion correction, image segmentation into regions of interest (ROIs), and extraction of fluorescence traces from each ROI. Out of focus fluorescence from surrounding neuropil and other cells can strongly contaminate the signal assigned to a given ROI. In this study, we introduce the FISSA toolbox (Fast Image Signal Separation Analysis) for neuropil decontamination. Given pre-defined ROIs, the FISSA toolbox automatically extracts the surrounding local neuropil and performs blind-source separation with non-negative matrix factorization. Using both simulated and in vivo data, we show that this toolbox performs similarly or better than existing published methods. FISSA requires only little RAM, and allows for fast processing of large datasets even on a standard laptop. The FISSA toolbox is available in Python, with an option for MATLAB format outputs, and can easily be integrated into existing workflows. It is available from Github and the standard Python repositories.
Clayson, Peter E; Miller, Gregory A
2017-01-01
Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.
Lawhern, Vernon; Hairston, W David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.
Lawhern, Vernon; Hairston, W. David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169
Snigdha, Shikha; Milgram, Norton W; Willis, Sherry L; Albert, Marylin; Weintraub, S; Fortin, Norbert J; Cotman, Carl W
2013-07-01
A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer's disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer's disease. Copyright © 2013 Elsevier Inc. All rights reserved.
Snigdha, Shikha; Milgram, Norton W.; Willis, Sherry L.; Albert, Marylin; Weintraub, S.; Fortin, Norbert J.; Cotman, Carl W.
2013-01-01
A major goal of animal research is to identify interventions that can promote successful aging and delay or reverse age-related cognitive decline in humans. Recent advances in standardizing cognitive assessment tools for humans have the potential to bring preclinical work closer to human research in aging and Alzheimer’s disease. The National Institute of Health (NIH) has led an initiative to develop a comprehensive Toolbox for Neurologic Behavioral Function (NIH Toolbox) to evaluate cognitive, motor, sensory and emotional function for use in epidemiologic and clinical studies spanning 3 to 85 years of age. This paper aims to analyze the strengths and limitations of animal behavioral tests that can be used to parallel those in the NIH Toolbox. We conclude that there are several paradigms available to define a preclinical battery that parallels the NIH Toolbox. We also suggest areas in which new tests may benefit the development of a comprehensive preclinical test battery for assessment of cognitive function in animal models of aging and Alzheimer’s disease. PMID:23434040
Guillaume, Y C; Peyrin, E
2000-03-06
A chemometric methodology is proposed to study the separation of seven p-hydroxybenzoic esters in reversed phase liquid chromatography (RPLC). Fifteen experiments were found to be necessary to find a mathematical model which linked a novel chromatographic response function (CRF) with the column temperature, the water fraction in the mobile phase and its flow rate. The CRF optimum was determined using a new algorithm based on Glover's taboo search (TS). A flow-rate of 0.9 ml min(-1) with a water fraction of 0.64 in the ACN-water mixture and a column temperature of 10 degrees C gave the most efficient separation conditions. The usefulness of TS was compared with the pure random search (PRS) and simplex search (SS). As demonstrated by calculations, the algorithm avoids entrapment in local minima and continues the search to give a near-optimal final solution. Unlike other methods of global optimisation, this procedure is generally applicable, easy to implement, derivative free, conceptually simple and could be used in the future for much more complex optimisation problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-11
While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less
A GIS tool for two-dimensional glacier-terminus change tracking
NASA Astrophysics Data System (ADS)
Urbanski, Jacek Andrzej
2018-02-01
This paper presents a Glacier Termini Tracking (GTT) toolbox for the two-dimensional analysis of glacier-terminus position changes. The input consists of a vector layer with several termini lines relating to the same glacier at different times. The output layers allow analyses to be conducted of glacier-terminus retreats, changes in retreats over time and along the ice face, and glacier-terminus fluctuations over time. The application of three tools from the toolbox is demonstrated via the analysis of eight glacier-terminus retreats and fluctuations at the Hornsund fjord in south Svalbard. It is proposed that this toolbox may also be useful in the study of other line features that change over time, like coastlines and rivers. The toolbox has been coded in Python and runs via ArcGIS.
NASA Astrophysics Data System (ADS)
Wessel, Paul; Luis, Joaquim F.
2017-02-01
The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.
Global Soil Information Facilities - Component Worldgrids.org
NASA Astrophysics Data System (ADS)
Reuter, H. I.; Hengl, T.
2012-04-01
GSIF (Global Soil Information Facilities) is ISRIC's framework for production of open soil data. It has been inspired by global environmental data initiatives (e.g. oneGeology, GBIF). The main practical motivation for GSIF is to build cyber-infrastructure to collate legacy (i.e., historic) soil data currently under threat of being lost forever and to generate new soil information. The objective of the component worldgrids is a (de)-central repository for collecting, storing, accessing and interacting with gridded data sets of global soil covariate data for production mapping, while being part of a larger GSIF. It is the physical implementation of the expectation that ISRIC would lead and coordinate a project to assemble a core data set of global environmental covariates to (partly) support local efforts to produce global soil property maps. Currently over 100 layers with a 5 and 1 km resolution with a global coverage can be accessed via www.worldgrids.org. Three different functionalities are implemented to extract data in an OGC complained matter: i) single point overlay ii) mass point overlay; iii) zone grid overlay with reporting of different statistical parameters. The presentation will focus on datasets, functionalities, access via the R-project and ArcGIS globalsoilmap.net Toolbox as well on future enhancements to the worldgrids platform.
2013-04-24
DETECT: A MATLAB Toolbox for Event Detection and Identification in Time Series, with Applications to Artifact Detection in EEG Signals Vernon...datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal . We have developed...As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and
Real time wind farm emulation using SimWindFarm toolbox
NASA Astrophysics Data System (ADS)
Topor, Marcel
2016-06-01
This paper presents a wind farm emulation solution using an open source Matlab/Simulink toolbox and the National Instruments cRIO platform. This work is based on the Aeolus SimWindFarm (SWF) toolbox models developed at Aalborg university, Denmark. Using the Matlab Simulink models developed in SWF, the modeling code can be exported to a real time model using the NI Veristand model framework and the resulting code is integrated as a hardware in the loop control on the NI 9068 platform.
A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings
Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano
2009-01-01
Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code. PMID:19607698
A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.
Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano
2009-07-16
Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant quantities used in information theoretic analysis of neural data. The toolbox can be easily used within Matlab, the environment used by most neuroscience laboratories for the acquisition, preprocessing and plotting of neural data. It can therefore significantly enlarge the domain of application of information theory to neuroscience, and lead to new discoveries about the neural code.
NASA Astrophysics Data System (ADS)
Huning, L. S.; Margulis, S. A.
2013-12-01
Concepts in introductory hydrology courses are often taught in the context of process-based modeling that ultimately is integrated into a watershed model. In an effort to reduce the learning curve associated with applying hydrologic concepts to real-world applications, we developed and incorporated a 'hydrology toolbox' that complements a new, companion textbook into introductory undergraduate hydrology courses. The hydrology toolbox contains the basic building blocks (functions coded in MATLAB) for an integrated spatially-distributed watershed model that makes hydrologic topics (e.g. precipitation, snow, radiation, evaporation, unsaturated flow, infiltration, groundwater, and runoff) more user-friendly and accessible for students. The toolbox functions can be used in a modular format so that students can study individual hydrologic processes and become familiar with the hydrology toolbox. This approach allows such courses to emphasize understanding and application of hydrologic concepts rather than computer coding or programming. While topics in introductory hydrology courses are often introduced and taught independently or semi-independently, they are inherently interconnected. These toolbox functions are therefore linked together at the end of the course to reinforce a holistic understanding of how these hydrologic processes are measured, interconnected, and modeled. They are integrated into a spatially-distributed watershed model or numerical laboratory where students can explore a range of topics such as rainfall-runoff modeling, urbanization, deforestation, watershed response to changes in parameters or forcings, etc. Model output can readily be visualized and analyzed by students to understand watershed response in a real river basin or a simple 'toy' basin. These tools complement the textbook, each of which has been well received by students in multiple hydrology courses with various disciplinary backgrounds. The same governing equations that students have studied in the textbook and used in the toolbox have been encapsulated in the watershed model. Therefore, the combination of the hydrology toolbox, integrated watershed model, and textbook tends to eliminate the potential disconnect between process-based modeling and an 'off-the-shelf' watershed model.
Sentinel-2 data exploitation with ESA's Sentinel-2 Toolbox
NASA Astrophysics Data System (ADS)
Gascon, Ferran; Ramoino, Fabrizzio; deanos, Yves-louis
2017-04-01
The Sentinel-2 Toolbox is a project kicked off by ESA in early 2014, under the umbrella of the ESA SEOM programme with the aim to provide a tool for visualizing, analysing, and processing the Sentinel-2 datasets. The toolbox is an extension of the SeNtinel Application Platform (SNAP), a project resulting from the effort of the developers of the Sentinel-1, Sentinel-2 and Sentinel-3 toolbox to provide a single common application framework suited for the mixed exploitation of SAR, high resolution optical and medium resolution optical datasets. All three development teams collaborate to drive the evolution of the common SNAP framework in a developer forum. In this triplet, the Sentinel-2 toolbox is dedicated to enhance SNAP support for high resolution optical imagery. It is a multi-mission toolbox, already providing support for Sentinel-2, RapidEye, Deimos, SPOT 1 to SPOT 5 datasets. In terms of processing algorithms, SNAP provides tools specific to the Sentinel-2 mission : • An atmospheric correction module, Sen2Cor, is integrated into the toolbox, and provides scene classification, atmospheric correction, cirrus detection and correction. The output L2A products can be opened seamlessly in the toolbox. • A multitemporal synthesis processor (L3) • A biophysical products processor (L2B) • A water processor • A deforestation detector • OTB tools integration • SNAP Engine for Cloud Exploitation along with a set of more generic tools for high resolution optical data exploitation. Together with the generic functionalities of SNAP this provides an ideal environment for designing multi-missions processing chains and producing value-added products from raw datasets. The use of SNAP is manifold and the desktop tools provides a rich application for interactive visualization, analysis and processing of data. But all tools available from SNAP can be accessed via command-line through the Graph Processing Framework (GPT), the kernel of the SNAP processing engine. This makes it a perfect candidate for driving the processing of data on servers for bulk processing.
Broadview Radar Altimetry Toolbox
NASA Astrophysics Data System (ADS)
Garcia-Mondejar, Albert; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme
2017-04-01
The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the frontend for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the dataformatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat2 Baseline C. The second release of the Toolbox, published in October 2016, has a new graphical user interface and other visualisation improvements. The third release (January 2017) includes more features and solves issues from the previous versions.
NASA Astrophysics Data System (ADS)
Hoell, Simon; Omenzetter, Piotr
2018-02-01
To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.
Global optimization of multimode interference structure for ratiometric wavelength measurement
NASA Astrophysics Data System (ADS)
Wang, Qian; Farrell, Gerald; Hatta, Agus Muhamad
2007-07-01
The multimode interference structure is conventionally used as a splitter/combiner. In this paper, it is optimised as an edge filter for ratiometric wavelength measurement, which can be used in demodulation of fiber Bragg grating sensing. The global optimization algorithm-adaptive simulated annealing is introduced in the design of multimode interference structure including the length and width of the multimode waveguide section, and positions of the input and output waveguides. The designed structure shows a suitable spectral response for wavelength measurement and a good fabrication tolerance.
Smoke Ready Toolbox for Wildfires
This site provides an online Smoke Ready Toolbox for Wildfires, which lists resources and tools that provide information on health impacts from smoke exposure, current fire conditions and forecasts and strategies to reduce exposure to smoke.
Developing a congestion mitigation toolbox.
DOT National Transportation Integrated Search
2011-09-30
Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference for practitioners and an educational tool for elected officials acting through public policy boards to better understand the development, planning,...
Grid Integrated Distributed PV (GridPV) Version 2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reno, Matthew J.; Coogan, Kyle
2014-12-01
This manual provides the documentation of the MATLAB toolbox of functions for using OpenDSS to simulate the impact of solar energy on the distribution system. The majority of the functio ns are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in th e OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included tomore » show potential uses of the toolbox functions. Each function i n the toolbox is documented with the function use syntax, full description, function input list, function output list, example use, and example output.« less
CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, F.; Flach, G.; BROWN, K.
2013-06-01
This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less
Neural Parallel Engine: A toolbox for massively parallel neural signal processing.
Tam, Wing-Kin; Yang, Zhi
2018-05-01
Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.
Bialke, Martin; Rau, Henriette; Thamm, Oliver C; Schuldt, Ronny; Penndorf, Peter; Blumentritt, Arne; Gött, Robert; Piegsa, Jens; Bahls, Thomas; Hoffmann, Wolfgang
2018-01-25
In most research projects budget, staff and IT infrastructures are limiting resources. Especially for small-scale registries and cohort studies professional IT support and commercial electronic data capture systems are too expensive. Consequently, these projects use simple local approaches (e.g. Excel) for data capture instead of a central data management including web-based data capture and proper research databases. This leads to manual processes to merge, analyze and, if possible, pseudonymize research data of different study sites. To support multi-site data capture, storage and analyses in small-scall research projects, corresponding requirements were analyzed within the MOSAIC project. Based on the identified requirements, the Toolbox for Research was developed as a flexible software solution for various research scenarios. Additionally, the Toolbox facilitates data integration of research data as well as metadata by performing necessary procedures automatically. Also, Toolbox modules allow the integration of device data. Moreover, separation of personally identifiable information and medical data by using only pseudonyms for storing medical data ensures the compliance to data protection regulations. This pseudonymized data can then be exported in SPSS format in order to enable scientists to prepare reports and analyses. The Toolbox for Research was successfully piloted in the German Burn Registry in 2016 facilitating the documentation of 4350 burn cases at 54 study sites. The Toolbox for Research can be downloaded free of charge from the project website and automatically installed due to the use of Docker technology.
ERIC Educational Resources Information Center
Yasumoto, Seiko
2014-01-01
"Blended learning" has been attracting academic interest catalysed by the advance of mixed-media technology and has significance for the global educational community and evolutionary development of pedagogical approaches to optimise student learning. This paper examines one aspect of blended teaching of Japanese language and culture in…
White, Melanie D.; Milne, Ruth V. J.; Nolan, Matthew F.
2011-01-01
We introduce a molecular toolbox for manipulation of neuronal gene expression in vivo. The toolbox includes promoters, ion channels, optogenetic tools, fluorescent proteins, and intronic artificial microRNAs. The components are easily assembled into adeno-associated virus (AAV) or lentivirus vectors using recombination cloning. We demonstrate assembly of toolbox components into lentivirus and AAV vectors and use these vectors for in vivo expression of inwardly rectifying potassium channels (Kir2.1, Kir3.1, and Kir3.2) and an artificial microRNA targeted against the ion channel HCN1 (HCN1 miRNA). We show that AAV assembled to express HCN1 miRNA produces efficacious and specific in vivo knockdown of HCN1 channels. Comparison of in vivo viral transduction using HCN1 miRNA with mice containing a germ line deletion of HCN1 reveals similar physiological phenotypes in cerebellar Purkinje cells. The easy assembly and re-usability of the toolbox components, together with the ability to up- or down-regulate neuronal gene expression in vivo, may be useful for applications in many areas of neuroscience. PMID:21772812
Sun, Li; Hernandez-Guzman, Jessica; Warncke, Kurt
2009-01-01
Electron spin echo envelope modulation (ESEEM) is a technique of pulsed-electron paramagnetic resonance (EPR) spectroscopy. The analyis of ESEEM data to extract information about the nuclear and electronic structure of a disordered (powder) paramagnetic system requires accurate and efficient numerical simulations. A single coupled nucleus of known nuclear g value (gN) and spin I=1 can have up to eight adjustable parameters in the nuclear part of the spin Hamiltonian. We have developed OPTESIM, an ESEEM simulation toolbox, for automated numerical simulation of powder two- and three-pulse one-dimensional ESEEM for arbitrary number (N) and type (I, gN) of coupled nuclei, and arbitrary mutual orientations of the hyperfine tensor principal axis systems for N>1. OPTESIM is based in the Matlab environment, and includes the following features: (1) a fast algorithm for translation of the spin Hamiltonian into simulated ESEEM, (2) different optimization methods that can be hybridized to achieve an efficient coarse-to-fine grained search of the parameter space and convergence to a global minimum, (3) statistical analysis of the simulation parameters, which allows the identification of simultaneous confidence regions at specific confidence levels. OPTESIM also includes a geometry-preserving spherical averaging algorithm as default for N>1, and global optimization over multiple experimental conditions, such as the dephasing time ( ) for three-pulse ESEEM, and external magnetic field values. Application examples for simulation of 14N coupling (N=1, N=2) in biological and chemical model paramagnets are included. Automated, optimized simulations by using OPTESIM lead to a convergence on dramatically shorter time scales, relative to manual simulations. PMID:19553148
System analysis tools for an ELT at ESO
NASA Astrophysics Data System (ADS)
Mueller, Michael; Koch, Franz
2006-06-01
Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.
A Michigan toolbox for mitigating traffic congestion.
DOT National Transportation Integrated Search
2011-09-30
"Researchers created A Michigan Toolbox for Mitigating Traffic Congestion to be a useful desk reference : for practitioners and an educational tool for elected officials acting through public policy boards to better : understand the development, plan...
Drinking Water Cyanotoxin Risk Communication Toolbox
The drinking water cyanotoxin risk communication toolbox is a ready-to-use, “one-stop-shop” to support public water systems, states, and local governments in developing, as they deem appropriate, their own risk communication materials.
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant assessment data bases,
40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...
40 CFR 141.715 - Microbial toolbox options for meeting Cryptosporidium treatment requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... criteria are in § 141.716(b). Pre Filtration Toolbox Options (3) Presedimentation basin with coagulation 0... separate granular media filtration stage if treatment train includes coagulation prior to first filter...
Air Sensor Toolbox for Citizen Scientists
EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.
A portable toolbox to monitor and evaluate signal operations.
DOT National Transportation Integrated Search
2011-10-01
Researchers from the Texas Transportation Institute developed a portable tool consisting of a fieldhardened : computer interfacing with the traffic signal cabinet through special enhanced Bus Interface Units. : The toolbox consisted of a monitoring t...
Air Sensor Toolbox: Resources and Funding
EPA’s Air Sensor Toolbox provides information and guidance on new low-cost compact technologies for measuring air quality. It provides information to help citizens more effectively and accurately collect air quality data in their community.
NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.
Zhang, Bo; Dai, Ji; Zhang, Tao
2017-11-13
In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.
EPA EMERGENCY PLANNING TOOLBOX
EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...
Ironbound Community Citizen Science Toolbox Fact Sheet
EPA is partnering with Newark’s Ironbound Community Corporation (ICC) to design, develop, and pilot a Citizen Science Toolbox that will enable communities to collect their own environmental data and increase their ability to understand local conditions.
Evaluating a 2D image-based computerized approach for measuring riverine pebble roundness
NASA Astrophysics Data System (ADS)
Cassel, Mathieu; Piégay, Hervé; Lavé, Jérôme; Vaudor, Lise; Hadmoko Sri, Danang; Wibiwo Budi, Sandy; Lavigne, Franck
2018-06-01
The geometrical characteristics of pebbles are important features to study transport pathways, sedimentary history, depositional environments, abrasion processes or to target sediment sources. Both the shape and roundness of pebbles can be described by a still growing number of metrics in 2D and 3D or by visual charts. Despite new developments, existing tools remain proprietary and no pebble roundness toolbox has been available widely within the scientific community. The toolbox developed by Roussillon et al. (2009) automatically computes the size, shape and roundness indexes of pebbles from their 2D maximal projection plans. Using a digital camera, this toolbox operates using 2D pictures taken of pebbles placed on a one square meter red board, allowing data collection to be quickly and efficiently acquired at a large scale. Now that the toolbox is freely available for download,
System engineering toolbox for design-oriented engineers
NASA Technical Reports Server (NTRS)
Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.
1994-01-01
This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.
National Water-Quality Assessment (NAWQA) area-characterization toolbox
Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.
2010-01-01
This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.
Streptomyces spp. in the biocatalysis toolbox.
Spasic, Jelena; Mandic, Mina; Djokic, Lidija; Nikodinovic-Runic, Jasmina
2018-04-01
About 20,100 research publications dated 2000-2017 were recovered searching the PubMed and Web of Science databases for Streptomyces, which are the richest known source of bioactive molecules. However, these bacteria with versatile metabolism are powerful suppliers of biocatalytic tools (enzymes) for advanced biotechnological applications such as green chemical transformations and biopharmaceutical and biofuel production. The recent technological advances, especially in DNA sequencing coupled with computational tools for protein functional and structural prediction, and the improved access to microbial diversity enabled the easier access to enzymes and the ability to engineer them to suit a wider range of biotechnological processes. The major driver behind a dramatic increase in the utilization of biocatalysis is sustainable development and the shift toward bioeconomy that will, in accordance to the UN policy agenda "Bioeconomy to 2030," become a global effort in the near future. Streptomyces spp. already play a significant role among industrial microorganisms. The intention of this minireview is to highlight the presence of Streptomyces in the toolbox of biocatalysis and to give an overview of the most important advances in novel biocatalyst discovery and applications. Judging by the steady increase in a number of recent references (228 for the 2000-2017 period), it is clear that biocatalysts from Streptomyces spp. hold promises in terms of valuable properties and applicative industrial potential.
Broadview Radar Altimetry Toolbox
NASA Astrophysics Data System (ADS)
Escolà, Roger; Garcia-Mondejar, Albert; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Naeije, Marc; Ambrozio, Americo; Restano, Marco; Benveniste, Jérôme
2016-04-01
The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Etudes Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the envissat reprocessed, new features regarding data interpolation and formulas updates.
Broadview Radar Altimetry Toolbox
NASA Astrophysics Data System (ADS)
Mondéjar, Albert; Benveniste, Jérôme; Naeije, Marc; Escolà, Roger; Moyano, Gorka; Roca, Mònica; Terra-Homem, Miguel; Friaças, Ana; Martinho, Fernando; Schrama, Ernst; Ambrózio, Américo; Restano, Marco
2016-07-01
The universal altimetry toolbox, BRAT (Broadview Radar Altimetry Toolbox) which can read all previous and current altimetry missions' data, incorporates now the capability to read the upcoming Sentinel-3 L1 and L2 products. ESA endeavoured to develop and supply this capability to support the users of the future Sentinel-3 SAR Altimetry Mission. BRAT is a collection of tools and tutorial documents designed to facilitate the processing of radar altimetry data. This project started in 2005 from the joint efforts of ESA (European Space Agency) and CNES (Centre National d'Études Spatiales), and it is freely available at http://earth.esa.int/brat. The tools enable users to interact with the most common altimetry data formats. The BratGUI is the front-end for the powerful command line tools that are part of the BRAT suite. BRAT can also be used in conjunction with MATLAB/IDL (via reading routines) or in C/C++/Fortran via a programming API, allowing the user to obtain desired data, bypassing the data-formatting hassle. BRAT can be used simply to visualise data quickly, or to translate the data into other formats such as NetCDF, ASCII text files, KML (Google Earth) and raster images (JPEG, PNG, etc.). Several kinds of computations can be done within BRAT involving combinations of data fields that the user can save for posterior reuse or using the already embedded formulas that include the standard oceanographic altimetry formulas. The Radar Altimeter Tutorial, that contains a strong introduction to altimetry, shows its applications in different fields such as Oceanography, Cryosphere, Geodesy, Hydrology among others. Included are also "use cases", with step-by-step examples, on how to use the toolbox in the different contexts. The Sentinel-3 SAR Altimetry Toolbox shall benefit from the current BRAT version. While developing the toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The Broadview Radar Altimetry Toolbox is a continuation of the Basic Radar Altimetry Toolbox. While developing the new toolbox we will revamp of the Graphical User Interface and provide, among other enhancements, support for reading the upcoming S3 datasets and specific "use-cases" for SAR altimetry in order to train the users and make them aware of the great potential of SAR altimetry for coastal and inland applications. As for any open source framework, contributions from users having developed their own functions are welcome. The first Release of the new Radar Altimetry Toolbox was published in September 2015. It incorporates the capability to read S3 products as well as the new CryoSat-2 Baseline C. The second Release of the Toolbox, planned for March 2016, will have a new graphical user interface and some visualisation improvements. The third release, planned for September 2016, will incorporate new datasets such as the lake and rivers or the EnviSat reprocessed, new features regarding data interpolation and formulas updates.
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
NASA Astrophysics Data System (ADS)
Jandt, Simon; Laagemaa, Priidik; Janssen, Frank
2014-05-01
The systematic and objective comparison between output from a numerical ocean model and a set of observations, called validation in the context of this presentation, is a beneficial activity at several stages, starting from early steps in model development and ending at the quality control of model based products delivered to customers. Even though the importance of this kind of validation work is widely acknowledged it is often not among the most popular tasks in ocean modelling. In order to ease the validation work a comprehensive toolbox has been developed in the framework of the MyOcean-2 project. The objective of this toolbox is to carry out validation integrating different data sources, e.g. time-series at stations, vertical profiles, surface fields or along track satellite data, with one single program call. The validation toolbox, implemented in MATLAB, features all parts of the validation process - ranging from read-in procedures of datasets to the graphical and numerical output of statistical metrics of the comparison. The basic idea is to have only one well-defined validation schedule for all applications, in which all parts of the validation process are executed. Each part, e.g. read-in procedures, forms a module in which all available functions of this particular part are collected. The interface between the functions, the module and the validation schedule is highly standardized. Functions of a module are set up for certain validation tasks, new functions can be implemented into the appropriate module without affecting the functionality of the toolbox. The functions are assigned for each validation task in user specific settings, which are externally stored in so-called namelists and gather all information of the used datasets as well as paths and metadata. In the framework of the MyOcean-2 project the toolbox is frequently used to validate the forecast products of the Baltic Sea Marine Forecasting Centre. Hereby the performance of any new product version is compared with the previous version. Although, the toolbox is mainly tested for the Baltic Sea yet, it can easily be adapted to different datasets and parameters, regardless of the geographic region. In this presentation the usability of the toolbox is demonstrated along with several results of the validation process.
3. How comprehensive can we be in the economic assessment of vaccines?
2017-01-01
ABSTRACT In two previous papers we argued on current vaccines economic assessment not fully comprehensive when using the incremental cost-utility analysis normally applied for treatments. Many differences exist between vaccines and drug treatments making vaccines economic evaluation more cumbersome. Four challenges overwhelmingly present in vaccines assessment are less important for treatments: requirements for population, societal perspectives, budget impact evaluation, and time focused objectives (control or elimination). Based on this, economic analysis of vaccines may need to be presented to many different stakeholders with various evaluation preferences, in addition to the current stakeholders involved for drugs treatment assessment. Then, we may need a tool making the inventory of the different vaccines health economic assessment programmes more comprehensive. The cauliflower value toolbox has been developed with that aim, and its use is illustrated here with rotavirus vaccine. Given the broader perspectives for vaccine assessment, it provides better value and cost evaluations. Cost-benefit analysis may be the preferred economic assessment method when considering substitution from treatment to active medical prevention. Other economic evaluation methods can be selected (i.e. optimisation modelling, return on investment) when project prioritisation is the main focus considered and when stakeholders would like to influence the development of the healthcare programme. PMID:29785253
Resource acquisition, distribution and end-use efficiencies and the growth of industrial society
NASA Astrophysics Data System (ADS)
Jarvis, A. J.; Jarvis, S. J.; Hewitt, C. N.
2015-10-01
A key feature of the growth of industrial society is the acquisition of increasing quantities of resources from the environment and their distribution for end-use. With respect to energy, the growth of industrial society appears to have been near-exponential for the last 160 years. We provide evidence that indicates that the global distribution of resources that underpins this growth may be facilitated by the continual development and expansion of near-optimal directed networks (roads, railways, flight paths, pipelines, cables etc.). However, despite this continual striving for optimisation, the distribution efficiencies of these networks must decline over time as they expand due to path lengths becoming longer and more tortuous. Therefore, to maintain long-term exponential growth the physical limits placed on the distribution networks appear to be counteracted by innovations deployed elsewhere in the system, namely at the points of acquisition and end-use of resources. We postulate that the maintenance of the growth of industrial society, as measured by global energy use, at the observed rate of ~ 2.4 % yr-1 stems from an implicit desire to optimise patterns of energy use over human working lifetimes.
Prediction of road traffic death rate using neural networks optimised by genetic algorithm.
Jafari, Seyed Ali; Jahandideh, Sepideh; Jahandideh, Mina; Asadabadi, Ebrahim Barzegari
2015-01-01
Road traffic injuries (RTIs) are realised as a main cause of public health problems at global, regional and national levels. Therefore, prediction of road traffic death rate will be helpful in its management. Based on this fact, we used an artificial neural network model optimised through Genetic algorithm to predict mortality. In this study, a five-fold cross-validation procedure on a data set containing total of 178 countries was used to verify the performance of models. The best-fit model was selected according to the root mean square errors (RMSE). Genetic algorithm, as a powerful model which has not been introduced in prediction of mortality to this extent in previous studies, showed high performance. The lowest RMSE obtained was 0.0808. Such satisfactory results could be attributed to the use of Genetic algorithm as a powerful optimiser which selects the best input feature set to be fed into the neural networks. Seven factors have been known as the most effective factors on the road traffic mortality rate by high accuracy. The gained results displayed that our model is very promising and may play a useful role in developing a better method for assessing the influence of road traffic mortality risk factors.
Planetary Geologic Mapping Python Toolbox: A Suite of Tools to Support Mapping Workflows
NASA Astrophysics Data System (ADS)
Hunter, M. A.; Skinner, J. A.; Hare, T. M.; Fortezzo, C. M.
2017-06-01
The collective focus of the Planetary Geologic Mapping Python Toolbox is to provide researchers with additional means to migrate legacy GIS data, assess the quality of data and analysis results, and simplify common mapping tasks.
SSOAP Toolbox Enhancements and Case Study
Recognizing the need for tools to support the development of sanitary sewer overflow (SSO) control plans, in October 2009 the U.S. Environmental Protection Agency (EPA) released the first version of the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox. This first ve...
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei (OA)
2014-01-01
A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This presentation describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this presentation is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture.
An Educational Model for Hands-On Hydrology Education
NASA Astrophysics Data System (ADS)
AghaKouchak, A.; Nakhjiri, N.; Habib, E. H.
2014-12-01
This presentation provides an overview of a hands-on modeling tool developed for students in civil engineering and earth science disciplines to help them learn the fundamentals of hydrologic processes, model calibration, sensitivity analysis, uncertainty assessment, and practice conceptual thinking in solving engineering problems. The toolbox includes two simplified hydrologic models, namely HBV-EDU and HBV-Ensemble, designed as a complement to theoretical hydrology lectures. The models provide an interdisciplinary application-oriented learning environment that introduces the hydrologic phenomena through the use of a simplified conceptual hydrologic model. The toolbox can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching more advanced topics including uncertainty analysis, and ensemble simulation. Both models have been administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of hydrology.
Emotion assessment using the NIH Toolbox
Butt, Zeeshan; Pilkonis, Paul A.; Cyranowski, Jill M.; Zill, Nicholas; Hendrie, Hugh C.; Kupst, Mary Jo; Kelly, Morgen A. R.; Bode, Rita K.; Choi, Seung W.; Lai, Jin-Shei; Griffith, James W.; Stoney, Catherine M.; Brouwers, Pim; Knox, Sarah S.; Cella, David
2013-01-01
One of the goals of the NIH Toolbox for Assessment of Neurological and Behavioral Function was to identify or develop brief measures of emotion for use in prospective epidemiologic and clinical research. Emotional health has significant links to physical health and exerts a powerful effect on perceptions of life quality. Based on an extensive literature review and expert input, the Emotion team identified 4 central subdomains: Negative Affect, Psychological Well-Being, Stress and Self-Efficacy, and Social Relationships. A subsequent psychometric review identified several existing self-report and proxy measures of these subdomains with measurement characteristics that met the NIH Toolbox criteria. In cases where adequate measures did not exist, robust item banks were developed to assess concepts of interest. A population-weighted sample was recruited by an online survey panel to provide initial item calibration and measure validation data. Participants aged 8 to 85 years completed self-report measures whereas parents/guardians responded for children aged 3 to 12 years. Data were analyzed using a combination of classic test theory and item response theory methods, yielding efficient measures of emotional health concepts. An overview of the development of the NIH Toolbox Emotion battery is presented along with preliminary results. Norming activities led to further refinement of the battery, thus enhancing the robustness of emotional health measurement for researchers using the NIH Toolbox. PMID:23479549
Wyrm: A Brain-Computer Interface Toolbox in Python.
Venthur, Bastian; Dähne, Sven; Höhne, Johannes; Heller, Hendrik; Blankertz, Benjamin
2015-10-01
In the last years Python has gained more and more traction in the scientific community. Projects like NumPy, SciPy, and Matplotlib have created a strong foundation for scientific computing in Python and machine learning packages like scikit-learn or packages for data analysis like Pandas are building on top of it. In this paper we present Wyrm ( https://github.com/bbci/wyrm ), an open source BCI toolbox in Python. Wyrm is applicable to a broad range of neuroscientific problems. It can be used as a toolbox for analysis and visualization of neurophysiological data and in real-time settings, like an online BCI application. In order to prevent software defects, Wyrm makes extensive use of unit testing. We will explain the key aspects of Wyrm's software architecture and design decisions for its data structure, and demonstrate and validate the use of our toolbox by presenting our approach to the classification tasks of two different data sets from the BCI Competition III. Furthermore, we will give a brief analysis of the data sets using our toolbox, and demonstrate how we implemented an online experiment using Wyrm. With Wyrm we add the final piece to our ongoing effort to provide a complete, free and open source BCI system in Python.
Modelling multi-pulse population dynamics from ultrafast spectroscopy.
van Wilderen, Luuk J G W; Lincoln, Craig N; van Thor, Jasper J
2011-03-21
Current advanced laser, optics and electronics technology allows sensitive recording of molecular dynamics, from single resonance to multi-colour and multi-pulse experiments. Extracting the occurring (bio-) physical relevant pathways via global analysis of experimental data requires a systematic investigation of connectivity schemes. Here we present a Matlab-based toolbox for this purpose. The toolbox has a graphical user interface which facilitates the application of different reaction models to the data to generate the coupled differential equations. Any time-dependent dataset can be analysed to extract time-independent correlations of the observables by using gradient or direct search methods. Specific capabilities (i.e. chirp and instrument response function) for the analysis of ultrafast pump-probe spectroscopic data are included. The inclusion of an extra pulse that interacts with a transient phase can help to disentangle complex interdependent pathways. The modelling of pathways is therefore extended by new theory (which is included in the toolbox) that describes the finite bleach (orientation) effect of single and multiple intense polarised femtosecond pulses on an ensemble of randomly oriented particles in the presence of population decay. For instance, the generally assumed flat-top multimode beam profile is adapted to a more realistic Gaussian shape, exposing the need for several corrections for accurate anisotropy measurements. In addition, the (selective) excitation (photoselection) and anisotropy of populations that interact with single or multiple intense polarised laser pulses is demonstrated as function of power density and beam profile. Using example values of real world experiments it is calculated to what extent this effectively orients the ensemble of particles. Finally, the implementation includes the interaction with multiple pulses in addition to depth averaging in optically dense samples. In summary, we show that mathematical modelling is essential to model and resolve the details of physical behaviour of populations in ultrafast spectroscopy such as pump-probe, pump-dump-probe and pump-repump-probe experiments.
Modelling Multi-Pulse Population Dynamics from Ultrafast Spectroscopy
van Wilderen, Luuk J. G. W.; Lincoln, Craig N.; van Thor, Jasper J.
2011-01-01
Current advanced laser, optics and electronics technology allows sensitive recording of molecular dynamics, from single resonance to multi-colour and multi-pulse experiments. Extracting the occurring (bio-) physical relevant pathways via global analysis of experimental data requires a systematic investigation of connectivity schemes. Here we present a Matlab-based toolbox for this purpose. The toolbox has a graphical user interface which facilitates the application of different reaction models to the data to generate the coupled differential equations. Any time-dependent dataset can be analysed to extract time-independent correlations of the observables by using gradient or direct search methods. Specific capabilities (i.e. chirp and instrument response function) for the analysis of ultrafast pump-probe spectroscopic data are included. The inclusion of an extra pulse that interacts with a transient phase can help to disentangle complex interdependent pathways. The modelling of pathways is therefore extended by new theory (which is included in the toolbox) that describes the finite bleach (orientation) effect of single and multiple intense polarised femtosecond pulses on an ensemble of randomly oriented particles in the presence of population decay. For instance, the generally assumed flat-top multimode beam profile is adapted to a more realistic Gaussian shape, exposing the need for several corrections for accurate anisotropy measurements. In addition, the (selective) excitation (photoselection) and anisotropy of populations that interact with single or multiple intense polarised laser pulses is demonstrated as function of power density and beam profile. Using example values of real world experiments it is calculated to what extent this effectively orients the ensemble of particles. Finally, the implementation includes the interaction with multiple pulses in addition to depth averaging in optically dense samples. In summary, we show that mathematical modelling is essential to model and resolve the details of physical behaviour of populations in ultrafast spectroscopy such as pump-probe, pump-dump-probe and pump-repump-probe experiments. PMID:21445294
NASA Astrophysics Data System (ADS)
De Clippele, L. H.; Gafeira, J.; Robert, K.; Hennige, S.; Lavaleye, M. S.; Duineveld, G. C. A.; Huvenne, V. A. I.; Roberts, J. M.
2017-03-01
Cold-water corals form substantial biogenic habitats on continental shelves and in deep-sea areas with topographic highs, such as banks and seamounts. In the Atlantic, many reef and mound complexes are engineered by Lophelia pertusa, the dominant framework-forming coral. In this study, a variety of mapping approaches were used at a range of scales to map the distribution of both cold-water coral habitats and individual coral colonies at the Mingulay Reef Complex (west Scotland). The new ArcGIS-based British Geological Survey (BGS) seabed mapping toolbox semi-automatically delineated over 500 Lophelia reef `mini-mounds' from bathymetry data with 2-m resolution. The morphometric and acoustic characteristics of the mini-mounds were also automatically quantified and captured using this toolbox. Coral presence data were derived from high-definition remotely operated vehicle (ROV) records and high-resolution microbathymetry collected by a ROV-mounted multibeam echosounder. With a resolution of 0.35 × 0.35 m, the microbathymetry covers 0.6 km2 in the centre of the study area and allowed identification of individual live coral colonies in acoustic data for the first time. Maximum water depth, maximum rugosity, mean rugosity, bathymetric positioning index and maximum current speed were identified as the environmental variables that contributed most to the prediction of live coral presence. These variables were used to create a predictive map of the likelihood of presence of live cold-water coral colonies in the area of the Mingulay Reef Complex covered by the 2-m resolution data set. Predictive maps of live corals across the reef will be especially valuable for future long-term monitoring surveys, including those needed to understand the impacts of global climate change. This is the first study using the newly developed BGS seabed mapping toolbox and an ROV-based microbathymetric grid to explore the environmental variables that control coral growth on cold-water coral reefs.
NASA Astrophysics Data System (ADS)
Christian, Paul M.
2002-07-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
Crosse, Michael J; Di Liberto, Giovanni M; Bednar, Adam; Lalor, Edmund C
2016-01-01
Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter-often referred to as a temporal response function-that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application.
Crosse, Michael J.; Di Liberto, Giovanni M.; Bednar, Adam; Lalor, Edmund C.
2016-01-01
Understanding how brains process sensory signals in natural environments is one of the key goals of twenty-first century neuroscience. While brain imaging and invasive electrophysiology will play key roles in this endeavor, there is also an important role to be played by noninvasive, macroscopic techniques with high temporal resolution such as electro- and magnetoencephalography. But challenges exist in determining how best to analyze such complex, time-varying neural responses to complex, time-varying and multivariate natural sensory stimuli. There has been a long history of applying system identification techniques to relate the firing activity of neurons to complex sensory stimuli and such techniques are now seeing increased application to EEG and MEG data. One particular example involves fitting a filter—often referred to as a temporal response function—that describes a mapping between some feature(s) of a sensory stimulus and the neural response. Here, we first briefly review the history of these system identification approaches and describe a specific technique for deriving temporal response functions known as regularized linear regression. We then introduce a new open-source toolbox for performing this analysis. We describe how it can be used to derive (multivariate) temporal response functions describing a mapping between stimulus and response in both directions. We also explain the importance of regularizing the analysis and how this regularization can be optimized for a particular dataset. We then outline specifically how the toolbox implements these analyses and provide several examples of the types of results that the toolbox can produce. Finally, we consider some of the limitations of the toolbox and opportunities for future development and application. PMID:27965557
DICOM router: an open source toolbox for communication and correction of DICOM objects.
Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich
2005-03-01
Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.
Integrated system dynamics toolbox for water resources planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.
2006-12-01
Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associatedmore » processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward achieving the technology development goals of this center.« less
Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox
The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing ...
The purpose of this toolbox is to help EPA Regional staff and their partners to take advantage of the efficiency and quality gains from the Resource Conservation and Recovery Act (RCRA) Facilities Investigation Remedy Selection Track (FIRST) approach.
Traffic analysis toolbox volume IX : work zone modeling and simulation, a guide for analysts
DOT National Transportation Integrated Search
2009-03-01
This document is the second volume in the FHWA Traffic Analysis Toolbox: Work Zone Analysis series. Whereas the first volume provides guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in work zone plan...
Shafqat-Abbasi, Hamdah; Kowalewski, Jacob M; Kiss, Alexa; Gong, Xiaowei; Hernandez-Varas, Pablo; Berge, Ulrich; Jafari-Mamaghani, Mehrdad; Lock, John G; Strömblad, Staffan
2016-01-01
Mesenchymal (lamellipodial) migration is heterogeneous, although whether this reflects progressive variability or discrete, 'switchable' migration modalities, remains unclear. We present an analytical toolbox, based on quantitative single-cell imaging data, to interrogate this heterogeneity. Integrating supervised behavioral classification with multivariate analyses of cell motion, membrane dynamics, cell-matrix adhesion status and F-actin organization, this toolbox here enables the detection and characterization of two quantitatively distinct mesenchymal migration modes, termed 'Continuous' and 'Discontinuous'. Quantitative mode comparisons reveal differences in cell motion, spatiotemporal coordination of membrane protrusion/retraction, and how cells within each mode reorganize with changed cell speed. These modes thus represent distinctive migratory strategies. Additional analyses illuminate the macromolecular- and cellular-scale effects of molecular targeting (fibronectin, talin, ROCK), including 'adaptive switching' between Continuous (favored at high adhesion/full contraction) and Discontinuous (low adhesion/inhibited contraction) modes. Overall, this analytical toolbox now facilitates the exploration of both spontaneous and adaptive heterogeneity in mesenchymal migration. DOI: http://dx.doi.org/10.7554/eLife.11384.001 PMID:26821527
A Toolbox to Improve Algorithms for Insulin-Dosing Decision Support
Donsa, K.; Plank, J.; Schaupp, L.; Mader, J. K.; Truskaller, T.; Tschapeller, B.; Höll, B.; Spat, S.; Pieber, T. R.
2014-01-01
Summary Background Standardized insulin order sets for subcutaneous basal-bolus insulin therapy are recommended by clinical guidelines for the inpatient management of diabetes. The algorithm based GlucoTab system electronically assists health care personnel by supporting clinical workflow and providing insulin-dose suggestions. Objective To develop a toolbox for improving clinical decision-support algorithms. Methods The toolbox has three main components. 1) Data preparation: Data from several heterogeneous sources is extracted, cleaned and stored in a uniform data format. 2) Simulation: The effects of algorithm modifications are estimated by simulating treatment workflows based on real data from clinical trials. 3) Analysis: Algorithm performance is measured, analyzed and simulated by using data from three clinical trials with a total of 166 patients. Results Use of the toolbox led to algorithm improvements as well as the detection of potential individualized subgroup-specific algorithms. Conclusion These results are a first step towards individualized algorithm modifications for specific patient subgroups. PMID:25024768
A Transcription Activator-Like Effector (TALE) Toolbox for Genome Engineering
Sanjana, Neville E.; Cong, Le; Zhou, Yang; Cunniff, Margaret M.; Feng, Guoping; Zhang, Feng
2013-01-01
Transcription activator-like effectors (TALEs) are a class of naturally occurring DNA binding proteins found in the plant pathogen Xanthomonas sp. The DNA binding domain of each TALE consists of tandem 34-amino acid repeat modules that can be rearranged according to a simple cipher to target new DNA sequences. Customized TALEs can be used for a wide variety of genome engineering applications, including transcriptional modulation and genome editing. Here we describe a toolbox for rapid construction of custom TALE transcription factors (TALE-TFs) and nucleases (TALENs) using a hierarchical ligation procedure. This toolbox facilitates affordable and rapid construction of custom TALE-TFs and TALENs within one week and can be easily scaled up to construct TALEs for multiple targets in parallel. We also provide details for testing the activity in mammalian cells of custom TALE-TFs and TALENs using, respectively, qRT-PCR and Surveyor nuclease. The TALE toolbox described here will enable a broad range of biological applications. PMID:22222791
Parsons, D.R.; Jackson, P.R.; Czuba, J.A.; Engel, F.L.; Rhoads, B.L.; Oberg, K.A.; Best, J.L.; Mueller, D.S.; Johnson, K.K.; Riley, J.D.
2013-01-01
The use of acoustic Doppler current profilers (ADCP) for discharge measurements and three-dimensional flow mapping has increased rapidly in recent years and has been primarily driven by advances in acoustic technology and signal processing. Recent research has developed a variety of methods for processing data obtained from a range of ADCP deployments and this paper builds on this progress by describing new software for processing and visualizing ADCP data collected along transects in rivers or other bodies of water. The new utility, the Velocity Mapping Toolbox (VMT), allows rapid processing (vector rotation, projection, averaging and smoothing), visualization (planform and cross-section vector and contouring), and analysis of a range of ADCP-derived datasets. The paper documents the data processing routines in the toolbox and presents a set of diverse examples that demonstrate its capabilities. The toolbox is applicable to the analysis of ADCP data collected in a wide range of aquatic environments and is made available as open-source code along with this publication.
Aslan, Mikail; Davis, Jack B A; Johnston, Roy L
2016-03-07
The global optimisation of small bimetallic PdCo binary nanoalloys are systematically investigated using the Birmingham Cluster Genetic Algorithm (BCGA). The effect of size and composition on the structures, stability, magnetic and electronic properties including the binding energies, second finite difference energies and mixing energies of Pd-Co binary nanoalloys are discussed. A detailed analysis of Pd-Co structural motifs and segregation effects is also presented. The maximal mixing energy corresponds to Pd atom compositions for which the number of mixed Pd-Co bonds is maximised. Global minimum clusters are distinguished from transition states by vibrational frequency analysis. HOMO-LUMO gap, electric dipole moment and vibrational frequency analyses are made to enable correlation with future experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fair, Jeanne M.
It is often said about infectious diseases that a “threat anywhere is a threat everywhere,” and the recent outbreaks of Ebola in West Africa and Zika virus in South America have proven that pathogens know no borders. Not only are they transboundary, pathogens do not discriminate who they infect. In addition to the natural increase in emerging zoonotic infectious diseases worldwide due to changing environmental conditions and globalization, the use of infectious diseases as warfare agents is a threat in today’s world. Early detection remains one of the best ways to prevent small outbreaks becoming epidemics and pandemics. We findmore » that an accurate diagnosis, detection, and reporting of diseases are important components of mitigating outbreaks, and biosurveillance remains the top tool in our toolbox. And while vaccines have been important for controlling more common infectious virus diseases, they are less feasible for less common diseases, emerging pathogens, and rapidly evolving microbes. Furthermore, due to globalization and increased travel, emigration, and migration, biosurveillance is critical throughout the world, not just in pockets of more developed regions.« less
Fair, Jeanne M.
2017-07-12
It is often said about infectious diseases that a “threat anywhere is a threat everywhere,” and the recent outbreaks of Ebola in West Africa and Zika virus in South America have proven that pathogens know no borders. Not only are they transboundary, pathogens do not discriminate who they infect. In addition to the natural increase in emerging zoonotic infectious diseases worldwide due to changing environmental conditions and globalization, the use of infectious diseases as warfare agents is a threat in today’s world. Early detection remains one of the best ways to prevent small outbreaks becoming epidemics and pandemics. We findmore » that an accurate diagnosis, detection, and reporting of diseases are important components of mitigating outbreaks, and biosurveillance remains the top tool in our toolbox. And while vaccines have been important for controlling more common infectious virus diseases, they are less feasible for less common diseases, emerging pathogens, and rapidly evolving microbes. Furthermore, due to globalization and increased travel, emigration, and migration, biosurveillance is critical throughout the world, not just in pockets of more developed regions.« less
ERIC Educational Resources Information Center
Reinfried, Sibylle; Tempelmann, Sebastian
2014-01-01
This paper provides a video-based learning process study that investigates the kinds of mental models of the atmospheric greenhouse effect 13-year-old learners have and how these mental models change with a learning environment, which is optimised in regard to instructional psychology. The objective of this explorative study was to observe and…
ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273
ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.
SBEToolbox: A Matlab Toolbox for Biological Network Analysis
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418
SBEToolbox: A Matlab Toolbox for Biological Network Analysis.
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.
EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...
NASA Technical Reports Server (NTRS)
Jovic, Srboljub
2015-01-01
This document provides the software design description for the two core software components, the LVC Gateway, the LVC Gateway Toolbox, and two participants, the LVC Gateway Data Logger and the SAA Processor (SaaProc).
DOT National Transportation Integrated Search
2016-10-01
The National Highway Traffic Safety Administration has just : released a new resource for developing seat belt programs in : the traffic safety communityExpanding the Seat Belt Program : Toolbox: A Starter Kit for Trying New Program Ideas. : Resea...
Focused Field Investigations for Sewer Condition Assessment with EPA SSOAP Toolbox - slides
The Nation’s sanitary sewer infrastructure is aging, and is currently one of the top national water program priorities. The U.S. Environmental Protection Agency (EPA) developed the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox to assist communities in developing S...
Motor assessment using the NIH Toolbox
Magasi, Susan; McCreath, Heather E.; Bohannon, Richard W.; Wang, Ying-Chih; Bubela, Deborah J.; Rymer, William Z.; Beaumont, Jennifer; Rine, Rose Marie; Lai, Jin-Shei; Gershon, Richard C.
2013-01-01
Motor function involves complex physiologic processes and requires the integration of multiple systems, including neuromuscular, musculoskeletal, and cardiopulmonary, and neural motor and sensory-perceptual systems. Motor-functional status is indicative of current physical health status, burden of disease, and long-term health outcomes, and is integrally related to daily functioning and quality of life. Given its importance to overall neurologic health and function, motor function was identified as a key domain for inclusion in the NIH Toolbox for Assessment of Neurological and Behavioral Function (NIH Toolbox). We engaged in a 3-stage developmental process to: 1) identify key subdomains and candidate measures for inclusion in the NIH Toolbox, 2) pretest candidate measures for feasibility across the age span of people aged 3 to 85 years, and 3) validate candidate measures against criterion measures in a sample of healthy individuals aged 3 to 85 years (n = 340). Based on extensive literature review and input from content experts, the 5 subdomains of dexterity, strength, balance, locomotion, and endurance were recommended for inclusion in the NIH Toolbox motor battery. Based on our validation testing, valid and reliable measures that are simultaneously low-cost and portable have been recommended to assess each subdomain, including the 9-hole peg board for dexterity, grip dynamometry for upper-extremity strength, standing balance test, 4-m walk test for gait speed, and a 2-minute walk test for endurance. PMID:23479547
MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.
Elliott, Mark T; Welchman, Andrew E; Wing, Alan M
2009-02-15
Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.
Analysis and optimisation of a mixed fluid cascade (MFC) process
NASA Astrophysics Data System (ADS)
Ding, He; Sun, Heng; Sun, Shoujun; Chen, Cheng
2017-04-01
A mixed fluid cascade (MFC) process that comprises three refrigeration cycles has great capacity for large-scale LNG production, which consumes a great amount of energy. Therefore, any performance enhancement of the liquefaction process will significantly reduce the energy consumption. The MFC process is simulated and analysed by use of proprietary software, Aspen HYSYS. The effect of feed gas pressure, LNG storage pressure, water-cooler outlet temperature, different pre-cooling regimes, liquefaction, and sub-cooling refrigerant composition on MFC performance are investigated and presented. The characteristics of its excellent numerical calculation ability and the user-friendly interface of MATLAB™ and powerful thermo-physical property package of Aspen HYSYS are combined. A genetic algorithm is then invoked to optimise the MFC process globally. After optimisation, the unit power consumption can be reduced to 4.655 kW h/kmol, or 4.366 kW h/kmol on condition that the compressor adiabatic efficiency is 80%, or 85%, respectively. Additionally, to improve the process further, with regards its thermodynamic efficiency, configuration optimisation is conducted for the MFC process and several configurations are established. By analysing heat transfer and thermodynamic performances, the configuration entailing a pre-cooling cycle with three pressure levels, liquefaction, and a sub-cooling cycle with one pressure level is identified as the most efficient and thus optimal: its unit power consumption is 4.205 kW h/kmol. Additionally, the mechanism responsible for the weak performance of the suggested liquefaction cycle configuration lies in the unbalanced distribution of cold energy in the liquefaction temperature range.
An ethics toolbox for neurotechnology.
Farah, Martha J
2015-04-08
Advances in neurotechnology will raise new ethical dilemmas, to which scientists and the rest of society must respond. Here I present a "toolbox" of concepts to help us analyze these issues and communicate with each other about them across differences of ethical intuition. Copyright © 2015 Elsevier Inc. All rights reserved.
The triticeae toolbox: combining phenotype and genotype data to advance small-grains breeding
USDA-ARS?s Scientific Manuscript database
The Triticeae Toolbox (http://triticeaetoolbox.org; T3) is the database schema enabling plant breeders and researchers to combine, visualize, and interrogate the wealth of phenotype and genotype data generated by the Triticeae Coordinated Agricultural Project (TCAP). T3 enables users to define speci...
Wastewater Collection System Toolbox | Eliminating Sanitary ...
2017-04-10
Communities across the United States are working to find cost-effective, long-term approaches to managing their aging wastewater infrastructure and preventing the problems that lead to sanitary sewer overflows. The Toolbox is an effort by EPA New England to provide examples of programs and educational efforts from New England and beyond.
EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...
40 CFR 141.717 - Pre-filtration treatment toolbox components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... surface water or GWUDI source. (c) Bank filtration. Systems receive Cryptosporidium treatment credit for... paragraph. Systems using bank filtration when they begin source water monitoring under § 141.701(a) must... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Pre-filtration treatment toolbox...
USDA-ARS?s Scientific Manuscript database
This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...
Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach
ERIC Educational Resources Information Center
Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…
Bock, I; Raveh-Amit, H; Losonczi, E; Carstea, A C; Feher, A; Mashayekhi, K; Matyas, S; Dinnyes, A; Pribenszky, C
2016-04-01
The efficiency of various assisted reproductive techniques can be improved by preconditioning the gametes and embryos with sublethal hydrostatic pressure treatment. However, the underlying molecular mechanism responsible for this protective effect remains unknown and requires further investigation. Here, we studied the effect of optimised hydrostatic pressure treatment on the global gene expression of mouse oocytes after embryonic genome activation. Based on a gene expression microarray analysis, a significant effect of treatment was observed in 4-cell embryos derived from treated oocytes, revealing a transcriptional footprint of hydrostatic pressure-affected genes. Functional analysis identified numerous genes involved in protein synthesis that were downregulated in 4-cell embryos in response to hydrostatic pressure treatment, suggesting that regulation of translation has a major role in optimised hydrostatic pressure-induced stress tolerance. We present a comprehensive microarray analysis and further delineate a potential mechanism responsible for the protective effect of hydrostatic pressure treatment.
Global preamplification simplifies targeted mRNA quantification
Kroneis, Thomas; Jonasson, Emma; Andersson, Daniel; Dolatabadi, Soheila; Ståhlberg, Anders
2017-01-01
The need to perform gene expression profiling using next generation sequencing and quantitative real-time PCR (qPCR) on small sample sizes and single cells is rapidly expanding. However, to analyse few molecules, preamplification is required. Here, we studied global and target-specific preamplification using 96 optimised qPCR assays. To evaluate the preamplification strategies, we monitored the reactions in real-time using SYBR Green I detection chemistry followed by melting curve analysis. Next, we compared yield and reproducibility of global preamplification to that of target-specific preamplification by qPCR using the same amount of total RNA. Global preamplification generated 9.3-fold lower yield and 1.6-fold lower reproducibility than target-specific preamplification. However, the performance of global preamplification is sufficient for most downstream applications and offers several advantages over target-specific preamplification. To demonstrate the potential of global preamplification we analysed the expression of 15 genes in 60 single cells. In conclusion, we show that global preamplification simplifies targeted gene expression profiling of small sample sizes by a flexible workflow. We outline the pros and cons for global preamplification compared to target-specific preamplification. PMID:28332609
Hosseini, S M Hadi; Hoeft, Fumiko; Kesler, Shelli R
2012-01-01
In recent years, graph theoretical analyses of neuroimaging data have increased our understanding of the organization of large-scale structural and functional brain networks. However, tools for pipeline application of graph theory for analyzing topology of brain networks is still lacking. In this report, we describe the development of a graph-analysis toolbox (GAT) that facilitates analysis and comparison of structural and functional network brain networks. GAT provides a graphical user interface (GUI) that facilitates construction and analysis of brain networks, comparison of regional and global topological properties between networks, analysis of network hub and modules, and analysis of resilience of the networks to random failure and targeted attacks. Area under a curve (AUC) and functional data analyses (FDA), in conjunction with permutation testing, is employed for testing the differences in network topologies; analyses that are less sensitive to the thresholding process. We demonstrated the capabilities of GAT by investigating the differences in the organization of regional gray-matter correlation networks in survivors of acute lymphoblastic leukemia (ALL) and healthy matched Controls (CON). The results revealed an alteration in small-world characteristics of the brain networks in the ALL survivors; an observation that confirm our hypothesis suggesting widespread neurobiological injury in ALL survivors. Along with demonstration of the capabilities of the GAT, this is the first report of altered large-scale structural brain networks in ALL survivors.
Tools in Support of Planning for Weather and Climate Extremes
NASA Astrophysics Data System (ADS)
Done, J.; Bruyere, C. L.; Hauser, R.; Holland, G. J.; Tye, M. R.
2016-12-01
A major limitation to planning for weather and climate extremes is the lack of maintained and readily available tools that can provide robust and well-communicated predictions and advice on their impacts. The National Center for Atmospheric Research is facilitating a collaborative international program to develop and support such tools within its Capacity Center for Climate and Weather Extremes aimed at improving community resilience planning and reducing weather and climate impacts. A Global Risk, Resilience and Impacts Toolbox is in development and will provide: A portable web-based interface to process work requests from a variety of users and locations; A sophisticated framework that enables specialized community tools to access a comprehensive database (public and private) of geo-located hazard, vulnerability, exposure, and loss data; A community development toolkit that enables and encourages community tool developments geared towards specific user management and planning needs, and A comprehensive community support facilitated by NCAR utilizing tutorials and a help desk. A number of applications are in development, built off the latest climate science, and in collaboration with private industry and local and state governments. Example applications will be described, including a hurricane damage tool in collaboration with the reinsurance sector, and a weather management tool for the construction industry. These examples will serve as starting points to discuss the broader potential of the toolbox.
PDB2Graph: A toolbox for identifying critical amino acids map in proteins based on graph theory.
Niknam, Niloofar; Khakzad, Hamed; Arab, Seyed Shahriar; Naderi-Manesh, Hossein
2016-05-01
The integrative and cooperative nature of protein structure involves the assessment of topological and global features of constituent parts. Network concept takes complete advantage of both of these properties in the analysis concomitantly. High compatibility to structural concepts or physicochemical properties in addition to exploiting a remarkable simplification in the system has made network an ideal tool to explore biological systems. There are numerous examples in which different protein structural and functional characteristics have been clarified by the network approach. Here, we present an interactive and user-friendly Matlab-based toolbox, PDB2Graph, devoted to protein structure network construction, visualization, and analysis. Moreover, PDB2Graph is an appropriate tool for identifying critical nodes involved in protein structural robustness and function based on centrality indices. It maps critical amino acids in protein networks and can greatly aid structural biologists in selecting proper amino acid candidates for manipulating protein structures in a more reasonable and rational manner. To introduce the capability and efficiency of PDB2Graph in detail, the structural modification of Calmodulin through allosteric binding of Ca(2+) is considered. In addition, a mutational analysis for three well-identified model proteins including Phage T4 lysozyme, Barnase and Ribonuclease HI, was performed to inspect the influence of mutating important central residues on protein activity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Robust stability for uncertain stochastic fuzzy BAM neural networks with time-varying delays
NASA Astrophysics Data System (ADS)
Syed Ali, M.; Balasubramaniam, P.
2008-07-01
In this Letter, by utilizing the Lyapunov functional and combining with the linear matrix inequality (LMI) approach, we analyze the global asymptotic stability of uncertain stochastic fuzzy Bidirectional Associative Memory (BAM) neural networks with time-varying delays which are represented by the Takagi-Sugeno (TS) fuzzy models. A new class of uncertain stochastic fuzzy BAM neural networks with time varying delays has been studied and sufficient conditions have been derived to obtain conservative result in stochastic settings. The developed results are more general than those reported in the earlier literatures. In addition, the numerical examples are provided to illustrate the applicability of the result using LMI toolbox in MATLAB.
Adding Remote Sensing Data Products to the Nutrient Management Decision Support Toolbox
NASA Technical Reports Server (NTRS)
Lehrter, John; Schaeffer, Blake; Hagy, Jim; Spiering, Bruce; Blonski, Slawek; Underwood, Lauren; Ellis, Chris
2011-01-01
Some of the primary issues that manifest from nutrient enrichment and eutrophication (Figure 1) may be observed from satellites. For example, remotely sensed estimates of chlorophyll a (chla), total suspended solids (TSS), and light attenuation (Kd) or water clarity, which are often associated with elevated nutrient inputs, are data products collected daily and globally for coastal systems from satellites such as NASA s MODIS (Figure 2). The objective of this project is to inform water quality decision making activities using remotely sensed water quality data. In particular, we seek to inform the development of numeric nutrient criteria. In this poster we demonstrate an approach for developing nutrient criteria based on remotely sensed chla.
Impact-oriented steering--the concept of NGO-IDEAs 'impact toolbox'.
2008-03-01
The NGO-IDEAs 'Impact Toolbox' has been developed with a group of NGOs all of which are active in the area of saving and credit in South India. This compilation of methods to apply in impact-oriented steering was devised by the executive staff of the Indian partner NGOs, also known as the Resource Persons, in 2006 and tested from late 2006 to early 2007. At first glance, the approach may appear to be highly specialised and difficult to transfer. However, in fact it follows principles that can be adapted for several NGOs in other countries and in other sectors. The following article presents the concept of the NGO-IDEAs 'Impact Toolbox'.
PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.
Soranzo, Alessandro; Grassi, Massimo
2014-01-01
PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.
PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing
Soranzo, Alessandro; Grassi, Massimo
2014-01-01
PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment. PMID:25101013
Modern CACSD using the Robust-Control Toolbox
NASA Technical Reports Server (NTRS)
Chiang, Richard Y.; Safonov, Michael G.
1989-01-01
The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.
On simulated annealing phase transitions in phylogeny reconstruction.
Strobl, Maximilian A R; Barker, Daniel
2016-08-01
Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Policy Analysis for Sustainable Development: The Toolbox for the Environmental Social Scientist
ERIC Educational Resources Information Center
Runhaar, Hens; Dieperink, Carel; Driessen, Peter
2006-01-01
Purpose: The paper seeks to propose the basic competencies of environmental social scientists regarding policy analysis for sustainable development. The ultimate goal is to contribute to an improvement of educational programmes in higher education by suggesting a toolbox that should be integrated in the curriculum. Design/methodology/approach:…
EPA's Office of Research and Development and Office of Water/Water Security Division have jointly developed a Response Protocol Toolbox (RPTB) to address the complex, multi-faceted challenges of a water utility's planning and response to intentional contamination of drinking wate...
DOT National Transportation Integrated Search
1998-12-01
As a part of the Small Urban and Rural ITS Study it conducted in 4 of its more rural regions, the New York State Department of Transportation has developed a compendium of systems, devices and strategies that can enhance safety, provide information, ...
The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...
The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...
The Psychometric Toolbox: An Excel Package for Use in Measurement and Psychometrics Courses
ERIC Educational Resources Information Center
Ferrando, Pere J.; Masip-Cabrera, Antoni; Navarro-González, David; Lorenzo-Seva, Urbano
2017-01-01
The Psychometric Toolbox (PT) is a user-friendly, non-commercial package mainly intended to be used for instructional purposes in introductory courses of educational and psychological measurement, psychometrics and statistics. The PT package is organized in six separate modules or sub-programs: Data preprocessor (descriptive analyses and data…
Toolbox or Adjustable Spanner? A Critical Comparison of Two Metaphors for Adaptive Decision Making
ERIC Educational Resources Information Center
Söllner, Anke; Bröder, Arndt
2016-01-01
For multiattribute decision tasks, different metaphors exist that describe the process of decision making and its adaptation to diverse problems and situations. Multiple strategy models (MSMs) assume that decision makers choose adaptively from a set of different strategies (toolbox metaphor), whereas evidence accumulation models (EAMs) hold that a…
FALCON: a toolbox for the fast contextualization of logical networks
De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas
2017-01-01
Abstract Motivation Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. Results We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. Availability and implementation FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. Contact thomas.sauter@uni.lu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28673016
FALCON: a toolbox for the fast contextualization of logical networks.
De Landtsheer, Sébastien; Trairatphisan, Panuwat; Lucarelli, Philippe; Sauter, Thomas
2017-11-01
Mathematical modelling of regulatory networks allows for the discovery of knowledge at the system level. However, existing modelling tools are often computation-heavy and do not offer intuitive ways to explore the model, to test hypotheses or to interpret the results biologically. We have developed a computational approach to contextualize logical models of regulatory networks with biological measurements based on a probabilistic description of rule-based interactions between the different molecules. Here, we propose a Matlab toolbox, FALCON, to automatically and efficiently build and contextualize networks, which includes a pipeline for conducting parameter analysis, knockouts and easy and fast model investigation. The contextualized models could then provide qualitative and quantitative information about the network and suggest hypotheses about biological processes. FALCON is freely available for non-commercial users on GitHub under the GPLv3 licence. The toolbox, installation instructions, full documentation and test datasets are available at https://github.com/sysbiolux/FALCON. FALCON runs under Matlab (MathWorks) and requires the Optimization Toolbox. thomas.sauter@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Tian, Xing; Poeppel, David; Huber, David E.
2011-01-01
The open-source toolbox “TopoToolbox” is a suite of functions that use sensor topography to calculate psychologically meaningful measures (similarity, magnitude, and timing) from multisensor event-related EEG and MEG data. Using a GUI and data visualization, TopoToolbox can be used to calculate and test the topographic similarity between different conditions (Tian and Huber, 2008). This topographic similarity indicates whether different conditions involve a different distribution of underlying neural sources. Furthermore, this similarity calculation can be applied at different time points to discover when a response pattern emerges (Tian and Poeppel, 2010). Because the topographic patterns are obtained separately for each individual, these patterns are used to produce reliable measures of response magnitude that can be compared across individuals using conventional statistics (Davelaar et al. Submitted and Huber et al., 2008). TopoToolbox can be freely downloaded. It runs under MATLAB (The MathWorks, Inc.) and supports user-defined data structure as well as standard EEG/MEG data import using EEGLAB (Delorme and Makeig, 2004). PMID:21577268
NASA Astrophysics Data System (ADS)
Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph
2016-09-01
CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vinuesa, Ricardo; Fick, Lambert; Negi, Prabal
In the present document we describe a toolbox for the spectral-element code Nek5000, aimed at computing turbulence statistics. The toolbox is presented for a small test case, namely a square duct with L x = 2h, L y = 2h and L z = 4h, where x, y and z are the horizontal, vertical and streamwise directions, respectively. The number of elements in the xy-plane is 16 X 16 = 256, and the number of elements in z is 4, leading to a total of 1,204 spectral elements. A polynomial order of N = 5 is chosen, and the meshmore » is generated using the Nek5000 tool genbox. The toolbox presented here allows to compute mean-velocity components, the Reynolds-stress tensor as well as turbulent kinetic energy (TKE) and Reynolds-stress budgets. Note that the present toolbox allows to compute turbulence statistics in turbulent flows with one homogeneous direction (where the statistics are based on time-averaging as well as averaging in the homogeneous direction), as well as in fully three-dimensional flows (with no periodic directions, where only time-averaging is considered).« less
Saint-Pierre, S
2012-01-01
Over the last few decades, the steady progress achieved in reducing planned exposures of both workers and the public has been admirable in the nuclear sector. However, the disproportionate focus on tiny public exposures and radioactive discharges associated with normal operations came at a high price, and the quasi-denial of a risk of major accident and related weaknesses in emergency preparedness and response came at an even higher price. Fukushima has unfortunately taught us that radiological protection (RP) for emergency and post-emergency situations can be much more than a simple evacuation that lasts 24-48 h, with people returning safely to their homes soon afterwards. On optimisation of emergency and post-emergency exposures, the only 'show in town' in terms of international RP policy improvements has been the issuance of the 2007 Recommendations of the International Commission on Radiological Protection (ICRP). However, no matter how genuine these improvements are, they have not been 'road tested' on the practical reality of severe accidents. Post-Fukushima, there is a compelling case to review the practical adequacy of key RP notions such as optimisation, evacuation, sheltering, and reference levels for workers and the public, and to amend these notions with a view to making the international RP system more useful in the event of a severe accident. On optimisation of planned exposures, the reality is that, nowadays, margins for further reductions of public doses in the nuclear sector are very small, and the smaller the dose, the greater the extra effort needed to reduce the dose further. If sufficient caution is not exercised in the use of RP notions such as dose constraints, there is a real risk of challenging nuclear power technologies beyond safety reasons. For nuclear new build, it is the optimisation of key operational parameters of nuclear power technologies (not RP) that is of paramount importance to improve their overall efficiency. In pursuing further improvements in the international RP system, it should be clearly borne in mind that the system is generally based on protection against the risk of cancer and hereditary diseases. The system also protects against deterministic non-cancer effects on tissues and organs. In seeking refinements of such protective notions, ICRP is invited to pay increased attention to the fact that a continued balance must be struck between beneficial activities that cause exposures and protection. The global nuclear industry is committed to help overcome these key RP issues as part of the RP community's upcoming international deliberations towards a more efficient international RP system. Copyright © 2012. Published by Elsevier Ltd.
Cross-species 3D virtual reality toolbox for visual and cognitive experiments.
Doucet, Guillaume; Gulli, Roberto A; Martinez-Trujillo, Julio C
2016-06-15
Although simplified visual stimuli, such as dots or gratings presented on homogeneous backgrounds, provide strict control over the stimulus parameters during visual experiments, they fail to approximate visual stimulation in natural conditions. Adoption of virtual reality (VR) in neuroscience research has been proposed to circumvent this problem, by combining strict control of experimental variables and behavioral monitoring within complex and realistic environments. We have created a VR toolbox that maximizes experimental flexibility while minimizing implementation costs. A free VR engine (Unreal 3) has been customized to interface with any control software via text commands, allowing seamless introduction into pre-existing laboratory data acquisition frameworks. Furthermore, control functions are provided for the two most common programming languages used in visual neuroscience: Matlab and Python. The toolbox offers milliseconds time resolution necessary for electrophysiological recordings and is flexible enough to support cross-species usage across a wide range of paradigms. Unlike previously proposed VR solutions whose implementation is complex and time-consuming, our toolbox requires minimal customization or technical expertise to interface with pre-existing data acquisition frameworks as it relies on already familiar programming environments. Moreover, as it is compatible with a variety of display and input devices, identical VR testing paradigms can be used across species, from rodents to humans. This toolbox facilitates the addition of VR capabilities to any laboratory without perturbing pre-existing data acquisition frameworks, or requiring any major hardware changes. Copyright © 2016 Z. All rights reserved.
A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction.
Abulnaga, S Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M; Onyike, Chiadi U; Ying, Sarah H; Prince, Jerry L
2016-02-27
The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.
A toolbox to visually explore cerebellar shape changes in cerebellar disease and dysfunction
NASA Astrophysics Data System (ADS)
Abulnaga, S. Mazdak; Yang, Zhen; Carass, Aaron; Kansal, Kalyani; Jedynak, Bruno M.; Onyike, Chiadi U.; Ying, Sarah H.; Prince, Jerry L.
2016-03-01
The cerebellum plays an important role in motor control and is also involved in cognitive processes. Cerebellar function is specialized by location, although the exact topographic functional relationship is not fully understood. The spinocerebellar ataxias are a group of neurodegenerative diseases that cause regional atrophy in the cerebellum, yielding distinct motor and cognitive problems. The ability to study the region-specific atrophy patterns can provide insight into the problem of relating cerebellar function to location. In an effort to study these structural change patterns, we developed a toolbox in MATLAB to provide researchers a unique way to visually explore the correlation between cerebellar lobule shape changes and function loss, with a rich set of visualization and analysis modules. In this paper, we outline the functions and highlight the utility of the toolbox. The toolbox takes as input landmark shape representations of subjects' cerebellar substructures. A principal component analysis is used for dimension reduction. Following this, a linear discriminant analysis and a regression analysis can be performed to find the discriminant direction associated with a specific disease type, or the regression line of a specific functional measure can be generated. The characteristic structural change pattern of a disease type or of a functional score is visualized by sampling points on the discriminant or regression line. The sampled points are used to reconstruct synthetic cerebellar lobule shapes. We showed a few case studies highlighting the utility of the toolbox and we compare the analysis results with the literature.
NASA Astrophysics Data System (ADS)
Hansen, Akio; Ament, Felix; Lammert, Andrea
2017-04-01
Large-eddy simulations have been performed since several decades, but due to computational limits most studies were restricted to small domains or idealised initial-/boundary conditions. Within the High definition clouds and precipitation for advancing climate prediction (HD(CP)2) project realistic weather forecasting like LES simulations were performed with the newly developed ICON LES model for several days. The domain covers central Europe with a horizontal resolution down to 156 m. The setup consists of more than 3 billion grid cells, by what one 3D dump requires roughly 500 GB. A newly developed online evaluation toolbox was created to check instantaneously for realistic model simulations. The toolbox automatically combines model results with observations and generates several quicklooks for various variables. So far temperature-/humidity profiles, cloud cover, integrated water vapour, precipitation and many more are included. All kind of observations like aircraft observations, soundings or precipitation radar networks are used. For each dataset, a specific module is created, which allows for an easy handling and enhancement of the toolbox. Most of the observations are automatically downloaded from the Standardized Atmospheric Measurement Database (SAMD). The evaluation tool should support scientists at monitoring computational costly model simulations as well as to give a first overview about model's performance. The structure of the toolbox as well as the SAMD database are presented. Furthermore, the toolbox was applied on an ICON LES sensitivity study, where example results are shown.
Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system
NASA Astrophysics Data System (ADS)
Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong
2011-06-01
Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.
Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark
2015-01-01
This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.
Andrew, Marion; Barua, Reeta; Short, Steven M.; Kohn, Linda M.
2012-01-01
The Sclerotiniaceae (Ascomycotina, Leotiomycetes) is a relatively recently evolved lineage of necrotrophic host generalists, and necrotrophic or biotrophic host specialists, some latent or symptomless. We hypothesized that they inherited a basic toolbox of genes for plant symbiosis from their common ancestor. Maintenance and evolutionary diversification of symbiosis could require selection on toolbox genes or on timing and magnitude of gene expression. The genes studied were chosen because their products have been previously investigated as pathogenicity factors in the Sclerotiniaceae. They encode proteins associated with cell wall degradation: acid protease 1 (acp1), aspartyl protease (asps), and polygalacturonases (pg1, pg3, pg5, pg6), and the oxalic acid (OA) pathway: a zinc finger transcription factor (pac1), and oxaloacetate acetylhydrolase (oah), catalyst in OA production, essential for full symptom production in Sclerotinia sclerotiorum. Site-specific likelihood analyses provided evidence for purifying selection in all 8 pathogenicity-related genes. Consistent with an evolutionary arms race model, positive selection was detected in 5 of 8 genes. Only generalists produced large, proliferating disease lesions on excised Arabidopsis thaliana leaves and oxalic acid by 72 hours in vitro. In planta expression of oah was 10–300 times greater among the necrotrophic host generalists than necrotrophic and biotrophic host specialists; pac1 was not differentially expressed. Ability to amplify 6/8 pathogenicity related genes and produce oxalic acid in all genera are consistent with the common toolbox hypothesis for this gene sample. That our data did not distinguish biotrophs from necrotrophs is consistent with 1) a common toolbox based on necrotrophy and 2) the most conservative interpretation of the 3-locus housekeeping gene phylogeny – a baseline of necrotrophy from which forms of biotrophy emerged at least twice. Early oah overexpression likely expands the host range of necrotrophic generalists in the Sclerotiniaceae, while specialists and biotrophs deploy oah, or other as-yet-unknown toolbox genes, differently. PMID:22253834
The 'Toolbox' of strategies for managing Haemonchus contortus in goats: What's in and what's out.
Kearney, P E; Murray, P J; Hoy, J M; Hohenhaus, M; Kotze, A
2016-04-15
A dynamic and innovative approach to managing the blood-consuming nematode Haemonchus contortus in goats is critical to crack dependence on veterinary anthelmintics. H. contortus management strategies have been the subject of intense research for decades, and must be selected to create a tailored, individualized program for goat farms. Through the selection and combination of strategies from the Toolbox, an effective management program for H. contortus can be designed according to the unique conditions of each particular farm. This Toolbox investigates strategies including vaccines, bioactive forages, pasture/grazing management, behavioural management, natural immunity, FAMACHA, Refugia and strategic drenching, mineral/vitamin supplementation, copper Oxide Wire Particles (COWPs), breeding and selection/selecting resistant and resilient individuals, biological control and anthelmintic drugs. Barbervax(®), the ground-breaking Haemonchus vaccine developed and currently commercially available on a pilot scale for sheep, is prime for trialling in goats and would be an invaluable inclusion to this Toolbox. The specialised behaviours of goats, specifically their preferences to browse a variety of plants and accompanying physiological adaptations to the consumption of secondary compounds contained in browse, have long been unappreciated and thus overlooked as a valuable, sustainable strategy for Haemonchus management. These strategies are discussed in this review as to their value for inclusion into the 'Toolbox' currently, and the future implications of ongoing research for goat producers. Combining and manipulating strategies such as browsing behaviour, pasture management, bioactive forages and identifying and treating individual animals for haemonchosis, in addition to continuous evaluation of strategy effectiveness, is conducted using a model farm scenario. Selecting strategies from the Toolbox, with regard to their current availability, feasibility, economical cost and potential ease of implementation depending on the systems of production and their complementary nature, is the future of managing H. contortus in farmed goats internationally and maintaining the remaining efficacy of veterinary anthelmintics. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Christian, Paul M.; Wells, Randy
2001-09-01
This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).
Toolbox for Urban Mobility Simulation: High Resolution Population Dynamics for Global Cities
NASA Astrophysics Data System (ADS)
Bhaduri, B. L.; Lu, W.; Liu, C.; Thakur, G.; Karthik, R.
2015-12-01
In this rapidly urbanizing world, unprecedented rate of population growth is not only mirrored by increasing demand for energy, food, water, and other natural resources, but has detrimental impacts on environmental and human security. Transportation simulations are frequently used for mobility assessment in urban planning, traffic operation, and emergency management. Previous research, involving purely analytical techniques to simulations capturing behavior, has investigated questions and scenarios regarding the relationships among energy, emissions, air quality, and transportation. Primary limitations of past attempts have been availability of input data, useful "energy and behavior focused" models, validation data, and adequate computational capability that allows adequate understanding of the interdependencies of our transportation system. With increasing availability and quality of traditional and crowdsourced data, we have utilized the OpenStreetMap roads network, and has integrated high resolution population data with traffic simulation to create a Toolbox for Urban Mobility Simulations (TUMS) at global scale. TUMS consists of three major components: data processing, traffic simulation models, and Internet-based visualizations. It integrates OpenStreetMap, LandScanTM population, and other open data (Census Transportation Planning Products, National household Travel Survey, etc.) to generate both normal traffic operation and emergency evacuation scenarios. TUMS integrates TRANSIMS and MITSIM as traffic simulation engines, which are open-source and widely-accepted for scalable traffic simulations. Consistent data and simulation platform allows quick adaption to various geographic areas that has been demonstrated for multiple cities across the world. We are combining the strengths of geospatial data sciences, high performance simulations, transportation planning, and emissions, vehicle and energy technology development to design and develop a simulation framework to assist decision makers at all levels - local, state, regional, and federal. Using Cleveland, Tennessee as an example, in this presentation, we illustrate how emerging cities could easily assess future land use scenario driven impacts on energy and environment utilizing such a capability.
NASA Astrophysics Data System (ADS)
Duffy, J. E.
2016-02-01
Biodiversity - the variety of functional types of organisms - is the engine of marine ecosystem processes, including productivity, nutrient cycling, and carbon sequestration. Biodiversity remains a black box in much of ocean science, despite wide recognition that effectively managing human interactions with marine ecosystems requires understanding both structure and functional consequences of biodiversity. Moreover, the inherent complexity of biological systems puts a premium on data-rich, comparative approaches, which are best met via collaborative networks. The Smithsonian Institution's MarineGEO program links a growing network of partners conducting parallel, comparative research to understand change in marine biodiversity and ecosystems, natural and anthropogenic drivers of that change, and the ecological processes mediating it. The focus is on nearshore, seabed-associated systems where biodiversity and human population are concentrated and interact most, yet which fall through the cracks of existing ocean observing programs. MarineGEO offers a standardized toolbox of research modules that efficiently capture key elements of biological diversity and its importance in ecological processes across a range of habitats. The toolbox integrates high-tech (DNA-based, imaging) and low-tech protocols (diver surveys, rapid assays of consumer activity) adaptable to differing institutional capacity and resources. The model for long-term sustainability involves leveraging in-kind support among partners, adoption of best practices wherever possible, engagement of students and citizen scientists, and benefits of training, networking, and global relevance as incentives for participation. Here I highlight several MarineGEO comparative research projects demonstrating the value of standardized, scalable assays and parallel experiments for measuring fish and invertebrate diversity, recruitment, benthic herbivory and generalist predation, decomposition, and carbon sequestration. Key remaining challenges include consensus on protocols; integration of historical data; data management and access; and informatics. These challenges are common to other fields and prospects for progress in the near future are good.
An experimental toolbox for the generation of cold and ultracold polar molecules
NASA Astrophysics Data System (ADS)
Zeppenfeld, Martin; Gantner, Thomas; Glöckner, Rosa; Ibrügger, Martin; Koller, Manuel; Prehn, Alexander; Wu, Xing; Chervenkov, Sotir; Rempe, Gerhard
2017-01-01
Cold and ultracold molecules enable fascinating applications in quantum science. We present our toolbox of techniques to generate the required molecule ensembles, including buffergas cooling, centrifuge deceleration and optoelectrical Sisyphus cooling. We obtain excellent control over both the motional and internal molecular degrees of freedom, allowing us to aim at various applications.
Water Power Data and Tools | Water Power | NREL
computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically
Tensor Toolbox for MATLAB v. 3.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kola, Tamara; Bader, Brett W.; Acar Ataman, Evrim NMN
Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors using MATLAB's object-oriented features. It also provides algorithms for tensor decomposition and factorization, algorithms for computing tensor eigenvalues, and methods for visualization of results.
MOFA Software for the COBRA Toolbox
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griesemer, Marc; Navid, Ali
MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.
The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...
The interim final Response Protocol Toolbox: Planning for and Responding to Contamination Threats to Drinking Water Systems is designed to help the water sector effectively and appropriately respond to intentional contamination threats and incidents. It was produced by EPA, buil...
The panacea toolbox of a PhD biomedical student.
Skaik, Younis
2014-01-01
Doing a PhD (doctor of philosophy) for the sake of contribution to knowledge should give the student an immense enthusiasm through the PhD period. It is the time in one's life that one spends to "hit the nail on the head" in a specific area and topic of interest. A PhD consists mostly of hard work and tenacity; however, luck and genius might also play a little role. You can pass all PhD phases without having both luck and genius. The PhD student should have pre-PhD and PhD toolboxes, which are "sine quibus non" for getting successfully a PhD degree. In this manuscript, the toolboxes of the PhD student are discussed.
A Tol2 Gateway-Compatible Toolbox for the Study of the Nervous System and Neurodegenerative Disease.
Don, Emily K; Formella, Isabel; Badrock, Andrew P; Hall, Thomas E; Morsch, Marco; Hortle, Elinor; Hogan, Alison; Chow, Sharron; Gwee, Serene S L; Stoddart, Jack J; Nicholson, Garth; Chung, Roger; Cole, Nicholas J
2017-02-01
Currently there is a lack in fundamental understanding of disease progression of most neurodegenerative diseases, and, therefore, treatments and preventative measures are limited. Consequently, there is a great need for adaptable, yet robust model systems to both investigate elementary disease mechanisms and discover effective therapeutics. We have generated a Tol2 Gateway-compatible toolbox to study neurodegenerative disorders in zebrafish, which includes promoters for astrocytes, microglia and motor neurons, multiple fluorophores, and compatibility for the introduction of genes of interest or disease-linked genes. This toolbox will advance the rapid and flexible generation of zebrafish models to discover the biology of the nervous system and the disease processes that lead to neurodegeneration.
Spectral analysis and filtering techniques in digital spatial data processing
Pan, Jeng-Jong
1989-01-01
A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author
Global Existence Results for Viscoplasticity at Finite Strain
NASA Astrophysics Data System (ADS)
Mielke, Alexander; Rossi, Riccarda; Savaré, Giuseppe
2018-01-01
We study a model for rate-dependent gradient plasticity at finite strain based on the multiplicative decomposition of the strain tensor, and investigate the existence of global-in-time solutions to the related PDE system. We reveal its underlying structure as a generalized gradient system, where the driving energy functional is highly nonconvex and features the geometric nonlinearities related to finite-strain elasticity as well as the multiplicative decomposition of finite-strain plasticity. Moreover, the dissipation potential depends on the left-invariant plastic rate, and thus depends on the plastic state variable. The existence theory is developed for a class of abstract, nonsmooth, and nonconvex gradient systems, for which we introduce suitable notions of solutions, namely energy-dissipation-balance and energy-dissipation-inequality solutions. Hence, we resort to the toolbox of the direct method of the calculus of variations to check that the specific energy and dissipation functionals for our viscoplastic models comply with the conditions of the general theory.
Tsipa, Argyro; Koutinas, Michalis; Usaku, Chonlatep; Mantalaris, Athanasios
2018-05-02
Currently, design and optimisation of biotechnological bioprocesses is performed either through exhaustive experimentation and/or with the use of empirical, unstructured growth kinetics models. Whereas, elaborate systems biology approaches have been recently explored, mixed-substrate utilisation is predominantly ignored despite its significance in enhancing bioprocess performance. Herein, bioprocess optimisation for an industrially-relevant bioremediation process involving a mixture of highly toxic substrates, m-xylene and toluene, was achieved through application of a novel experimental-modelling gene regulatory network - growth kinetic (GRN-GK) hybrid framework. The GRN model described the TOL and ortho-cleavage pathways in Pseudomonas putida mt-2 and captured the transcriptional kinetics expression patterns of the promoters. The GRN model informed the formulation of the growth kinetics model replacing the empirical and unstructured Monod kinetics. The GRN-GK framework's predictive capability and potential as a systematic optimal bioprocess design tool, was demonstrated by effectively predicting bioprocess performance, which was in agreement with experimental values, when compared to four commonly used models that deviated significantly from the experimental values. Significantly, a fed-batch biodegradation process was designed and optimised through the model-based control of TOL Pr promoter expression resulting in 61% and 60% enhanced pollutant removal and biomass formation, respectively, compared to the batch process. This provides strong evidence of model-based bioprocess optimisation at the gene level, rendering the GRN-GK framework as a novel and applicable approach to optimal bioprocess design. Finally, model analysis using global sensitivity analysis (GSA) suggests an alternative, systematic approach for model-driven strain modification for synthetic biology and metabolic engineering applications. Copyright © 2018. Published by Elsevier Inc.
[The acute phase, a time which determines the outcome of a patient with a head trauma].
Jeauneaux, Olivier; Bony, Maylis; Giroud, Olivier; Chabert, Flavien; Pagnier, Daniel; Mansuy, Charlène; Quélin, Pauline; Lemperrière, Héloïse; Grodecœur, Caroline; Armonia, Cécile
2017-03-01
As soon as their prehospital care begins, patients with a serious head injury are given intensive care to offset the systemic failures observed and minimise secondary brain damage. In intensive care, monitoring is continuous and neuroprotection optimised. While the prognosis of the patient remains uncertain, their family are included and involved in their global care. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Savini, H; Maugey, N; Aletti, M; Facon, A; Koulibaly, F; Cotte, J; Janvier, F; Cordier, P Y; Dampierre, H; Ramade, S; Foissaud, V; Granier, H; Sagui, E; Carmoi, T
2016-10-01
The Healthcare Workers Treatment Center of Conakry, Guinea, was inaugurated in january 2015. It is dedicated to the diagnosis and the treatment of healthcare workers with probable or confirmed Ebola viral disease. It is staffed by the french army medical service. The french military team may reconcile their medical practice and the ethno-cultural imperatives to optimise the patient adherence during his hospitalization.
Martens, Pim; Akin, Su-Mia; Maud, Huynen; Mohsin, Raza
2010-09-17
It is clear that globalization is something more than a purely economic phenomenon manifesting itself on a global scale. Among the visible manifestations of globalization are the greater international movement of goods and services, financial capital, information and people. In addition, there are technological developments, more transboundary cultural exchanges, facilitated by the freer trade of more differentiated products as well as by tourism and immigration, changes in the political landscape and ecological consequences. In this paper, we link the Maastricht Globalization Index with health indicators to analyse if more globalized countries are doing better in terms of infant mortality rate, under-five mortality rate, and adult mortality rate. The results indicate a positive association between a high level of globalization and low mortality rates. In view of the arguments that globalization provides winners and losers, and might be seen as a disequalizing process, we should perhaps be careful in interpreting the observed positive association as simple evidence that globalization is mostly good for our health. It is our hope that a further analysis of health impacts of globalization may help in adjusting and optimising the process of globalization on every level in the direction of a sustainable and healthy development for all.
Is globalization healthy: a statistical indicator analysis of the impacts of globalization on health
2010-01-01
It is clear that globalization is something more than a purely economic phenomenon manifesting itself on a global scale. Among the visible manifestations of globalization are the greater international movement of goods and services, financial capital, information and people. In addition, there are technological developments, more transboundary cultural exchanges, facilitated by the freer trade of more differentiated products as well as by tourism and immigration, changes in the political landscape and ecological consequences. In this paper, we link the Maastricht Globalization Index with health indicators to analyse if more globalized countries are doing better in terms of infant mortality rate, under-five mortality rate, and adult mortality rate. The results indicate a positive association between a high level of globalization and low mortality rates. In view of the arguments that globalization provides winners and losers, and might be seen as a disequalizing process, we should perhaps be careful in interpreting the observed positive association as simple evidence that globalization is mostly good for our health. It is our hope that a further analysis of health impacts of globalization may help in adjusting and optimising the process of globalization on every level in the direction of a sustainable and healthy development for all. PMID:20849605
Ferreira, Fábio S; Pereira, João M S; Duarte, João V; Castelo-Branco, Miguel
2017-01-01
Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately - using standard univariate VBM - and simultaneously, with multivariate analyses. Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities.
Review of Qualitative Approaches for the Construction Industry: Designing a Risk Management Toolbox
Spee, Ton; Gillen, Matt; Lentz, Thomas J.; Garrod, Andrew; Evans, Paul; Swuste, Paul
2011-01-01
Objectives This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Methods Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. Results This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. Conclusion The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions. PMID:22953194
Review of qualitative approaches for the construction industry: designing a risk management toolbox.
Zalk, David M; Spee, Ton; Gillen, Matt; Lentz, Thomas J; Garrod, Andrew; Evans, Paul; Swuste, Paul
2011-06-01
This paper presents the framework and protocol design for a construction industry risk management toolbox. The construction industry needs a comprehensive, systematic approach to assess and control occupational risks. These risks span several professional health and safety disciplines, emphasized by multiple international occupational research agenda projects including: falls, electrocution, noise, silica, welding fumes, and musculoskeletal disorders. Yet, the International Social Security Association says, "whereas progress has been made in safety and health, the construction industry is still a high risk sector." Small- and medium-sized enterprises (SMEs) employ about 80% of the world's construction workers. In recent years a strategy for qualitative occupational risk management, known as Control Banding (CB) has gained international attention as a simplified approach for reducing work-related risks. CB groups hazards into stratified risk 'bands', identifying commensurate controls to reduce the level of risk and promote worker health and safety. We review these qualitative solutions-based approaches and identify strengths and weaknesses toward designing a simplified CB 'toolbox' approach for use by SMEs in construction trades. This toolbox design proposal includes international input on multidisciplinary approaches for performing a qualitative risk assessment determining a risk 'band' for a given project. Risk bands are used to identify the appropriate level of training to oversee construction work, leading to commensurate and appropriate control methods to perform the work safely. The Construction Toolbox presents a review-generated format to harness multiple solutions-based national programs and publications for controlling construction-related risks with simplified approaches across the occupational safety, health and hygiene professions.
Ferreira, Fábio S.; Pereira, João M.S.; Duarte, João V.; Castelo-Branco, Miguel
2017-01-01
Background: Although voxel based morphometry studies are still the standard for analyzing brain structure, their dependence on massive univariate inferential methods is a limiting factor. A better understanding of brain pathologies can be achieved by applying inferential multivariate methods, which allow the study of multiple dependent variables, e.g. different imaging modalities of the same subject. Objective: Given the widespread use of SPM software in the brain imaging community, the main aim of this work is the implementation of massive multivariate inferential analysis as a toolbox in this software package. applied to the use of T1 and T2 structural data from diabetic patients and controls. This implementation was compared with the traditional ANCOVA in SPM and a similar multivariate GLM toolbox (MRM). Method: We implemented the new toolbox and tested it by investigating brain alterations on a cohort of twenty-eight type 2 diabetes patients and twenty-six matched healthy controls, using information from both T1 and T2 weighted structural MRI scans, both separately – using standard univariate VBM - and simultaneously, with multivariate analyses. Results: Univariate VBM replicated predominantly bilateral changes in basal ganglia and insular regions in type 2 diabetes patients. On the other hand, multivariate analyses replicated key findings of univariate results, while also revealing the thalami as additional foci of pathology. Conclusion: While the presented algorithm must be further optimized, the proposed toolbox is the first implementation of multivariate statistics in SPM8 as a user-friendly toolbox, which shows great potential and is ready to be validated in other clinical cohorts and modalities. PMID:28761571
Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods
NASA Astrophysics Data System (ADS)
Rogers, Adam; Safi-Harb, Samar; Fiege, Jason
2015-08-01
The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.
T-MATS Toolbox for the Modeling and Analysis of Thermodynamic Systems
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.
2014-01-01
The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a MATLABSimulink (The MathWorks Inc.) plug-in for creating and simulating thermodynamic systems and controls. The package contains generic parameterized components that can be combined with a variable input iterative solver and optimization algorithm to create complex system models, such as gas turbines.
ERIC Educational Resources Information Center
Oxman, Victor; Stupel, Moshe
2018-01-01
A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.
NASA Astrophysics Data System (ADS)
Oxman, Victor; Stupel, Moshe
2018-04-01
A geometrical task is presented with multiple solutions using different methods, in order to show the connection between various branches of mathematics and to highlight the importance of providing the students with an extensive 'mathematical toolbox'. Investigation of the property that appears in the task was carried out using a computerized tool.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Knudsen, P.; Benveniste, J.
2011-07-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),
Lührs, Michael; Goebel, Rainer
2017-10-01
Turbo-Satori is a neurofeedback and brain-computer interface (BCI) toolbox for real-time functional near-infrared spectroscopy (fNIRS). It incorporates multiple pipelines from real-time preprocessing and analysis to neurofeedback and BCI applications. The toolbox is designed with a focus in usability, enabling a fast setup and execution of real-time experiments. Turbo-Satori uses an incremental recursive least-squares procedure for real-time general linear model calculation and support vector machine classifiers for advanced BCI applications. It communicates directly with common NIRx fNIRS hardware and was tested extensively ensuring that the calculations can be performed in real time without a significant change in calculation times for all sampling intervals during ongoing experiments of up to 6 h of recording. Enabling immediate access to advanced processing features also allows the use of this toolbox for students and nonexperts in the field of fNIRS data acquisition and processing. Flexible network interfaces allow third party stimulus applications to access the processed data and calculated statistics in real time so that this information can be easily incorporated in neurofeedback or BCI presentations.
HYDRORECESSION: A toolbox for streamflow recession analysis
NASA Astrophysics Data System (ADS)
Arciniega, S.
2015-12-01
Streamflow recession curves are hydrological signatures allowing to study the relationship between groundwater storage and baseflow and/or low flows at the catchment scale. Recent studies have showed that streamflow recession analysis can be quite sensitive to the combination of different models, extraction techniques and parameter estimation methods. In order to better characterize streamflow recession curves, new methodologies combining multiple approaches have been recommended. The HYDRORECESSION toolbox, presented here, is a Matlab graphical user interface developed to analyse streamflow recession time series with the support of different tools allowing to parameterize linear and nonlinear storage-outflow relationships through four of the most useful recession models (Maillet, Boussinesq, Coutagne and Wittenberg). The toolbox includes four parameter-fitting techniques (linear regression, lower envelope, data binning and mean squared error) and three different methods to extract hydrograph recessions segments (Vogel, Brutsaert and Aksoy). In addition, the toolbox has a module that separates the baseflow component from the observed hydrograph using the inverse reservoir algorithm. Potential applications provided by HYDRORECESSION include model parameter analysis, hydrological regionalization and classification, baseflow index estimates, catchment-scale recharge and low-flows modelling, among others. HYDRORECESSION is freely available for non-commercial and academic purposes.
Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis
Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.
2006-01-01
In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709
A novel toolbox for E. coli lysis monitoring.
Rajamanickam, Vignesh; Wurm, David; Slouka, Christoph; Herwig, Christoph; Spadiut, Oliver
2017-01-01
The bacterium Escherichia coli is a well-studied recombinant host organism with a plethora of applications in biotechnology. Highly valuable biopharmaceuticals, such as antibody fragments and growth factors, are currently being produced in E. coli. However, the high metabolic burden during recombinant protein production can lead to cell death, consequent lysis, and undesired product loss. Thus, fast and precise analyzers to monitor E. coli bioprocesses and to retrieve key process information, such as the optimal time point of harvest, are needed. However, such reliable monitoring tools are still scarce to date. In this study, we cultivated an E. coli strain producing a recombinant single-chain antibody fragment in the cytoplasm. In bioreactor cultivations, we purposely triggered cell lysis by pH ramps. We developed a novel toolbox using UV chromatograms as fingerprints and chemometric techniques to monitor these lysis events and used flow cytometry (FCM) as reference method to quantify viability offline. Summarizing, we were able to show that a novel toolbox comprising HPLC chromatogram fingerprinting and data science tools allowed the identification of E. coli lysis in a fast and reliable manner. We are convinced that this toolbox will not only facilitate E. coli bioprocess monitoring but will also allow enhanced process control in the future.
A sigma factor toolbox for orthogonal gene expression in Escherichia coli
Van Brempt, Maarten; Van Nerom, Katleen; Van Hove, Bob; Maertens, Jo; De Mey, Marjan; Charlier, Daniel
2018-01-01
Abstract Synthetic genetic sensors and circuits enable programmable control over timing and conditions of gene expression and, as a result, are increasingly incorporated into the control of complex and multi-gene pathways. Size and complexity of genetic circuits are growing, but stay limited by a shortage of regulatory parts that can be used without interference. Therefore, orthogonal expression and regulation systems are needed to minimize undesired crosstalk and allow for dynamic control of separate modules. This work presents a set of orthogonal expression systems for use in Escherichia coli based on heterologous sigma factors from Bacillus subtilis that recognize specific promoter sequences. Up to four of the analyzed sigma factors can be combined to function orthogonally between each other and toward the host. Additionally, the toolbox is expanded by creating promoter libraries for three sigma factors without loss of their orthogonal nature. As this set covers a wide range of transcription initiation frequencies, it enables tuning of multiple outputs of the circuit in response to different sensory signals in an orthogonal manner. This sigma factor toolbox constitutes an interesting expansion of the synthetic biology toolbox and may contribute to the assembly of more complex synthetic genetic systems in the future. PMID:29361130
ICT: isotope correction toolbox.
Jungreuthmayer, Christian; Neubauer, Stefan; Mairinger, Teresa; Zanghellini, Jürgen; Hann, Stephan
2016-01-01
Isotope tracer experiments are an invaluable technique to analyze and study the metabolism of biological systems. However, isotope labeling experiments are often affected by naturally abundant isotopes especially in cases where mass spectrometric methods make use of derivatization. The correction of these additive interferences--in particular for complex isotopic systems--is numerically challenging and still an emerging field of research. When positional information is generated via collision-induced dissociation, even more complex calculations for isotopic interference correction are necessary. So far, no freely available tools can handle tandem mass spectrometry data. We present isotope correction toolbox, a program that corrects tandem mass isotopomer data from tandem mass spectrometry experiments. Isotope correction toolbox is written in the multi-platform programming language Perl and, therefore, can be used on all commonly available computer platforms. Source code and documentation can be freely obtained under the Artistic License or the GNU General Public License from: https://github.com/jungreuc/isotope_correction_toolbox/ {christian.jungreuthmayer@boku.ac.at,juergen.zanghellini@boku.ac.at} Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Marine ferromanganese encrustations: Archives of changing oceans
Koschinsky, Andrea; Hein, James
2017-01-01
Marine iron–manganese oxide coatings occur in many shallow and deep-water areas of the global ocean and can form in three ways: 1) Fe–Mn crusts can precipitate from seawater onto rocks on seamounts; 2) Fe–Mn nodules can form on the sediment surface around a nucleus by diagenetic processes in sediment pore water; 3) encrustations can precipitate from hydrothermal fluids. These oxide coatings have been growing for thousands to tens of millions of years. They represent a vast archive of how oceans have changed, including variations of climate, ocean currents, geological activity, erosion processes on land, and even anthropogenic impact. A growing toolbox of age-dating methods and element and isotopic signatures are being used to exploit these archives.
Early warning signal for interior crises in excitable systems.
Karnatak, Rajat; Kantz, Holger; Bialonski, Stephan
2017-10-01
The ability to reliably predict critical transitions in dynamical systems is a long-standing goal of diverse scientific communities. Previous work focused on early warning signals related to local bifurcations (critical slowing down) and nonbifurcation-type transitions. We extend this toolbox and report on a characteristic scaling behavior (critical attractor growth) which is indicative of an impending global bifurcation, an interior crisis in excitable systems. We demonstrate our early warning signal in a conceptual climate model as well as in a model of coupled neurons known to exhibit extreme events. We observed critical attractor growth prior to interior crises of chaotic as well as strange-nonchaotic attractors. These observations promise to extend the classes of transitions that can be predicted via early warning signals.
Danaher, Brian G; Brendryen, Håvar; Seeley, John R; Tyler, Milagra S; Woolley, Tim
2015-03-01
mHealth interventions that deliver content via mobile phones represent a burgeoning area of health behavior change. The current paper examines two themes that can inform the underlying design of mHealth interventions: (1) mobile device functionality, which represents the technological toolbox available to intervention developers; and (2) the pervasive information architecture of mHealth interventions, which determines how intervention content can be delivered concurrently using mobile phones, personal computers, and other devices. We posit that developers of mHealth interventions will be better able to achieve the promise of this burgeoning arena by leveraging the toolbox and functionality of mobile devices in order to engage participants and encourage meaningful behavior change within the context of a carefully designed pervasive information architecture.
Ghosh, Ranadhir; Yearwood, John; Ghosh, Moumita; Bagirov, Adil
2006-06-01
In this paper we investigate a hybrid model based on the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. Also we discuss different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this paper we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model. Comparative results on a range of standard datasets are provided for different fusion hybrid models.
Moss and peat hydraulic properties are optimized to maximise peatland water use efficiency
NASA Astrophysics Data System (ADS)
Kettridge, Nicholas; Tilak, Amey; Devito, Kevin; Petrone, Rich; Mendoza, Carl; Waddington, Mike
2016-04-01
Peatland ecosystems are globally important carbon and terrestrial surface water stores that have formed over millennia. These ecosystems have likely optimised their ecohydrological function over the long-term development of their soil hydraulic properties. Through a theoretical ecosystem approach, applying hydrological modelling integrated with known ecological thresholds and concepts, the optimisation of peat hydraulic properties is examined to determine which of the following conditions peatland ecosystems target during this development: i) maximise carbon accumulation, ii) maximise water storage, or iii) balance carbon profit across hydrological disturbances. Saturated hydraulic conductivity (Ks) and empirical van Genuchten water retention parameter α are shown to provide a first order control on simulated water tensions. Across parameter space, peat profiles with hypothetical combinations of Ks and α show a strong binary tendency towards targeting either water or carbon storage. Actual hydraulic properties from five northern peatlands fall at the interface between these goals, balancing the competing demands of carbon accumulation and water storage. We argue that peat hydraulic properties are thus optimized to maximise water use efficiency and that this optimisation occurs over a centennial to millennial timescale as the peatland develops. This provides a new conceptual framework to characterise peat hydraulic properties across climate zones and between a range of different disturbances, and which can be used to provide benchmarks for peatland design and reclamation.
Optimising predictor domains for spatially coherent precipitation downscaling
NASA Astrophysics Data System (ADS)
Radanovics, S.; Vidal, J.-P.; Sauquet, E.; Ben Daoud, A.; Bontron, G.
2013-10-01
Statistical downscaling is widely used to overcome the scale gap between predictors from numerical weather prediction models or global circulation models and predictands like local precipitation, required for example for medium-term operational forecasts or climate change impact studies. The predictors are considered over a given spatial domain which is rarely optimised with respect to the target predictand location. In this study, an extended version of the growing rectangular domain algorithm is proposed to provide an ensemble of near-optimum predictor domains for a statistical downscaling method. This algorithm is applied to find five-member ensembles of near-optimum geopotential predictor domains for an analogue downscaling method for 608 individual target zones covering France. Results first show that very similar downscaling performances based on the continuous ranked probability score (CRPS) can be achieved by different predictor domains for any specific target zone, demonstrating the need for considering alternative domains in this context of high equifinality. A second result is the large diversity of optimised predictor domains over the country that questions the commonly made hypothesis of a common predictor domain for large areas. The domain centres are mainly distributed following the geographical location of the target location, but there are apparent differences between the windward and the lee side of mountain ridges. Moreover, domains for target zones located in southeastern France are centred more east and south than the ones for target locations on the same longitude. The size of the optimised domains tends to be larger in the southeastern part of the country, while domains with a very small meridional extent can be found in an east-west band around 47° N. Sensitivity experiments finally show that results are rather insensitive to the starting point of the optimisation algorithm except for zones located in the transition area north of this east-west band. Results also appear generally robust with respect to the archive length considered for the analogue method, except for zones with high interannual variability like in the Cévennes area. This study paves the way for defining regions with homogeneous geopotential predictor domains for precipitation downscaling over France, and therefore de facto ensuring the spatial coherence required for hydrological applications.
ERIC Educational Resources Information Center
Dougherty, Susan
Noting that over the last decade, the role of a foster parent has evolved from temporary caregiver to essential part of a professional team in determining the best long-term plan for children in their care, this guide focuses on practical ways in which best child welfare practice can be incorporated into the recruitment, training, and support of…
Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M
2009-03-01
Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.
NASA Astrophysics Data System (ADS)
Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel
2015-12-01
We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.
2011-01-01
Background Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. Results In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. Conclusions TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox. PMID:22098775
Development of a CRISPR/Cas9 genome editing toolbox for Corynebacterium glutamicum.
Liu, Jiao; Wang, Yu; Lu, Yujiao; Zheng, Ping; Sun, Jibin; Ma, Yanhe
2017-11-16
Corynebacterium glutamicum is an important industrial workhorse and advanced genetic engineering tools are urgently demanded. Recently, the clustered regularly interspaced short palindromic repeats (CRISPR) and their CRISPR-associated proteins (Cas) have revolutionized the field of genome engineering. The CRISPR/Cas9 system that utilizes NGG as protospacer adjacent motif (PAM) and has good targeting specificity can be developed into a powerful tool for efficient and precise genome editing of C. glutamicum. Herein, we developed a versatile CRISPR/Cas9 genome editing toolbox for C. glutamicum. Cas9 and gRNA expression cassettes were reconstituted to combat Cas9 toxicity and facilitate effective termination of gRNA transcription. Co-transformation of Cas9 and gRNA expression plasmids was exploited to overcome high-frequency mutation of cas9, allowing not only highly efficient gene deletion and insertion with plasmid-borne editing templates (efficiencies up to 60.0 and 62.5%, respectively) but also simple and time-saving operation. Furthermore, CRISPR/Cas9-mediated ssDNA recombineering was developed to precisely introduce small modifications and single-nucleotide changes into the genome of C. glutamicum with efficiencies over 80.0%. Notably, double-locus editing was also achieved in C. glutamicum. This toolbox works well in several C. glutamicum strains including the widely-used strains ATCC 13032 and ATCC 13869. In this study, we developed a CRISPR/Cas9 toolbox that could facilitate markerless gene deletion, gene insertion, precise base editing, and double-locus editing in C. glutamicum. The CRISPR/Cas9 toolbox holds promise for accelerating the engineering of C. glutamicum and advancing its application in the production of biochemicals and biofuels.
Arnetz, J E; Hasson, H
2007-07-01
Lack of professional development opportunities among nursing staff is a major concern in elderly care and has been associated with work dissatisfaction and staff turnover. There is a lack of prospective, controlled studies evaluating the effects of educational interventions on nursing competence and work satisfaction. The aim of this study was to evaluate the possible effects of an educational "toolbox" intervention on nursing staff ratings of their competence, psychosocial work environment and overall work satisfaction. The study was a prospective, non-randomized, controlled intervention. Nursing staff in two municipal elderly care organizations in western Sweden. In an initial questionnaire survey, nursing staff in the intervention municipality described several areas in which they felt a need for competence development. Measurement instruments and educational materials for improving staff knowledge and work practices were then collated by researchers and managers in a "toolbox." Nursing staff ratings of their competence and work were measured pre and post-intervention by questionnaire. Staff ratings in the intervention municipality were compared to staff ratings in the reference municipality, where no toolbox was introduced. Nursing staff ratings of their competence and psychosocial work environment, including overall work satisfaction, improved significantly over time in the intervention municipality, compared to the reference group. Both competence and work environment ratings were largely unchanged among reference municipality staff. Multivariate analysis revealed a significant interaction effect between municipalities over time for nursing staff ratings of participation, leadership, performance feedback and skills' development. Staff ratings for these four scales improved significantly in the intervention municipality as compared to the reference municipality. Compared to a reference municipality, nursing staff ratings of their competence and the psychosocial work environment improved in the municipality where the toolbox was introduced.
Minas, Giorgos; Momiji, Hiroshi; Jenkins, Dafyd J; Costa, Maria J; Rand, David A; Finkenstädt, Bärbel
2017-06-26
Given the development of high-throughput experimental techniques, an increasing number of whole genome transcription profiling time series data sets, with good temporal resolution, are becoming available to researchers. The ReTrOS toolbox (Reconstructing Transcription Open Software) provides MATLAB-based implementations of two related methods, namely ReTrOS-Smooth and ReTrOS-Switch, for reconstructing the temporal transcriptional activity profile of a gene from given mRNA expression time series or protein reporter time series. The methods are based on fitting a differential equation model incorporating the processes of transcription, translation and degradation. The toolbox provides a framework for model fitting along with statistical analyses of the model with a graphical interface and model visualisation. We highlight several applications of the toolbox, including the reconstruction of the temporal cascade of transcriptional activity inferred from mRNA expression data and protein reporter data in the core circadian clock in Arabidopsis thaliana, and how such reconstructed transcription profiles can be used to study the effects of different cell lines and conditions. The ReTrOS toolbox allows users to analyse gene and/or protein expression time series where, with appropriate formulation of prior information about a minimum of kinetic parameters, in particular rates of degradation, users are able to infer timings of changes in transcriptional activity. Data from any organism and obtained from a range of technologies can be used as input due to the flexible and generic nature of the model and implementation. The output from this software provides a useful analysis of time series data and can be incorporated into further modelling approaches or in hypothesis generation.
Lindner, Michael; Vicente, Raul; Priesemann, Viola; Wibral, Michael
2011-11-18
Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer entropy is an information theoretic implementation of Wiener's principle of observational causality. It offers an approach to the detection of neuronal interactions that is free of an explicit model of the interactions. Hence, it offers the power to analyze linear and nonlinear interactions alike. This allows for example the comprehensive analysis of directed interactions in neural networks at various levels of description. Here we present the open-source MATLAB toolbox TRENTOOL that allows the user to handle the considerable complexity of this measure and to validate the obtained results using non-parametrical statistical testing. We demonstrate the use of the toolbox and the performance of the algorithm on simulated data with nonlinear (quadratic) coupling and on local field potentials (LFP) recorded from the retina and the optic tectum of the turtle (Pseudemys scripta elegans) where a neuronal one-way connection is likely present. In simulated data TE detected information flow in the simulated direction reliably with false positives not exceeding the rates expected under the null hypothesis. In the LFP data we found directed interactions from the retina to the tectum, despite the complicated signal transformations between these stages. No false positive interactions in the reverse directions were detected. TRENTOOL is an implementation of transfer entropy and mutual information analysis that aims to support the user in the application of this information theoretic measure. TRENTOOL is implemented as a MATLAB toolbox and available under an open source license (GPL v3). For the use with neural data TRENTOOL seamlessly integrates with the popular FieldTrip toolbox.
Hilbink, Mirrian A H W; Ouwens, Marielle M T J; Burgers, Jako S; Kool, Rudolf B
2014-03-19
In the last decade, guideline organizations faced a number of problems, including a lack of standardization in guideline development methods and suboptimal guideline implementation. To contribute to the solution of these problems, we produced a toolbox for guideline development, implementation, revision, and evaluation. All relevant guideline organizations in the Netherlands were approached to prioritize the topics. We sent out a questionnaire and discussed the results at an invitational conference. Based on consensus, twelve topics were selected for the development of new tools. Subsequently, working groups were composed for the development of the tools. After development of the tools, their draft versions were pilot tested in 40 guideline projects. Based on the results of the pilot tests, the tools were refined and their final versions were presented. The vast majority of organizations involved in pilot testing of the tools reported satisfaction with using the tools. Guideline experts involved in pilot testing of the tools proposed a variety of suggestions for the implementation of the tools. The tools are available in Dutch and in English at a web-based platform on guideline development and implementation (http://www.ha-ring.nl). A collaborative approach was used for the development and evaluation of a toolbox for development, implementation, revision, and evaluation of guidelines. This approach yielded a potentially powerful toolbox for improving the quality and implementation of Dutch clinical guidelines. Collaboration between guideline organizations within this project led to stronger linkages, which is useful for enhancing coordination of guideline development and implementation and preventing duplication of efforts. Use of the toolbox could improve quality standards in the Netherlands, and might facilitate the development of high-quality guidelines in other countries as well.
Predictive Mining of Time Series Data
NASA Astrophysics Data System (ADS)
Java, A.; Perlman, E. S.
2002-05-01
All-sky monitors are a relatively new development in astronomy, and their data represent a largely untapped resource. Proper utilization of this resource could lead to important discoveries not only in the physics of variable objects, but in how one observes such objects. We discuss the development of a Java toolbox for astronomical time series data. Rather than using methods conventional in astronomy (e.g., power spectrum and cross-correlation analysis) we employ rule discovery techniques commonly used in analyzing stock-market data. By clustering patterns found within the data, rule discovery allows one to build predictive models, allowing one to forecast when a given event might occur or whether the occurrence of one event will trigger a second. We have tested the toolbox and accompanying display tool on datasets (representing several classes of objects) from the RXTE All Sky Monitor. We use these datasets to illustrate the methods and functionality of the toolbox. We have found predictive patterns in several ASM datasets. We also discuss problems faced in the development process, particularly the difficulties of dealing with discretized and irregularly sampled data. A possible application would be in scheduling target of opportunity observations where the astronomer wants to observe an object when a certain event or series of events occurs. By combining such a toolbox with an automatic, Java query tool which regularly gathers data on objects of interest, the astronomer or telescope operator could use the real-time datastream to efficiently predict the occurrence of (for example) a flare or other event. By combining the toolbox with dynamic time warping data-mining tools, one could predict events which may happen on variable time scales.
Improving Vector Evaluated Particle Swarm Optimisation by Incorporating Nondominated Solutions
Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima
2013-01-01
The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm. PMID:23737718
Improving Vector Evaluated Particle Swarm Optimisation by incorporating nondominated solutions.
Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima
2013-01-01
The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.
NASA Astrophysics Data System (ADS)
Tsai, Jinn-Tsong; Chou, Ping-Yi; Chou, Jyh-Horng
2015-11-01
The aim of this study is to generate vector quantisation (VQ) codebooks by integrating principle component analysis (PCA) algorithm, Linde-Buzo-Gray (LBG) algorithm, and evolutionary algorithms (EAs). The EAs include genetic algorithm (GA), particle swarm optimisation (PSO), honey bee mating optimisation (HBMO), and firefly algorithm (FF). The study is to provide performance comparisons between PCA-EA-LBG and PCA-LBG-EA approaches. The PCA-EA-LBG approaches contain PCA-GA-LBG, PCA-PSO-LBG, PCA-HBMO-LBG, and PCA-FF-LBG, while the PCA-LBG-EA approaches contain PCA-LBG, PCA-LBG-GA, PCA-LBG-PSO, PCA-LBG-HBMO, and PCA-LBG-FF. All training vectors of test images are grouped according to PCA. The PCA-EA-LBG used the vectors grouped by PCA as initial individuals, and the best solution gained by the EAs was given for LBG to discover a codebook. The PCA-LBG approach is to use the PCA to select vectors as initial individuals for LBG to find a codebook. The PCA-LBG-EA used the final result of PCA-LBG as an initial individual for EAs to find a codebook. The search schemes in PCA-EA-LBG first used global search and then applied local search skill, while in PCA-LBG-EA first used local search and then employed global search skill. The results verify that the PCA-EA-LBG indeed gain superior results compared to the PCA-LBG-EA, because the PCA-EA-LBG explores a global area to find a solution, and then exploits a better one from the local area of the solution. Furthermore the proposed PCA-EA-LBG approaches in designing VQ codebooks outperform existing approaches shown in the literature.
Perestrelo, Rosa; Barros, António S; Rocha, Sílvia M; Câmara, José S
2011-09-15
The volatiles (VOCs) and semi-volatile organic compounds (SVOCs) responsible for aroma are mainly present in skin of grape varieties. Thus, the present investigation is directed towards the optimisation of a solvent free methodology based on headspace-solid-phase microextraction (HS-SPME) combined with gas chromatography-quadrupole mass spectrometry (GC-qMS) in order to establish the global volatile composition in pulp and skin of Bual and Bastardo Vitis vinifera L. varieties. A deep study on the extraction-influencing parameters was performed, and the best results, expressed as GC peak area, number of identified compounds and reproducibility, were obtained using 4 g of sample homogenised in 5 mL of ultra-pure Milli-Q water in a 20 mL glass vial with addition of 2g of sodium chloride (NaCl). A divinylbenzene/carboxen/polydimethylsiloxane fibre was selected for extraction at 60°C for 45 min under continuous stirring at 800 rpm. More than 100 VOCs and SVOCs, including 27 monoterpenoids, 27 sesquiterpenoids, 21 carbonyl compounds, 17 alcohols (from which 2 aromatics), 10 C(13) norisoprenoids and 5 acids were identified. The results showed that, for both grape varieties, the levels and number of volatiles in skin were considerably higher than those observed in pulp. According to the data obtained by principal component analysis (PCA), the establishment of the global volatile signature of grape and the relationship between different part of grapes-pulp and skin, may be an useful tool to winemaker decision to define the vinification procedures that improves the organoleptic characteristics of the corresponding wines and consequently contributed to an economic valorization and consumer acceptance. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.
Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.
Dovlo, Edem; Baddour, Natalie
2015-01-01
The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.
III. NIH Toolbox Cognition Battery (CB): measuring episodic memory.
Bauer, Patricia J; Dikmen, Sureyya S; Heaton, Robert K; Mungas, Dan; Slotkin, Jerry; Beaumont, Jennifer L
2013-08-01
One of the most significant domains of cognition is episodic memory, which allows for rapid acquisition and long-term storage of new information. For purposes of the NIH Toolbox, we devised a new test of episodic memory. The nonverbal NIH Toolbox Picture Sequence Memory Test (TPSMT) requires participants to reproduce the order of an arbitrarily ordered sequence of pictures presented on a computer. To adjust for ability, sequence length varies from 6 to 15 pictures. Multiple trials are administered to increase reliability. Pediatric data from the validation study revealed the TPSMT to be sensitive to age-related changes. The task also has high test-retest reliability and promising construct validity. Steps to further increase the sensitivity of the instrument to individual and age-related variability are described. © 2013 The Society for Research in Child Development, Inc.
Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka
2016-03-01
This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).
ERIC Educational Resources Information Center
Simpkins, Mary Ann; McNeill, Shane; Dieckman, Dale; Sissom, Mark; LoBianco, Judy; Lund, Jackie; Barney, David C.; Manson, Mara; Silva, Betsy
2009-01-01
NASPE's Teacher Toolbox is an instructional resource site which provides educators with a wide variety of teaching tools that focus on physical activity. This service is provided by NASPE to support instructional activities as well as promote quality programs. New monthly issues support NASPE's mission to enhance knowledge, improve professional…
Motion Simulation in the Environment for Auditory Research
2011-08-01
Toolbox, Centre for Digital Music , Queen Mary University of London, 2009. http://www.isophonics.net/content/spatial-audio- matlab-toolbox (accessed July 27...this work. 46 Student Bio I studied Music Technology at Northwestern University, graduating as valedictorian of the School of... Music in 2008. In 2009, I was awarded the Gates Cambridge Scholarship to fund a postgraduate degree at the University of Cambridge. I read for a
2013-06-01
benefitting from rapid, automated discrimination of specific predefined signals , and is free-standing (requiring no other plugins or packages). The...previously labeled dataset, and comparing two labeled datasets. 15. SUBJECT TERMS Artifact, signal detection, EEG, MATLAB, toolbox 16. SECURITY... CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 56 19a. NAME OF RESPONSIBLE PERSON W. David Hairston a. REPORT
Development of a Dependency Theory Toolbox for Database Design.
1987-12-01
published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and
Eastman, Kyler M; Huk, Alexander C
2012-01-01
Neurophysiological studies in awake, behaving primates (both human and non-human) have focused with increasing scrutiny on the temporal relationship between neural signals and behaviors. Consequently, laboratories are often faced with the problem of developing experimental equipment that can support data recording with high temporal precision and also be flexible enough to accommodate a wide variety of experimental paradigms. To this end, we have developed a MATLAB toolbox that integrates several modern pieces of equipment, but still grants experimenters the flexibility of a high-level programming language. Our toolbox takes advantage of three popular and powerful technologies: the Plexon apparatus for neurophysiological recordings (Plexon, Inc., Dallas, TX, USA), a Datapixx peripheral (Vpixx Technologies, Saint-Bruno, QC, Canada) for control of analog, digital, and video input-output signals, and the Psychtoolbox MATLAB toolbox for stimulus generation (Brainard, 1997; Pelli, 1997; Kleiner et al., 2007). The PLDAPS ("Platypus") system is designed to support the study of the visual systems of awake, behaving primates during multi-electrode neurophysiological recordings, but can be easily applied to other related domains. Despite its wide range of capabilities and support for cutting-edge video displays and neural recording systems, the PLDAPS system is simple enough for someone with basic MATLAB programming skills to design their own experiments.
ERPLAB: an open-source toolbox for the analysis of event-related potentials
Lopez-Calderon, Javier; Luck, Steven J.
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741
Citizen Science Air Monitoring in the Ironbound Community ...
The Environmental Protection Agency’s (EPA) mission is to protect human health and the environment. To move toward achieving this goal, EPA is facilitating identification of potential environmental concerns, particularly in vulnerable communities. This includes actively supporting citizen science projects and providing communities with the information and assistance they need to conduct their own air pollution monitoring efforts. The Air Sensor Toolbox for Citizen Scientists1 was developed as a resource to meet stakeholder needs. Examples of materials developed for the Toolbox and ultimately pilot tested in the Ironbound Community in Newark, New Jersey are reported here. The Air Sensor Toolbox for Citizen Scientists is designed as an online resource that provides information and guidance on new, low-cost compact technologies used for measuring air quality. The Toolbox features resources developed by EPA researchers that can be used by citizens to effectively collect, analyze, interpret, and communicate air quality data. The resources include information about sampling methods, how to calibrate and validate monitors, options for measuring air quality, data interpretation guidelines, and low-cost sensor performance information. This Regional Applied Research Effort (RARE) project provided an opportunity for the Office of Research and Development (ORD) to work collaboratively with EPA Region 2 to provide the Ironbound Community with a “Toolbox” specific for c
Optics Program Simplifies Analysis and Design
NASA Technical Reports Server (NTRS)
2007-01-01
Engineers at Goddard Space Flight Center partnered with software experts at Mide Technology Corporation, of Medford, Massachusetts, through a Small Business Innovation Research (SBIR) contract to design the Disturbance-Optics-Controls-Structures (DOCS) Toolbox, a software suite for performing integrated modeling for multidisciplinary analysis and design. The DOCS Toolbox integrates various discipline models into a coupled process math model that can then predict system performance as a function of subsystem design parameters. The system can be optimized for performance; design parameters can be traded; parameter uncertainties can be propagated through the math model to develop error bounds on system predictions; and the model can be updated, based on component, subsystem, or system level data. The Toolbox also allows the definition of process parameters as explicit functions of the coupled model and includes a number of functions that analyze the coupled system model and provide for redesign. The product is being sold commercially by Nightsky Systems Inc., of Raleigh, North Carolina, a spinoff company that was formed by Mide specifically to market the DOCS Toolbox. Commercial applications include use by any contractors developing large space-based optical systems, including Lockheed Martin Corporation, The Boeing Company, and Northrup Grumman Corporation, as well as companies providing technical audit services, like General Dynamics Corporation
ERPLAB: an open-source toolbox for the analysis of event-related potentials.
Lopez-Calderon, Javier; Luck, Steven J
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB's EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB's tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user's guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.
Wavefront Control Toolbox for James Webb Space Telescope Testbed
NASA Technical Reports Server (NTRS)
Shiri, Ron; Aronstein, David L.; Smith, Jeffery Scott; Dean, Bruce H.; Sabatke, Erin
2007-01-01
We have developed a Matlab toolbox for wavefront control of optical systems. We have applied this toolbox to the optical models of James Webb Space Telescope (JWST) in general and to the JWST Testbed Telescope (TBT) in particular, implementing both unconstrained and constrained wavefront optimization to correct for possible misalignments present on the segmented primary mirror or the monolithic secondary mirror. The optical models implemented in Zemax optical design program and information is exchanged between Matlab and Zemax via the Dynamic Data Exchange (DDE) interface. The model configuration is managed using the XML protocol. The optimization algorithm uses influence functions for each adjustable degree of freedom of the optical mode. The iterative and non-iterative algorithms have been developed to converge to a local minimum of the root-mean-square (rms) of wavefront error using singular value decomposition technique of the control matrix of influence functions. The toolkit is highly modular and allows the user to choose control strategies for the degrees of freedom to be adjusted on a given iteration and wavefront convergence criterion. As the influence functions are nonlinear over the control parameter space, the toolkit also allows for trade-offs between frequency of updating the local influence functions and execution speed. The functionality of the toolbox and the validity of the underlying algorithms have been verified through extensive simulations.
Harms and benefits from social imitation
NASA Astrophysics Data System (ADS)
Slanina, František
2001-10-01
We study the role of imitation within a model of economics with adaptive agents. The basic ingredients are those of the minority game. We add the possibility of local information exchange and imitation of the neighbour's strategy. Imitators should pay a fee to the imitated. Connected groups are formed, which act as if they were single players. Coherent spatial areas of rich and poor agents result, leading to the decrease of local social tensions. Size and stability of these areas depends on the parameters of the model. Global performance measured by the attendance volatility is optimised at certain value of the imitation probability. The social tensions are suppressed for large imitation probability, but due to the price paid by the imitators the requirements of high global effectivity and low social tensions are in conflict, as well as the requirements of low global and low local wealth differences.
NASA Astrophysics Data System (ADS)
Fouladi, Ehsan; Mojallali, Hamed
2018-01-01
In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master-slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm optimisation (PSO) algorithm. Simulation results show better performance in terms of accuracy and convergence for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller.
The Visible Signature Modelling and Evaluation ToolBox
2008-12-01
Technology Organisation DSTO–TR–2212 ABSTRACT A new software suite, the Visible Signature ToolBox ( VST ), has been developed to model and evaluate the...visible signatures of maritime platforms. The VST is a collection of commercial, off-the-shelf software and DSTO developed pro- grams and procedures. The...suite. The VST can be utilised to model and assess visible signatures of maritime platforms. A number of examples are presented to demonstrate the
ERIC Educational Resources Information Center
Godfroy-Genin, Anne-Sophie; Pinault, Cloe
2006-01-01
The main objective of the WomEng European research project was to assess when, how and why women decide to or not to study engineering. This question was addressed through an international cross-comparison by an interdisciplinary research team in seven European countries. This article presents, in the first part, the methodological toolbox…
ERIC Educational Resources Information Center
Fassi, Davide; Motter, Roberta
2014-01-01
This paper is a reflection on the use of public spaces in towns and the development of a system-events toolbox to activate them towards social cohesion. It is the result of a 1 year action research developed together with POLIMI DESIS Lab of the Department of Design to develop design solutions to open up the public spaces of the campus to the…
Evaluating 3D-printed biomaterials as scaffolds for vascularized bone tissue engineering.
Wang, Martha O; Vorwald, Charlotte E; Dreher, Maureen L; Mott, Eric J; Cheng, Ming-Huei; Cinar, Ali; Mehdizadeh, Hamidreza; Somo, Sami; Dean, David; Brey, Eric M; Fisher, John P
2015-01-07
There is an unmet need for a consistent set of tools for the evaluation of 3D-printed constructs. A toolbox developed to design, characterize, and evaluate 3D-printed poly(propylene fumarate) scaffolds is proposed for vascularized engineered tissues. This toolbox combines modular design and non-destructive fabricated design evaluation, evaluates biocompatibility and mechanical properties, and models angiogenesis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Social Network Mapping: A New Tool For The Leadership Toolbox
2002-04-01
SOCIAL NETWORK MAPPING: A NEW TOOL FOR THE LEADERSHIP TOOLBOX By Elisabeth J. Strines, Colonel, USAF 8037 Washington Road Alexandria...valid OMB control number. 1. REPORT DATE 00 APR 2002 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Social Network Mapping: A...describes the concept of social network mapping and demonstrates how it can be used by squadron commanders and leaders at all levels to provide subtle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.
The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less
Markley, Andrew L.; Begemann, Matthew B.; Clarke, Ryan E.; ...
2014-09-12
The application of synthetic biology requires characterized tools to precisely control gene expression. This toolbox of genetic parts previously did not exist for the industrially promising cyanobacterium, Synechococcus sp. strain PCC 7002. To address this gap, two orthogonal constitutive promoter libraries, one based on a cyanobacterial promoter and the other ported from Escherichia coli, were built and tested in PCC 7002. The libraries demonstrated 3 and 2.5 log dynamic ranges, respectively, but correlated poorly with E. coli expression levels. These promoter libraries were then combined to create and optimize a series of IPTG inducible cassettes. The resultant induction system hadmore » a 48-fold dynamic range and was shown to out-perform P trc constructs. Finally, a RBS library was designed and tested in PCC 7002. The presented synthetic biology toolbox will enable accelerated engineering of PCC 7002.« less
GUIDANCE DOCUMENT ON IMPLEMENTATION OF THE ...
The Agreement in Principle for the Stage 2 M-DBP Federal Advisory Committee contains a list of treatment processes and management practices for water systems to use in meeting additional Cryptosporidium treatment requirements under the LT2ESWTR. This list, termed the microbial toolbox, includes watershed control programs, alternative intake locations, pretreatment processes, additional filtration barriers, inactivation technologies, and enhanced plant performance. The intent of the microbial toolbox is to provide water systems with broad flexibility in selecting cost-effective LT2ESWTR compliance strategies. Moreover, the toolbox allows systems that currently provide additional pathogen barriers or that can demonstrate enhanced performance to receive additional Cryptosporidium treatment credit. Provide guidance to utilities with surface water supplies and to state drinking water programs on the use of different treatment technologies to reduce the level of Cryptosporidium in drinking water. Technologies included in the guidance manual may be used to achieve compliance with the requirements of the LT2ESWTR.
ΔΔPT: a comprehensive toolbox for the analysis of protein motion
2013-01-01
Background Normal Mode Analysis is one of the most successful techniques for studying motions in proteins and macromolecules. It can provide information on the mechanism of protein functions, used to aid crystallography and NMR data reconstruction, and calculate protein free energies. Results ΔΔPT is a toolbox allowing calculation of elastic network models and principle component analysis. It allows the analysis of pdb files or trajectories taken from; Gromacs, Amber, and DL_POLY. As well as calculation of the normal modes it also allows comparison of the modes with experimental protein motion, variation of modes with mutation or ligand binding, and calculation of molecular dynamic entropies. Conclusions This toolbox makes the respective tools available to a wide community of potential NMA users, and allows them unrivalled ability to analyse normal modes using a variety of techniques and current software. PMID:23758746
OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.
Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T
2017-01-01
In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590
Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates
Dovlo, Edem; Baddour, Natalie
2015-01-01
The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: • The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms. • The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform. • The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time. PMID:26150988
The laboratory test utilization management toolbox
Baird, Geoffrey
2014-01-01
Efficiently managing laboratory test utilization requires both ensuring adequate utilization of needed tests in some patients and discouraging superfluous tests in other patients. After the difficult clinical decision is made to define the patients that do and do not need a test, a wealth of interventions are available to the clinician and laboratorian to help guide appropriate utilization. These interventions are collectively referred to here as the utilization management toolbox. Experience has shown that some tools in the toolbox are weak and other are strong, and that tools are most effective when many are used simultaneously. While the outcomes of utilization management studies are not always as concrete as may be desired, what data is available in the literature indicate that strong utilization management interventions are safe and effective measures to improve patient health and reduce waste in an era of increasing financial pressure. PMID:24969916
NASA Astrophysics Data System (ADS)
Ridgeway, William K.; Millar, David P.; Williamson, James R.
2013-04-01
Fluorescence Correlation Spectroscopy (FCS) is widely used to quantify reaction rates and concentrations of molecules in vitro and in vivo. We recently reported Fluorescence Triple Correlation Spectroscopy (F3CS), which correlates three signals together instead of two. F3CS can analyze the stoichiometries of complex mixtures and detect irreversible processes by identifying time-reversal asymmetries. Here we report the computational developments that were required for the realization of F3CS and present the results as the Triple Correlation Toolbox suite of programs. Triple Correlation Toolbox is a complete data analysis pipeline capable of acquiring, correlating and fitting large data sets. Each segment of the pipeline handles error estimates for accurate error-weighted global fitting. Data acquisition was accelerated with a combination of off-the-shelf counter-timer chips and vectorized operations on 128-bit registers. This allows desktop computers with inexpensive data acquisition cards to acquire hours of multiple-channel data with sub-microsecond time resolution. Off-line correlation integrals were implemented as a two delay time multiple-tau scheme that scales efficiently with multiple processors and provides an unprecedented view of linked dynamics. Global fitting routines are provided to fit FCS and F3CS data to models containing up to ten species. Triple Correlation Toolbox is a complete package that enables F3CS to be performed on existing microscopes. Catalogue identifier: AEOP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 50189 No. of bytes in distributed program, including test data, etc.: 6135283 Distribution format: tar.gz Programming language: C/Assembly. Computer: Any with GCC and library support. Operating system: Linux and OS X (data acq. for Linux only due to library availability), not tested on Windows. RAM: ≥512 MB. Classification: 16.4. External routines: NIDAQmx (National Instruments), Gnu Scientific Library, GTK+, PLplot (optional) Nature of problem: Fluorescence Triple Correlation Spectroscopy required three things: data acquisition at faster speeds than were possible without expensive custom hardware, triple-correlation routines that could process 1/2 TB data sets rapidly, and fitting routines capable of handling several to a hundred fit parameters and 14,000 + data points, each with error estimates. Solution method: A novel data acquisition concept mixed signal processing with off-the-shelf hardware and data-parallel processing using 128-bit registers found in desktop CPUs. Correlation algorithms used fractal data structures and multithreading to reduce data analysis times. Global fitting was implemented with robust minimization routines and provides feedback that allows the user to critically inspect initial guesses and fits. Restrictions: Data acquisition only requires a National Instruments data acquisition card (it was tested on Linux using card PCIe-6251) and a simple home-built circuit. Unusual features: Hand-coded ×86-64 assembly for data acquisition loops (platform-independent C code also provided). Additional comments: A complete collection of tools to perform Fluorescence Triple Correlation Spectroscopy-from data acquisition to two-tau correlation of large data sets, to model fitting. Running time: 1-5 h of data analysis per hour of data collected. Varies depending on data-acquisition length, time resolution, data density and number of cores used for correlation integrals.
NASA Astrophysics Data System (ADS)
Hadade, Ioan; di Mare, Luca
2016-08-01
Modern multicore and manycore processors exhibit multiple levels of parallelism through a wide range of architectural features such as SIMD for data parallel execution or threads for core parallelism. The exploitation of multi-level parallelism is therefore crucial for achieving superior performance on current and future processors. This paper presents the performance tuning of a multiblock CFD solver on Intel SandyBridge and Haswell multicore CPUs and the Intel Xeon Phi Knights Corner coprocessor. Code optimisations have been applied on two computational kernels exhibiting different computational patterns: the update of flow variables and the evaluation of the Roe numerical fluxes. We discuss at great length the code transformations required for achieving efficient SIMD computations for both kernels across the selected devices including SIMD shuffles and transpositions for flux stencil computations and global memory transformations. Core parallelism is expressed through threading based on a number of domain decomposition techniques together with optimisations pertaining to alleviating NUMA effects found in multi-socket compute nodes. Results are correlated with the Roofline performance model in order to assert their efficiency for each distinct architecture. We report significant speedups for single thread execution across both kernels: 2-5X on the multicore CPUs and 14-23X on the Xeon Phi coprocessor. Computations at full node and chip concurrency deliver a factor of three speedup on the multicore processors and up to 24X on the Xeon Phi manycore coprocessor.
Wave data processing toolbox manual
Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul
2006-01-01
Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.
Yu, Teresa; Korgaonkar, Mayuresh S; Grieve, Stuart M
2017-04-01
This study examined patterns of cerebellar volumetric gray matter (GM) loss across the adult lifespan in a large cross-sectional sample. Four hundred and seventy-nine healthy participants (age range: 7-86 years) were drawn from the Brain Resource International Database who provided T1-weighted MRI scans. The spatially unbiased infratentorial template (SUIT) toolbox in SPM8 was used for normalisation of the cerebellum structures. Global volumetric and voxel-based morphometry analyses were performed to evaluate age-associated trends and gender-specific age-patterns. Global cerebellar GM shows a cross-sectional reduction with advancing age of 2.5 % per decade-approximately half the rate seen in the whole brain. The male cerebellum is larger with a lower percentage of GM, however, after controlling for total brain volume, no gender difference was detected. Analysis of age-related changes in GM volume revealed large bilateral clusters involving the vermis and cerebellar crus where regional loss occurred at nearly twice the average cerebellar rate. No gender-specific patterns were detected. These data confirm that regionally specific GM loss occurs in the cerebellum with age, and form a solid base for further investigation to find functional correlates for this global and focal loss.
Smaggus, Andrew; Mrkobrada, Marko; Marson, Alanna; Appleton, Andrew
2018-01-01
The quality and safety movement has reinvigorated interest in optimising morbidity and mortality (M&M) rounds. We performed a systematic review to identify effective means of updating M&M rounds to (1) identify and address quality and safety issues, and (2) address contemporary educational goals. Relevant databases (Medline, Embase, PubMed, Education Resource Information Centre, Cumulative Index to Nursing and Allied Health Literature, Healthstar, and Global Health) were searched to identify primary sources. Studies were included if they (1) investigated an intervention applied to M&M rounds, (2) reported outcomes relevant to the identification of quality and safety issues, or educational outcomes relevant to quality improvement (QI), patient safety or general medical education and (3) included a control group. Study quality was assessed using the Medical Education Research Study Quality Instrument and Newcastle-Ottawa Scale-Education instruments. Given the heterogeneity of interventions and outcome measures, results were analysed thematically. The final analysis included 19 studies. We identified multiple effective strategies (updating objectives, standardising elements of rounds and attaching rounds to a formal quality committee) to optimise M&M rounds for a QI/safety purpose. These efforts were associated with successful integration of quality and safety content into rounds, and increased implementation of QI interventions. Consistent effects on educational outcomes were difficult to identify, likely due to the use of methodologies ill-fitted for educational research. These results are encouraging for those seeking to optimise the quality and safety mission of M&M rounds. However, the inability to identify consistent educational effects suggests the investigation of M&M rounds could benefit from additional methodologies (qualitative, mixed methods) in order to understand the complex mechanisms driving learning at M&M rounds. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
ESA's Multi-mission Sentinel-1 Toolbox
NASA Astrophysics Data System (ADS)
Veci, Luis; Lu, Jun; Foumelis, Michael; Engdahl, Marcus
2017-04-01
The Sentinel-1 Toolbox is a new open source software for scientific learning, research and exploitation of the large archives of Sentinel and heritage missions. The Toolbox is based on the proven BEAM/NEST architecture inheriting all current NEST functionality including multi-mission support for most civilian satellite SAR missions. The project is funded through ESA's Scientific Exploitation of Operational Missions (SEOM). The Sentinel-1 Toolbox will strive to serve the SEOM mandate by providing leading-edge software to the science and application users in support of ESA's operational SAR mission as well as by educating and growing a SAR user community. The Toolbox consists of a collection of processing tools, data product readers and writers and a display and analysis application. A common architecture for all Sentinel Toolboxes is being jointly developed by Brockmann Consult, Array Systems Computing and C-S called the Sentinel Application Platform (SNAP). The SNAP architecture is ideal for Earth Observation processing and analysis due the following technological innovations: Extensibility, Portability, Modular Rich Client Platform, Generic EO Data Abstraction, Tiled Memory Management, and a Graph Processing Framework. The project has developed new tools for working with Sentinel-1 data in particular for working with the new Interferometric TOPSAR mode. TOPSAR Complex Coregistration and a complete Interferometric processing chain has been implemented for Sentinel-1 TOPSAR data. To accomplish this, a coregistration following the Spectral Diversity[4] method has been developed as well as special azimuth handling in the coherence, interferogram and spectral filter operators. The Toolbox includes reading of L0, L1 and L2 products in SAFE format, calibration and de-noising, slice product assembling, TOPSAR deburst and sub-swath merging, terrain flattening radiometric normalization, and visualization for L2 OCN products. The Toolbox also provides several new tools for exploitation of polarimetric data including speckle filters, decompositions, and classifiers. The Toolbox will also include tools for large data stacks, supervised and unsupervised classification, improved vector handling and change detection. Architectural improvements such as smart memory configuration, task queuing, and optimizations for complex data will provide better support and performance for very large products and stacks.In addition, a Cloud Exploitation Platform Extension (CEP) has been developed to add the capability to smoothly utilize a cloud computing platform where EO data repositories and high performance processing capabilities are available. The extension to the SENTINEL Application Platform would facilitate entry into cloud processing services for supporting bulk processing on high performance clusters. Since December 2016, the COMET-LiCS InSAR portal (http://comet.nerc.ac.uk/COMET-LiCS-portal/) has been live, delivering interferograms and coherence estimates over the entire Alpine-Himalayan belt. The portal already contains tens of thousands of products, which can be browsed in a user-friendly portal, and downloaded for free by the general public. For our processing, we use the facilities at the Climate and Environmental Monitoring from Space (CEMS). Here we have large storage and processing facilities to our disposal, and a complete duplicate of the Sentinel-1 archive is maintained. This greatly simplifies the infrastructure we had to develop for automated processing of large areas. Here we will give an overview of the current status of the processing system, as well as discuss future plans. We will cover the infrastructure we developed to automatically produce interferograms and its challenges, and the processing strategy for time series analysis. We will outline the objectives of the system in the near and distant future, and a roadmap for its continued development. Finally, we will highlight some of the scientific results and projects linked to the system.
Proba-V Mission Exploitation Platform
NASA Astrophysics Data System (ADS)
Goor, Erwin; Dries, Jeroen
2017-04-01
VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (a Copernicus contributing mission) EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. Furthermore data from the Copernicus Global Land Service is in scope of the platform. From November 2015 an operational Proba-V MEP environment, as an ESA operation service, is gradually deployed at the VITO data center with direct access to the complete data archive. Since autumn 2016 the platform is operational and yet several applications are released to the users, e.g. - A time series viewer, showing the evolution of Proba-V bands and derived vegetation parameters from the Copernicus Global Land Service for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains on a powerfull Hadoop/Spark backend e.g. for the calculation of N-daily composites. - Virtual Machines can be provided with access to the data archive and tools to work with this data, e.g. various toolboxes (GDAL, QGIS, GrassGIS, SNAP toolbox, …) and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. - A prototype of jupyter Notebooks is available with some examples worked out to show the potential of the data. Today the platform is used by several third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. In parallel the platform is further improved and extended. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be available as well soon. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components (e.g. Geotrellis). The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger community of users. The presentation will address these benefits for the users and discuss on the technical challenges in implementing this MEP. Furthermore demonstrations will be done. Platform URL: https://proba-v-mep.esa.int/
Visualizing flow fields using acoustic Doppler current profilers and the Velocity Mapping Toolbox
Jackson, P. Ryan
2013-01-01
The purpose of this fact sheet is to provide examples of how the U.S. Geological Survey is using acoustic Doppler current profilers for much more than routine discharge measurements. These instruments are capable of mapping complex three-dimensional flow fields within rivers, lakes, and estuaries. Using the Velocity Mapping Toolbox to process the ADCP data allows detailed visualization of the data, providing valuable information for a range of studies and applications.
Gershon, Richard C; Slotkin, Jerry; Manly, Jennifer J; Blitz, David L; Beaumont, Jennifer L; Schnipke, Deborah; Wallner-Allen, Kathleen; Golinkoff, Roberta Michnick; Gleason, Jean Berko; Hirsh-Pasek, Kathy; Adams, Marilyn Jager; Weintraub, Sandra
2013-08-01
Mastery of language skills is an important predictor of daily functioning and health. Vocabulary comprehension and reading decoding are relatively quick and easy to measure and correlate highly with overall cognitive functioning, as well as with success in school and work. New measures of vocabulary comprehension and reading decoding (in both English and Spanish) were developed for the NIH Toolbox Cognition Battery (CB). In the Toolbox Picture Vocabulary Test (TPVT), participants hear a spoken word while viewing four pictures, and then must choose the picture that best represents the word. This approach tests receptive vocabulary knowledge without the need to read or write, removing the literacy load for children who are developing literacy and for adults who struggle with reading and writing. In the Toolbox Oral Reading Recognition Test (TORRT), participants see a letter or word onscreen and must pronounce or identify it. The examiner determines whether it was pronounced correctly by comparing the response to the pronunciation guide on a separate computer screen. In this chapter, we discuss the importance of language during childhood and the relation of language and brain function. We also review the development of the TPVT and TORRT, including information about the item calibration process and results from a validation study. Finally, the strengths and weaknesses of the measures are discussed. © 2013 The Society for Research in Child Development, Inc.
The iRoCS Toolbox--3D analysis of the plant root apical meristem at cellular resolution.
Schmidt, Thorsten; Pasternak, Taras; Liu, Kun; Blein, Thomas; Aubry-Hivet, Dorothée; Dovzhenko, Alexander; Duerr, Jasmin; Teale, William; Ditengou, Franck A; Burkhardt, Hans; Ronneberger, Olaf; Palme, Klaus
2014-03-01
To achieve a detailed understanding of processes in biological systems, cellular features must be quantified in the three-dimensional (3D) context of cells and organs. We described use of the intrinsic root coordinate system (iRoCS) as a reference model for the root apical meristem of plants. iRoCS enables direct and quantitative comparison between the root tips of plant populations at single-cell resolution. The iRoCS Toolbox automatically fits standardized coordinates to raw 3D image data. It detects nuclei or segments cells, automatically fits the coordinate system, and groups the nuclei/cells into the root's tissue layers. The division status of each nucleus may also be determined. The only manual step required is to mark the quiescent centre. All intermediate outputs may be refined if necessary. The ability to learn the visual appearance of nuclei by example allows the iRoCS Toolbox to be easily adapted to various phenotypes. The iRoCS Toolbox is provided as an open-source software package, licensed under the GNU General Public License, to make it accessible to a broad community. To demonstrate the power of the technique, we measured subtle changes in cell division patterns caused by modified auxin flux within the Arabidopsis thaliana root apical meristem. © 2014 The Authors The Plant Journal © 2014 John Wiley & Sons Ltd.
Harden, J Taylor; Silverberg, Nina
2010-01-01
The ability to locate the right research tool at the right time for recruitment and retention of minority and health disparity populations is a challenge. This article provides an introduction to a number of recruitment and retention tools in a National Institute on Aging Health Disparities Toolbox and to this special edition on challenges and opportunities in recruitment and retention of minority populations in Alzheimer disease and dementia research. The Health Disparities Toolbox and Health Disparities Resource Persons Network are described along with other more established resource tools including the Alzheimer Disease Center Education Cores, Alzheimer Disease Education and Referral Center, and Resource Centers for Minority Aging Research. Nine featured articles are introduced. The articles address a range of concerns including what we know and do not know, conceptual and theoretical perspectives framing issues of diversity and inclusion, success as a result of sustained investment of time and community partnerships, the significant issue of mistrust, willingness to participate in research as a dynamic personal attribute, Helpline Service and the amount of resources required for success, assistance in working with Limited English Proficiency elders, and sage advice from social marketing and investigations of health literacy as a barrier to recruitment and retention. Finally, an appeal is made for scientists to share tools for the National Institute on Aging Health Disparity Toolbox and to join the Health Disparities Resource Persons Network.
Self-Efficacy Buffers the Relationship between Educational Disadvantage and Executive Functioning.
Zahodne, Laura B; Nowinski, Cindy J; Gershon, Richard C; Manly, Jennifer J
2015-04-01
Previous studies showed that control beliefs are more strongly related to global cognition and mortality among adults with low education, providing preliminary evidence that self-efficacy buffers against the negative impact of educational disadvantage on physical and cognitive health. The current study extends these findings to a nationally representative sample of men and women aged 30 to 85 and explores which cognitive domains are most strongly associated with self-efficacy, educational attainment, and their interaction. Data were obtained from 1032 adult (30-85) participants in the United States norming study for the NIH Toolbox. Self-efficacy, executive functioning, working memory, processing speed, episodic memory, and vocabulary were assessed with the NIH Toolbox. Multivariate analysis of covariance and follow-up regressions tested the hypothesis that self-efficacy would be more strongly related to cognitive performance among individuals with lower education, controlling for age, sex, race, ethnicity, education, reading level, testing language, and depressive symptoms. Higher education was associated with higher self-efficacy and better performance on all cognitive tests. Higher self-efficacy was associated with better set-switching and attention/inhibition. Significant self-efficacy by education interactions indicated that associations between self-efficacy and executive abilities were stronger for individuals with lower education. Specifically, individuals with low education but high self-efficacy performed similarly to individuals with high education. This study provides evidence that self-efficacy beliefs buffer against the negative effects of low educational attainment on executive functioning. These results have implications for future policy and/or intervention work aimed at reducing the deleterious effects of educational disadvantage on later cognitive health.
NASA Astrophysics Data System (ADS)
Krätli, Saverio; Kaufmann, Brigitte; Roba, Hassan; Hiernaux, Pierre; Li, Wenjun; Easdale, Marcos H.; Huelsebusch, Christian
2016-04-01
The theoretical understanding of drylands and pastoral systems has long undergone a U-turn from the initial perspective rooted in classical ecology. The shift has hinged on the way to represent asymmetric variability, from a disturbance in an ecosystem that naturally tends towards uniformity and stability, to a constitutive part of a dynamic ecosystem. Operationalising the new reversed perspective, including the need to update the methodological infrastructure to plan around drylands and pastoral development, remains a challenge. Underlying assumptions about stability and uniformity, that are a legacy of equilibrium thinking, remain embedded in the toolbox of pastoral development, starting from the technical language to talk about the subject. This effectively gets in the way of operationalizing state of the art understanding of pastoral systems and the drylands. Unless these barriers are identified, unpacked and managed, even the present calls for increasing the rigour and intensity of data collection - for example as part of the ongoing global process to revise and improve agricultural data - cannot deliver a realistic representation of pastoral systems in statistics and policy making. This contribution presents the case for understanding variability as an asset, and provides a range of examples of methodological barriers, including classifications of livestock systems, scale of observation, key parameters in animal production, indicators in the measurement of ecological efficiency, concepts of ecological fragility, natural resources, and pastoral risk. The need to update this legacy is a pressing challenge for policy makers concerned with both modernisation and resilience in the drylands.
Optimisation of logistics processes of energy grass collection
NASA Astrophysics Data System (ADS)
Bányai, Tamás.
2010-05-01
The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The objective function of the optimisation is the maximisation of the profit which means the maximization of the difference between revenue and cost. The objective function trades off the income of the assigned transportation demands against the logistic costs. The constraints are the followings: (1) the free capacity of the assigned transportation resource is more than the re-quested capacity of the transportation demand; the calculated arrival time of the transportation resource to the harvesting place is not later than the requested arrival time of them; (3) the calculated arrival time of the transportation demand to the processing and production facility is not later than the requested arrival time; (4) one transportation demand is assigned to one transportation resource and one resource is assigned to one transportation resource. The decision variable of the optimisation problem is the set of scheduling variables and the assignment of resources to transportation demands. The evaluation parameters of the optimised system are the followings: total costs of the collection process; utilisation of transportation resources and warehouses; efficiency of production and/or processing facilities. However the multidimensional heuristic optimisation method is based on genetic algorithm, but the routing sequence of the optimisation works on the base of an ant colony algorithm. The optimal routes are calculated by the aid of the ant colony algorithm as a subroutine of the global optimisation method and the optimal assignment is given by the genetic algorithm. One important part of the mathematical method is the sensibility analysis of the objective function, which shows the influence rate of the different input parameters. Acknowledgements This research was implemented within the frame of the project entitled "Development and operation of the Technology and Knowledge Transfer Centre of the University of Miskolc". with support by the European Union and co-funding of the European Social Fund. References [1] P. R. Daniel: The Economics of Harvesting and Transporting Corn Stover for Conversion to Fuel Ethanol: A Case Study for Minnesota. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/14213.html [2] T. G. Douglas, J. Brendan, D. Erin & V.-D. Becca: Energy and Chemicals from Native Grasses: Production, Transportation and Processing Technologies Considered in the Northern Great Plains. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/13838.html [3] Homepage of energygrass. www.energiafu.hu
Aron, Miles; Browning, Richard; Carugo, Dario; Sezgin, Erdinc; Bernardino de la Serna, Jorge; Eggeling, Christian; Stride, Eleanor
2017-05-12
Spectral imaging with polarity-sensitive fluorescent probes enables the quantification of cell and model membrane physical properties, including local hydration, fluidity, and lateral lipid packing, usually characterized by the generalized polarization (GP) parameter. With the development of commercial microscopes equipped with spectral detectors, spectral imaging has become a convenient and powerful technique for measuring GP and other membrane properties. The existing tools for spectral image processing, however, are insufficient for processing the large data sets afforded by this technological advancement, and are unsuitable for processing images acquired with rapidly internalized fluorescent probes. Here we present a MATLAB spectral imaging toolbox with the aim of overcoming these limitations. In addition to common operations, such as the calculation of distributions of GP values, generation of pseudo-colored GP maps, and spectral analysis, a key highlight of this tool is reliable membrane segmentation for probes that are rapidly internalized. Furthermore, handling for hyperstacks, 3D reconstruction and batch processing facilitates analysis of data sets generated by time series, z-stack, and area scan microscope operations. Finally, the object size distribution is determined, which can provide insight into the mechanisms underlying changes in membrane properties and is desirable for e.g. studies involving model membranes and surfactant coated particles. Analysis is demonstrated for cell membranes, cell-derived vesicles, model membranes, and microbubbles with environmentally-sensitive probes Laurdan, carboxyl-modified Laurdan (C-Laurdan), Di-4-ANEPPDHQ, and Di-4-AN(F)EPPTEA (FE), for quantification of the local lateral density of lipids or lipid packing. The Spectral Imaging Toolbox is a powerful tool for the segmentation and processing of large spectral imaging datasets with a reliable method for membrane segmentation and no ability in programming required. The Spectral Imaging Toolbox can be downloaded from https://uk.mathworks.com/matlabcentral/fileexchange/62617-spectral-imaging-toolbox .
NASA Astrophysics Data System (ADS)
Hazwan, M. H. M.; Shayfull, Z.; Sharif, S.; Nasir, S. M.; Zainal, N.
2017-09-01
In injection moulding process, quality and productivity are notably important and must be controlled for each product type produced. Quality is measured as the extent of warpage of moulded parts while productivity is measured as a duration of moulding cycle time. To control the quality, many researchers have introduced various of optimisation approaches which have been proven enhanced the quality of the moulded part produced. In order to improve the productivity of injection moulding process, some of researches have proposed the application of conformal cooling channels which have been proven reduced the duration of moulding cycle time. Therefore, this paper presents an application of alternative optimisation approach which is Response Surface Methodology (RSM) with Glowworm Swarm Optimisation (GSO) on the moulded part with straight-drilled and conformal cooling channels mould. This study examined the warpage condition of the moulded parts before and after optimisation work applied for both cooling channels. A front panel housing have been selected as a specimen and the performance of proposed optimisation approach have been analysed on the conventional straight-drilled cooling channels compared to the Milled Groove Square Shape (MGSS) conformal cooling channels by simulation analysis using Autodesk Moldflow Insight (AMI) 2013. Based on the results, melt temperature is the most significant factor contribute to the warpage condition and warpage have optimised by 39.1% after optimisation for straight-drilled cooling channels and cooling time is the most significant factor contribute to the warpage condition and warpage have optimised by 38.7% after optimisation for MGSS conformal cooling channels. In addition, the finding shows that the application of optimisation work on the conformal cooling channels offers the better quality and productivity of the moulded part produced.
Using Optimisation Techniques to Granulise Rough Set Partitions
NASA Astrophysics Data System (ADS)
Crossingham, Bodie; Marwala, Tshilidzi
2007-11-01
This paper presents an approach to optimise rough set partition sizes using various optimisation techniques. Three optimisation techniques are implemented to perform the granularisation process, namely, genetic algorithm (GA), hill climbing (HC) and simulated annealing (SA). These optimisation methods maximise the classification accuracy of the rough sets. The proposed rough set partition method is tested on a set of demographic properties of individuals obtained from the South African antenatal survey. The three techniques are compared in terms of their computational time, accuracy and number of rules produced when applied to the Human Immunodeficiency Virus (HIV) data set. The optimised methods results are compared to a well known non-optimised discretisation method, equal-width-bin partitioning (EWB). The accuracies achieved after optimising the partitions using GA, HC and SA are 66.89%, 65.84% and 65.48% respectively, compared to the accuracy of EWB of 59.86%. In addition to rough sets providing the plausabilities of the estimated HIV status, they also provide the linguistic rules describing how the demographic parameters drive the risk of HIV.
A density functional global optimisation study of neutral 8-atom Cu-Ag and Cu-Au clusters
NASA Astrophysics Data System (ADS)
Heard, Christopher J.; Johnston, Roy L.
2013-02-01
The effect of doping on the energetics and dimensionality of eight atom coinage metal subnanometre particles is fully resolved using a genetic algorithm in tandem with on the fly density functional theory calculations to determine the global minima (GM) for Cu n Ag(8- n) and Cu n Au(8- n) clusters. Comparisons are made to previous ab initio work on mono- and bimetallic clusters, with excellent agreement found. Charge transfer and geometric arguments are considered to rationalise the stability of the particular permutational isomers found. An interesting transition between three dimensional and two dimensional GM structures is observed for copper-gold clusters, which is sharper and appears earlier in the doping series than is known for gold-silver particles.
Dolan, Gerry
2017-10-01
The 7th Haemophilia Global Summit was held in Madrid, Spain, in September 2016. With a programme designed, for the 6th consecutive year, by a Scientific Steering Committee of haemophilia experts, the aim of the summit was to share optimal management strategies for haemophilia at all life stages and to provide an opportunity for specialists from across the haemophilia multidisciplinary care team to engage in discussion and debate with leading international experts on current and future areas of research. Topics covered ranged from the optimisation of haemophilia management, emerging issues in clinical care, practical approaches and future perspectives, in addition to patient engagement and empowerment in modern haemophilia care. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Litt, Jonathan S. (Compiler)
2018-01-01
NASA Glenn Research Center hosted a Users' Workshop on the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) on August 21, 2017. The objective of this workshop was to update the user community on the latest features of T-MATS, and to provide a forum to present work performed using T-MATS. Presentations highlighted creative applications and the development of new features and libraries, and emphasized the flexibility and simulation power of T-MATS.
NASA Technical Reports Server (NTRS)
Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek
2002-01-01
To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.
Relative electronic and free energies of octane's unique conformations
NASA Astrophysics Data System (ADS)
Kirschner, Karl N.; Heiden, Wolfgang; Reith, Dirk
2017-06-01
This study reports the geometries and electronic energies of n-octane's unique conformations using perturbation methods that best mimic CCSD(T) results. In total, the fully optimised minima of n-butane (2 conformations), n-pentane (4 conformations), n-hexane (12 conformations) and n-octane (96 conformations) were investigated at several different theory levels and basis sets. We find that DF-MP2.5/aug-cc-pVTZ is in very good agreement with the more expensive CCSD(T) results. At this level, we can clearly confirm the 96 stable minima which were previously found using a reparameterised density functional theory (DFT). Excellent agreement was found between their DFT results and our DF-MP2.5 perturbation results. Subsequent Gibbs free energy calculations, using scaled MP2/aug-cc-pVTZ zero-point vibrational energy and frequencies, indicate a significant temperature dependency of the relative energies, with a change in the predicted global minimum. The results of this work will be important for future computational investigations of fuel-related octane reactions and for optimisation of molecular force fields (e.g. lipids).
Sensor selection cost optimisation for tracking structurally cyclic systems: a P-order solution
NASA Astrophysics Data System (ADS)
Doostmohammadian, M.; Zarrabi, H.; Rabiee, H. R.
2017-08-01
Measurements and sensing implementations impose certain cost in sensor networks. The sensor selection cost optimisation is the problem of minimising the sensing cost of monitoring a physical (or cyber-physical) system. Consider a given set of sensors tracking states of a dynamical system for estimation purposes. For each sensor assume different costs to measure different (realisable) states. The idea is to assign sensors to measure states such that the global cost is minimised. The number and selection of sensor measurements need to ensure the observability to track the dynamic state of the system with bounded estimation error. The main question we address is how to select the state measurements to minimise the cost while satisfying the observability conditions. Relaxing the observability condition for structurally cyclic systems, the main contribution is to propose a graph theoretic approach to solve the problem in polynomial time. Note that polynomial time algorithms are suitable for large-scale systems as their running time is upper-bounded by a polynomial expression in the size of input for the algorithm. We frame the problem as a linear sum assignment with solution complexity of ?.
CEMENTITIOUS BARRIERS PARTNERSHIP FY13 MID-YEAR REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, H.; Flach, G.; Langton, C.
2013-05-01
In FY2013, the Cementitious Barriers Partnership (CBP) is continuing in its effort to develop and enhance software tools demonstrating tangible progress toward fulfilling the objective of developing a set of tools to improve understanding and prediction of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. In FY2012, the CBP released the initial inhouse “Beta-version” of the CBP Software Toolbox, a suite of software for simulating reactive transport in cementitious materials and important degradation phenomena. The current primary software components are LeachXS/ORCHESTRA, STADIUM, and a GoldSim interface for probabilistic analysis of selected degradation scenarios. THAMESmore » is a planned future CBP Toolbox component (FY13/14) focused on simulation of the microstructure of cementitious materials and calculation of resultant hydraulic and constituent mass transfer parameters needed in modeling. This past November, the CBP Software Toolbox Version 1.0 was released that supports analysis of external sulfate attack (including damage mechanics), carbonation, and primary constituent leaching. The LeachXS component embodies an extensive material property measurements database along with chemical speciation and reactive mass transport simulation cases with emphasis on leaching of major, trace and radionuclide constituents from cementitious materials used in DOE facilities, such as Saltstone (Savannah River) and Cast Stone (Hanford), tank closure grouts, and barrier concretes. STADIUM focuses on the physical and structural service life of materials and components based on chemical speciation and reactive mass transport of major cement constituents and aggressive species (e.g., chloride, sulfate, etc.). The CBP issued numerous reports and other documentation that accompanied the “Version 1.0” release including a CBP Software Toolbox User Guide and Installation Guide. These documents, as well as, the presentations from the CBP Software Toolbox Demonstration and User Workshop, which are briefly described below, can be accessed from the CBP webpage at http://cementbarriers.org/. The website was recently modified to describe the CBP Software Toolbox and includes an interest form for application to use the software. The CBP FY13 program is continuing research to improve and enhance the simulation tools as well as develop new tools that model other key degradation phenomena not addressed in Version 1.0. Also efforts to continue to verify the various simulation tools thru laboratory experiments and analysis of field specimens are ongoing to quantify and reduce the uncertainty associated with performance assessments are ongoing. This mid-year report also includes both a summary on the FY13 software accomplishments in addition to the release of Version 1.0 of the CBP Software Toolbox and the various experimental programs that are providing data for calibration and validation of the CBP developed software. The focus this year for experimental studies was to measure transport in cementitious material by utilization of a leaching method and reduction capacity of saltstone field samples. Results are being used to calibrate and validate the updated carbonation model.« less
NASA Astrophysics Data System (ADS)
Kaliszewski, M.; Mazuro, P.
2016-09-01
Simulated Annealing Method of optimisation for the sealing piston ring geometry is tested. The aim of optimisation is to develop ring geometry which would exert demanded pressure on a cylinder just while being bended to fit the cylinder. Method of FEM analysis of an arbitrary piston ring geometry is applied in an ANSYS software. The demanded pressure function (basing on formulae presented by A. Iskra) as well as objective function are introduced. Geometry definition constructed by polynomials in radial coordinate system is delivered and discussed. Possible application of Simulated Annealing Method in a piston ring optimisation task is proposed and visualised. Difficulties leading to possible lack of convergence of optimisation are presented. An example of an unsuccessful optimisation performed in APDL is discussed. Possible line of further optimisation improvement is proposed.
A world without bacterial meningitis: how genomic epidemiology can inform vaccination strategy.
Rodrigues, Charlene M C; Maiden, Martin C J
2018-01-01
Bacterial meningitis remains an important cause of global morbidity and mortality. Although effective vaccinations exist and are being increasingly used worldwide, bacterial diversity threatens their impact and the ultimate goal of eliminating the disease. Through genomic epidemiology, we can appreciate bacterial population structure and its consequences for transmission dynamics, virulence, antimicrobial resistance, and development of new vaccines. Here, we review what we have learned through genomic epidemiological studies, following the rapid implementation of whole genome sequencing that can help to optimise preventative strategies for bacterial meningitis.
NASA Astrophysics Data System (ADS)
Kutsch, W. L.; Zhao, Z.; Hardisty, A.; Hellström, M.; Chin, Y.; Magagna, B.; Asmi, A.; Papale, D.; Pfeil, B.; Atkinson, M.
2017-12-01
Environmental Research Infrastructures (ENVRIs) are expected to become important pillars not only for supporting their own scientific communities, but also a) for inter-disciplinary research and b) for the European Earth Observation Program Copernicus as a contribution to the Global Earth Observation System of Systems (GEOSS) or global thematic data networks. As such, it is very important that data-related activities of the ENVRIs will be well integrated. This requires common policies, models and e-infrastructure to optimise technological implementation, define workflows, and ensure coordination, harmonisation, integration and interoperability of data, applications and other services. The key is interoperating common metadata systems (utilising a richer metadata model as the `switchboard' for interoperation with formal syntax and declared semantics). The metadata characterises data, services, users and ICT resources (including sensors and detectors). The European Cluster Project ENVRIplus has developed a reference model (ENVRI RM) for common data infrastructure architecture to promote interoperability among ENVRIs. The presentation will provide an overview of recent progress and give examples for the integration of ENVRI data in global integration networks.
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling
Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078
Howard, Steven J.; Melhuish, Edward
2016-01-01
Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years Toolbox (EYT) offers substantial advantages for early assessment of language, EF, self-regulation, and social development. In the current study, results of our large-scale administration of this toolbox to 1,764 preschool and early primary school students indicated very good reliability, convergent validity with existing measures, and developmental sensitivity. Results were also suggestive of better capture of children’s emerging abilities relative to comparison measures. Preliminary norms are presented, showing a clear developmental trajectory across half-year age groups. The accessibility of the EYT, as well as its advantages over existing measures, offers considerably enhanced opportunities for objective measurement of young children’s abilities to enable research and educational applications. PMID:28503022
The MONGOOSE Rational Arithmetic Toolbox.
Le, Christopher; Chindelevitch, Leonid
2018-01-01
The modeling of metabolic networks has seen a rapid expansion following the complete sequencing of thousands of genomes. The constraint-based modeling framework has emerged as one of the most popular approaches to reconstructing and analyzing genome-scale metabolic models. Its main assumption is that of a quasi-steady-state, requiring that the production of each internal metabolite be balanced by its consumption. However, due to the multiscale nature of the models, the large number of reactions and metabolites, and the use of floating-point arithmetic for the stoichiometric coefficients, ensuring that this assumption holds can be challenging.The MONGOOSE toolbox addresses this problem by using rational arithmetic, thus ensuring that models are analyzed in a reproducible manner and consistently with modeling assumptions. In this chapter we present a protocol for the complete analysis of a metabolic network model using the MONGOOSE toolbox, via its newly developed GUI, and describe how it can be used as a model-checking platform both during and after the model construction process.
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.
Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.
BOLDSync: a MATLAB-based toolbox for synchronized stimulus presentation in functional MRI.
Joshi, Jitesh; Saharan, Sumiti; Mandal, Pravat K
2014-02-15
Precise and synchronized presentation of paradigm stimuli in functional magnetic resonance imaging (fMRI) is central to obtaining accurate information about brain regions involved in a specific task. In this manuscript, we present a new MATLAB-based toolbox, BOLDSync, for synchronized stimulus presentation in fMRI. BOLDSync provides a user friendly platform for design and presentation of visual, audio, as well as multimodal audio-visual (AV) stimuli in functional imaging experiments. We present simulation experiments that demonstrate the millisecond synchronization accuracy of BOLDSync, and also illustrate the functionalities of BOLDSync through application to an AV fMRI study. BOLDSync gains an advantage over other available proprietary and open-source toolboxes by offering a user friendly and accessible interface that affords both precision in stimulus presentation and versatility across various types of stimulus designs and system setups. BOLDSync is a reliable, efficient, and versatile solution for synchronized stimulus presentation in fMRI study. Copyright © 2013 Elsevier B.V. All rights reserved.
Lefebvre, Baptiste; Deny, Stéphane; Gardella, Christophe; Stimberg, Marcel; Jetter, Florian; Zeck, Guenther; Picaud, Serge; Duebel, Jens
2018-01-01
In recent years, multielectrode arrays and large silicon probes have been developed to record simultaneously between hundreds and thousands of electrodes packed with a high density. However, they require novel methods to extract the spiking activity of large ensembles of neurons. Here, we developed a new toolbox to sort spikes from these large-scale extracellular data. To validate our method, we performed simultaneous extracellular and loose patch recordings in rodents to obtain ‘ground truth’ data, where the solution to this sorting problem is known for one cell. The performance of our algorithm was always close to the best expected performance, over a broad range of signal-to-noise ratios, in vitro and in vivo. The algorithm is entirely parallelized and has been successfully tested on recordings with up to 4225 electrodes. Our toolbox thus offers a generic solution to sort accurately spikes for up to thousands of electrodes. PMID:29557782
Developing a Fluorescent Toolbox To Shed Light on the Mysteries of RNA.
Alexander, Seth C; Devaraj, Neal K
2017-10-03
Technologies that detect and image RNA have illuminated the complex roles played by RNA, redefining the traditional and superficial role first outlined by the central dogma of biology. Because there is such a wide diversity of RNA structure arising from an assortment of functions within biology, a toolbox of approaches have emerged for investigation of this important class of biomolecules. These methods are necessary to detect and elucidate the localization and dynamics of specific RNAs and in doing so unlock our understanding of how RNA dysregulation leads to disease. Current methods for detecting and imaging RNA include in situ hybridization techniques, fluorescent aptamers, RNA binding proteins fused to fluorescent reporters, and covalent labeling strategies. Because of the inherent diversity of these methods, each approach comes with a set of strengths and limitations that leave room for future improvement. This perspective seeks to highlight the most recent advances and remaining challenges for the wide-ranging toolbox of technologies that illuminate RNA's contribution to cellular complexity.
GenSSI 2.0: multi-experiment structural identifiability analysis of SBML models.
Ligon, Thomas S; Fröhlich, Fabian; Chis, Oana T; Banga, Julio R; Balsa-Canto, Eva; Hasenauer, Jan
2018-04-15
Mathematical modeling using ordinary differential equations is used in systems biology to improve the understanding of dynamic biological processes. The parameters of ordinary differential equation models are usually estimated from experimental data. To analyze a priori the uniqueness of the solution of the estimation problem, structural identifiability analysis methods have been developed. We introduce GenSSI 2.0, an advancement of the software toolbox GenSSI (Generating Series for testing Structural Identifiability). GenSSI 2.0 is the first toolbox for structural identifiability analysis to implement Systems Biology Markup Language import, state/parameter transformations and multi-experiment structural identifiability analysis. In addition, GenSSI 2.0 supports a range of MATLAB versions and is computationally more efficient than its previous version, enabling the analysis of more complex models. GenSSI 2.0 is an open-source MATLAB toolbox and available at https://github.com/genssi-developer/GenSSI. thomas.ligon@physik.uni-muenchen.de or jan.hasenauer@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.
Comparative and Quantitative Global Proteomics Approaches: An Overview
Deracinois, Barbara; Flahaut, Christophe; Duban-Deweer, Sophie; Karamanos, Yannis
2013-01-01
Proteomics became a key tool for the study of biological systems. The comparison between two different physiological states allows unravelling the cellular and molecular mechanisms involved in a biological process. Proteomics can confirm the presence of proteins suggested by their mRNA content and provides a direct measure of the quantity present in a cell. Global and targeted proteomics strategies can be applied. Targeted proteomics strategies limit the number of features that will be monitored and then optimise the methods to obtain the highest sensitivity and throughput for a huge amount of samples. The advantage of global proteomics strategies is that no hypothesis is required, other than a measurable difference in one or more protein species between the samples. Global proteomics methods attempt to separate quantify and identify all the proteins from a given sample. This review highlights only the different techniques of separation and quantification of proteins and peptides, in view of a comparative and quantitative global proteomics analysis. The in-gel and off-gel quantification of proteins will be discussed as well as the corresponding mass spectrometry technology. The overview is focused on the widespread techniques while keeping in mind that each approach is modular and often recovers the other. PMID:28250403
Advanced functional network analysis in the geosciences: The pyunicorn package
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen
2013-04-01
Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.
Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox
NASA Astrophysics Data System (ADS)
Karimpour, Arash; Chen, Qin
2017-09-01
There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana's estuaries.
A CRISPR/Cas9 Toolbox for Multiplexed Plant Genome Editing and Transcriptional Regulation.
Lowder, Levi G; Zhang, Dengwei; Baltes, Nicholas J; Paul, Joseph W; Tang, Xu; Zheng, Xuelian; Voytas, Daniel F; Hsieh, Tzung-Fu; Zhang, Yong; Qi, Yiping
2015-10-01
The relative ease, speed, and biological scope of clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated Protein9 (Cas9)-based reagents for genomic manipulations are revolutionizing virtually all areas of molecular biosciences, including functional genomics, genetics, applied biomedical research, and agricultural biotechnology. In plant systems, however, a number of hurdles currently exist that limit this technology from reaching its full potential. For example, significant plant molecular biology expertise and effort is still required to generate functional expression constructs that allow simultaneous editing, and especially transcriptional regulation, of multiple different genomic loci or multiplexing, which is a significant advantage of CRISPR/Cas9 versus other genome-editing systems. To streamline and facilitate rapid and wide-scale use of CRISPR/Cas9-based technologies for plant research, we developed and implemented a comprehensive molecular toolbox for multifaceted CRISPR/Cas9 applications in plants. This toolbox provides researchers with a protocol and reagents to quickly and efficiently assemble functional CRISPR/Cas9 transfer DNA constructs for monocots and dicots using Golden Gate and Gateway cloning methods. It comes with a full suite of capabilities, including multiplexed gene editing and transcriptional activation or repression of plant endogenous genes. We report the functionality and effectiveness of this toolbox in model plants such as tobacco (Nicotiana benthamiana), Arabidopsis (Arabidopsis thaliana), and rice (Oryza sativa), demonstrating its utility for basic and applied plant research. © 2015 American Society of Plant Biologists. All Rights Reserved.
Optimisation des trajectoires verticales par la methode de la recherche de l'harmonie =
NASA Astrophysics Data System (ADS)
Ruby, Margaux
Face au rechauffement climatique, les besoins de trouver des solutions pour reduire les emissions de CO2 sont urgentes. L'optimisation des trajectoires est un des moyens pour reduire la consommation de carburant lors d'un vol. Afin de determiner la trajectoire optimale de l'avion, differents algorithmes ont ete developpes. Le but de ces algorithmes est de reduire au maximum le cout total d'un vol d'un avion qui est directement lie a la consommation de carburant et au temps de vol. Un autre parametre, nomme l'indice de cout est considere dans la definition du cout de vol. La consommation de carburant est fournie via des donnees de performances pour chaque phase de vol. Dans le cas de ce memoire, les phases d'un vol complet, soit, une phase de montee, une phase de croisiere et une phase de descente, sont etudies. Des " marches de montee " etaient definies comme des montees de 2 000ft lors de la phase de croisiere sont egalement etudiees. L'algorithme developpe lors de ce memoire est un metaheuristique, nomme la recherche de l'harmonie, qui, concilie deux types de recherches : la recherche locale et la recherche basee sur une population. Cet algorithme se base sur l'observation des musiciens lors d'un concert, ou plus exactement sur la capacite de la musique a trouver sa meilleure harmonie, soit, en termes d'optimisation, le plus bas cout. Differentes donnees d'entrees comme le poids de l'avion, la destination, la vitesse de l'avion initiale et le nombre d'iterations doivent etre, entre autre, fournies a l'algorithme pour qu'il soit capable de determiner la solution optimale qui est definie comme : [Vitesse de montee, Altitude, Vitesse de croisiere, Vitesse de descente]. L'algorithme a ete developpe a l'aide du logiciel MATLAB et teste pour plusieurs destinations et plusieurs poids pour un seul type d'avion. Pour la validation, les resultats obtenus par cet algorithme ont ete compares dans un premier temps aux resultats obtenus suite a une recherche exhaustive qui a utilisee toutes les combinaisons possibles. Cette recherche exhaustive nous a fourni l'optimal global; ainsi, la solution de notre algorithme doit se rapprocher le plus possible de la recherche exhaustive afin de prouver qu'il donne des resultats proche de l'optimal global. Une seconde comparaison a ete effectuee entre les resultats fournis par l'algorithme et ceux du Flight Management System (FMS) qui est un systeme d'avionique situe dans le cockpit de l'avion fournissant la route a suivre afin d'optimiser la trajectoire. Le but est de prouver que l'algorithme de la recherche de l'harmonie donne de meilleurs resultats que l'algorithme implemente dans le FMS.
NASA Astrophysics Data System (ADS)
Fritzsche, Matthias; Kittel, Konstantin; Blankenburg, Alexander; Vajna, Sándor
2012-08-01
The focus of this paper is to present a method of multidisciplinary design optimisation based on the autogenetic design theory (ADT) that provides methods, which are partially implemented in the optimisation software described here. The main thesis of the ADT is that biological evolution and the process of developing products are mainly similar, i.e. procedures from biological evolution can be transferred into product development. In order to fulfil requirements and boundary conditions of any kind (that may change at any time), both biological evolution and product development look for appropriate solution possibilities in a certain area, and try to optimise those that are actually promising by varying parameters and combinations of these solutions. As the time necessary for multidisciplinary design optimisations is a critical aspect in product development, ways to distribute the optimisation process with the effective use of unused calculating capacity, can reduce the optimisation time drastically. Finally, a practical example shows how ADT methods and distributed optimising are applied to improve a product.
NASA Astrophysics Data System (ADS)
Niemeijer, Sander
2017-04-01
The ESA Atmospheric Toolbox (BEAT) is one of the ESA Sentinel Toolboxes. It consists of a set of software components to read, analyze, and visualize a wide range of atmospheric data products. In addition to the upcoming Sentinel-5P mission it supports a wide range of other atmospheric data products, including those of previous ESA missions, ESA Third Party missions, Copernicus Atmosphere Monitoring Service (CAMS), ground based data, etc. The toolbox consists of three main components that are called CODA, HARP and VISAN. CODA provides interfaces for direct reading of data from earth observation data files. These interfaces consist of command line applications, libraries, direct interfaces to scientific applications (IDL and MATLAB), and direct interfaces to programming languages (C, Fortran, Python, and Java). CODA provides a single interface to access data in a wide variety of data formats, including ASCII, binary, XML, netCDF, HDF4, HDF5, CDF, GRIB, RINEX, and SP3. HARP is a toolkit for reading, processing and inter-comparing satellite remote sensing data, model data, in-situ data, and ground based remote sensing data. The main goal of HARP is to assist in the inter-comparison of datasets. By appropriately chaining calls to HARP command line tools one can pre-process datasets such that two datasets that need to be compared end up having the same temporal/spatial grid, same data format/structure, and same physical unit. The toolkit comes with its own data format conventions, the HARP format, which is based on netcdf/HDF. Ingestion routines (based on CODA) allow conversion from a wide variety of atmospheric data products to this common format. In addition, the toolbox provides a wide range of operations to perform conversions on the data such as unit conversions, quantity conversions (e.g. number density to volume mixing ratios), regridding, vertical smoothing using averaging kernels, collocation of two datasets, etc. VISAN is a cross-platform visualization and analysis application for atmospheric data and can be used to visualize and analyze the data that you retrieve using the CODA and HARP interfaces. The application uses the Python language as the means through which you provide commands to the application. The Python interfaces for CODA and HARP are included so you can directly ingest product data from within VISAN. Powerful visualization functionality for 2D plots and geographical plots in VISAN will allow you to directly visualize the ingested data. All components from the ESA Atmospheric Toolbox are Open Source and freely available. Software packages can be downloaded from the BEAT website: http://stcorp.nl/beat/
Derimay, François; Souteyrand, Geraud; Motreff, Pascal; Rioufol, Gilles; Finet, Gerard
2017-10-13
The rePOT (proximal optimisation technique) sequence proved significantly more effective than final kissing balloon (FKB) with two drug-eluting stents (DES) in a bench test. We sought to validate efficacy experimentally in a large range of latest-generation DES. On left main fractal coronary bifurcation bench models, five samples of each of the six main latest-generation DES (Coroflex ISAR, Orsiro, Promus PREMIER, Resolute Integrity, Ultimaster, XIENCE Xpedition) were implanted on rePOT (initial POT, side branch inflation, final POT). Proximal elliptical ratio, side branch obstruction (SBO), stent overstretch and strut malapposition were quantified on 2D and 3D OCT. Results were compared to FKB with Promus PREMIER. Whatever the design, rePOT maintained vessel circularity compared to FKB: elliptical ratio, 1.02±0.01 to 1.04±0.01 vs. 1.26±0.02 (p<0.05). Global strut malapposition was much lower: 2.6±1.4% to 0.1±0.2% vs. 40.4±8.4% for FKB (p<0.05). However, only Promus PREMIER and XIENCE Xpedition achieved significantly less SBO: respectively, 5.6±3.5% and 10.0±5.3% vs. 23.5±5.7% for FKB (p<0.05). Platform design differences had little influence on the excellent results of rePOT versus FKB. RePOT optimised strut apposition without proximal elliptical deformation in the six main latest-generation DES. Thickness and design characteristics seemed relevant for optimising SBO.
Integration of environmental aspects in modelling and optimisation of water supply chains.
Koleva, Mariya N; Calderón, Andrés J; Zhang, Di; Styan, Craig A; Papageorgiou, Lazaros G
2018-04-26
Climate change becomes increasingly more relevant in the context of water systems planning. Tools are necessary to provide the most economic investment option considering the reliability of the infrastructure from technical and environmental perspectives. Accordingly, in this work, an optimisation approach, formulated as a spatially-explicit multi-period Mixed Integer Linear Programming (MILP) model, is proposed for the design of water supply chains at regional and national scales. The optimisation framework encompasses decisions such as installation of new purification plants, capacity expansion, and raw water trading schemes. The objective is to minimise the total cost incurring from capital and operating expenditures. Assessment of available resources for withdrawal is performed based on hydrological balances, governmental rules and sustainable limits. In the light of the increasing importance of reliability of water supply, a second objective, seeking to maximise the reliability of the supply chains, is introduced. The epsilon-constraint method is used as a solution procedure for the multi-objective formulation. Nash bargaining approach is applied to investigate the fair trade-offs between the two objectives and find the Pareto optimality. The models' capability is addressed through a case study based on Australia. The impact of variability in key input parameters is tackled through the implementation of a rigorous global sensitivity analysis (GSA). The findings suggest that variations in water demand can be more disruptive for the water supply chain than scenarios in which rainfalls are reduced. The frameworks can facilitate governmental multi-aspect decision making processes for the adequate and strategic investments of regional water supply infrastructure. Copyright © 2018. Published by Elsevier B.V.
Global reaction mechanism for the auto-ignition of full boiling range gasoline and kerosene fuels
NASA Astrophysics Data System (ADS)
Vandersickel, A.; Wright, Y. M.; Boulouchos, K.
2013-12-01
Compact reaction schemes capable of predicting auto-ignition are a prerequisite for the development of strategies to control and optimise homogeneous charge compression ignition (HCCI) engines. In particular for full boiling range fuels exhibiting two stage ignition a tremendous demand exists in the engine development community. The present paper therefore meticulously assesses a previous 7-step reaction scheme developed to predict auto-ignition for four hydrocarbon blends and proposes an important extension of the model constant optimisation procedure, allowing for the model to capture not only ignition delays, but also the evolutions of representative intermediates and heat release rates for a variety of full boiling range fuels. Additionally, an extensive validation of the later evolutions by means of various detailed n-heptane reaction mechanisms from literature has been presented; both for perfectly homogeneous, as well as non-premixed/stratified HCCI conditions. Finally, the models potential to simulate the auto-ignition of various full boiling range fuels is demonstrated by means of experimental shock tube data for six strongly differing fuels, containing e.g. up to 46.7% cyclo-alkanes, 20% napthalenes or complex branched aromatics such as methyl- or ethyl-napthalene. The good predictive capability observed for each of the validation cases as well as the successful parameterisation for each of the six fuels, indicate that the model could, in principle, be applied to any hydrocarbon fuel, providing suitable adjustments to the model parameters are carried out. Combined with the optimisation strategy presented, the model therefore constitutes a major step towards the inclusion of real fuel kinetics into full scale HCCI engine simulations.
Reyes, Antonio Jose; Ramcharan, Kanterpersad
2016-08-02
We report a patient driven home care system that successfully assisted 24/7 with the management of a 68-year-old woman after a stroke-a global illness. The patient's caregiver and physician used computer devices, smartphones and internet access for information exchange. Patient, caregiver, family and physician satisfaction, coupled with outcome and cost were indictors of quality of care. The novelty of this basic model of teleneurology is characterised by implementing a patient/caregiver driven system designed to improve access to cost-efficient neurological care, which has potential for use in primary, secondary and tertiary levels of healthcare in rural and underserved regions of the world. We suggest involvement of healthcare stakeholders in teleneurology to address this global problem of limited access to neurological care. This model can facilitate the management of neurological diseases, impact on outcome, reduce frequency of consultations and hospitalisations, facilitate teaching of healthcare workers and promote research. 2016 BMJ Publishing Group Ltd.
Oral health promotion and education messages in Live.Learn.Laugh. projects.
Horn, Virginie; Phantumvanit, Prathip
2014-10-01
The FDI-Unilever Live.Learn.Laugh. phase 2 partnership involved dissemination of the key oral health message of encouraging 'twice-daily toothbrushing with fluoride toothpaste' and education of people worldwide by FDI, National Dental Associations, the Unilever Oral Care global team and local brands. The dissemination and education process used different methodologies, each targeting specific groups, namely: mother and child (Project option A); schoolchildren (Project option B); dentists and patients (Project option C); and specific communities (Project option D). Altogether, the partnership implemented 29 projects in 27 countries. These consisted of educational interventions, evaluations including (in some cases) clinical assessment, together with communication activities at both global and local levels, to increase the reach of the message to a broader population worldwide. The phase 2 experience reveals the strength of such a public-private partnership approach in tackling global oral health issues by creating synergies between partners and optimising the promotion and education process. © 2014 FDI World Dental Federation.
Farrer, Emily C; Ashton, Isabel W; Knape, Jonas; Suding, Katharine N
2014-04-01
Two sources of complexity make predicting plant community response to global change particularly challenging. First, realistic global change scenarios involve multiple drivers of environmental change that can interact with one another to produce non-additive effects. Second, in addition to these direct effects, global change drivers can indirectly affect plants by modifying species interactions. In order to tackle both of these challenges, we propose a novel population modeling approach, requiring only measurements of abundance and climate over time. To demonstrate the applicability of this approach, we model population dynamics of eight abundant plant species in a multifactorial global change experiment in alpine tundra where we manipulated nitrogen, precipitation, and temperature over 7 years. We test whether indirect and interactive effects are important to population dynamics and whether explicitly incorporating species interactions can change predictions when models are forecast under future climate change scenarios. For three of the eight species, population dynamics were best explained by direct effect models, for one species neither direct nor indirect effects were important, and for the other four species indirect effects mattered. Overall, global change had negative effects on species population growth, although species responded to different global change drivers, and single-factor effects were slightly more common than interactive direct effects. When the fitted population dynamic models were extrapolated under changing climatic conditions to the end of the century, forecasts of community dynamics and diversity loss were largely similar using direct effect models that do not explicitly incorporate species interactions or best-fit models; however, inclusion of species interactions was important in refining the predictions for two of the species. The modeling approach proposed here is a powerful way of analyzing readily available datasets which should be added to our toolbox to tease apart complex drivers of global change. © 2013 John Wiley & Sons Ltd.
Optimisation in radiotherapy. III: Stochastic optimisation algorithms and conclusions.
Ebert, M
1997-12-01
This is the final article in a three part examination of optimisation in radiotherapy. Previous articles have established the bases and form of the radiotherapy optimisation problem, and examined certain types of optimisation algorithm, namely, those which perform some form of ordered search of the solution space (mathematical programming), and those which attempt to find the closest feasible solution to the inverse planning problem (deterministic inversion). The current paper examines algorithms which search the space of possible irradiation strategies by stochastic methods. The resulting iterative search methods move about the solution space by sampling random variates, which gradually become more constricted as the algorithm converges upon the optimal solution. This paper also discusses the implementation of optimisation in radiotherapy practice.
Adding tools to the open source toolbox: The Internet
NASA Technical Reports Server (NTRS)
Porth, Tricia
1994-01-01
The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.
Gigerenzer, Gerd
2008-01-01
The adaptive toolbox is a Darwinian-inspired theory that conceives of the mind as a modular system that is composed of heuristics, their building blocks, and evolved capacities. The study of the adaptive toolbox is descriptive and analyzes the selection and structure of heuristics in social and physical environments. The study of ecological rationality is prescriptive and identifies the structure of environments in which specific heuristics either succeed or fail. Results have been used for designing heuristics and environments to improve professional decision making in the real world. © 2008 Association for Psychological Science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Zhang
GIXSGUIis a MATLAB toolbox that offers both a graphical user interface and script-based access to visualize and process grazing-incidence X-ray scattering data from nanostructures on surfaces and in thin films. It provides routine surface scattering data reduction methods such as geometric correction, one-dimensional intensity linecut, two-dimensional intensity reshapingetc. Three-dimensional indexing is also implemented to determine the space group and lattice parameters of buried organized nanoscopic structures in supported thin films.
Influence of some design parameters on the thermal performance of domestic refrigerator appliances
NASA Astrophysics Data System (ADS)
Rebora, Alessandro; Senarega, Maurizio; Tagliafico, Luca A.
2006-07-01
This paper presents a thermal study on chest-freezers, the small refrigerators used in domestic and supermarket applications. A thermal and energy model of a particular kind of these refrigerators, the “hot-wall” (or “skin condenser”) refrigerator, is developed and used to perform sensitivity and design optimisation analysis for given working temperatures and useful volume of the refrigerated cell. A finite-element heat transfer model of the refrigerator box is coupled to the complete thermodynamic model of the refrigerating plant, including real working conditions (compressor efficiency, friction pressure losses and so on). A sensitivity study of the main design parameters affecting the global refrigerator performance has been developed (for fixed working temperatures) with reference to the thickness of the metallic plates, to the evaporator and condenser tube diameters and to the evaporator tube pitch (with fixed evaporator-to-condenser tube pitch ratio). The results obtained show that the proposed sensitivity analysis can yield quite reliable results (in comparison with much more complex, albeit more accurate mathematical optimisation algorithms) using small computational resources. The great importance of 2-D heat conduction in the metallic plates is shown, evidencing how the plate thickness and the evaporator and condenser tube diameters affect the global performance of the system according to the well-known “fin efficiency” effect. The influence of the evaporator and condenser tube diameters on the friction pressure losses is also outlined. Some practical suggestions are made in conclusion, regarding the criteria which should be adopted in the thermal design of a hot-wall refrigerator.
NASA Astrophysics Data System (ADS)
Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.
2017-09-01
This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.
Metaheuristic optimisation methods for approximate solving of singular boundary value problems
NASA Astrophysics Data System (ADS)
Sadollah, Ali; Yadav, Neha; Gao, Kaizhou; Su, Rong
2017-07-01
This paper presents a novel approximation technique based on metaheuristics and weighted residual function (WRF) for tackling singular boundary value problems (BVPs) arising in engineering and science. With the aid of certain fundamental concepts of mathematics, Fourier series expansion, and metaheuristic optimisation algorithms, singular BVPs can be approximated as an optimisation problem with boundary conditions as constraints. The target is to minimise the WRF (i.e. error function) constructed in approximation of BVPs. The scheme involves generational distance metric for quality evaluation of the approximate solutions against exact solutions (i.e. error evaluator metric). Four test problems including two linear and two non-linear singular BVPs are considered in this paper to check the efficiency and accuracy of the proposed algorithm. The optimisation task is performed using three different optimisers including the particle swarm optimisation, the water cycle algorithm, and the harmony search algorithm. Optimisation results obtained show that the suggested technique can be successfully applied for approximate solving of singular BVPs.
Optimisation study of a vehicle bumper subsystem with fuzzy parameters
NASA Astrophysics Data System (ADS)
Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.
2012-10-01
This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).
Sentinel-3 coverage-driven mission design: Coupling of orbit selection and instrument design
NASA Astrophysics Data System (ADS)
Cornara, S.; Pirondini, F.; Palmade, J. L.
2017-11-01
The first satellite of the Sentinel-3 series was launched in February 2016. Sentinel-3 payload suite encompasses the Ocean and Land Colour Instrument (OLCI) with a swath of 1270 km, the Sea and Land Surface Temperature Radiometer (SLSTR) yielding a dual-view scan with swaths of 1420 km (nadir) and 750 km (oblique view), the Synthetic Aperture Radar Altimeter (SRAL) working in Ku-band and C-band, and the dual-frequency Microwave Radiometer (MWR). In the early stages of mission and system design, the main driver for the Sentinel-3 reference orbit selection was the requirement to achieve a revisit time of two days or less globally over ocean areas with two satellites (i.e. 4-day global coverage with one satellite). The orbit selection was seamlessly coupled with the OLCI instrument design in terms of field of view (FoV) definition driven by the observation zenith angle (OZA) and sunglint constraints applied to ocean observations. The criticality of the global coverage requirement for ocean monitoring derives from the sunglint phenomenon, i.e. the impact on visible channels of the solar ray reflection on the water surface. This constraint was finally overcome thanks to the concurrent optimisation of the orbit parameters, notably the Local Time at Descending Node (LTDN), and the OLCI instrument FoV definition. The orbit selection process started with the identification of orbits with short repeat cycle (2-4 days), firstly to minimise the time required to achieve global coverage with existing constraints, and then to minimise the swath required to obtain global coverage and the maximum required OZA. This step yielded the selection of a 4-day repeat cycle orbit, thus allowing 2-day coverage with two adequately spaced satellites. Then suitable candidate orbits with higher repeat cycles were identified in the proximity of the selected altitudes and the reference orbit was ultimately chosen. Rationale was to keep the swath for global coverage as close as possible to the previous optimum value, but to tailor the repeat cycle length (i.e. the ground-track grid) to optimise the topography mission performances. The final choice converged on the sun-synchronous orbit 14 + 7/27, reference altitude ∼800 km, LTDN = 10h00. Extensive coverage analyses were carried out to characterise the mission performance and the fulfilment of the requirements, encompassing revisit time, number of acquisitions, observation viewing geometry and swath properties. This paper presents a comprehensive overview of the Sentinel-3 orbit selection, starting from coverage requirements and highlighting the close interaction with the instrument design activity.
Self-Efficacy Buffers the Relationship between Educational Disadvantage and Executive Functioning
Zahodne, Laura B.; Nowinski, Cindy J.; Gershon, Richard C.; Manly, Jennifer J.
2016-01-01
Objective Previous studies showed that control beliefs are more strongly related to global cognition and mortality among adults with low education, providing preliminary evidence that self-efficacy buffers against the negative impact of educational disadvantage on physical and cognitive health. The current study extends these findings to a nationally-representative sample of men and women aged 30 to 85 and explores which cognitive domains are most strongly associated with self-efficacy, educational attainment, and their interaction. Method Data were obtained from 1,032 adult (30-85) participants in the United States norming study for the NIH Toolbox. Self-efficacy, executive functioning, working memory, processing speed, episodic memory, and vocabulary were assessed with the NIH Toolbox. Multivariate analysis of covariance and follow-up regressions tested the hypothesis that self-efficacy would be more strongly related to cognitive performance among individuals with lower education, controlling for age, sex, race, ethnicity, education, reading level, testing language, and depressive symptoms. Results Higher education was associated with higher self-efficacy and better performance on all cognitive tests. Higher self-efficacy was associated with better set-switching and attention/inhibition. Significant self-efficacy by education interactions indicated that associations between self-efficacy and executive abilities were stronger for individuals with lower education. Specifically, individuals with low education but high self-efficacy performed similarly to individuals with high education. Conclusions This study provides evidence that self-efficacy beliefs buffer against the negative effects of low educational attainment on executive functioning. These results have implications for future policy and/or intervention work aimed at reducing the deleterious effects of educational disadvantage on later cognitive health. PMID:25877284
Chemical methods in the development of eco-efficient wood-based pellet production and technology.
Kuokkanen, Matti; Kuokkanen, Toivo; Stoor, Tuomas; Niinimäki, Jouko; Pohjonen, Veli
2009-09-01
Up to 20 million tons of waste wood biomass per year is left unused in Finland, mainly in the forests during forestry operations, because supply and demand does not meet. As a consequence of high heat energy prices, the looming threat of climate change, the greenhouse effect, and due to global as well as national demands to considerably increase the proportion of renewable energy, there is currently tremendous enthusiasm in Finland to substantially increase pellet production. As part of this European objective to increase the eco- and cost-efficient utilization of bio-energy from the European forest belt, the aim of our research group is - by means of multidisciplinary research, especially through chemical methods - to promote the development of Nordic wood-based pellet production in both the qualitative and the quantitative sense. Wood-based pellets are classified as an emission-neutral fuel, which means that they are free from emission trading in the European Union. The main fields of pellet research and the chemical toolbox that has been developed for these studies, which includes a new specific staining and optical microscope method designed to determine the cross-linking of pellets in the presence of various binding compounds, are described in this paper. As model examples illustrating the benefits of this toolbox, experimental data is presented concerning Finnish wood pellets and corresponding wood-based pellets that include the use of starch-containing waste potato peel residue and commercial lignosulfonate as binding materials. The initial results concerning the use of the developed and optimized specific staining and microscopic method using starch-containing potato peel residue as binding material are presented.
NASA Astrophysics Data System (ADS)
Courault, Romain; Franclet, Alexiane; Bourrand, Kévin; Bilodeau, Clélia; Saïd, Sonia; Cohen, Marianne
2018-05-01
More than others, arctic ecosystems are affected by consequences of global climate changes. The herbivorous plays numerous roles both in Scandinavian natural and cultural landscapes (Forbes et al., 2007). Wild reindeer (Rangifer tarandus L.) herds in Hardangervidda plateau (Norway) constitute one of the isolated populations along Fennoscandia mountain range. The study aims to understand temporal and spatial variability of intra- and inter-annual home ranges extent and geophysical properties. We then characterize phenological variability with Corine Land Cover ecological habitat assessment and bi-monthly NDVI index (MODIS 13Q1, 250 m). Thirdly, we test relationships between reindeer's estimated densities and geophysical factors. All along the study, a Python toolbox ("GRiD") has been mounted and refined to fit with biogeographical expectancies. The toolbox let user's choice of inputs and facilitate then the gathering of raster datasets with given spatial extent of clipping and resolution. The grid generation and cells extraction gives one tabular output, allowing then to easily compute complex geostatistical analysis with regular spreadsheets. Results are based on reindeer's home ranges, associated extent (MODIS tile) and spatial resolution (250 m). Spatial mismatch of 0.6 % has been found between ecological habitat when comparing raw (100 m2) and new dataset (250 m2). Inter-annual home ranges analysis describes differences between inter-seasonal migrations (early spring, end of the summer) and calving or capitalizing times. For intra-annual home ranges, significant correlations have been found between reindeer's estimated densities and both altitudes and phenology. GRiD performance and biogeographical results suggests 1) to enhance geometric accuracy 2) better examine links between estimated densities and NDVI.
Finite element modelling of the foot for clinical application: A systematic review.
Behforootan, Sara; Chatzistergos, Panagiotis; Naemi, Roozbeh; Chockalingam, Nachiappan
2017-01-01
Over the last two decades finite element modelling has been widely used to give new insight on foot and footwear biomechanics. However its actual contribution for the improvement of the therapeutic outcome of different pathological conditions of the foot, such as the diabetic foot, remains relatively limited. This is mainly because finite element modelling has only been used within the research domain. Clinically applicable finite element modelling can open the way for novel diagnostic techniques and novel methods for treatment planning/optimisation which would significantly enhance clinical practice. In this context this review aims to provide an overview of modelling techniques in the field of foot and footwear biomechanics and to investigate their applicability in a clinical setting. Even though no integrated modelling system exists that could be directly used in the clinic and considerable progress is still required, current literature includes a comprehensive toolbox for future work towards clinically applicable finite element modelling. The key challenges include collecting the information that is needed for geometry design, the assignment of material properties and loading on a patient-specific basis and in a cost-effective and non-invasive way. The ultimate challenge for the implementation of any computational system into clinical practice is to ensure that it can produce reliable results for any person that belongs in the population for which it was developed. Consequently this highlights the need for thorough and extensive validation of each individual step of the modelling process as well as for the overall validation of the final integrated system. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Strategies for efficient resolution analysis in full-waveform inversion
NASA Astrophysics Data System (ADS)
Fichtner, A.; van Leeuwen, T.; Trampert, J.
2016-12-01
Full-waveform inversion is developing into a standard method in the seismological toolbox. It combines numerical wave propagation for heterogeneous media with adjoint techniques in order to improve tomographic resolution. However, resolution becomes increasingly difficult to quantify because of the enormous computational requirements. Here we present two families of methods that can be used for efficient resolution analysis in full-waveform inversion. They are based on the targeted extraction of resolution proxies from the Hessian matrix, which is too large to store and to compute explicitly. Fourier methods rest on the application of the Hessian to Earth models with harmonic oscillations. This yields the Fourier spectrum of the Hessian for few selected wave numbers, from which we can extract properties of the tomographic point-spread function for any point in space. Random probing methods use uncorrelated, random test models instead of harmonic oscillations. Auto-correlating the Hessian-model applications for sufficiently many test models also characterises the point-spread function. Both Fourier and random probing methods provide a rich collection of resolution proxies. These include position- and direction-dependent resolution lengths, and the volume of point-spread functions as indicator of amplitude recovery and inter-parameter trade-offs. The computational requirements of these methods are equivalent to approximately 7 conjugate-gradient iterations in full-waveform inversion. This is significantly less than the optimisation itself, which may require tens to hundreds of iterations to reach convergence. In addition to the theoretical foundations of the Fourier and random probing methods, we show various illustrative examples from real-data full-waveform inversion for crustal and mantle structure.
NASA Astrophysics Data System (ADS)
Mérola, Fabienne; Erard, Marie; Fredj, Asma; Pasquier, Hélène
2016-03-01
New fluorescent proteins (FPs) are constantly discovered from natural sources, and submitted to intensive engineering based on random mutagenesis and directed evolution. However, most of these newly developed FPs fail to achieve all the performances required for their bioimaging applications. The design of highly optimised FP-based reporters, simultaneously displaying appropriate colour, multimeric state, chromophore maturation, brightness, photostability and environmental sensitivity will require a better understanding of the structural and dynamic determinants of FP photophysics. The recent development of cyan fluorescent proteins (CFPs) like mCerulean3, mTurquoise2 and Aquamarine brings a different view on these questions, as in this particular case, a step by step evaluation of critical mutations has been performed within a family of spectrally identical and evolutionary close variants. These efforts have led to CFPs with quantum yields close to unity, near single exponential emission decays, high photostability and complete insensitivity to pH, making them ideal choices as energy transfer donors in FRET and FLIM imaging applications. During this process, it was found that a proper amino-acid choice at only two positions (148 and 65) is sufficient to transform the performances of CFPs: with the help of structural and theoretical investigations, we rationalise here how these two positions critically control the CFP photophysics, in the context of FPs derived from the Aequorea victoria species. Today, these results provide a useful toolbox for upgrading the different CFP donors carried by FRET biosensors. They also trace the route towards the de novo design of FP-based optogenetic devices that will be perfectly tailored to dedicated imaging and sensing applications.
Hitchhiker's guide to multi-dimensional plant pathology.
Saunders, Diane G O
2015-02-01
Filamentous pathogens pose a substantial threat to global food security. One central question in plant pathology is how pathogens cause infection and manage to evade or suppress plant immunity to promote disease. With many technological advances over the past decade, including DNA sequencing technology, an array of new tools has become embedded within the toolbox of next-generation plant pathologists. By employing a multidisciplinary approach plant pathologists can fully leverage these technical advances to answer key questions in plant pathology, aimed at achieving global food security. This review discusses the impact of: cell biology and genetics on progressing our understanding of infection structure formation on the leaf surface; biochemical and molecular analysis to study how pathogens subdue plant immunity and manipulate plant processes through effectors; genomics and DNA sequencing technologies on all areas of plant pathology; and new forms of collaboration on accelerating exploitation of big data. As we embark on the next phase in plant pathology, the integration of systems biology promises to provide a holistic perspective of plant–pathogen interactions from big data and only once we fully appreciate these complexities can we design truly sustainable solutions to preserve our resources.
Sharing tools and best practice in Global Sensitivity Analysis within academia and with industry
NASA Astrophysics Data System (ADS)
Wagener, T.; Pianosi, F.; Noacco, V.; Sarrazin, F.
2017-12-01
We have spent years trying to improve the use of global sensitivity analysis (GSA) in earth and environmental modelling. Our efforts included (1) the development of tools that provide easy access to widely used GSA methods, (2) the definition of workflows so that best practice is shared in an accessible way, and (3) the development of algorithms to close gaps in available GSA methods (such as moment independent strategies) and to make GSA applications more robust (such as convergence criteria). These elements have been combined in our GSA Toolbox, called SAFE (www.safetoolbox.info), which has up to now been adopted by over 1000 (largely) academic users worldwide. However, despite growing uptake in academic circles and across a wide range of application areas, transfer to industry applications has been difficult. Initial market research regarding opportunities and barriers for uptake revealed a large potential market, but also highlighted a significant lack of knowledge regarding state-of-the-art methods and their potential value for end-users. We will present examples and discuss our experience so far in trying to overcome these problems and move beyond academia in distributing GSA tools and expertise.
CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, III, F. G.
2016-07-29
One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less
NASA Astrophysics Data System (ADS)
Johnson, S. E.; Vel, S. S.; Cook, A. C.; Song, W. J.; Gerbi, C. C.; Okaya, D. A.
2014-12-01
Owing to the abundance of highly anisotropic minerals in the crust, the Voigt and Reuss bounds on the seismic velocities can be separated by more than 1 km/s. These bounds are determined by modal mineralogy and crystallographic preferred orientations (CPO) of the constituent minerals, but where the true velocities lie between these bounds is determined by other fabric parameters such as the shapes, shape-preferred orientations, and spatial arrangements of grains. Thus, the calculation of accurate bulk stiffness relies on explicitly treating the grain-scale heterogeneity, and the same principle applies at larger scales, for example calculating accurate bulk stiffness for a crustal volume with varying proportions and distributions of folds or shear zones. We have developed stand-alone GUI software - ESP Toolbox - for the calculation of 3D bulk elastic and seismic properties of heterogeneous and polycrystalline materials using image or EBSD data. The GUI includes a number of different homogenization techniques, including Voigt, Reuss, Hill, geometric mean, self-consistent and asymptotic expansion homogenization (AEH) methods. The AEH method, which uses a finite element mesh, is most accurate since it explicitly accounts for elastic interactions of constituent minerals/phases. The user need only specify the microstructure and material properties of the minerals/phases. We use the Toolbox to explore changes in bulk elasticity and related seismic anisotropy caused by specific variables, including: (a) the quartz alpha-beta phase change in rocks with varying proportions of quartz, (b) changes in modal mineralogy and CPO fabric that occur during progressive deformation and metamorphism, and (c) shear zones of varying thickness, abundance and geometry in continental crust. The Toolbox allows rapid sensitivity analysis around these and other variables, and the resulting bulk stiffness matrices can be used to populate volumes for synthetic wave propagation experiments that allow direct visualization of how variables of interest might affect propagation at a variety of scales. Sensitivity analyses also illustrate the value of the more precise AEH method. The ESP Toolbox can be downloaded here: http://umaine.edu/mecheng/faculty-and-staff/senthil-vel/software/
Basic Radar Altimetry Toolbox: Tools to Use Radar Altimetry for Geodesy
NASA Astrophysics Data System (ADS)
Rosmorduc, V.; Benveniste, J. J.; Bronner, E.; Niejmeier, S.
2010-12-01
Radar altimetry is very much a technique expanding its applications and uses. If quite a lot of efforts have been made for oceanography users (including easy-to-use data), the use of those data for geodesy, especially combined witht ESA GOCE mission data is still somehow hard. ESA and CNES thus had the Basic Radar Altimetry Toolbox developed (as well as, on ESA side, the GOCE User Toolbox, both being linked). The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data. The software is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason-1, Envisat, Jason- 2, CryoSat and the future Saral missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. It is an opportunity to teach remote sensing with practical training. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. About 1200 people downloaded it (Summer 2010), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are ongoing, some are in discussion. Examples and Data use cases on geodesy will be presented. BRAT is developed under contract with ESA and CNES.
NASA Astrophysics Data System (ADS)
Thomas, Benjamin A.; Cuplov, Vesna; Bousse, Alexandre; Mendes, Adriana; Thielemans, Kris; Hutton, Brian F.; Erlandsson, Kjell
2016-11-01
Positron emission tomography (PET) images are degraded by a phenomenon known as the partial volume effect (PVE). Approaches have been developed to reduce PVEs, typically through the utilisation of structural information provided by other imaging modalities such as MRI or CT. These methods, known as partial volume correction (PVC) techniques, reduce PVEs by compensating for the effects of the scanner resolution, thereby improving the quantitative accuracy. The PETPVC toolbox described in this paper comprises a suite of methods, both classic and more recent approaches, for the purposes of applying PVC to PET data. Eight core PVC techniques are available. These core methods can be combined to create a total of 22 different PVC techniques. Simulated brain PET data are used to demonstrate the utility of toolbox in idealised conditions, the effects of applying PVC with mismatched point-spread function (PSF) estimates and the potential of novel hybrid PVC methods to improve the quantification of lesions. All anatomy-based PVC techniques achieve complete recovery of the PET signal in cortical grey matter (GM) when performed in idealised conditions. Applying deconvolution-based approaches results in incomplete recovery due to premature termination of the iterative process. PVC techniques are sensitive to PSF mismatch, causing a bias of up to 16.7% in GM recovery when over-estimating the PSF by 3 mm. The recovery of both GM and a simulated lesion was improved by combining two PVC techniques together. The PETPVC toolbox has been written in C++, supports Windows, Mac and Linux operating systems, is open-source and publicly available.
Peng, Shichun; Ma, Yilong; Spetsieris, Phoebe G; Mattis, Paul; Feigin, Andrew; Dhawan, Vijay; Eidelberg, David
2013-01-01
In order to generate imaging biomarkers from disease-specific brain networks, we have implemented a general toolbox to rapidly perform scaled subprofile modeling (SSM) based on principal component analysis (PCA) on brain images of patients and normals. This SSMPCA toolbox can define spatial covariance patterns whose expression in individual subjects can discriminate patients from controls or predict behavioral measures. The technique may depend on differences in spatial normalization algorithms and brain imaging systems. We have evaluated the reproducibility of characteristic metabolic patterns generated by SSMPCA in patients with Parkinson's disease (PD). We used [18F]fluorodeoxyglucose PET scans from PD patients and normal controls. Motor-related (PDRP) and cognition-related (PDCP) metabolic patterns were derived from images spatially normalized using four versions of SPM software (spm99, spm2, spm5 and spm8). Differences between these patterns and subject scores were compared across multiple independent groups of patients and control subjects. These patterns and subject scores were highly reproducible with different normalization programs in terms of disease discrimination and cognitive correlation. Subject scores were also comparable in PD patients imaged across multiple PET scanners. Our findings confirm a very high degree of consistency among brain networks and their clinical correlates in PD using images normalized in four different SPM platforms. SSMPCA toolbox can be used reliably for generating disease-specific imaging biomarkers despite the continued evolution of image preprocessing software in the neuroimaging community. Network expressions can be quantified in individual patients independent of different physical characteristics of PET cameras. PMID:23671030
Peng, Shichun; Ma, Yilong; Spetsieris, Phoebe G; Mattis, Paul; Feigin, Andrew; Dhawan, Vijay; Eidelberg, David
2014-05-01
To generate imaging biomarkers from disease-specific brain networks, we have implemented a general toolbox to rapidly perform scaled subprofile modeling (SSM) based on principal component analysis (PCA) on brain images of patients and normals. This SSMPCA toolbox can define spatial covariance patterns whose expression in individual subjects can discriminate patients from controls or predict behavioral measures. The technique may depend on differences in spatial normalization algorithms and brain imaging systems. We have evaluated the reproducibility of characteristic metabolic patterns generated by SSMPCA in patients with Parkinson's disease (PD). We used [(18) F]fluorodeoxyglucose PET scans from patients with PD and normal controls. Motor-related (PDRP) and cognition-related (PDCP) metabolic patterns were derived from images spatially normalized using four versions of SPM software (spm99, spm2, spm5, and spm8). Differences between these patterns and subject scores were compared across multiple independent groups of patients and control subjects. These patterns and subject scores were highly reproducible with different normalization programs in terms of disease discrimination and cognitive correlation. Subject scores were also comparable in patients with PD imaged across multiple PET scanners. Our findings confirm a very high degree of consistency among brain networks and their clinical correlates in PD using images normalized in four different SPM platforms. SSMPCA toolbox can be used reliably for generating disease-specific imaging biomarkers despite the continued evolution of image preprocessing software in the neuroimaging community. Network expressions can be quantified in individual patients independent of different physical characteristics of PET cameras. Copyright © 2013 Wiley Periodicals, Inc.
MagPy: A Python toolbox for controlling Magstim transcranial magnetic stimulators.
McNair, Nicolas A
2017-01-30
To date, transcranial magnetic stimulation (TMS) studies manipulating stimulation parameters have largely used blocked paradigms. However, altering these parameters on a trial-by-trial basis in Magstim stimulators is complicated by the need to send regular (1Hz) commands to the stimulator. Additionally, effecting such control interferes with the ability to send TMS pulses or simultaneously present stimuli with high-temporal precision. This manuscript presents the MagPy toolbox, a Python software package that provides full control over Magstim stimulators via the serial port. It is able to maintain this control with no impact on concurrent processing, such as stimulus delivery. In addition, a specially-designed "QuickFire" serial cable is specified that allows MagPy to trigger TMS pulses with very low-latency. In a series of experimental simulations, MagPy was able to maintain uninterrupted remote control over the connected Magstim stimulator across all testing sessions. In addition, having MagPy enabled had no effect on stimulus timing - all stimuli were presented for precisely the duration specified. Finally, using the QuickFire cable, MagPy was able to elicit TMS pulses with sub-millisecond latencies. The MagPy toolbox allows for experiments that require manipulating stimulation parameters from trial to trial. Furthermore, it can achieve this in contexts that require tight control over timing, such as those seeking to combine TMS with fMRI or EEG. Together, the MagPy toolbox and QuickFire serial cable provide an effective means for controlling Magstim stimulators during experiments while ensuring high-precision timing. Copyright © 2016 Elsevier B.V. All rights reserved.
Glaser, Johann; Beisteiner, Roland; Bauer, Herbert; Fischmeister, Florian Ph S
2013-11-09
In concurrent EEG/fMRI recordings, EEG data are impaired by the fMRI gradient artifacts which exceed the EEG signal by several orders of magnitude. While several algorithms exist to correct the EEG data, these algorithms lack the flexibility to either leave out or add new steps. The here presented open-source MATLAB toolbox FACET is a modular toolbox for the fast and flexible correction and evaluation of imaging artifacts from concurrently recorded EEG datasets. It consists of an Analysis, a Correction and an Evaluation framework allowing the user to choose from different artifact correction methods with various pre- and post-processing steps to form flexible combinations. The quality of the chosen correction approach can then be evaluated and compared to different settings. FACET was evaluated on a dataset provided with the FMRIB plugin for EEGLAB using two different correction approaches: Averaged Artifact Subtraction (AAS, Allen et al., NeuroImage 12(2):230-239, 2000) and the FMRI Artifact Slice Template Removal (FASTR, Niazy et al., NeuroImage 28(3):720-737, 2005). Evaluation of the obtained results were compared to the FASTR algorithm implemented in the EEGLAB plugin FMRIB. No differences were found between the FACET implementation of FASTR and the original algorithm across all gradient artifact relevant performance indices. The FACET toolbox not only provides facilities for all three modalities: data analysis, artifact correction as well as evaluation and documentation of the results but it also offers an easily extendable framework for development and evaluation of new approaches.
Fayyaz S, S Kiavash; Liu, Xiaoyue Cathy; Zhang, Guohui
2017-01-01
The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George's transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis.
Fayyaz S., S. Kiavash; Zhang, Guohui
2017-01-01
The social functions of urbanized areas are highly dependent on and supported by the convenient access to public transportation systems, particularly for the less privileged populations who have restrained auto ownership. To accurately evaluate the public transit accessibility, it is critical to capture the spatiotemporal variation of transit services. This can be achieved by measuring the shortest paths or minimum travel time between origin-destination (OD) pairs at each time-of-day (e.g. every minute). In recent years, General Transit Feed Specification (GTFS) data has been gaining popularity for between-station travel time estimation due to its interoperability in spatiotemporal analytics. Many software packages, such as ArcGIS, have developed toolbox to enable the travel time estimation with GTFS. They perform reasonably well in calculating travel time between OD pairs for a specific time-of-day (e.g. 8:00 AM), yet can become computational inefficient and unpractical with the increase of data dimensions (e.g. all times-of-day and large network). In this paper, we introduce a new algorithm that is computationally elegant and mathematically efficient to address this issue. An open-source toolbox written in C++ is developed to implement the algorithm. We implemented the algorithm on City of St. George’s transit network to showcase the accessibility analysis enabled by the toolbox. The experimental evidence shows significant reduction on computational time. The proposed algorithm and toolbox presented is easily transferable to other transit networks to allow transit agencies and researchers perform high resolution transit performance analysis. PMID:28981544
MNPBEM - A Matlab toolbox for the simulation of plasmonic nanoparticles
NASA Astrophysics Data System (ADS)
Hohenester, Ulrich; Trügler, Andreas
2012-02-01
MNPBEM is a Matlab toolbox for the simulation of metallic nanoparticles (MNP), using a boundary element method (BEM) approach. The main purpose of the toolbox is to solve Maxwell's equations for a dielectric environment where bodies with homogeneous and isotropic dielectric functions are separated by abrupt interfaces. Although the approach is in principle suited for arbitrary body sizes and photon energies, it is tested (and probably works best) for metallic nanoparticles with sizes ranging from a few to a few hundreds of nanometers, and for frequencies in the optical and near-infrared regime. The toolbox has been implemented with Matlab classes. These classes can be easily combined, which has the advantage that one can adapt the simulation programs flexibly for various applications. Program summaryProgram title: MNPBEM Catalogue identifier: AEKJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v2 No. of lines in distributed program, including test data, etc.: 15 700 No. of bytes in distributed program, including test data, etc.: 891 417 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b) Computer: Any which supports Matlab 7.11.0 (R2010b) Operating system: Any which supports Matlab 7.11.0 (R2010b) RAM: ⩾1 GByte Classification: 18 Nature of problem: Solve Maxwell's equations for dielectric particles with homogeneous dielectric functions separated by abrupt interfaces. Solution method: Boundary element method using electromagnetic potentials. Running time: Depending on surface discretization between seconds and hours.
Boundary element based multiresolution shape optimisation in electrostatics
NASA Astrophysics Data System (ADS)
Bandara, Kosala; Cirak, Fehmi; Of, Günther; Steinbach, Olaf; Zapletal, Jan
2015-09-01
We consider the shape optimisation of high-voltage devices subject to electrostatic field equations by combining fast boundary elements with multiresolution subdivision surfaces. The geometry of the domain is described with subdivision surfaces and different resolutions of the same geometry are used for optimisation and analysis. The primal and adjoint problems are discretised with the boundary element method using a sufficiently fine control mesh. For shape optimisation the geometry is updated starting from the coarsest control mesh with increasingly finer control meshes. The multiresolution approach effectively prevents the appearance of non-physical geometry oscillations in the optimised shapes. Moreover, there is no need for mesh regeneration or smoothing during the optimisation due to the absence of a volume mesh. We present several numerical experiments and one industrial application to demonstrate the robustness and versatility of the developed approach.
Tail mean and related robust solution concepts
NASA Astrophysics Data System (ADS)
Ogryczak, Włodzimierz
2014-01-01
Robust optimisation might be viewed as a multicriteria optimisation problem where objectives correspond to the scenarios although their probabilities are unknown or imprecise. The simplest robust solution concept represents a conservative approach focused on the worst-case scenario results optimisation. A softer concept allows one to optimise the tail mean thus combining performances under multiple worst scenarios. We show that while considering robust models allowing the probabilities to vary only within given intervals, the tail mean represents the robust solution for only upper bounded probabilities. For any arbitrary intervals of probabilities the corresponding robust solution may be expressed by the optimisation of appropriately combined mean and tail mean criteria thus remaining easily implementable with auxiliary linear inequalities. Moreover, we use the tail mean concept to develope linear programming implementable robust solution concepts related to risk averse optimisation criteria.
Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens
2013-01-01
We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215
KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.
Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert
2017-05-15
Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.
ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-01-01
Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270
Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf
2012-05-01
Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/
Pernet, Cyril R.; Wilcox, Rand; Rousselet, Guillaume A.
2012-01-01
Pearson’s correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab(R) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand. PMID:23335907
Pernet, Cyril R; Wilcox, Rand; Rousselet, Guillaume A
2012-01-01
Pearson's correlation measures the strength of the association between two variables. The technique is, however, restricted to linear associations and is overly sensitive to outliers. Indeed, a single outlier can result in a highly inaccurate summary of the data. Yet, it remains the most commonly used measure of association in psychology research. Here we describe a free Matlab((R)) based toolbox (http://sourceforge.net/projects/robustcorrtool/) that computes robust measures of association between two or more random variables: the percentage-bend correlation and skipped-correlations. After illustrating how to use the toolbox, we show that robust methods, where outliers are down weighted or removed and accounted for in significance testing, provide better estimates of the true association with accurate false positive control and without loss of power. The different correlation methods were tested with normal data and normal data contaminated with marginal or bivariate outliers. We report estimates of effect size, false positive rate and power, and advise on which technique to use depending on the data at hand.
A Toolbox for Ab Initio 3-D Reconstructions in Single-particle Electron Microscopy
Voss, Neil R; Lyumkis, Dmitry; Cheng, Anchi; Lau, Pick-Wei; Mulder, Anke; Lander, Gabriel C; Brignole, Edward J; Fellmann, Denis; Irving, Christopher; Jacovetty, Erica L; Leung, Albert; Pulokas, James; Quispe, Joel D; Winkler, Hanspeter; Yoshioka, Craig; Carragher, Bridget; Potter, Clinton S
2010-01-01
Structure determination of a novel macromolecular complex via single-particle electron microscopy depends upon overcoming the challenge of establishing a reliable 3-D reconstruction using only 2-D images. There are a variety of strategies that deal with this issue, but not all of them are readily accessible and straightforward to use. We have developed a “toolbox” of ab initio reconstruction techniques that provide several options for calculating 3-D volumes in an easily managed and tightly controlled work-flow that adheres to standard conventions and formats. This toolbox is designed to streamline the reconstruction process by removing the necessity for bookkeeping, while facilitating transparent data transfer between different software packages. It currently includes procedures for calculating ab initio reconstructions via random or orthogonal tilt geometry, tomograms, and common lines, all of which have been tested using the 50S ribosomal subunit. Our goal is that the accessibility of multiple independent reconstruction algorithms via this toolbox will improve the ease with which models can be generated, and provide a means of evaluating the confidence and reliability of the final reconstructed map. PMID:20018246
Koul, Atesh; Becchio, Cristina; Cavallo, Andrea
2017-12-12
Recent years have seen an increased interest in machine learning-based predictive methods for analyzing quantitative behavioral data in experimental psychology. While these methods can achieve relatively greater sensitivity compared to conventional univariate techniques, they still lack an established and accessible implementation. The aim of current work was to build an open-source R toolbox - "PredPsych" - that could make these methods readily available to all psychologists. PredPsych is a user-friendly, R toolbox based on machine-learning predictive algorithms. In this paper, we present the framework of PredPsych via the analysis of a recently published multiple-subject motion capture dataset. In addition, we discuss examples of possible research questions that can be addressed with the machine-learning algorithms implemented in PredPsych and cannot be easily addressed with univariate statistical analysis. We anticipate that PredPsych will be of use to researchers with limited programming experience not only in the field of psychology, but also in that of clinical neuroscience, enabling computational assessment of putative bio-behavioral markers for both prognosis and diagnosis.
DPARSF: A MATLAB Toolbox for "Pipeline" Data Analysis of Resting-State fMRI.
Chao-Gan, Yan; Yu-Feng, Zang
2010-01-01
Resting-state functional magnetic resonance imaging (fMRI) has attracted more and more attention because of its effectiveness, simplicity and non-invasiveness in exploration of the intrinsic functional architecture of the human brain. However, user-friendly toolbox for "pipeline" data analysis of resting-state fMRI is still lacking. Based on some functions in Statistical Parametric Mapping (SPM) and Resting-State fMRI Data Analysis Toolkit (REST), we have developed a MATLAB toolbox called Data Processing Assistant for Resting-State fMRI (DPARSF) for "pipeline" data analysis of resting-state fMRI. After the user arranges the Digital Imaging and Communications in Medicine (DICOM) files and click a few buttons to set parameters, DPARSF will then give all the preprocessed (slice timing, realign, normalize, smooth) data and results for functional connectivity, regional homogeneity, amplitude of low-frequency fluctuation (ALFF), and fractional ALFF. DPARSF can also create a report for excluding subjects with excessive head motion and generate a set of pictures for easily checking the effect of normalization. In addition, users can also use DPARSF to extract time courses from regions of interest.
Almén, Anja; Båth, Magnus
2016-06-01
The overall aim of the present work was to develop a conceptual framework for managing radiation dose in diagnostic radiology with the intention to support optimisation. An optimisation process was first derived. The framework for managing radiation dose, based on the derived optimisation process, was then outlined. The outset of the optimisation process is four stages: providing equipment, establishing methodology, performing examinations and ensuring quality. The optimisation process comprises a series of activities and actions at these stages. The current system of diagnostic reference levels is an activity in the last stage, ensuring quality. The system becomes a reactive activity only to a certain extent engaging the core activity in the radiology department, performing examinations. Three reference dose levels-possible, expected and established-were assigned to the three stages in the optimisation process, excluding ensuring quality. A reasonably achievable dose range is also derived, indicating an acceptable deviation from the established dose level. A reasonable radiation dose for a single patient is within this range. The suggested framework for managing radiation dose should be regarded as one part of the optimisation process. The optimisation process constitutes a variety of complementary activities, where managing radiation dose is only one part. This emphasises the need to take a holistic approach integrating the optimisation process in different clinical activities. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Gordon, G T; McCann, B P
2015-01-01
This paper describes the basis of a stakeholder-based sustainable optimisation indicator (SOI) system to be developed for small-to-medium sized activated sludge (AS) wastewater treatment plants (WwTPs) in the Republic of Ireland (ROI). Key technical publications relating to best practice plant operation, performance audits and optimisation, and indicator and benchmarking systems for wastewater services are identified. Optimisation studies were developed at a number of Irish AS WwTPs and key findings are presented. A national AS WwTP manager/operator survey was carried out to verify the applied operational findings and identify the key operator stakeholder requirements for this proposed SOI system. It was found that most plants require more consistent operational data-based decision-making, monitoring and communication structures to facilitate optimised, sustainable and continuous performance improvement. The applied optimisation and stakeholder consultation phases form the basis of the proposed stakeholder-based SOI system. This system will allow for continuous monitoring and rating of plant performance, facilitate optimised operation and encourage the prioritisation of performance improvement through tracking key operational metrics. Plant optimisation has become a major focus due to the transfer of all ROI water services to a national water utility from individual local authorities and the implementation of the EU Water Framework Directive.
Parasite vulnerability to climate change: an evidence-based functional trait approach
Cizauskas, Carrie A.; Clements, Chris F.; Dougherty, Eric R.; Harris, Nyeema C.; Phillips, Anna J.
2017-01-01
Despite the number of virulent pathogens that are projected to benefit from global change and to spread in the next century, we suggest that a combination of coextinction risk and climate sensitivity could make parasites at least as extinction prone as any other trophic group. However, the existing interdisciplinary toolbox for identifying species threatened by climate change is inadequate or inappropriate when considering parasites as conservation targets. A functional trait approach can be used to connect parasites' ecological role to their risk of disappearance, but this is complicated by the taxonomic and functional diversity of many parasite clades. Here, we propose biological traits that may render parasite species particularly vulnerable to extinction (including high host specificity, complex life cycles and narrow climatic tolerance), and identify critical gaps in our knowledge of parasite biology and ecology. By doing so, we provide criteria to identify vulnerable parasite species and triage parasite conservation efforts. PMID:28280551
BurnMan: Towards a multidisciplinary toolkit for reproducible deep Earth science
NASA Astrophysics Data System (ADS)
Myhill, R.; Cottaar, S.; Heister, T.; Rose, I.; Unterborn, C. T.; Dannberg, J.; Martin-Short, R.
2016-12-01
BurnMan (www.burnman.org) is an open-source toolbox to compute thermodynamic and thermoelastic properties as a function of pressure and temperature using published mineral physical parameters and equations-of-state. The framework is user-friendly, written in Python, and modular, allowing the user to implement their own equations of state, endmember and solution model libraries, geotherms, and averaging schemes. Here we introduce various new modules, which can be used to: Fit thermodynamic variables to data from high pressure static and shock wave experiments, Calculate equilibrium assemblages given a bulk composition, pressure and temperature, Calculate chemical potentials and oxygen fugacities for given assemblages Compute 3D synthetic seismic models using output from geodynamic models and compare these results with global seismic tomographic models, Create input files for synthetic seismogram codes. Users can contribute scripts that reproduce the results from peer-reviewed articles and practical demonstrations (e.g. Cottaar et al., 2014).
Classification With Truncated Distance Kernel.
Huang, Xiaolin; Suykens, Johan A K; Wang, Shuning; Hornegger, Joachim; Maier, Andreas
2018-05-01
This brief proposes a truncated distance (TL1) kernel, which results in a classifier that is nonlinear in the global region but is linear in each subregion. With this kernel, the subregion structure can be trained using all the training data and local linear classifiers can be established simultaneously. The TL1 kernel has good adaptiveness to nonlinearity and is suitable for problems which require different nonlinearities in different areas. Though the TL1 kernel is not positive semidefinite, some classical kernel learning methods are still applicable which means that the TL1 kernel can be directly used in standard toolboxes by replacing the kernel evaluation. In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying the TL1 kernel a promising nonlinear kernel for classification tasks.
Kassem, Abdulsalam M; Ibrahim, Hany M; Samy, Ahmed M
2017-05-01
The objective of this study was to develop and optimise self-nanoemulsifying drug delivery system (SNEDDS) of atorvastatin calcium (ATC) for improving dissolution rate and eventually oral bioavailability. Ternary phase diagrams were constructed on basis of solubility and emulsification studies. The composition of ATC-SNEDDS was optimised using the Box-Behnken optimisation design. Optimised ATC-SNEDDS was characterised for various physicochemical properties. Pharmacokinetic, pharmacodynamic and histological findings were performed in rats. Optimised ATC-SNEDDS resulted in droplets size of 5.66 nm, zeta potential of -19.52 mV, t 90 of 5.43 min and completely released ATC within 30 min irrespective of pH of the medium. Area under the curve of optimised ATC-SNEDDS in rats was 2.34-folds higher than ATC suspension. Pharmacodynamic studies revealed significant reduction in serum lipids of rats with fatty liver. Photomicrographs showed improvement in hepatocytes structure. In this study, we confirmed that ATC-SNEDDS would be a promising approach for improving oral bioavailability of ATC.
Li, Jinyan; Fong, Simon; Wong, Raymond K; Millham, Richard; Wong, Kelvin K L
2017-06-28
Due to the high-dimensional characteristics of dataset, we propose a new method based on the Wolf Search Algorithm (WSA) for optimising the feature selection problem. The proposed approach uses the natural strategy established by Charles Darwin; that is, 'It is not the strongest of the species that survives, but the most adaptable'. This means that in the evolution of a swarm, the elitists are motivated to quickly obtain more and better resources. The memory function helps the proposed method to avoid repeat searches for the worst position in order to enhance the effectiveness of the search, while the binary strategy simplifies the feature selection problem into a similar problem of function optimisation. Furthermore, the wrapper strategy gathers these strengthened wolves with the classifier of extreme learning machine to find a sub-dataset with a reasonable number of features that offers the maximum correctness of global classification models. The experimental results from the six public high-dimensional bioinformatics datasets tested demonstrate that the proposed method can best some of the conventional feature selection methods up to 29% in classification accuracy, and outperform previous WSAs by up to 99.81% in computational time.
Use of game-theoretical methods in biochemistry and biophysics.
Schuster, Stefan; Kreft, Jan-Ulrich; Schroeter, Anja; Pfeiffer, Thomas
2008-04-01
Evolutionary game theory can be considered as an extension of the theory of evolutionary optimisation in that two or more organisms (or more generally, units of replication) tend to optimise their properties in an interdependent way. Thus, the outcome of the strategy adopted by one species (e.g., as a result of mutation and selection) depends on the strategy adopted by the other species. In this review, the use of evolutionary game theory for analysing biochemical and biophysical systems is discussed. The presentation is illustrated by a number of instructive examples such as the competition between microorganisms using different metabolic pathways for adenosine triphosphate production, the secretion of extracellular enzymes, the growth of trees and photosynthesis. These examples show that, due to conflicts of interest, the global optimum (in the sense of being the best solution for the whole system) is not always obtained. For example, some yeast species use metabolic pathways that waste nutrients, and in a dense tree canopy, trees grow taller than would be optimal for biomass productivity. From the viewpoint of game theory, the examples considered can be described by the Prisoner's Dilemma, snowdrift game, Tragedy of the Commons and rock-scissors-paper game.
Optimisation of 12 MeV electron beam simulation using variance reduction technique
NASA Astrophysics Data System (ADS)
Jayamani, J.; Termizi, N. A. S. Mohd; Kamarulzaman, F. N. Mohd; Aziz, M. Z. Abdul
2017-05-01
Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 107 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 107 to 20 × 107. In this study, 5 MeV electron cut-off with 10 × 107 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy.
A Toolbox of Metrology-Based Techniques for Optical System Alignment
NASA Technical Reports Server (NTRS)
Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Casto, Gordon V.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.;
2016-01-01
The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a toolbox format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.
A Toolbox of Metrology-Based Techniques for Optical System Alignment
NASA Technical Reports Server (NTRS)
Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; Hetherington, Samuel E.;
2016-01-01
The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a "toolbox" format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.
Getting a grip on glycans: A current overview of the metabolic oligosaccharide engineering toolbox.
Sminia, Tjerk J; Zuilhof, Han; Wennekes, Tom
2016-11-29
This review discusses the advances in metabolic oligosaccharide engineering (MOE) from 2010 to 2016 with a focus on the structure, preparation, and reactivity of its chemical probes. A brief historical overview of MOE is followed by a comprehensive overview of the chemical probes currently available in the MOE molecular toolbox and the bioconjugation techniques they enable. The final part of the review focusses on the synthesis of a selection of probes and finishes with an outlook on recent and potential upcoming advances in the field of MOE. Copyright © 2016 Elsevier Ltd. All rights reserved.
Improving Cognitive Skills of the Industrial Robot
NASA Astrophysics Data System (ADS)
Bezák, Pavol
2015-08-01
At present, there are plenty of industrial robots that are programmed to do the same repetitive task all the time. Industrial robots doing such kind of job are not able to understand whether the action is correct, effective or good. Object detection, manipulation and grasping is challenging due to the hand and object modeling uncertainties, unknown contact type and object stiffness properties. In this paper, the proposal of an intelligent humanoid hand object detection and grasping model is presented assuming that the object properties are known. The control is simulated in the Matlab Simulink/ SimMechanics, Neural Network Toolbox and Computer Vision System Toolbox.
ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox
NASA Technical Reports Server (NTRS)
Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan
2015-01-01
The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.
GPELab, a Matlab toolbox to solve Gross-Pitaevskii equations II: Dynamics and stochastic simulations
NASA Astrophysics Data System (ADS)
Antoine, Xavier; Duboscq, Romain
2015-08-01
GPELab is a free Matlab toolbox for modeling and numerically solving large classes of systems of Gross-Pitaevskii equations that arise in the physics of Bose-Einstein condensates. The aim of this second paper, which follows (Antoine and Duboscq, 2014), is to first present the various pseudospectral schemes available in GPELab for computing the deterministic and stochastic nonlinear dynamics of Gross-Pitaevskii equations (Antoine, et al., 2013). Next, the corresponding GPELab functions are explained in detail. Finally, some numerical examples are provided to show how the code works for the complex dynamics of BEC problems.
Kinematic simulation and analysis of robot based on MATLAB
NASA Astrophysics Data System (ADS)
Liao, Shuhua; Li, Jiong
2018-03-01
The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.
Application of Three Existing Stope Boundary Optimisation Methods in an Operating Underground Mine
NASA Astrophysics Data System (ADS)
Erdogan, Gamze; Yavuz, Mahmut
2017-12-01
The underground mine planning and design optimisation process have received little attention because of complexity and variability of problems in underground mines. Although a number of optimisation studies and software tools are available and some of them, in special, have been implemented effectively to determine the ultimate-pit limits in an open pit mine, there is still a lack of studies for optimisation of ultimate stope boundaries in underground mines. The proposed approaches for this purpose aim at maximizing the economic profit by selecting the best possible layout under operational, technical and physical constraints. In this paper, the existing three heuristic techniques including Floating Stope Algorithm, Maximum Value Algorithm and Mineable Shape Optimiser (MSO) are examined for optimisation of stope layout in a case study. Each technique is assessed in terms of applicability, algorithm capabilities and limitations considering the underground mine planning challenges. Finally, the results are evaluated and compared.
Design Optimisation of a Magnetic Field Based Soft Tactile Sensor
Raske, Nicholas; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Culmer, Peter; Hewson, Robert
2017-01-01
This paper investigates the design optimisation of a magnetic field based soft tactile sensor, comprised of a magnet and Hall effect module separated by an elastomer. The aim was to minimise sensitivity of the output force with respect to the input magnetic field; this was achieved by varying the geometry and material properties. Finite element simulations determined the magnetic field and structural behaviour under load. Genetic programming produced phenomenological expressions describing these responses. Optimisation studies constrained by a measurable force and stable loading conditions were conducted; these produced Pareto sets of designs from which the optimal sensor characteristics were selected. The optimisation demonstrated a compromise between sensitivity and the measurable force, a fabricated version of the optimised sensor validated the improvements made using this methodology. The approach presented can be applied in general for optimising soft tactile sensor designs over a range of applications and sensing modes. PMID:29099787
Robles, A; Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2014-04-01
The results of a global sensitivity analysis of a filtration model for submerged anaerobic MBRs (AnMBRs) are assessed in this paper. This study aimed to (1) identify the less- (or non-) influential factors of the model in order to facilitate model calibration and (2) validate the modelling approach (i.e. to determine the need for each of the proposed factors to be included in the model). The sensitivity analysis was conducted using a revised version of the Morris screening method. The dynamic simulations were conducted using long-term data obtained from an AnMBR plant fitted with industrial-scale hollow-fibre membranes. Of the 14 factors in the model, six were identified as influential, i.e. those calibrated using off-line protocols. A dynamic calibration (based on optimisation algorithms) of these influential factors was conducted. The resulting estimated model factors accurately predicted membrane performance. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ozkaya, Sait I.
2018-03-01
Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.
MRI Atlas-Based Measurement of Spinal Cord Injury Predicts Outcome in Acute Flaccid Myelitis.
McCoy, D B; Talbott, J F; Wilson, Michael; Mamlouk, M D; Cohen-Adad, J; Wilson, Mark; Narvid, J
2017-02-01
Recent advances in spinal cord imaging analysis have led to the development of a robust anatomic template and atlas incorporated into an open-source platform referred to as the Spinal Cord Toolbox. Using the Spinal Cord Toolbox, we sought to correlate measures of GM, WM, and cross-sectional area pathology on T2 MR imaging with motor disability in patients with acute flaccid myelitis. Spinal cord imaging for 9 patients with acute flaccid myelitis was analyzed by using the Spinal Cord Toolbox. A semiautomated pipeline using the Spinal Cord Toolbox measured lesion involvement in GM, WM, and total spinal cord cross-sectional area. Proportions of GM, WM, and cross-sectional area affected by T2 hyperintensity were calculated across 3 ROIs: 1) center axial section of lesion; 2) full lesion segment; and 3) full cord atlas volume. Spearman rank order correlation was calculated to compare MR metrics with clinical measures of disability. Proportion of GM metrics at the center axial section significantly correlated with measures of motor impairment upon admission ( r [9] = -0.78; P = .014) and at 3-month follow-up ( r [9] = -0.66; P = .05). Further, proportion of GM extracted across the full lesion segment significantly correlated with initial motor impairment ( r [9] = -0.74, P = .024). No significant correlation was found for proportion of WM or proportion of cross-sectional area with clinical disability. Atlas-based measures of proportion of GM T2 signal abnormality measured on a single axial MR imaging section and across the full lesion segment correlate with motor impairment and outcome in patients with acute flaccid myelitis. This is the first atlas-based study to correlate clinical outcomes with segmented measures of T2 signal abnormality in the spinal cord. © 2017 by American Journal of Neuroradiology.
Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy
NASA Astrophysics Data System (ADS)
Jha, S.; Harry, D. L.; Schutt, D.
2016-12-01
The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.
Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela
2017-01-01
Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.
The conservation physiology toolbox: status and opportunities
Love, Oliver P; Hultine, Kevin R
2018-01-01
Abstract For over a century, physiological tools and techniques have been allowing researchers to characterize how organisms respond to changes in their natural environment and how they interact with human activities or infrastructure. Over time, many of these techniques have become part of the conservation physiology toolbox, which is used to monitor, predict, conserve, and restore plant and animal populations under threat. Here, we provide a summary of the tools that currently comprise the conservation physiology toolbox. By assessing patterns in articles that have been published in ‘Conservation Physiology’ over the past 5 years that focus on introducing, refining and validating tools, we provide an overview of where researchers are placing emphasis in terms of taxa and physiological sub-disciplines. Although there is certainly diversity across the toolbox, metrics of stress physiology (particularly glucocorticoids) and studies focusing on mammals have garnered the greatest attention, with both comprising the majority of publications (>45%). We also summarize the types of validations that are actively being completed, including those related to logistics (sample collection, storage and processing), interpretation of variation in physiological traits and relevance for conservation science. Finally, we provide recommendations for future tool refinement, with suggestions for: (i) improving our understanding of the applicability of glucocorticoid physiology; (ii) linking multiple physiological and non-physiological tools; (iii) establishing a framework for plant conservation physiology; (iv) assessing links between environmental disturbance, physiology and fitness; (v) appreciating opportunities for validations in under-represented taxa; and (vi) emphasizing tool validation as a core component of research programmes. Overall, we are confident that conservation physiology will continue to increase its applicability to more taxa, develop more non-invasive techniques, delineate where limitations exist, and identify the contexts necessary for interpretation in captivity and the wild. PMID:29942517
MOtoNMS: A MATLAB toolbox to process motion data for neuromusculoskeletal modeling and simulation.
Mantoan, Alice; Pizzolato, Claudio; Sartori, Massimo; Sawacha, Zimi; Cobelli, Claudio; Reggiani, Monica
2015-01-01
Neuromusculoskeletal modeling and simulation enable investigation of the neuromusculoskeletal system and its role in human movement dynamics. These methods are progressively introduced into daily clinical practice. However, a major factor limiting this translation is the lack of robust tools for the pre-processing of experimental movement data for their use in neuromusculoskeletal modeling software. This paper presents MOtoNMS (matlab MOtion data elaboration TOolbox for NeuroMusculoSkeletal applications), a toolbox freely available to the community, that aims to fill this lack. MOtoNMS processes experimental data from different motion analysis devices and generates input data for neuromusculoskeletal modeling and simulation software, such as OpenSim and CEINMS (Calibrated EMG-Informed NMS Modelling Toolbox). MOtoNMS implements commonly required processing steps and its generic architecture simplifies the integration of new user-defined processing components. MOtoNMS allows users to setup their laboratory configurations and processing procedures through user-friendly graphical interfaces, without requiring advanced computer skills. Finally, configuration choices can be stored enabling the full reproduction of the processing steps. MOtoNMS is released under GNU General Public License and it is available at the SimTK website and from the GitHub repository. Motion data collected at four institutions demonstrate that, despite differences in laboratory instrumentation and procedures, MOtoNMS succeeds in processing data and producing consistent inputs for OpenSim and CEINMS. MOtoNMS fills the gap between motion analysis and neuromusculoskeletal modeling and simulation. Its support to several devices, a complete implementation of the pre-processing procedures, its simple extensibility, the available user interfaces, and its free availability can boost the translation of neuromusculoskeletal methods in daily and clinical practice.
morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python
Hull, Michael J.; Willshaw, David J.
2014-01-01
The broad structure of a modeling study can often be explained over a cup of coffee, but converting this high-level conceptual idea into graphs of the final simulation results may require many weeks of sitting at a computer. Although models themselves can be complex, often many mental resources are wasted working around complexities of the software ecosystem such as fighting to manage files, interfacing between tools and data formats, finding mistakes in code or working out the units of variables. morphforge is a high-level, Python toolbox for building and managing simulations of small populations of multicompartmental biophysical model neurons. An entire in silico experiment, including the definition of neuronal morphologies, channel descriptions, stimuli, visualization and analysis of results can be written within a single short Python script using high-level objects. Multiple independent simulations can be created and run from a single script, allowing parameter spaces to be investigated. Consideration has been given to the reuse of both algorithmic and parameterizable components to allow both specific and stochastic parameter variations. Some other features of the toolbox include: the automatic generation of human-readable documentation (e.g., PDF files) about a simulation; the transparent handling of different biophysical units; a novel mechanism for plotting simulation results based on a system of tags; and an architecture that supports both the use of established formats for defining channels and synapses (e.g., MODL files), and the possibility to support other libraries and standards easily. We hope that this toolbox will allow scientists to quickly build simulations of multicompartmental model neurons for research and serve as a platform for further tool development. PMID:24478690
Local soil quality assessment of north-central Namibia: integrating farmers' and technical knowledge
NASA Astrophysics Data System (ADS)
Prudat, Brice; Bloemertz, Lena; Kuhn, Nikolaus J.
2018-02-01
Soil degradation is a major threat for farmers of semi-arid north-central Namibia. Soil conservation practices can be promoted by the development of soil quality (SQ) evaluation toolboxes that provide ways to evaluate soil degradation. However, such toolboxes must be adapted to local conditions to reach farmers. Based on qualitative (interviews and soil descriptions) and quantitative (laboratory analyses) data, we developed a set of SQ indicators relevant for our study area that integrates farmers' field experiences (FFEs) and technical knowledge. We suggest using participatory mapping to delineate soil units (Oshikwanyama soil units, KwSUs) based on FFEs, which highlight mostly soil properties that integrate long-term productivity and soil hydrological characteristics (i.e. internal SQ). The actual SQ evaluation of a location depends on the KwSU described and is thereafter assessed by field soil texture (i.e. chemical fertility potential) and by soil colour shade (i.e. SOC status). This three-level information aims to reveal SQ improvement potential by comparing, for any location, (a) estimated clay content against median clay content (specific to KwSU) and (b) soil organic status against calculated optimal values (depends on clay content). The combination of farmers' and technical assessment cumulates advantages of both systems of knowledge, namely the integrated long-term knowledge of the farmers and a short- and medium-term SQ status assessment. The toolbox is a suggestion for evaluating SQ and aims to help farmers, rural development planners and researchers from all fields of studies understanding SQ issues in north-central Namibia. This suggested SQ toolbox is adapted to a restricted area of north-central Namibia, but similar tools could be developed in most areas where small-scale agriculture prevails.
Basic Radar Altimetry Toolbox: tools to teach altimetry for ocean
NASA Astrophysics Data System (ADS)
Rosmorduc, Vinca; Benveniste, Jerome; Bronner, Emilie; Niemeijer, Sander; Lucas, Bruno Manuel; Dinardo, Salvatore
2013-04-01
The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the next mission to be launched, CryoSat. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. More than 2000 people downloaded it (January 2013), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2 and 3. Others are in discussion for the future, including addition of the future Sentinel-3. The Basic Radar Altimetry Toolbox is able: - to read most distributed radar altimetry data, including the one from future missions like Saral, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways, including as an educational tool, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data and additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. Example from education uses will be presented, and feedback from those who used it as such will be most welcome. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/
Huang, Qi; Nie, Binbin; Ma, Chen; Wang, Jing; Zhang, Tianhao; Duan, Shaofeng; Wu, Shang; Liang, Shengxiang; Li, Panlong; Liu, Hua; Sun, Hua; Zhou, Jiangning; Xu, Lin; Shan, Baoci
2018-01-01
Tree shrews are proposed as an alternative animal model to nonhuman primates due to their close affinity to primates. Neuroimaging techniques are widely used to study brain functions and structures of humans and animals. However, tree shrews are rarely applied in neuroimaging field partly due to the lack of available species specific analysis methods. In this study, 10 PET/CT and 10 MRI images of tree shrew brain were used to construct PET and MRI templates; based on histological atlas we reconstructed a three-dimensional digital atlas with 628 structures delineated; then the digital atlas and templates were aligned into a stereotaxic space. Finally, we integrated the digital atlas and templates into a toolbox for tree shrew brain spatial normalization, statistical analysis and results localization. We validated the feasibility of the toolbox by simulated data with lesions in laterodorsal thalamic nucleus (LD). The lesion volumes of simulated PET and MRI images were (12.97±3.91)mm 3 and (7.04±0.84)mm 3 . Statistical results at p<0.005 showed the lesion volumes of PET and MRI were 13.18mm 3 and 8.06mm 3 in LD. To our knowledge, we report the first PET template and digital atlas of tree shrew brain. Compared to the existing MRI templates, our MRI template was aligned into stereotaxic space. And the toolbox is the first software dedicated for tree shrew brain analysis. The templates and digital atlas of tree shrew brain, as well as the toolbox, facilitate the use of tree shrews in neuroimaging field. Copyright © 2017 Elsevier B.V. All rights reserved.
Modular-based multiscale modeling on viscoelasticity of polymer nanocomposites
NASA Astrophysics Data System (ADS)
Li, Ying; Liu, Zeliang; Jia, Zheng; Liu, Wing Kam; Aldousari, Saad M.; Hedia, Hassan S.; Asiri, Saeed A.
2017-02-01
Polymer nanocomposites have been envisioned as advanced materials for improving the mechanical performance of neat polymers used in aerospace, petrochemical, environment and energy industries. With the filler size approaching the nanoscale, composite materials tend to demonstrate remarkable thermomechanical properties, even with addition of a small amount of fillers. These observations confront the classical composite theories and are usually attributed to the high surface-area-to-volume-ratio of the fillers, which can introduce strong nanoscale interfacial effect and relevant long-range perturbation on polymer chain dynamics. Despite decades of research aimed at understanding interfacial effect and improving the mechanical performance of composite materials, it is not currently possible to accurately predict the mechanical properties of polymer nanocomposites directly from their molecular constituents. To overcome this challenge, different theoretical, experimental and computational schemes will be used to uncover the key physical mechanisms at the relevant spatial and temporal scales for predicting and tuning constitutive behaviors in silico, thereby establishing a bottom-up virtual design principle to achieve unprecedented mechanical performance of nanocomposites. A modular-based multiscale modeling approach for viscoelasticity of polymer nanocomposites has been proposed and discussed in this study, including four modules: (A) neat polymer toolbox; (B) interphase toolbox; (C) microstructural toolbox and (D) homogenization toolbox. Integrating these modules together, macroscopic viscoelasticity of polymer nanocomposites could be directly predicted from their molecular constituents. This will maximize the computational ability to design novel polymer composites with advanced performance. More importantly, elucidating the viscoelasticity of polymer nanocomposites through fundamental studies is a critical step to generate an integrated computational material engineering principle for discovering and manufacturing new composites with transformative impact on aerospace, automobile, petrochemical industries.
Seiger, Rene; Ganger, Sebastian; Kranz, Georg S; Hahn, Andreas; Lanzenberger, Rupert
2018-05-15
Automated cortical thickness (CT) measurements are often used to assess gray matter changes in the healthy and diseased human brain. The FreeSurfer software is frequently applied for this type of analysis. The computational anatomy toolbox (CAT12) for SPM, which offers a fast and easy-to-use alternative approach, was recently made available. In this study, we compared region of interest (ROI)-wise CT estimations of the surface-based FreeSurfer 6 (FS6) software and the volume-based CAT12 toolbox for SPM using 44 elderly healthy female control subjects (HC). In addition, these 44 HCs from the cross-sectional analysis and 34 age- and sex-matched patients with Alzheimer's disease (AD) were used to assess the potential of detecting group differences for each method. Finally, a test-retest analysis was conducted using 19 HC subjects. All data were taken from the OASIS database and MRI scans were recorded at 1.5 Tesla. A strong correlation was observed between both methods in terms of ROI mean CT estimates (R 2 = .83). However, CAT12 delivered significantly higher CT estimations in 32 of the 34 ROIs, indicating a systematic difference between both approaches. Furthermore, both methods were able to reliably detect atrophic brain areas in AD subjects, with the highest decreases in temporal areas. Finally, FS6 as well as CAT12 showed excellent test-retest variability scores. Although CT estimations were systematically higher for CAT12, this study provides evidence that this new toolbox delivers accurate and robust CT estimates and can be considered a fast and reliable alternative to FreeSurfer. © 2018 The Authors. Journal of Neuroimaging published by Wiley Periodicals, Inc. on behalf of American Society of Neuroimaging.
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Berry, Kevin; Gregpru. Late; Speckman, Keith; Hur-Diaz, Sun; Surka, Derek; Gaylor, Dave
2010-01-01
The Orbit Determination Toolbox is an orbit determination (OD) analysis tool based on MATLAB and Java that provides a flexible way to do early mission analysis. The toolbox is primarily intended for advanced mission analysis such as might be performed in concept exploration, proposal, early design phase, or rapid design center environments. The emphasis is on flexibility, but it has enough fidelity to produce credible results. Insight into all flight dynamics source code is provided. MATLAB is the primary user interface and is used for piecing together measurement and dynamic models. The Java Astrodynamics Toolbox is used as an engine for things that might be slow or inefficient in MATLAB, such as high-fidelity trajectory propagation, lunar and planetary ephemeris look-ups, precession, nutation, polar motion calculations, ephemeris file parsing, and the like. The primary analysis functions are sequential filter/smoother and batch least-squares commands that incorporate Monte-Carlo data simulation, linear covariance analysis, measurement processing, and plotting capabilities at the generic level. These functions have a user interface that is based on that of the MATLAB ODE suite. To perform a specific analysis, users write MATLAB functions that implement truth and design system models. The user provides his or her models as inputs to the filter commands. The software provides a capability to publish and subscribe to a software bus that is compliant with the NASA Goddard Mission Services Evolution Center (GMSEC) standards, to exchange data with other flight dynamics tools to simplify the flight dynamics design cycle. Using the publish and subscribe approach allows for analysts in a rapid design center environment to seamlessly incorporate changes in spacecraft and mission design into navigation analysis and vice versa.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apte, A; Veeraraghavan, H; Oh, J
Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less
Ribera, Esteban; Martínez-Sesmero, José Manuel; Sánchez-Rubio, Javier; Rubio, Rafael; Pasquau, Juan; Poveda, José Luis; Pérez-Mitru, Alejandro; Roldán, Celia; Hernández-Novoa, Beatriz
2018-03-01
The objective of this study is to estimate the economic impact associated with the optimisation of triple antiretroviral treatment (ART) in patients with undetectable viral load according to the recommendations from the GeSIDA/PNS (2015) Consensus and their applicability in the Spanish clinical practice. A pharmacoeconomic model was developed based on data from a National Hospital Prescription Survey on ART (2014) and the A-I evidence recommendations for the optimisation of ART from the GeSIDA/PNS (2015) consensus. The optimisation model took into account the willingness to optimise a particular regimen and other assumptions, and the results were validated by an expert panel in HIV infection (Infectious Disease Specialists and Hospital Pharmacists). The analysis was conducted from the NHS perspective, considering the annual wholesale price and accounting for deductions stated in the RD-Law 8/2010 and the VAT. The expert panel selected six optimisation strategies, and estimated that 10,863 (13.4%) of the 80,859 patients in Spain currently on triple ART, would be candidates to optimise their ART, leading to savings of €15.9M/year (2.4% of total triple ART drug cost). The most feasible strategies (>40% of patients candidates for optimisation, n=4,556) would be optimisations to ATV/r+3TC therapy. These would produce savings between €653 and €4,797 per patient per year depending on baseline triple ART. Implementation of the main optimisation strategies recommended in the GeSIDA/PNS (2015) Consensus into Spanish clinical practice would lead to considerable savings, especially those based in dual therapy with ATV/r+3TC, thus contributing to the control of pharmaceutical expenditure and NHS sustainability. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Optimisation of the hybrid renewable energy system by HOMER, PSO and CPSO for the study area
NASA Astrophysics Data System (ADS)
Khare, Vikas; Nema, Savita; Baredar, Prashant
2017-04-01
This study is based on simulation and optimisation of the renewable energy system of the police control room at Sagar in central India. To analyse this hybrid system, the meteorological data of solar insolation and hourly wind speeds of Sagar in central India (longitude 78°45‧ and latitude 23°50‧) have been considered. The pattern of load consumption is studied and suitably modelled for optimisation of the hybrid energy system using HOMER software. The results are compared with those of the particle swarm optimisation and the chaotic particle swarm optimisation algorithms. The use of these two algorithms to optimise the hybrid system leads to a higher quality result with faster convergence. Based on the optimisation result, it has been found that replacing conventional energy sources by the solar-wind hybrid renewable energy system will be a feasible solution for the distribution of electric power as a stand-alone application at the police control room. This system is more environmentally friendly than the conventional diesel generator. The fuel cost reduction is approximately 70-80% more than that of the conventional diesel generator.
Algorithme intelligent d'optimisation d'un design structurel de grande envergure
NASA Astrophysics Data System (ADS)
Dominique, Stephane
The implementation of an automated decision support system in the field of design and structural optimisation can give a significant advantage to any industry working on mechanical designs. Indeed, by providing solution ideas to a designer or by upgrading existing design solutions while the designer is not at work, the system may reduce the project cycle time, or allow more time to produce a better design. This thesis presents a new approach to automate a design process based on Case-Based Reasoning (CBR), in combination with a new genetic algorithm named Genetic Algorithm with Territorial core Evolution (GATE). This approach was developed in order to reduce the operating cost of the process. However, as the system implementation cost is quite expensive, the approach is better suited for large scale design problem, and particularly for design problems that the designer plans to solve for many different specification sets. First, the CBR process uses a databank filled with every known solution to similar design problems. Then, the closest solutions to the current problem in term of specifications are selected. After this, during the adaptation phase, an artificial neural network (ANN) interpolates amongst known solutions to produce an additional solution to the current problem using the current specifications as inputs. Each solution produced and selected by the CBR is then used to initialize the population of an island of the genetic algorithm. The algorithm will optimise the solution further during the refinement phase. Using progressive refinement, the algorithm starts using only the most important variables for the problem. Then, as the optimisation progress, the remaining variables are gradually introduced, layer by layer. The genetic algorithm that is used is a new algorithm specifically created during this thesis to solve optimisation problems from the field of mechanical device structural design. The algorithm is named GATE, and is essentially a real number genetic algorithm that prevents new individuals to be born too close to previously evaluated solutions. The restricted area becomes smaller or larger during the optimisation to allow global or local search when necessary. Also, a new search operator named Substitution Operator is incorporated in GATE. This operator allows an ANN surrogate model to guide the algorithm toward the most promising areas of the design space. The suggested CBR approach and GATE were tested on several simple test problems, as well as on the industrial problem of designing a gas turbine engine rotor's disc. These results are compared to other results obtained for the same problems by many other popular optimisation algorithms, such as (depending of the problem) gradient algorithms, binary genetic algorithm, real number genetic algorithm, genetic algorithm using multiple parents crossovers, differential evolution genetic algorithm, Hookes & Jeeves generalized pattern search method and POINTER from the software I-SIGHT 3.5. Results show that GATE is quite competitive, giving the best results for 5 of the 6 constrained optimisation problem. GATE also provided the best results of all on problem produced by a Maximum Set Gaussian landscape generator. Finally, GATE provided a disc 4.3% lighter than the best other tested algorithm (POINTER) for the gas turbine engine rotor's disc problem. One drawback of GATE is a lesser efficiency for highly multimodal unconstrained problems, for which he gave quite poor results with respect to its implementation cost. To conclude, according to the preliminary results obtained during this thesis, the suggested CBR process, combined with GATE, seems to be a very good candidate to automate and accelerate the structural design of mechanical devices, potentially reducing significantly the cost of industrial preliminary design processes.
Reverse engineering a gene network using an asynchronous parallel evolution strategy
2010-01-01
Background The use of reverse engineering methods to infer gene regulatory networks by fitting mathematical models to gene expression data is becoming increasingly popular and successful. However, increasing model complexity means that more powerful global optimisation techniques are required for model fitting. The parallel Lam Simulated Annealing (pLSA) algorithm has been used in such approaches, but recent research has shown that island Evolutionary Strategies can produce faster, more reliable results. However, no parallel island Evolutionary Strategy (piES) has yet been demonstrated to be effective for this task. Results Here, we present synchronous and asynchronous versions of the piES algorithm, and apply them to a real reverse engineering problem: inferring parameters in the gap gene network. We find that the asynchronous piES exhibits very little communication overhead, and shows significant speed-up for up to 50 nodes: the piES running on 50 nodes is nearly 10 times faster than the best serial algorithm. We compare the asynchronous piES to pLSA on the same test problem, measuring the time required to reach particular levels of residual error, and show that it shows much faster convergence than pLSA across all optimisation conditions tested. Conclusions Our results demonstrate that the piES is consistently faster and more reliable than the pLSA algorithm on this problem, and scales better with increasing numbers of nodes. In addition, the piES is especially well suited to further improvements and adaptations: Firstly, the algorithm's fast initial descent speed and high reliability make it a good candidate for being used as part of a global/local search hybrid algorithm. Secondly, it has the potential to be used as part of a hierarchical evolutionary algorithm, which takes advantage of modern multi-core computing architectures. PMID:20196855
Optimisation of nano-silica modified self-compacting high-Volume fly ash mortar
NASA Astrophysics Data System (ADS)
Achara, Bitrus Emmanuel; Mohammed, Bashar S.; Fadhil Nuruddin, Muhd
2017-05-01
Evaluation of the effects of nano-silica amount and superplasticizer (SP) dosage on the compressive strength, porosity and slump flow on high-volume fly ash self-consolidating mortar was investigated. Multiobjective optimisation technique using Design-Expert software was applied to obtain solution based on desirability function that simultaneously optimises the variables and the responses. A desirability function of 0.811 gives the optimised solution. The experimental and predicted results showed minimal errors in all the measured responses.
Advanced treatment planning using direct 4D optimisation for pencil-beam scanned particle therapy
NASA Astrophysics Data System (ADS)
Bernatowicz, Kinga; Zhang, Ye; Perrin, Rosalind; Weber, Damien C.; Lomax, Antony J.
2017-08-01
We report on development of a new four-dimensional (4D) optimisation approach for scanned proton beams, which incorporates both irregular motion patterns and the delivery dynamics of the treatment machine into the plan optimiser. Furthermore, we assess the effectiveness of this technique to reduce dose to critical structures in proximity to moving targets, while maintaining effective target dose homogeneity and coverage. The proposed approach has been tested using both a simulated phantom and a clinical liver cancer case, and allows for realistic 4D calculations and optimisation using irregular breathing patterns extracted from e.g. 4DCT-MRI (4D computed tomography-magnetic resonance imaging). 4D dose distributions resulting from our 4D optimisation can achieve almost the same quality as static plans, independent of the studied geometry/anatomy or selected motion (regular and irregular). Additionally, current implementation of the 4D optimisation approach requires less than 3 min to find the solution for a single field planned on 4DCT of a liver cancer patient. Although 4D optimisation allows for realistic calculations using irregular breathing patterns, it is very sensitive to variations from the planned motion. Based on a sensitivity analysis, target dose homogeneity comparable to static plans (D5-D95 <5%) has been found only for differences in amplitude of up to 1 mm, for changes in respiratory phase <200 ms and for changes in the breathing period of <20 ms in comparison to the motions used during optimisation. As such, methods to robustly deliver 4D optimised plans employing 4D intensity-modulated delivery are discussed.
Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian
2005-01-01
This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.
Mutual information-based LPI optimisation for radar network
NASA Astrophysics Data System (ADS)
Shi, Chenguang; Zhou, Jianjiang; Wang, Fei; Chen, Jun
2015-07-01
Radar network can offer significant performance improvement for target detection and information extraction employing spatial diversity. For a fixed number of radars, the achievable mutual information (MI) for estimating the target parameters may extend beyond a predefined threshold with full power transmission. In this paper, an effective low probability of intercept (LPI) optimisation algorithm is presented to improve LPI performance for radar network. Based on radar network system model, we first provide Schleher intercept factor for radar network as an optimisation metric for LPI performance. Then, a novel LPI optimisation algorithm is presented, where for a predefined MI threshold, Schleher intercept factor for radar network is minimised by optimising the transmission power allocation among radars in the network such that the enhanced LPI performance for radar network can be achieved. The genetic algorithm based on nonlinear programming (GA-NP) is employed to solve the resulting nonconvex and nonlinear optimisation problem. Some simulations demonstrate that the proposed algorithm is valuable and effective to improve the LPI performance for radar network.
Zarb, Francis; McEntee, Mark F; Rainford, Louise
2015-06-01
To evaluate visual grading characteristics (VGC) and ordinal regression analysis during head CT optimisation as a potential alternative to visual grading assessment (VGA), traditionally employed to score anatomical visualisation. Patient images (n = 66) were obtained using current and optimised imaging protocols from two CT suites: a 16-slice scanner at the national Maltese centre for trauma and a 64-slice scanner in a private centre. Local resident radiologists (n = 6) performed VGA followed by VGC and ordinal regression analysis. VGC alone indicated that optimised protocols had similar image quality as current protocols. Ordinal logistic regression analysis provided an in-depth evaluation, criterion by criterion allowing the selective implementation of the protocols. The local radiology review panel supported the implementation of optimised protocols for brain CT examinations (including trauma) in one centre, achieving radiation dose reductions ranging from 24 % to 36 %. In the second centre a 29 % reduction in radiation dose was achieved for follow-up cases. The combined use of VGC and ordinal logistic regression analysis led to clinical decisions being taken on the implementation of the optimised protocols. This improved method of image quality analysis provided the evidence to support imaging protocol optimisation, resulting in significant radiation dose savings. • There is need for scientifically based image quality evaluation during CT optimisation. • VGC and ordinal regression analysis in combination led to better informed clinical decisions. • VGC and ordinal regression analysis led to dose reductions without compromising diagnostic efficacy.
Genetic landscapes GIS Toolbox: tools to map patterns of genetic divergence and diversity.
Vandergast, Amy G.; Perry, William M.; Lugo, Roberto V.; Hathaway, Stacie A.
2011-01-01
The Landscape Genetics GIS Toolbox contains tools that run in the Geographic Information System software, ArcGIS, to map genetic landscapes and to summarize multiple genetic landscapes as average and variance surfaces. These tools can be used to visualize the distribution of genetic diversity across geographic space and to study associations between patterns of genetic diversity and geographic features or other geo-referenced environmental data sets. Together, these tools create genetic landscape surfaces directly from tables containing genetic distance or diversity data and sample location coordinates, greatly reducing the complexity of building and analyzing these raster surfaces in a Geographic Information System.
Genetic and Genomic Toolbox of Zea mays
Nannas, Natalie J.; Dawe, R. Kelly
2015-01-01
Maize has a long history of genetic and genomic tool development and is considered one of the most accessible higher plant systems. With a fully sequenced genome, a suite of cytogenetic tools, methods for both forward and reverse genetics, and characterized phenotype markers, maize is amenable to studying questions beyond plant biology. Major discoveries in the areas of transposons, imprinting, and chromosome biology came from work in maize. Moving forward in the post-genomic era, this classic model system will continue to be at the forefront of basic biological study. In this review, we outline the basics of working with maize and describe its rich genetic toolbox. PMID:25740912
Kang, Zhen; Huang, Hao; Zhang, Yunfeng; Du, Guocheng; Chen, Jian
2017-01-01
Pichia pastoris: (reclassified as Komagataella phaffii), a methylotrophic yeast strain has been widely used for heterologous protein production because of its unique advantages, such as readily achievable high-density fermentation, tractable genetic modifications and typical eukaryotic post-translational modifications. More recently, P. pastoris as a metabolic pathway engineering platform has also gained much attention. In this mini-review, we addressed recent advances of molecular toolboxes, including synthetic promoters, signal peptides, and genome engineering tools that established for P. pastoris. Furthermore, the applications of P. pastoris towards synthetic biology were also discussed and prospected especially in the context of genome-scale metabolic pathway analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eckerman, Keith F.; Sjoreen, Andrea L.
2013-05-01
The Radiological Toolbox software developed by Oak Ridge National Laboratory (ORNL) for U. S. Nuclear Regulatory Commission (NRC) is designed to provide electronic access to the vast and varied data that underlies the field of radiation protection. These data represent physical, chemical, anatomical, physiological, and mathematical parameters detailed in various handbooks which a health physicist might consult while in his office. The initial motivation for the software was to serve the needs of the health physicist away from his office and without access to his handbooks; e.g., NRC inspectors. The earlier releases of the software were widely used and acceptedmore » around the world by not only practicing health physicist but also those within educational programs. This release updates the software to accommodate changes in Windows operating systems and, in some aspects, radiation protection. This release has been tested on Windows 7 and 8 and on 32- and 64-bit machines. The nuclear decay data has been updated and thermal neutron capture cross sections and cancer risk coefficients have been included. This document and the software’s user’s guide provide further details and documentation of the information captured within the Radiological Toolbox.« less
A toolbox and a record for scientific model development
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1994-01-01
Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.
spads 1.0: a toolbox to perform spatial analyses on DNA sequence data sets.
Dellicour, Simon; Mardulyn, Patrick
2014-05-01
SPADS 1.0 (for 'Spatial and Population Analysis of DNA Sequences') is a population genetic toolbox for characterizing genetic variability within and among populations from DNA sequences. In view of the drastic increase in genetic information available through sequencing methods, spads was specifically designed to deal with multilocus data sets of DNA sequences. It computes several summary statistics from populations or groups of populations, performs input file conversions for other population genetic programs and implements locus-by-locus and multilocus versions of two clustering algorithms to study the genetic structure of populations. The toolbox also includes two MATLAB and r functions, GDISPAL and GDIVPAL, to display differentiation and diversity patterns across landscapes. These functions aim to generate interpolating surfaces based on multilocus distance and diversity indices. In the case of multiple loci, such surfaces can represent a useful alternative to multiple pie charts maps traditionally used in phylogeography to represent the spatial distribution of genetic diversity. These coloured surfaces can also be used to compare different data sets or different diversity and/or distance measures estimated on the same data set. © 2013 John Wiley & Sons Ltd.
MEG/EEG Source Reconstruction, Statistical Evaluation, and Visualization with NUTMEG
Dalal, Sarang S.; Zumer, Johanna M.; Guggisberg, Adrian G.; Trumpis, Michael; Wong, Daniel D. E.; Sekihara, Kensuke; Nagarajan, Srikantan S.
2011-01-01
NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions. PMID:21437174
MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.
Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S
2011-01-01
NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.
Language Measures of the NIH Toolbox Cognition Battery
Gershon, Richard C.; Cook, Karon F.; Mungas, Dan; Manly, Jennifer J.; Slotkin, Jerry; Beaumont, Jennifer L.; Weintraub, Sandra
2015-01-01
Language facilitates communication and efficient encoding of thought and experience. Because of its essential role in early childhood development, in educational achievement and in subsequent life adaptation, language was included as one of the subdomains in the NIH Toolbox for the Assessment of Neurological and Behavioral Function Cognition Battery (NIHTB-CB). There are many different components of language functioning, including syntactic processing (i.e., morphology and grammar) and lexical semantics. For purposes of the NIHTB-CB, two tests of language—a picture vocabulary test and a reading recognition test—were selected by consensus based on literature reviews, iterative expert input, and a desire to assess in English and Spanish. NIHTB-CB’s picture vocabulary and reading recognition tests are administered using computer adaptive testing and scored using item response theory. Data are presented from the validation of the English versions in a sample of adults ages 20–85 years (Spanish results will be presented in a future publication). Both tests demonstrated high test–retest reliability and good construct validity compared to corresponding gold-standard measures. Scores on the NIH Toolbox measures were consistent with age-related expectations, namely, growth in language during early development, with relative stabilization into late adulthood. PMID:24960128
Hydratools, a MATLAB® based data processing package for Sontek Hydra data
Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.
2005-01-01
The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.
On the design and optimisation of new fractal antenna using PSO
NASA Astrophysics Data System (ADS)
Rani, Shweta; Singh, A. P.
2013-10-01
An optimisation technique for newly shaped fractal structure using particle swarm optimisation with curve fitting is presented in this article. The aim of particle swarm optimisation is to find the geometry of the antenna for the required user-defined frequency. To assess the effectiveness of the presented method, a set of representative numerical simulations have been done and the results are compared with the measurements from experimental prototypes built according to the design specifications coming from the optimisation procedure. The proposed fractal antenna resonates at the 5.8 GHz industrial, scientific and medical band which is suitable for wireless telemedicine applications. The antenna characteristics have been studied using extensive numerical simulations and are experimentally verified. The antenna exhibits well-defined radiation patterns over the band.
Bryant, Maria; Burton, Wendy; Cundill, Bonnie; Farrin, Amanda J; Nixon, Jane; Stevens, June; Roberts, Kim; Foy, Robbie; Rutter, Harry; Hartley, Suzanne; Tubeuf, Sandy; Collinson, Michelle; Brown, Julia
2017-01-24
Family-based interventions to prevent childhood obesity depend upon parents' taking action to improve diet and other lifestyle behaviours in their families. Programmes that attract and retain high numbers of parents provide an enhanced opportunity to improve public health and are also likely to be more cost-effective than those that do not. We have developed a theory-informed optimisation intervention to promote parent engagement within an existing childhood obesity prevention group programme, HENRY (Health Exercise Nutrition for the Really Young). Here, we describe a proposal to evaluate the effectiveness of this optimisation intervention in regard to the engagement of parents and cost-effectiveness. The Optimising Family Engagement in HENRY (OFTEN) trial is a cluster randomised controlled trial being conducted across 24 local authorities (approximately 144 children's centres) which currently deliver HENRY programmes. The primary outcome will be parental enrolment and attendance at the HENRY programme, assessed using routinely collected process data. Cost-effectiveness will be presented in terms of primary outcomes using acceptability curves and through eliciting the willingness to pay for the optimisation from HENRY commissioners. Secondary outcomes include the longitudinal impact of the optimisation, parent-reported infant intake of fruits and vegetables (as a proxy to compliance) and other parent-reported family habits and lifestyle. This innovative trial will provide evidence on the implementation of a theory-informed optimisation intervention to promote parent engagement in HENRY, a community-based childhood obesity prevention programme. The findings will be generalisable to other interventions delivered to parents in other community-based environments. This research meets the expressed needs of commissioners, children's centres and parents to optimise the potential impact that HENRY has on obesity prevention. A subsequent cluster randomised controlled pilot trial is planned to determine the practicality of undertaking a definitive trial to robustly evaluate the effectiveness and cost-effectiveness of the optimised intervention on childhood obesity prevention. ClinicalTrials.gov identifier: NCT02675699 . Registered on 4 February 2016.
Enhancing the stress responses of probiotics for a lifestyle from gut to product and back again.
Mills, Susan; Stanton, Catherine; Fitzgerald, Gerald F; Ross, R Paul
2011-08-30
Before a probiotic bacterium can even begin to fulfill its biological role, it must survive a battery of environmental stresses imposed during food processing and passage through the gastrointestinal tract (GIT). Food processing stresses include extremes in temperature, as well as osmotic, oxidative and food matrix stresses. Passage through the GIT is a hazardous journey for any bacteria with deleterious lows in pH encountered in the stomach to the detergent-like properties of bile in the duodenum. However, bacteria are equipped with an array of defense mechanisms to counteract intracellular damage or to enhance the robustness of the cell to withstand lethal external environments. Understanding these mechanisms in probiotic bacteria and indeed other bacterial groups has resulted in the development of a molecular toolbox to augment the technological and gastrointestinal performance of probiotics. This has been greatly aided by studies which examine the global cellular responses to stress highlighting distinct regulatory networks and which also identify novel mechanisms used by cells to cope with hazardous environments. This review highlights the latest studies which have exploited the bacterial stress response with a view to producing next-generation probiotic cultures and highlights the significance of studies which view the global bacterial stress response from an integrative systems biology perspective.
Alejo, L; Corredoira, E; Sánchez-Muñoz, F; Huerga, C; Aza, Z; Plaza-Núñez, R; Serrada, A; Bret-Zurita, M; Parrón, M; Prieto-Areyano, C; Garzón-Moll, G; Madero, R; Guibelalde, E
2018-04-09
Objective: The new 2013/59 EURATOM Directive (ED) demands dosimetric optimisation procedures without undue delay. The aim of this study was to optimise paediatric conventional radiology examinations applying the ED without compromising the clinical diagnosis. Automatic dose management software (ADMS) was used to analyse 2678 studies of children from birth to 5 years of age, obtaining local diagnostic reference levels (DRLs) in terms of entrance surface air kerma. Given local DRL for infants and chest examinations exceeded the European Commission (EC) DRL, an optimisation was performed decreasing the kVp and applying the automatic control exposure. To assess the image quality, an analysis of high-contrast resolution (HCSR), signal-to-noise ratio (SNR) and figure of merit (FOM) was performed, as well as a blind test based on the generalised estimating equations method. For newborns and chest examinations, the local DRL exceeded the EC DRL by 113%. After the optimisation, a reduction of 54% was obtained. No significant differences were found in the image quality blind test. A decrease in SNR (-37%) and HCSR (-68%), and an increase in FOM (42%), was observed. ADMS allows the fast calculation of local DRLs and the performance of optimisation procedures in babies without delay. However, physical and clinical analyses of image quality remain to be needed to ensure the diagnostic integrity after the optimisation process. Advances in knowledge: ADMS are useful to detect radiation protection problems and to perform optimisation procedures in paediatric conventional imaging without undue delay, as ED requires.
E Birch, A Nicholas; Begg, Graham S; Squire, Geoffrey R
2011-06-01
Drivers behind food security and crop protection issues are discussed in relation to food losses caused by pests. Pests globally consume food estimated to feed an additional one billion people. Key drivers include rapid human population increase, climate change, loss of beneficial on-farm biodiversity, reduction in per capita cropped land, water shortages, and EU pesticide withdrawals under policies relating to 91/414 EEC. IPM (Integrated Pest Management) will be compulsory for all EU agriculture by 2014 and is also being widely adopted globally. IPM offers a 'toolbox' of complementary crop- and region-specific crop protection solutions to address these rising pressures. IPM aims for more sustainable solutions by using complementary technologies. The applied research challenge now is to reduce selection pressure on single solution strategies, by creating additive/synergistic interactions between IPM components. IPM is compatible with organic, conventional, and GM cropping systems and is flexible, allowing regional fine-tuning. It reduces pests below economic thresholds utilizing key 'ecological services', particularly biocontrol. A recent global review demonstrates that IPM can reduce pesticide use and increase yields of most of the major crops studied. Landscape scale 'ecological engineering', together with genetic improvement of new crop varieties, will enhance the durability of pest-resistant cultivars (conventional and GM). IPM will also promote compatibility with semiochemicals, biopesticides, precision pest monitoring tools, and rapid diagnostics. These combined strategies are urgently needed and are best achieved via multi-disciplinary research, including complex spatio-temporal modelling at farm and landscape scales. Integrative and synergistic use of existing and new IPM technologies will help meet future food production needs more sustainably in developed and developing countries, in an era of reduced pesticide availability. Current IPM research gaps are identified and discussed.
Asselineau, Charles-Alexis; Zapata, Jose; Pye, John
2015-06-01
A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.
Topology optimisation for natural convection problems
NASA Astrophysics Data System (ADS)
Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe; Sigmund, Ole
2014-12-01
This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach for designing heat sink geometries cooled by natural convection and micropumps powered by natural convection.
A supportive architecture for CFD-based design optimisation
NASA Astrophysics Data System (ADS)
Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong
2014-03-01
Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.
Optimisation of SOA-REAMs for hybrid DWDM-TDMA PON applications.
Naughton, Alan; Antony, Cleitus; Ossieur, Peter; Porto, Stefano; Talli, Giuseppe; Townsend, Paul D
2011-12-12
We demonstrate how loss-optimised, gain-saturated SOA-REAM based reflective modulators can reduce the burst to burst power variations due to differential access loss in the upstream path in carrier distributed passive optical networks by 18 dB compared to fixed linear gain modulators. We also show that the loss optimised device has a high tolerance to input power variations and can operate in deep saturation with minimal patterning penalties. Finally, we demonstrate that an optimised device can operate across the C-Band and also over a transmission distance of 80 km. © 2011 Optical Society of America
Reactive power planning under high penetration of wind energy using Benders decomposition
Xu, Yan; Wei, Yanli; Fang, Xin; ...
2015-11-05
This study addresses the optimal allocation of reactive power volt-ampere reactive (VAR) sources under the paradigm of high penetration of wind energy. Reactive power planning (RPP) in this particular condition involves a high level of uncertainty because of wind power characteristic. To properly model wind generation uncertainty, a multi-scenario framework optimal power flow that considers the voltage stability constraint under the worst wind scenario and transmission N 1 contingency is developed. The objective of RPP in this study is to minimise the total cost including the VAR investment cost and the expected generation cost. Therefore RPP under this condition ismore » modelled as a two-stage stochastic programming problem to optimise the VAR location and size in one stage, then to minimise the fuel cost in the other stage, and eventually, to find the global optimal RPP results iteratively. Benders decomposition is used to solve this model with an upper level problem (master problem) for VAR allocation optimisation and a lower problem (sub-problem) for generation cost minimisation. Impact of the potential reactive power support from doubly-fed induction generator (DFIG) is also analysed. Lastly, case studies on the IEEE 14-bus and 118-bus systems are provided to verify the proposed method.« less