Sample records for utility functions optimisation

  1. Game Theory of Mind

    PubMed Central

    Yoshida, Wako; Dolan, Ray J.; Friston, Karl J.

    2008-01-01

    This paper introduces a model of ‘theory of mind’, namely, how we represent the intentions and goals of others to optimise our mutual interactions. We draw on ideas from optimum control and game theory to provide a ‘game theory of mind’. First, we consider the representations of goals in terms of value functions that are prescribed by utility or rewards. Critically, the joint value functions and ensuing behaviour are optimised recursively, under the assumption that I represent your value function, your representation of mine, your representation of my representation of yours, and so on ad infinitum. However, if we assume that the degree of recursion is bounded, then players need to estimate the opponent's degree of recursion (i.e., sophistication) to respond optimally. This induces a problem of inferring the opponent's sophistication, given behavioural exchanges. We show it is possible to deduce whether players make inferences about each other and quantify their sophistication on the basis of choices in sequential games. This rests on comparing generative models of choices with, and without, inference. Model comparison is demonstrated using simulated and real data from a ‘stag-hunt’. Finally, we note that exactly the same sophisticated behaviour can be achieved by optimising the utility function itself (through prosocial utility), producing unsophisticated but apparently altruistic agents. This may be relevant ethologically in hierarchal game theory and coevolution. PMID:19112488

  2. Optimal control of LQG problem with an explicit trade-off between mean and variance

    NASA Astrophysics Data System (ADS)

    Qian, Fucai; Xie, Guo; Liu, Ding; Xie, Wenfang

    2011-12-01

    For discrete-time linear-quadratic Gaussian (LQG) control problems, a utility function on the expectation and the variance of the conventional performance index is considered. The utility function is viewed as an overall objective of the system and can perform the optimal trade-off between the mean and the variance of performance index. The nonlinear utility function is first converted into an auxiliary parameters optimisation problem about the expectation and the variance. Then an optimal closed-loop feedback controller for the nonseparable mean-variance minimisation problem is designed by nonlinear mathematical programming. Finally, simulation results are given to verify the algorithm's effectiveness obtained in this article.

  3. Geo-information processing service composition for concurrent tasks: A QoS-aware game theory approach

    NASA Astrophysics Data System (ADS)

    Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong

    2012-10-01

    Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.

  4. AstroBus On-Board Software

    NASA Astrophysics Data System (ADS)

    Biscarros, D.; Cantenot, C.; Séronie-Vivien, J.; Schmidt, G.

    AstroBus on-board software is a customisable software for ERC32 based avionics implementing standard ESA Packet Utilization Standard functions. Its architecture based on generic design templates and relying on a library providing standard PUS TC, TM and event services enhances its reusability on various programs. Finally, AstroBus on-board software development and validation environment is based on last generation tools providing an optimised customisation environment.

  5. Analysis of optimisation method for a two-stroke piston ring using the Finite Element Method and the Simulated Annealing Method

    NASA Astrophysics Data System (ADS)

    Kaliszewski, M.; Mazuro, P.

    2016-09-01

    Simulated Annealing Method of optimisation for the sealing piston ring geometry is tested. The aim of optimisation is to develop ring geometry which would exert demanded pressure on a cylinder just while being bended to fit the cylinder. Method of FEM analysis of an arbitrary piston ring geometry is applied in an ANSYS software. The demanded pressure function (basing on formulae presented by A. Iskra) as well as objective function are introduced. Geometry definition constructed by polynomials in radial coordinate system is delivered and discussed. Possible application of Simulated Annealing Method in a piston ring optimisation task is proposed and visualised. Difficulties leading to possible lack of convergence of optimisation are presented. An example of an unsuccessful optimisation performed in APDL is discussed. Possible line of further optimisation improvement is proposed.

  6. Optimisation of nano-silica modified self-compacting high-Volume fly ash mortar

    NASA Astrophysics Data System (ADS)

    Achara, Bitrus Emmanuel; Mohammed, Bashar S.; Fadhil Nuruddin, Muhd

    2017-05-01

    Evaluation of the effects of nano-silica amount and superplasticizer (SP) dosage on the compressive strength, porosity and slump flow on high-volume fly ash self-consolidating mortar was investigated. Multiobjective optimisation technique using Design-Expert software was applied to obtain solution based on desirability function that simultaneously optimises the variables and the responses. A desirability function of 0.811 gives the optimised solution. The experimental and predicted results showed minimal errors in all the measured responses.

  7. Metaheuristic optimisation methods for approximate solving of singular boundary value problems

    NASA Astrophysics Data System (ADS)

    Sadollah, Ali; Yadav, Neha; Gao, Kaizhou; Su, Rong

    2017-07-01

    This paper presents a novel approximation technique based on metaheuristics and weighted residual function (WRF) for tackling singular boundary value problems (BVPs) arising in engineering and science. With the aid of certain fundamental concepts of mathematics, Fourier series expansion, and metaheuristic optimisation algorithms, singular BVPs can be approximated as an optimisation problem with boundary conditions as constraints. The target is to minimise the WRF (i.e. error function) constructed in approximation of BVPs. The scheme involves generational distance metric for quality evaluation of the approximate solutions against exact solutions (i.e. error evaluator metric). Four test problems including two linear and two non-linear singular BVPs are considered in this paper to check the efficiency and accuracy of the proposed algorithm. The optimisation task is performed using three different optimisers including the particle swarm optimisation, the water cycle algorithm, and the harmony search algorithm. Optimisation results obtained show that the suggested technique can be successfully applied for approximate solving of singular BVPs.

  8. Comparison of the genetic algorithm and incremental optimisation routines for a Bayesian inverse modelling based network design

    NASA Astrophysics Data System (ADS)

    Nickless, A.; Rayner, P. J.; Erni, B.; Scholes, R. J.

    2018-05-01

    The design of an optimal network of atmospheric monitoring stations for the observation of carbon dioxide (CO2) concentrations can be obtained by applying an optimisation algorithm to a cost function based on minimising posterior uncertainty in the CO2 fluxes obtained from a Bayesian inverse modelling solution. Two candidate optimisation methods assessed were the evolutionary algorithm: the genetic algorithm (GA), and the deterministic algorithm: the incremental optimisation (IO) routine. This paper assessed the ability of the IO routine in comparison to the more computationally demanding GA routine to optimise the placement of a five-member network of CO2 monitoring sites located in South Africa. The comparison considered the reduction in uncertainty of the overall flux estimate, the spatial similarity of solutions, and computational requirements. Although the IO routine failed to find the solution with the global maximum uncertainty reduction, the resulting solution had only fractionally lower uncertainty reduction compared with the GA, and at only a quarter of the computational resources used by the lowest specified GA algorithm. The GA solution set showed more inconsistency if the number of iterations or population size was small, and more so for a complex prior flux covariance matrix. If the GA completed with a sub-optimal solution, these solutions were similar in fitness to the best available solution. Two additional scenarios were considered, with the objective of creating circumstances where the GA may outperform the IO. The first scenario considered an established network, where the optimisation was required to add an additional five stations to an existing five-member network. In the second scenario the optimisation was based only on the uncertainty reduction within a subregion of the domain. The GA was able to find a better solution than the IO under both scenarios, but with only a marginal improvement in the uncertainty reduction. These results suggest that the best use of resources for the network design problem would be spent in improvement of the prior estimates of the flux uncertainties rather than investing these resources in running a complex evolutionary optimisation algorithm. The authors recommend that, if time and computational resources allow, that multiple optimisation techniques should be used as a part of a comprehensive suite of sensitivity tests when performing such an optimisation exercise. This will provide a selection of best solutions which could be ranked based on their utility and practicality.

  9. Basis for the development of sustainable optimisation indicators for activated sludge wastewater treatment plants in the Republic of Ireland.

    PubMed

    Gordon, G T; McCann, B P

    2015-01-01

    This paper describes the basis of a stakeholder-based sustainable optimisation indicator (SOI) system to be developed for small-to-medium sized activated sludge (AS) wastewater treatment plants (WwTPs) in the Republic of Ireland (ROI). Key technical publications relating to best practice plant operation, performance audits and optimisation, and indicator and benchmarking systems for wastewater services are identified. Optimisation studies were developed at a number of Irish AS WwTPs and key findings are presented. A national AS WwTP manager/operator survey was carried out to verify the applied operational findings and identify the key operator stakeholder requirements for this proposed SOI system. It was found that most plants require more consistent operational data-based decision-making, monitoring and communication structures to facilitate optimised, sustainable and continuous performance improvement. The applied optimisation and stakeholder consultation phases form the basis of the proposed stakeholder-based SOI system. This system will allow for continuous monitoring and rating of plant performance, facilitate optimised operation and encourage the prioritisation of performance improvement through tracking key operational metrics. Plant optimisation has become a major focus due to the transfer of all ROI water services to a national water utility from individual local authorities and the implementation of the EU Water Framework Directive.

  10. Group search optimiser-based optimal bidding strategies with no Karush-Kuhn-Tucker optimality conditions

    NASA Astrophysics Data System (ADS)

    Yadav, Naresh Kumar; Kumar, Mukesh; Gupta, S. K.

    2017-03-01

    General strategic bidding procedure has been formulated in the literature as a bi-level searching problem, in which the offer curve tends to minimise the market clearing function and to maximise the profit. Computationally, this is complex and hence, the researchers have adopted Karush-Kuhn-Tucker (KKT) optimality conditions to transform the model into a single-level maximisation problem. However, the profit maximisation problem with KKT optimality conditions poses great challenge to the classical optimisation algorithms. The problem has become more complex after the inclusion of transmission constraints. This paper simplifies the profit maximisation problem as a minimisation function, in which the transmission constraints, the operating limits and the ISO market clearing functions are considered with no KKT optimality conditions. The derived function is solved using group search optimiser (GSO), a robust population-based optimisation algorithm. Experimental investigation is carried out on IEEE 14 as well as IEEE 30 bus systems and the performance is compared against differential evolution-based strategic bidding, genetic algorithm-based strategic bidding and particle swarm optimisation-based strategic bidding methods. The simulation results demonstrate that the obtained profit maximisation through GSO-based bidding strategies is higher than the other three methods.

  11. Improving Vector Evaluated Particle Swarm Optimisation by Incorporating Nondominated Solutions

    PubMed Central

    Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima

    2013-01-01

    The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm. PMID:23737718

  12. Improving Vector Evaluated Particle Swarm Optimisation by incorporating nondominated solutions.

    PubMed

    Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima

    2013-01-01

    The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.

  13. Crystal structure optimisation using an auxiliary equation of state

    NASA Astrophysics Data System (ADS)

    Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; Walsh, Aron

    2015-11-01

    Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.

  14. Multi-Objectivising Combinatorial Optimisation Problems by Means of Elementary Landscape Decompositions.

    PubMed

    Ceberio, Josu; Calvo, Borja; Mendiburu, Alexander; Lozano, Jose A

    2018-02-15

    In the last decade, many works in combinatorial optimisation have shown that, due to the advances in multi-objective optimisation, the algorithms from this field could be used for solving single-objective problems as well. In this sense, a number of papers have proposed multi-objectivising single-objective problems in order to use multi-objective algorithms in their optimisation. In this article, we follow up this idea by presenting a methodology for multi-objectivising combinatorial optimisation problems based on elementary landscape decompositions of their objective function. Under this framework, each of the elementary landscapes obtained from the decomposition is considered as an independent objective function to optimise. In order to illustrate this general methodology, we consider four problems from different domains: the quadratic assignment problem and the linear ordering problem (permutation domain), the 0-1 unconstrained quadratic optimisation problem (binary domain), and the frequency assignment problem (integer domain). We implemented two widely known multi-objective algorithms, NSGA-II and SPEA2, and compared their performance with that of a single-objective GA. The experiments conducted on a large benchmark of instances of the four problems show that the multi-objective algorithms clearly outperform the single-objective approaches. Furthermore, a discussion on the results suggests that the multi-objective space generated by this decomposition enhances the exploration ability, thus permitting NSGA-II and SPEA2 to obtain better results in the majority of the tested instances.

  15. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  16. Strategic optimisation of microgrid by evolving a unitised regenerative fuel cell system operational criterion

    NASA Astrophysics Data System (ADS)

    Bhansali, Gaurav; Singh, Bhanu Pratap; Kumar, Rajesh

    2016-09-01

    In this paper, the problem of microgrid optimisation with storage has been addressed in an unaccounted way rather than confining it to loss minimisation. Unitised regenerative fuel cell (URFC) systems have been studied and employed in microgrids to store energy and feed it back into the system when required. A value function-dependent on line losses, URFC system operational cost and stored energy at the end of the day are defined here. The function is highly complex, nonlinear and multi dimensional in nature. Therefore, heuristic optimisation techniques in combination with load flow analysis are used here to resolve the network and time domain complexity related with the problem. Particle swarm optimisation with the forward/backward sweep algorithm ensures optimal operation of microgrid thereby minimising the operational cost of the microgrid. Results are shown and are found to be consistently improving with evolution of the solution strategy.

  17. Natural Erosion of Sandstone as Shape Optimisation.

    PubMed

    Ostanin, Igor; Safonov, Alexander; Oseledets, Ivan

    2017-12-11

    Natural arches, pillars and other exotic sandstone formations have always been attracting attention for their unusual shapes and amazing mechanical balance that leave a strong impression of intelligent design rather than the result of a stochastic process. It has been recently demonstrated that these shapes could have been the result of the negative feedback between stress and erosion that originates in fundamental laws of friction between the rock's constituent particles. Here we present a deeper analysis of this idea and bridge it with the approaches utilized in shape and topology optimisation. It appears that the processes of natural erosion, driven by stochastic surface forces and Mohr-Coulomb law of dry friction, can be viewed within the framework of local optimisation for minimum elastic strain energy. Our hypothesis is confirmed by numerical simulations of the erosion using the topological-shape optimisation model. Our work contributes to a better understanding of stochastic erosion and feasible landscape formations that could be found on Earth and beyond.

  18. Hybrid insulation coordination and optimisation for 1 MV operation of pulsed electron accelerator KALI-30GW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senthil, K.; Mitra, S.; Sandeep, S., E-mail: sentilk@barc.gov.in

    In a multi-gigawatt pulsed power system like KALI-30 GW, insulation coordination is required to achieve high voltages ranging from 0.3 MV to 1 MV. At the same time optimisation of the insulation parameters is required to minimize the inductance of the system, so that nanoseconds output can be achieved. The KALI-30GW pulse power system utilizes a combination of Perspex, delrin, epoxy, transformer oil, nitrogen/SF{sub 6} gas and vacuum insulation at its various stages in compressing DC high voltage to a nanoseconds pulse. This paper describes the operation and performance of the system from 400 kV to 1030 kV output voltagemore » pulse and insulation parameters utilized for obtaining peak 1 MV output. (author)« less

  19. Pre-operative optimisation of lung function

    PubMed Central

    Azhar, Naheed

    2015-01-01

    The anaesthetic management of patients with pre-existing pulmonary disease is a challenging task. It is associated with increased morbidity in the form of post-operative pulmonary complications. Pre-operative optimisation of lung function helps in reducing these complications. Patients are advised to stop smoking for a period of 4–6 weeks. This reduces airway reactivity, improves mucociliary function and decreases carboxy-haemoglobin. The widely used incentive spirometry may be useful only when combined with other respiratory muscle exercises. Volume-based inspiratory devices have the best results. Pharmacotherapy of asthma and chronic obstructive pulmonary disease must be optimised before considering the patient for elective surgery. Beta 2 agonists, inhaled corticosteroids and systemic corticosteroids, are the main drugs used for this and several drugs play an adjunctive role in medical therapy. A graded approach has been suggested to manage these patients for elective surgery with an aim to achieve optimal pulmonary function. PMID:26556913

  20. Automated model optimisation using the Cylc workflow engine (Cyclops v1.0)

    NASA Astrophysics Data System (ADS)

    Gorman, Richard M.; Oliver, Hilary J.

    2018-06-01

    Most geophysical models include many parameters that are not fully determined by theory, and can be tuned to improve the model's agreement with available data. We might attempt to automate this tuning process in an objective way by employing an optimisation algorithm to find the set of parameters that minimises a cost function derived from comparing model outputs with measurements. A number of algorithms are available for solving optimisation problems, in various programming languages, but interfacing such software to a complex geophysical model simulation presents certain challenges. To tackle this problem, we have developed an optimisation suite (Cyclops) based on the Cylc workflow engine that implements a wide selection of optimisation algorithms from the NLopt Python toolbox (Johnson, 2014). The Cyclops optimisation suite can be used to calibrate any modelling system that has itself been implemented as a (separate) Cylc model suite, provided it includes computation and output of the desired scalar cost function. A growing number of institutions are using Cylc to orchestrate complex distributed suites of interdependent cycling tasks within their operational forecast systems, and in such cases application of the optimisation suite is particularly straightforward. As a test case, we applied the Cyclops to calibrate a global implementation of the WAVEWATCH III (v4.18) third-generation spectral wave model, forced by ERA-Interim input fields. This was calibrated over a 1-year period (1997), before applying the calibrated model to a full (1979-2016) wave hindcast. The chosen error metric was the spatial average of the root mean square error of hindcast significant wave height compared with collocated altimeter records. We describe the results of a calibration in which up to 19 parameters were optimised.

  1. Optimisation by hierarchical search

    NASA Astrophysics Data System (ADS)

    Zintchenko, Ilia; Hastings, Matthew; Troyer, Matthias

    2015-03-01

    Finding optimal values for a set of variables relative to a cost function gives rise to some of the hardest problems in physics, computer science and applied mathematics. Although often very simple in their formulation, these problems have a complex cost function landscape which prevents currently known algorithms from efficiently finding the global optimum. Countless techniques have been proposed to partially circumvent this problem, but an efficient method is yet to be found. We present a heuristic, general purpose approach to potentially improve the performance of conventional algorithms or special purpose hardware devices by optimising groups of variables in a hierarchical way. We apply this approach to problems in combinatorial optimisation, machine learning and other fields.

  2. Optimality principles in the regulation of metabolic networks.

    PubMed

    Berkhout, Jan; Bruggeman, Frank J; Teusink, Bas

    2012-08-29

    One of the challenging tasks in systems biology is to understand how molecular networks give rise to emergent functionality and whether universal design principles apply to molecular networks. To achieve this, the biophysical, evolutionary and physiological constraints that act on those networks need to be identified in addition to the characterisation of the molecular components and interactions. Then, the cellular "task" of the network-its function-should be identified. A network contributes to organismal fitness through its function. The premise is that the same functions are often implemented in different organisms by the same type of network; hence, the concept of design principles. In biology, due to the strong forces of selective pressure and natural selection, network functions can often be understood as the outcome of fitness optimisation. The hypothesis of fitness optimisation to understand the design of a network has proven to be a powerful strategy. Here, we outline the use of several optimisation principles applied to biological networks, with an emphasis on metabolic regulatory networks. We discuss the different objective functions and constraints that are considered and the kind of understanding that they provide.

  3. A novel global Harmony Search method based on Ant Colony Optimisation algorithm

    NASA Astrophysics Data System (ADS)

    Fouad, Allouani; Boukhetala, Djamel; Boudjema, Fares; Zenger, Kai; Gao, Xiao-Zhi

    2016-03-01

    The Global-best Harmony Search (GHS) is a stochastic optimisation algorithm recently developed, which hybridises the Harmony Search (HS) method with the concept of swarm intelligence in the particle swarm optimisation (PSO) to enhance its performance. In this article, a new optimisation algorithm called GHSACO is developed by incorporating the GHS with the Ant Colony Optimisation algorithm (ACO). Our method introduces a novel improvisation process, which is different from that of the GHS in the following aspects. (i) A modified harmony memory (HM) representation and conception. (ii) The use of a global random switching mechanism to monitor the choice between the ACO and GHS. (iii) An additional memory consideration selection rule using the ACO random proportional transition rule with a pheromone trail update mechanism. The proposed GHSACO algorithm has been applied to various benchmark functions and constrained optimisation problems. Simulation results demonstrate that it can find significantly better solutions when compared with the original HS and some of its variants.

  4. A new effective operator for the hybrid algorithm for solving global optimisation problems

    NASA Astrophysics Data System (ADS)

    Duc, Le Anh; Li, Kenli; Nguyen, Tien Trong; Yen, Vu Minh; Truong, Tung Khac

    2018-04-01

    Hybrid algorithms have been recently used to solve complex single-objective optimisation problems. The ultimate goal is to find an optimised global solution by using these algorithms. Based on the existing algorithms (HP_CRO, PSO, RCCRO), this study proposes a new hybrid algorithm called MPC (Mean-PSO-CRO), which utilises a new Mean-Search Operator. By employing this new operator, the proposed algorithm improves the search ability on areas of the solution space that the other operators of previous algorithms do not explore. Specifically, the Mean-Search Operator helps find the better solutions in comparison with other algorithms. Moreover, the authors have proposed two parameters for balancing local and global search and between various types of local search, as well. In addition, three versions of this operator, which use different constraints, are introduced. The experimental results on 23 benchmark functions, which are used in previous works, show that our framework can find better optimal or close-to-optimal solutions with faster convergence speed for most of the benchmark functions, especially the high-dimensional functions. Thus, the proposed algorithm is more effective in solving single-objective optimisation problems than the other existing algorithms.

  5. A Method for Decentralised Optimisation in Networks

    NASA Astrophysics Data System (ADS)

    Saramäki, Jari

    2005-06-01

    We outline a method for distributed Monte Carlo optimisation of computational problems in networks of agents, such as peer-to-peer networks of computers. The optimisation and messaging procedures are inspired by gossip protocols and epidemic data dissemination, and are decentralised, i.e. no central overseer is required. In the outlined method, each agent follows simple local rules and seeks for better solutions to the optimisation problem by Monte Carlo trials, as well as by querying other agents in its local neighbourhood. With proper network topology, good solutions spread rapidly through the network for further improvement. Furthermore, the system retains its functionality even in realistic settings where agents are randomly switched on and off.

  6. Semantic distance as a critical factor in icon design for in-car infotainment systems.

    PubMed

    Silvennoinen, Johanna M; Kujala, Tuomo; Jokinen, Jussi P P

    2017-11-01

    In-car infotainment systems require icons that enable fluent cognitive information processing and safe interaction while driving. An important issue is how to find an optimised set of icons for different functions in terms of semantic distance. In an optimised icon set, every icon needs to be semantically as close as possible to the function it visually represents and semantically as far as possible from the other functions represented concurrently. In three experiments (N = 21 each), semantic distances of 19 icons to four menu functions were studied with preference rankings, verbal protocols, and the primed product comparisons method. The results show that the primed product comparisons method can be efficiently utilised for finding an optimised set of icons for time-critical applications out of a larger set of icons. The findings indicate the benefits of the novel methodological perspective into the icon design for safety-critical contexts in general. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Optimality Principles in the Regulation of Metabolic Networks

    PubMed Central

    Berkhout, Jan; Bruggeman, Frank J.; Teusink, Bas

    2012-01-01

    One of the challenging tasks in systems biology is to understand how molecular networks give rise to emergent functionality and whether universal design principles apply to molecular networks. To achieve this, the biophysical, evolutionary and physiological constraints that act on those networks need to be identified in addition to the characterisation of the molecular components and interactions. Then, the cellular “task” of the network—its function—should be identified. A network contributes to organismal fitness through its function. The premise is that the same functions are often implemented in different organisms by the same type of network; hence, the concept of design principles. In biology, due to the strong forces of selective pressure and natural selection, network functions can often be understood as the outcome of fitness optimisation. The hypothesis of fitness optimisation to understand the design of a network has proven to be a powerful strategy. Here, we outline the use of several optimisation principles applied to biological networks, with an emphasis on metabolic regulatory networks. We discuss the different objective functions and constraints that are considered and the kind of understanding that they provide. PMID:24957646

  8. Distributed optimisation problem with communication delay and external disturbance

    NASA Astrophysics Data System (ADS)

    Tran, Ngoc-Tu; Xiao, Jiang-Wen; Wang, Yan-Wu; Yang, Wu

    2017-12-01

    This paper investigates the distributed optimisation problem for the multi-agent systems (MASs) with the simultaneous presence of external disturbance and the communication delay. To solve this problem, a two-step design scheme is introduced. In the first step, based on the internal model principle, the internal model term is constructed to compensate the disturbance asymptotically. In the second step, a distributed optimisation algorithm is designed to solve the distributed optimisation problem based on the MASs with the simultaneous presence of disturbance and communication delay. Moreover, in the proposed algorithm, each agent interacts with its neighbours through the connected topology and the delay occurs during the information exchange. By utilising Lyapunov-Krasovskii functional, the delay-dependent conditions are derived for both slowly and fast time-varying delay, respectively, to ensure the convergence of the algorithm to the optimal solution of the optimisation problem. Several numerical simulation examples are provided to illustrate the effectiveness of the theoretical results.

  9. Using social media to facilitate knowledge transfer in complex engineering environments: a primer for educators

    NASA Astrophysics Data System (ADS)

    Murphy, Glen; Salomone, Sonia

    2013-03-01

    While highly cohesive groups are potentially advantageous they are also often correlated with the emergence of knowledge and information silos based around those same functional or occupational clusters. Consequently, an essential challenge for engineering organisations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. This paper acts as a primer for those seeking to gain an understanding of the design, functionality and utility of a suite of software tools generically termed social media technologies in the context of optimising the management of tacit engineering knowledge. Underpinned by knowledge management theory and using detailed case examples, this paper explores how social media technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering environments.

  10. Promise and pitfalls of molecular markers of thyroid nodules

    PubMed Central

    Jadhav, S.; Lila, Anurag; Bandgar, Tushar; Shah, Nalini

    2012-01-01

    Thyroid nodules are common in the general population with a prevalence of 5-7% The initial evaluation of thyroid nodules commonly involves thyroid function tests, an ultrasound (USG) and fine needle aspiration biopsy (FNAB). The optimal management of patients with thyroid nodules with indeterminate cytology is plagued by the lack of highly sensitive and specific diagnostic modalities In this article we attempt to review the available literature on the molecular markers which are increasingly being studied for their diagnostic utility in assessing thyroid nodules. The various molecular markers consist of gene mutations, gene re arrangements, RNA based assays and immunohistochemical markers. The molecular markers definitely would help to optimise the management of such patients. PMID:23565369

  11. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    PubMed Central

    Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D

    2017-01-01

    Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis. PMID:28228852

  12. Genetic algorithm-based improved DOA estimation using fourth-order cumulants

    NASA Astrophysics Data System (ADS)

    Ahmed, Ammar; Tufail, Muhammad

    2017-05-01

    Genetic algorithm (GA)-based direction of arrival (DOA) estimation is proposed using fourth-order cumulants (FOC) and ESPRIT principle which results in Multiple Invariance Cumulant ESPRIT algorithm. In the existing FOC ESPRIT formulations, only one invariance is utilised to estimate DOAs. The unused multiple invariances (MIs) must be exploited simultaneously in order to improve the estimation accuracy. In this paper, a fitness function based on a carefully designed cumulant matrix is developed which incorporates MIs present in the sensor array. Better DOA estimation can be achieved by minimising this fitness function. Moreover, the effectiveness of Newton's method as well as GA for this optimisation problem has been illustrated. Simulation results show that the proposed algorithm provides improved estimation accuracy compared to existing algorithms, especially in the case of low SNR, less number of snapshots, closely spaced sources and high signal and noise correlation. Moreover, it is observed that the optimisation using Newton's method is more likely to converge to false local optima resulting in erroneous results. However, GA-based optimisation has been found attractive due to its global optimisation capability.

  13. On some properties of bone functional adaptation phenomenon useful in mechanical design.

    PubMed

    Nowak, Michał

    2010-01-01

    The paper discusses some unique properties of trabecular bone functional adaptation phenomenon, useful in mechanical design. On the basis of the biological process observations and the principle of constant strain energy density on the surface of the structure, the generic structural optimisation system has been developed. Such approach allows fulfilling mechanical theorem for the stiffest design, comprising the optimisations of size, shape and topology, using the concepts known from biomechanical studies. Also the biomimetic solution of multiple load problems is presented.

  14. Consideration of plant behaviour in optimal servo-compensator design

    NASA Astrophysics Data System (ADS)

    Moase, W. H.; Manzie, C.

    2016-07-01

    Where the most prevalent optimal servo-compensator formulations penalise the behaviour of an error system, this paper considers the problem of additionally penalising the actual states and inputs of the plant. Doing so has the advantage of enabling the penalty function to better resemble an economic cost. This is especially true of problems where control effort needs to be sensibly allocated across weakly redundant inputs or where one wishes to use penalties to soft-constrain certain states or inputs. It is shown that, although the resulting cost function grows unbounded as its horizon approaches infinity, it is possible to formulate an equivalent optimisation problem with a bounded cost. The resulting optimisation problem is similar to those in earlier studies but has an additional 'correction term' in the cost function, and a set of equality constraints that arise when there are redundant inputs. A numerical approach to solve the resulting optimisation problem is presented, followed by simulations on a micro-macro positioner that illustrate the benefits of the proposed servo-compensator design approach.

  15. A Bayesian Approach for Sensor Optimisation in Impact Identification

    PubMed Central

    Mallardo, Vincenzo; Sharif Khodaei, Zahra; Aliabadi, Ferri M. H.

    2016-01-01

    This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM) system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence. PMID:28774064

  16. Structural-electrical coupling optimisation for radiating and scattering performances of active phased array antenna

    NASA Astrophysics Data System (ADS)

    Wang, Congsi; Wang, Yan; Wang, Zhihai; Wang, Meng; Yuan, Shuai; Wang, Weifeng

    2018-04-01

    It is well known that calculating and reducing of radar cross section (RCS) of the active phased array antenna (APAA) are both difficult and complicated. It remains unresolved to balance the performance of the radiating and scattering when the RCS is reduced. Therefore, this paper develops a structure and scattering array factor coupling model of APAA based on the phase errors of radiated elements generated by structural distortion and installation error of the array. To obtain the optimal radiating and scattering performance, an integrated optimisation model is built to optimise the installation height of all the radiated elements in normal direction of the array, in which the particle swarm optimisation method is adopted and the gain loss and scattering array factor are selected as the fitness function. The simulation indicates that the proposed coupling model and integrated optimisation method can effectively decrease the RCS and that the necessary radiating performance can be simultaneously guaranteed, which demonstrate an important application value in engineering design and structural evaluation of APAA.

  17. A Cost-Utility Analysis of Prostate Cancer Screening in Australia.

    PubMed

    Keller, Andrew; Gericke, Christian; Whitty, Jennifer A; Yaxley, John; Kua, Boon; Coughlin, Geoff; Gianduzzo, Troy

    2017-02-01

    The Göteborg randomised population-based prostate cancer screening trial demonstrated that prostate-specific antigen (PSA)-based screening reduces prostate cancer deaths compared with an age-matched control group. Utilising the prostate cancer detection rates from this study, we investigated the clinical and cost effectiveness of a similar PSA-based screening strategy for an Australian population of men aged 50-69 years. A decision model that incorporated Markov processes was developed from a health system perspective. The base-case scenario compared a population-based screening programme with current opportunistic screening practices. Costs, utility values, treatment patterns and background mortality rates were derived from Australian data. All costs were adjusted to reflect July 2015 Australian dollars (A$). An alternative scenario compared systematic with opportunistic screening but with optimisation of active surveillance (AS) uptake in both groups. A discount rate of 5 % for costs and benefits was utilised. Univariate and probabilistic sensitivity analyses were performed to assess the effect of variable uncertainty on model outcomes. Our model very closely replicated the number of deaths from both prostate cancer and background mortality in the Göteborg study. The incremental cost per quality-adjusted life-year (QALY) for PSA screening was A$147,528. However, for years of life gained (LYGs), PSA-based screening (A$45,890/LYG) appeared more favourable. Our alternative scenario with optimised AS improved cost utility to A$45,881/QALY, with screening becoming cost effective at a 92 % AS uptake rate. Both modelled scenarios were most sensitive to the utility of patients before and after intervention, and the discount rate used. PSA-based screening is not cost effective compared with Australia's assumed willingness-to-pay threshold of A$50,000/QALY. It appears more cost effective if LYGs are used as the relevant outcome, and is more cost effective than the established Australian breast cancer screening programme on this basis. Optimised utilisation of AS increases the cost effectiveness of prostate cancer screening dramatically.

  18. Multiobjective optimisation of bogie suspension to boost speed on curves

    NASA Astrophysics Data System (ADS)

    Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor

    2016-01-01

    To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.

  19. Development of the hard and soft constraints based optimisation model for unit sizing of the hybrid renewable energy system designed for microgrid applications

    NASA Astrophysics Data System (ADS)

    Sundaramoorthy, Kumaravel

    2017-02-01

    The hybrid energy systems (HESs) based electricity generation system has become a more attractive solution for rural electrification nowadays. Economically feasible and technically reliable HESs are solidly based on an optimisation stage. This article discusses about the optimal unit sizing model with the objective function to minimise the total cost of the HES. Three typical rural sites from southern part of India have been selected for the application of the developed optimisation methodology. Feasibility studies and sensitivity analysis on the optimal HES are discussed elaborately in this article. A comparison has been carried out with the Hybrid Optimization Model for Electric Renewable optimisation model for three sites. The optimal HES is found with less total net present rate and rate of energy compared with the existing method

  20. Designing synthetic networks in silico: a generalised evolutionary algorithm approach.

    PubMed

    Smith, Robert W; van Sluijs, Bob; Fleck, Christian

    2017-12-02

    Evolution has led to the development of biological networks that are shaped by environmental signals. Elucidating, understanding and then reconstructing important network motifs is one of the principal aims of Systems & Synthetic Biology. Consequently, previous research has focused on finding optimal network structures and reaction rates that respond to pulses or produce stable oscillations. In this work we present a generalised in silico evolutionary algorithm that simultaneously finds network structures and reaction rates (genotypes) that can satisfy multiple defined objectives (phenotypes). The key step to our approach is to translate a schema/binary-based description of biological networks into systems of ordinary differential equations (ODEs). The ODEs can then be solved numerically to provide dynamic information about an evolved networks functionality. Initially we benchmark algorithm performance by finding optimal networks that can recapitulate concentration time-series data and perform parameter optimisation on oscillatory dynamics of the Repressilator. We go on to show the utility of our algorithm by finding new designs for robust synthetic oscillators, and by performing multi-objective optimisation to find a set of oscillators and feed-forward loops that are optimal at balancing different system properties. In sum, our results not only confirm and build on previous observations but we also provide new designs of synthetic oscillators for experimental construction. In this work we have presented and tested an evolutionary algorithm that can design a biological network to produce desired output. Given that previous designs of synthetic networks have been limited to subregions of network- and parameter-space, the use of our evolutionary optimisation algorithm will enable Synthetic Biologists to construct new systems with the potential to display a wider range of complex responses.

  1. Dwell time-based stabilisation of switched delay systems using free-weighting matrices

    NASA Astrophysics Data System (ADS)

    Koru, Ahmet Taha; Delibaşı, Akın; Özbay, Hitay

    2018-01-01

    In this paper, we present a quasi-convex optimisation method to minimise an upper bound of the dwell time for stability of switched delay systems. Piecewise Lyapunov-Krasovskii functionals are introduced and the upper bound for the derivative of Lyapunov functionals is estimated by free-weighting matrices method to investigate non-switching stability of each candidate subsystems. Then, a sufficient condition for the dwell time is derived to guarantee the asymptotic stability of the switched delay system. Once these conditions are represented by a set of linear matrix inequalities , dwell time optimisation problem can be formulated as a standard quasi-convex optimisation problem. Numerical examples are given to illustrate the improvements over previously obtained dwell time bounds. Using the results obtained in the stability case, we present a nonlinear minimisation algorithm to synthesise the dwell time minimiser controllers. The algorithm solves the problem with successive linearisation of nonlinear conditions.

  2. Quadratic Optimisation with One Quadratic Equality Constraint

    DTIC Science & Technology

    2010-06-01

    This report presents a theoretical framework for minimising a quadratic objective function subject to a quadratic equality constraint. The first part of the report gives a detailed algorithm which computes the global minimiser without calling special nonlinear optimisation solvers. The second part of the report shows how the developed theory can be applied to solve the time of arrival geolocation problem.

  3. Genome and epigenome engineering CRISPR toolkit for in vivo modulation of cis-regulatory interactions and gene expression in the chicken embryo.

    PubMed

    Williams, Ruth M; Senanayake, Upeka; Artibani, Mara; Taylor, Gunes; Wells, Daniel; Ahmed, Ahmed Ashour; Sauka-Spengler, Tatjana

    2018-02-23

    CRISPR/Cas9 genome engineering has revolutionised all aspects of biological research, with epigenome engineering transforming gene regulation studies. Here, we present an optimised, adaptable toolkit enabling genome and epigenome engineering in the chicken embryo, and demonstrate its utility by probing gene regulatory interactions mediated by neural crest enhancers. First, we optimise novel efficient guide-RNA mini expression vectors utilising chick U6 promoters, provide a strategy for rapid somatic gene knockout and establish a protocol for evaluation of mutational penetrance by targeted next-generation sequencing. We show that CRISPR/Cas9-mediated disruption of transcription factors causes a reduction in their cognate enhancer-driven reporter activity. Next, we assess endogenous enhancer function using both enhancer deletion and nuclease-deficient Cas9 (dCas9) effector fusions to modulate enhancer chromatin landscape, thus providing the first report of epigenome engineering in a developing embryo. Finally, we use the synergistic activation mediator (SAM) system to activate an endogenous target promoter. The novel genome and epigenome engineering toolkit developed here enables manipulation of endogenous gene expression and enhancer activity in chicken embryos, facilitating high-resolution analysis of gene regulatory interactions in vivo . © 2018. Published by The Company of Biologists Ltd.

  4. Optimal control of Formula One car energy recovery systems

    NASA Astrophysics Data System (ADS)

    Limebeer, D. J. N.; Perantoni, G.; Rao, A. V.

    2014-10-01

    The utility of orthogonal collocation methods in the solution of optimal control problems relating to Formula One racing is demonstrated. These methods can be used to optimise driver controls such as the steering, braking and throttle usage, and to optimise vehicle parameters such as the aerodynamic down force and mass distributions. Of particular interest is the optimal usage of energy recovery systems (ERSs). Contemporary kinetic energy recovery systems are studied and compared with future hybrid kinetic and thermal/heat ERSs known as ERS-K and ERS-H, respectively. It is demonstrated that these systems, when properly controlled, can produce contemporary lap time using approximately two-thirds of the fuel required by earlier generation (2013 and prior) vehicles.

  5. Cultural-based particle swarm for dynamic optimisation problems

    NASA Astrophysics Data System (ADS)

    Daneshyari, Moayed; Yen, Gary G.

    2012-07-01

    Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.

  6. A management and optimisation model for water supply planning in water deficit areas

    NASA Astrophysics Data System (ADS)

    Molinos-Senante, María; Hernández-Sancho, Francesc; Mocholí-Arce, Manuel; Sala-Garrido, Ramón

    2014-07-01

    The integrated water resources management approach has proven to be a suitable option for efficient, equitable and sustainable water management. In water-poor regions experiencing acute and/or chronic shortages, optimisation techniques are a useful tool for supporting the decision process of water allocation. In order to maximise the value of water use, an optimisation model was developed which involves multiple supply sources (conventional and non-conventional) and multiple users. Penalties, representing monetary losses in the event of an unfulfilled water demand, have been incorporated into the objective function. This model represents a novel approach which considers water distribution efficiency and the physical connections between water supply and demand points. Subsequent empirical testing using data from a Spanish Mediterranean river basin demonstrated the usefulness of the global optimisation model to solve existing water imbalances at the river basin level.

  7. Improving target coverage and organ-at-risk sparing in intensity-modulated radiotherapy for cervical oesophageal cancer using a simple optimisation method.

    PubMed

    Lu, Jia-Yang; Cheung, Michael Lok-Man; Huang, Bao-Tian; Wu, Li-Li; Xie, Wen-Jia; Chen, Zhi-Jian; Li, De-Rui; Xie, Liang-Xi

    2015-01-01

    To assess the performance of a simple optimisation method for improving target coverage and organ-at-risk (OAR) sparing in intensity-modulated radiotherapy (IMRT) for cervical oesophageal cancer. For 20 selected patients, clinically acceptable original IMRT plans (Original plans) were created, and two optimisation methods were adopted to improve the plans: 1) a base dose function (BDF)-based method, in which the treatment plans were re-optimised based on the original plans, and 2) a dose-controlling structure (DCS)-based method, in which the original plans were re-optimised by assigning additional constraints for hot and cold spots. The Original, BDF-based and DCS-based plans were compared with regard to target dose homogeneity, conformity, OAR sparing, planning time and monitor units (MUs). Dosimetric verifications were performed and delivery times were recorded for the BDF-based and DCS-based plans. The BDF-based plans provided significantly superior dose homogeneity and conformity compared with both the DCS-based and Original plans. The BDF-based method further reduced the doses delivered to the OARs by approximately 1-3%. The re-optimisation time was reduced by approximately 28%, but the MUs and delivery time were slightly increased. All verification tests were passed and no significant differences were found. The BDF-based method for the optimisation of IMRT for cervical oesophageal cancer can achieve significantly better dose distributions with better planning efficiency at the expense of slightly more MUs.

  8. A framework for the computer-aided planning and optimisation of manufacturing processes for components with functional graded properties

    NASA Astrophysics Data System (ADS)

    Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.

    2014-05-01

    In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.

  9. Energy landscapes for a machine learning application to series data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballard, Andrew J.; Stevenson, Jacob D.; Das, Ritankar

    2016-03-28

    Methods developed to explore and characterise potential energy landscapes are applied to the corresponding landscapes obtained from optimisation of a cost function in machine learning. We consider neural network predictions for the outcome of local geometry optimisation in a triatomic cluster, where four distinct local minima exist. The accuracy of the predictions is compared for fits using data from single and multiple points in the series of atomic configurations resulting from local geometry optimisation and for alternative neural networks. The machine learning solution landscapes are visualised using disconnectivity graphs, and signatures in the effective heat capacity are analysed in termsmore » of distributions of local minima and their properties.« less

  10. Improving Vector Evaluated Particle Swarm Optimisation Using Multiple Nondominated Leaders

    PubMed Central

    Lim, Kian Sheng; Buyamin, Salinda; Ahmad, Anita; Shapiai, Mohd Ibrahim; Naim, Faradila; Mubin, Marizan; Kim, Dong Hwa

    2014-01-01

    The vector evaluated particle swarm optimisation (VEPSO) algorithm was previously improved by incorporating nondominated solutions for solving multiobjective optimisation problems. However, the obtained solutions did not converge close to the Pareto front and also did not distribute evenly over the Pareto front. Therefore, in this study, the concept of multiple nondominated leaders is incorporated to further improve the VEPSO algorithm. Hence, multiple nondominated solutions that are best at a respective objective function are used to guide particles in finding optimal solutions. The improved VEPSO is measured by the number of nondominated solutions found, generational distance, spread, and hypervolume. The results from the conducted experiments show that the proposed VEPSO significantly improved the existing VEPSO algorithms. PMID:24883386

  11. Modelling soil water retention using support vector machines with genetic algorithm optimisation.

    PubMed

    Lamorski, Krzysztof; Sławiński, Cezary; Moreno, Felix; Barna, Gyöngyi; Skierucha, Wojciech; Arrue, José L

    2014-01-01

    This work presents point pedotransfer function (PTF) models of the soil water retention curve. The developed models allowed for estimation of the soil water content for the specified soil water potentials: -0.98, -3.10, -9.81, -31.02, -491.66, and -1554.78 kPa, based on the following soil characteristics: soil granulometric composition, total porosity, and bulk density. Support Vector Machines (SVM) methodology was used for model development. A new methodology for elaboration of retention function models is proposed. Alternative to previous attempts known from literature, the ν-SVM method was used for model development and the results were compared with the formerly used the C-SVM method. For the purpose of models' parameters search, genetic algorithms were used as an optimisation framework. A new form of the aim function used for models parameters search is proposed which allowed for development of models with better prediction capabilities. This new aim function avoids overestimation of models which is typically encountered when root mean squared error is used as an aim function. Elaborated models showed good agreement with measured soil water retention data. Achieved coefficients of determination values were in the range 0.67-0.92. Studies demonstrated usability of ν-SVM methodology together with genetic algorithm optimisation for retention modelling which gave better performing models than other tested approaches.

  12. Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM) and genetic algorithm method (GA)

    NASA Astrophysics Data System (ADS)

    Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    In this study, Computer Aided Engineering was used for injection moulding simulation. The method of Design of experiment (DOE) was utilize according to the Latin Square orthogonal array. The relationship between the injection moulding parameters and warpage were identify based on the experimental data that used. Response Surface Methodology (RSM) was used as to validate the model accuracy. Then, the RSM and GA method were combine as to examine the optimum injection moulding process parameter. Therefore the optimisation of injection moulding is largely improve and the result shown an increasing accuracy and also reliability. The propose method by combining RSM and GA method also contribute in minimising the warpage from occur.

  13. Optimisation of the Management of Higher Activity Waste in the UK - 13537

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walsh, Ciara; Buckley, Matthew

    2013-07-01

    The Upstream Optioneering project was created in the Nuclear Decommissioning Authority (UK) to support the development and implementation of significant opportunities to optimise activities across all the phases of the Higher Activity Waste management life cycle (i.e. retrieval, characterisation, conditioning, packaging, storage, transport and disposal). The objective of the Upstream Optioneering project is to work in conjunction with other functions within NDA and the waste producers to identify and deliver solutions to optimise the management of higher activity waste. Historically, optimisation may have occurred on aspects of the waste life cycle (considered here to include retrieval, conditioning, treatment, packaging, interimmore » storage, transport to final end state, which may be geological disposal). By considering the waste life cycle as a whole, critical analysis of assumed constraints may lead to cost savings for the UK Tax Payer. For example, it may be possible to challenge the requirements for packaging wastes for disposal to deliver an optimised waste life cycle. It is likely that the challenges faced in the UK are shared in other countries. It is therefore likely that the opportunities identified may also apply elsewhere, with the potential for sharing information to enable value to be shared. (authors)« less

  14. Design optimisation of powers-of-two FIR filter using self-organising random immigrants GA

    NASA Astrophysics Data System (ADS)

    Chandra, Abhijit; Chattopadhyay, Sudipta

    2015-01-01

    In this communication, we propose a novel design strategy of multiplier-less low-pass finite impulse response (FIR) filter with the aid of a recent evolutionary optimisation technique, known as the self-organising random immigrants genetic algorithm. Individual impulse response coefficients of the proposed filter have been encoded as sum of signed powers-of-two. During the formulation of the cost function for the optimisation algorithm, both the frequency response characteristic and the hardware cost of the discrete coefficient FIR filter have been considered. The role of crossover probability of the optimisation technique has been evaluated on the overall performance of the proposed strategy. For this purpose, the convergence characteristic of the optimisation technique has been included in the simulation results. In our analysis, two design examples of different specifications have been taken into account. In order to substantiate the efficiency of our proposed structure, a number of state-of-the-art design strategies of multiplier-less FIR filter have also been included in this article for the purpose of comparison. Critical analysis of the result unambiguously establishes the usefulness of our proposed approach for the hardware efficient design of digital filter.

  15. Optimisation of warpage on thin shell part by using particle swarm optimisation (PSO)

    NASA Astrophysics Data System (ADS)

    Norshahira, R.; Shayfull, Z.; Nasir, S. M.; Saad, S. M. Sazli; Fathullah, M.

    2017-09-01

    As the product nowadays moving towards thinner design, causing the production of the plastic product facing a lot of difficulties. This is due to the higher possibilities of defects occur as the thickness of the wall gets thinner. Demand for technique in reducing the defects increasing due to this factor. These defects has seen to be occur due to several factors in injection moulding process. In the study a Moldflow software was used in simulating the injection moulding process. While RSM is used in producing the mathematical model to be used as the input fitness function for the Matlab software. Particle Swarm Optimisation (PSO) technique is used in optimising the processing condition to reduce the amount of shrinkage and warpage of the plastic part. The results shows that there are a warpage reduction of 17.60% in x direction, 18.15% in y direction and 10.25% reduction in z direction respectively. The results shows the reliability of this artificial method in minimising the product warpage.

  16. Optimisation of oxygen ion transport in materials for ceramic membrane devices.

    PubMed

    Kilner, J A

    2007-01-01

    Oxygen transport in ceramic oxide materials has received much attention over the past few decades. Much of this interest has stemmed from the desire to construct high temperature electrochemical devices for energy conversion, an example being the solid oxide fuel cell. In order to achieve high performance for these devices, insights are needed in how to achieve optimum performance from the functional components such as the electrolytes and electrodes. This includes the optimisation of oxygen transport through the crystal lattice of electrode and electrolyte materials and across the homogeneous (grain boundary) and heterogeneous interfaces that exist in real devices. Strategies are discussed for the optimisation of these quantities and current problems in the characterisation of interfacial transport are explored.

  17. Optimisation of logistics processes of energy grass collection

    NASA Astrophysics Data System (ADS)

    Bányai, Tamás.

    2010-05-01

    The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The objective function of the optimisation is the maximisation of the profit which means the maximization of the difference between revenue and cost. The objective function trades off the income of the assigned transportation demands against the logistic costs. The constraints are the followings: (1) the free capacity of the assigned transportation resource is more than the re-quested capacity of the transportation demand; the calculated arrival time of the transportation resource to the harvesting place is not later than the requested arrival time of them; (3) the calculated arrival time of the transportation demand to the processing and production facility is not later than the requested arrival time; (4) one transportation demand is assigned to one transportation resource and one resource is assigned to one transportation resource. The decision variable of the optimisation problem is the set of scheduling variables and the assignment of resources to transportation demands. The evaluation parameters of the optimised system are the followings: total costs of the collection process; utilisation of transportation resources and warehouses; efficiency of production and/or processing facilities. However the multidimensional heuristic optimisation method is based on genetic algorithm, but the routing sequence of the optimisation works on the base of an ant colony algorithm. The optimal routes are calculated by the aid of the ant colony algorithm as a subroutine of the global optimisation method and the optimal assignment is given by the genetic algorithm. One important part of the mathematical method is the sensibility analysis of the objective function, which shows the influence rate of the different input parameters. Acknowledgements This research was implemented within the frame of the project entitled "Development and operation of the Technology and Knowledge Transfer Centre of the University of Miskolc". with support by the European Union and co-funding of the European Social Fund. References [1] P. R. Daniel: The Economics of Harvesting and Transporting Corn Stover for Conversion to Fuel Ethanol: A Case Study for Minnesota. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/14213.html [2] T. G. Douglas, J. Brendan, D. Erin & V.-D. Becca: Energy and Chemicals from Native Grasses: Production, Transportation and Processing Technologies Considered in the Northern Great Plains. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/13838.html [3] Homepage of energygrass. www.energiafu.hu

  18. Investigating the Trade-Off Between Power Generation and Environmental Impact of Tidal-Turbine Arrays Using Array Layout Optimisation and Habitat Sustainability Modelling.

    NASA Astrophysics Data System (ADS)

    du Feu, R. J.; Funke, S. W.; Kramer, S. C.; Hill, J.; Piggott, M. D.

    2016-12-01

    The installation of tidal turbines into the ocean will inevitably affect the environment around them. However, due to the relative infancy of this sector the extent and severity of such effects is unknown. The layout of an array of turbines is an important factor in determining not only the array's final yield but also how it will influence regional hydrodynamics. This in turn could affect, for example, sediment transportation or habitat suitability. The two potentially competing objectives of extracting energy from the tidal current, and of limiting any environmental impact consequent to influencing that current, are investigated here. This relationship is posed as a multi-objective optimisation problem. OpenTidalFarm, an array layout optimisation tool, and MaxEnt, habitat sustainability modelling software, are used to evaluate scenarios off the coast of the UK. MaxEnt is used to estimate the likelihood of finding a species in a given location based upon environmental input data and presence data of the species. Environmental features which are known to impact habitat, specifically those affected by the presence of an array, such as bed shear stress, are chosen as inputs. MaxEnt then uses a maximum-entropy modelling approach to estimate population distribution across the modelled area. OpenTidalFarm is used to maximise the power generated by an array, or multiple arrays, through adjusting the position and number of turbines within them. It uses a 2D shallow water model with turbine arrays represented as adjustable friction fields. It has the capability to also optimise for user created functionals that can be expressed mathematically. This work uses two functionals; power extracted by the array, and the suitability of habitat as predicted by MaxEnt. A gradient-based local optimisation is used to adjust the array layout at each iteration. This work presents arrays that are optimised for both yield and the viability of habitat for chosen species. In each scenario studied, a range of array formations is found expressing varying preferences for either functional. Further analyses then allow for the identification of trade-offs between the two key societal objectives of energy production and conservation. This in turn produces information valuable to stakeholders and policymakers when making decisions on array design.

  19. Optimal integrated management of groundwater resources and irrigated agriculture in arid coastal regions

    NASA Astrophysics Data System (ADS)

    Grundmann, J.; Schütze, N.; Heck, V.

    2014-09-01

    Groundwater systems in arid coastal regions are particularly at risk due to limited potential for groundwater replenishment and increasing water demand, caused by a continuously growing population. For ensuring a sustainable management of those regions, we developed a new simulation-based integrated water management system. The management system unites process modelling with artificial intelligence tools and evolutionary optimisation techniques for managing both water quality and water quantity of a strongly coupled groundwater-agriculture system. Due to the large number of decision variables, a decomposition approach is applied to separate the original large optimisation problem into smaller, independent optimisation problems which finally allow for faster and more reliable solutions. It consists of an analytical inner optimisation loop to achieve a most profitable agricultural production for a given amount of water and an outer simulation-based optimisation loop to find the optimal groundwater abstraction pattern. Thereby, the behaviour of farms is described by crop-water-production functions and the aquifer response, including the seawater interface, is simulated by an artificial neural network. The methodology is applied exemplarily for the south Batinah re-gion/Oman, which is affected by saltwater intrusion into a coastal aquifer system due to excessive groundwater withdrawal for irrigated agriculture. Due to contradicting objectives like profit-oriented agriculture vs aquifer sustainability, a multi-objective optimisation is performed which can provide sustainable solutions for water and agricultural management over long-term periods at farm and regional scales in respect of water resources, environment, and socio-economic development.

  20. Optimisation of Fabric Reinforced Polymer Composites Using a Variant of Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Axinte, Andrei; Taranu, Nicolae; Bejan, Liliana; Hudisteanu, Iuliana

    2017-12-01

    Fabric reinforced polymeric composites are high performance materials with a rather complex fabric geometry. Therefore, modelling this type of material is a cumbersome task, especially when an efficient use is targeted. One of the most important issue of its design process is the optimisation of the individual laminae and of the laminated structure as a whole. In order to do that, a parametric model of the material has been defined, emphasising the many geometric variables needed to be correlated in the complex process of optimisation. The input parameters involved in this work, include: widths or heights of the tows and the laminate stacking sequence, which are discrete variables, while the gaps between adjacent tows and the height of the neat matrix are continuous variables. This work is one of the first attempts of using a Genetic Algorithm ( GA) to optimise the geometrical parameters of satin reinforced multi-layer composites. Given the mixed type of the input parameters involved, an original software called SOMGA (Satin Optimisation with a Modified Genetic Algorithm) has been conceived and utilised in this work. The main goal is to find the best possible solution to the problem of designing a composite material which is able to withstand to a given set of external, in-plane, loads. The optimisation process has been performed using a fitness function which can analyse and compare mechanical behaviour of different fabric reinforced composites, the results being correlated with the ultimate strains, which demonstrate the efficiency of the composite structure.

  1. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.

    PubMed

    Trianni, Vito; López-Ibáñez, Manuel

    2015-01-01

    The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.

  2. Improving fMRI reliability in presurgical mapping for brain tumours.

    PubMed

    Stevens, M Tynan R; Clarke, David B; Stroink, Gerhard; Beyea, Steven D; D'Arcy, Ryan Cn

    2016-03-01

    Functional MRI (fMRI) is becoming increasingly integrated into clinical practice for presurgical mapping. Current efforts are focused on validating data quality, with reliability being a major factor. In this paper, we demonstrate the utility of a recently developed approach that uses receiver operating characteristic-reliability (ROC-r) to: (1) identify reliable versus unreliable data sets; (2) automatically select processing options to enhance data quality; and (3) automatically select individualised thresholds for activation maps. Presurgical fMRI was conducted in 16 patients undergoing surgical treatment for brain tumours. Within-session test-retest fMRI was conducted, and ROC-reliability of the patient group was compared to a previous healthy control cohort. Individually optimised preprocessing pipelines were determined to improve reliability. Spatial correspondence was assessed by comparing the fMRI results to intraoperative cortical stimulation mapping, in terms of the distance to the nearest active fMRI voxel. The average ROC-r reliability for the patients was 0.58±0.03, as compared to 0.72±0.02 in healthy controls. For the patient group, this increased significantly to 0.65±0.02 by adopting optimised preprocessing pipelines. Co-localisation of the fMRI maps with cortical stimulation was significantly better for more reliable versus less reliable data sets (8.3±0.9 vs 29±3 mm, respectively). We demonstrated ROC-r analysis for identifying reliable fMRI data sets, choosing optimal postprocessing pipelines, and selecting patient-specific thresholds. Data sets with higher reliability also showed closer spatial correspondence to cortical stimulation. ROC-r can thus identify poor fMRI data at time of scanning, allowing for repeat scans when necessary. ROC-r analysis provides optimised and automated fMRI processing for improved presurgical mapping. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Software Toolbox Development for Rapid Earthquake Source Optimisation Combining InSAR Data and Seismic Waveforms

    NASA Astrophysics Data System (ADS)

    Isken, Marius P.; Sudhaus, Henriette; Heimann, Sebastian; Steinberg, Andreas; Bathke, Hannes M.

    2017-04-01

    We present a modular open-source software framework (pyrocko, kite, grond; http://pyrocko.org) for rapid InSAR data post-processing and modelling of tectonic and volcanic displacement fields derived from satellite data. Our aim is to ease and streamline the joint optimisation of earthquake observations from InSAR and GPS data together with seismological waveforms for an improved estimation of the ruptures' parameters. Through this approach we can provide finite models of earthquake ruptures and therefore contribute to a timely and better understanding of earthquake kinematics. The new kite module enables a fast processing of unwrapped InSAR scenes for source modelling: the spatial sub-sampling and data error/noise estimation for the interferogram is evaluated automatically and interactively. The rupture's near-field surface displacement data are then combined with seismic far-field waveforms and jointly modelled using the pyrocko.gf framwork, which allows for fast forward modelling based on pre-calculated elastodynamic and elastostatic Green's functions. Lastly the grond module supplies a bootstrap-based probabilistic (Monte Carlo) joint optimisation to estimate the parameters and uncertainties of a finite-source earthquake rupture model. We describe the developed and applied methods as an effort to establish a semi-automatic processing and modelling chain. The framework is applied to Sentinel-1 data from the 2016 Central Italy earthquake sequence, where we present the earthquake mechanism and rupture model from which we derive regions of increased coulomb stress. The open source software framework is developed at GFZ Potsdam and at the University of Kiel, Germany, it is written in Python and C programming languages. The toolbox architecture is modular and independent, and can be utilized flexibly for a variety of geophysical problems. This work is conducted within the BridGeS project (http://www.bridges.uni-kiel.de) funded by the German Research Foundation DFG through an Emmy-Noether grant.

  4. Optimisation of reconstruction--reprojection-based motion correction for cardiac SPECT.

    PubMed

    Kangasmaa, Tuija S; Sohlberg, Antti O

    2014-07-01

    Cardiac motion is a challenging cause of image artefacts in myocardial perfusion SPECT. A wide range of motion correction methods have been developed over the years, and so far automatic algorithms based on the reconstruction--reprojection principle have proved to be the most effective. However, these methods have not been fully optimised in terms of their free parameters and implementational details. Two slightly different implementations of reconstruction--reprojection-based motion correction techniques were optimised for effective, good-quality motion correction and then compared with each other. The first of these methods (Method 1) was the traditional reconstruction-reprojection motion correction algorithm, where the motion correction is done in projection space, whereas the second algorithm (Method 2) performed motion correction in reconstruction space. The parameters that were optimised include the type of cost function (squared difference, normalised cross-correlation and mutual information) that was used to compare measured and reprojected projections, and the number of iterations needed. The methods were tested with motion-corrupt projection datasets, which were generated by adding three different types of motion (lateral shift, vertical shift and vertical creep) to motion-free cardiac perfusion SPECT studies. Method 2 performed slightly better overall than Method 1, but the difference between the two implementations was small. The execution time for Method 2 was much longer than for Method 1, which limits its clinical usefulness. The mutual information cost function gave clearly the best results for all three motion sets for both correction methods. Three iterations were sufficient for a good quality correction using Method 1. The traditional reconstruction--reprojection-based method with three update iterations and mutual information cost function is a good option for motion correction in clinical myocardial perfusion SPECT.

  5. Using modified fruit fly optimisation algorithm to perform the function test and case studies

    NASA Astrophysics Data System (ADS)

    Pan, Wen-Tsao

    2013-06-01

    Evolutionary computation is a computing mode established by practically simulating natural evolutionary processes based on the concept of Darwinian Theory, and it is a common research method. The main contribution of this paper was to reinforce the function of searching for the optimised solution using the fruit fly optimization algorithm (FOA), in order to avoid the acquisition of local extremum solutions. The evolutionary computation has grown to include the concepts of animal foraging behaviour and group behaviour. This study discussed three common evolutionary computation methods and compared them with the modified fruit fly optimization algorithm (MFOA). It further investigated the ability of the three mathematical functions in computing extreme values, as well as the algorithm execution speed and the forecast ability of the forecasting model built using the optimised general regression neural network (GRNN) parameters. The findings indicated that there was no obvious difference between particle swarm optimization and the MFOA in regards to the ability to compute extreme values; however, they were both better than the artificial fish swarm algorithm and FOA. In addition, the MFOA performed better than the particle swarm optimization in regards to the algorithm execution speed, and the forecast ability of the forecasting model built using the MFOA's GRNN parameters was better than that of the other three forecasting models.

  6. Escalated convergent artificial bee colony

    NASA Astrophysics Data System (ADS)

    Jadon, Shimpi Singh; Bansal, Jagdish Chand; Tiwari, Ritu

    2016-03-01

    Artificial bee colony (ABC) optimisation algorithm is a recent, fast and easy-to-implement population-based meta heuristic for optimisation. ABC has been proved a rival algorithm with some popular swarm intelligence-based algorithms such as particle swarm optimisation, firefly algorithm and ant colony optimisation. The solution search equation of ABC is influenced by a random quantity which helps its search process in exploration at the cost of exploitation. In order to find a fast convergent behaviour of ABC while exploitation capability is maintained, in this paper basic ABC is modified in two ways. First, to improve exploitation capability, two local search strategies, namely classical unidimensional local search and levy flight random walk-based local search are incorporated with ABC. Furthermore, a new solution search strategy, namely stochastic diffusion scout search is proposed and incorporated into the scout bee phase to provide more chance to abandon solution to improve itself. Efficiency of the proposed algorithm is tested on 20 benchmark test functions of different complexities and characteristics. Results are very promising and they prove it to be a competitive algorithm in the field of swarm intelligence-based algorithms.

  7. UAV path planning using artificial potential field method updated by optimal control theory

    NASA Astrophysics Data System (ADS)

    Chen, Yong-bo; Luo, Guan-chen; Mei, Yue-song; Yu, Jian-qiao; Su, Xiao-long

    2016-04-01

    The unmanned aerial vehicle (UAV) path planning problem is an important assignment in the UAV mission planning. Based on the artificial potential field (APF) UAV path planning method, it is reconstructed into the constrained optimisation problem by introducing an additional control force. The constrained optimisation problem is translated into the unconstrained optimisation problem with the help of slack variables in this paper. The functional optimisation method is applied to reform this problem into an optimal control problem. The whole transformation process is deduced in detail, based on a discrete UAV dynamic model. Then, the path planning problem is solved with the help of the optimal control method. The path following process based on the six degrees of freedom simulation model of the quadrotor helicopters is introduced to verify the practicability of this method. Finally, the simulation results show that the improved method is more effective in planning path. In the planning space, the length of the calculated path is shorter and smoother than that using traditional APF method. In addition, the improved method can solve the dead point problem effectively.

  8. Optimisation of a machine learning algorithm in human locomotion using principal component and discriminant function analyses.

    PubMed

    Bisele, Maria; Bencsik, Martin; Lewis, Martin G C; Barnett, Cleveland T

    2017-01-01

    Assessment methods in human locomotion often involve the description of normalised graphical profiles and/or the extraction of discrete variables. Whilst useful, these approaches may not represent the full complexity of gait data. Multivariate statistical methods, such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA), have been adopted since they have the potential to overcome these data handling issues. The aim of the current study was to develop and optimise a specific machine learning algorithm for processing human locomotion data. Twenty participants ran at a self-selected speed across a 15m runway in barefoot and shod conditions. Ground reaction forces (BW) and kinematics were measured at 1000 Hz and 100 Hz, respectively from which joint angles (°), joint moments (N.m.kg-1) and joint powers (W.kg-1) for the hip, knee and ankle joints were calculated in all three anatomical planes. Using PCA and DFA, power spectra of the kinematic and kinetic variables were used as a training database for the development of a machine learning algorithm. All possible combinations of 10 out of 20 participants were explored to find the iteration of individuals that would optimise the machine learning algorithm. The results showed that the algorithm was able to successfully predict whether a participant ran shod or barefoot in 93.5% of cases. To the authors' knowledge, this is the first study to optimise the development of a machine learning algorithm.

  9. Optimisation of a machine learning algorithm in human locomotion using principal component and discriminant function analyses

    PubMed Central

    Bisele, Maria; Bencsik, Martin; Lewis, Martin G. C.

    2017-01-01

    Assessment methods in human locomotion often involve the description of normalised graphical profiles and/or the extraction of discrete variables. Whilst useful, these approaches may not represent the full complexity of gait data. Multivariate statistical methods, such as Principal Component Analysis (PCA) and Discriminant Function Analysis (DFA), have been adopted since they have the potential to overcome these data handling issues. The aim of the current study was to develop and optimise a specific machine learning algorithm for processing human locomotion data. Twenty participants ran at a self-selected speed across a 15m runway in barefoot and shod conditions. Ground reaction forces (BW) and kinematics were measured at 1000 Hz and 100 Hz, respectively from which joint angles (°), joint moments (N.m.kg-1) and joint powers (W.kg-1) for the hip, knee and ankle joints were calculated in all three anatomical planes. Using PCA and DFA, power spectra of the kinematic and kinetic variables were used as a training database for the development of a machine learning algorithm. All possible combinations of 10 out of 20 participants were explored to find the iteration of individuals that would optimise the machine learning algorithm. The results showed that the algorithm was able to successfully predict whether a participant ran shod or barefoot in 93.5% of cases. To the authors’ knowledge, this is the first study to optimise the development of a machine learning algorithm. PMID:28886059

  10. The Influence of Organisational Commitment, Job Involvement and Utility Perceptions on Trainees' Motivation to Improve Work through Learning

    ERIC Educational Resources Information Center

    von Treuer, Kathryn; McHardy, Katherine; Earl, Celisha

    2013-01-01

    Workplace training is a key strategy often used by organisations to optimise performance. Further, trainee motivation is a key determinant of the degree to which the material learned in a training programme will be transferred to the workplace, enhancing the performance of the trainee. This study investigates the relationship between several…

  11. Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool

    NASA Astrophysics Data System (ADS)

    Helle, K. B.; Müller, T. O.; Astrup, P.; Dyve, J. E.

    2014-05-01

    Fast delivery of comprehensive information on the radiological situation is essential for decision-making in nuclear emergencies. Most national radiological agencies in Europe employ gamma dose rate sensor networks to monitor radioactive pollution of the atmosphere. Sensor locations were often chosen using regular grids or according to administrative constraints. Nowadays, however, the choice can be based on more realistic risk assessment, as it is possible to simulate potential radioactive plumes. To support sensor planning, we developed the DETECT Optimisation Tool (DOT) within the scope of the EU FP 7 project DETECT. It evaluates the gamma dose rates that a proposed set of sensors might measure in an emergency and uses this information to optimise the sensor locations. The gamma dose rates are taken from a comprehensive library of simulations of atmospheric radioactive plumes from 64 source locations. These simulations cover the whole European Union, so the DOT allows evaluation and optimisation of sensor networks for all EU countries, as well as evaluation of fencing sensors around possible sources. Users can choose from seven cost functions to evaluate the capability of a given monitoring network for early detection of radioactive plumes or for the creation of dose maps. The DOT is implemented as a stand-alone easy-to-use JAVA-based application with a graphical user interface and an R backend. Users can run evaluations and optimisations, and display, store and download the results. The DOT runs on a server and can be accessed via common web browsers; it can also be installed locally.

  12. Disease activity-guided dose optimisation of adalimumab and etanercept is a cost-effective strategy compared with non-tapering tight control rheumatoid arthritis care: analyses of the DRESS study.

    PubMed

    Kievit, Wietske; van Herwaarden, Noortje; van den Hoogen, Frank Hj; van Vollenhoven, Ronald F; Bijlsma, Johannes Wj; van den Bemt, Bart Jf; van der Maas, Aatke; den Broeder, Alfons A

    2016-11-01

    A disease activity-guided dose optimisation strategy of adalimumab or etanercept (TNFi (tumour necrosis factor inhibitors)) has shown to be non-inferior in maintaining disease control in patients with rheumatoid arthritis (RA) compared with usual care. However, the cost-effectiveness of this strategy is still unknown. This is a preplanned cost-effectiveness analysis of the Dose REduction Strategy of Subcutaneous TNF inhibitors (DRESS) study, a randomised controlled, open-label, non-inferiority trial performed in two Dutch rheumatology outpatient clinics. Patients with low disease activity using TNF inhibitors were included. Total healthcare costs were measured and quality adjusted life years (QALY) were based on EQ5D utility scores. Decremental cost-effectiveness analyses were performed using bootstrap analyses; incremental net monetary benefit (iNMB) was used to express cost-effectiveness. 180 patients were included, and 121 were allocated to the dose optimisation strategy and 59 to control. The dose optimisation strategy resulted in a mean cost saving of -€12 280 (95 percentile -€10 502; -€14 104) per patient per 18 months. There is an 84% chance that the dose optimisation strategy results in a QALY loss with a mean QALY loss of -0.02 (-0.07 to 0.02). The decremental cost-effectiveness ratio (DCER) was €390 493 (€5 085 184; dominant) of savings per QALY lost. The mean iNMB was €10 467 (€6553-€14 037). Sensitivity analyses using 30% and 50% lower prices for TNFi remained cost-effective. Disease activity-guided dose optimisation of TNFi results in considerable cost savings while no relevant loss of quality of life was observed. When the minimal QALY loss is compensated with the upper limit of what society is willing to pay or accept in the Netherlands, the net savings are still high. NTR3216; Post-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Power law-based local search in spider monkey optimisation for lower order system modelling

    NASA Astrophysics Data System (ADS)

    Sharma, Ajay; Sharma, Harish; Bhargava, Annapurna; Sharma, Nirmala

    2017-01-01

    The nature-inspired algorithms (NIAs) have shown efficiency to solve many complex real-world optimisation problems. The efficiency of NIAs is measured by their ability to find adequate results within a reasonable amount of time, rather than an ability to guarantee the optimal solution. This paper presents a solution for lower order system modelling using spider monkey optimisation (SMO) algorithm to obtain a better approximation for lower order systems and reflects almost original higher order system's characteristics. Further, a local search strategy, namely, power law-based local search is incorporated with SMO. The proposed strategy is named as power law-based local search in SMO (PLSMO). The efficiency, accuracy and reliability of the proposed algorithm is tested over 20 well-known benchmark functions. Then, the PLSMO algorithm is applied to solve the lower order system modelling problem.

  14. An integrated modelling and multicriteria analysis approach to managing nitrate diffuse pollution: 2. A case study for a chalk catchment in England.

    PubMed

    Koo, B K; O'Connell, P E

    2006-04-01

    The site-specific land use optimisation methodology, suggested by the authors in the first part of this two-part paper, has been applied to the River Kennet catchment at Marlborough, Wiltshire, UK, for a case study. The Marlborough catchment (143 km(2)) is an agriculture-dominated rural area over a deep chalk aquifer that is vulnerable to nitrate pollution from agricultural diffuse sources. For evaluation purposes, the catchment was discretised into a network of 1 kmx1 km grid cells. For each of the arable-land grid cells, seven land use alternatives (four arable-land alternatives and three grassland alternatives) were evaluated for their environmental and economic potential. For environmental evaluation, nitrate leaching rates of land use alternatives were estimated using SHETRAN simulations and groundwater pollution potential was evaluated using the DRASTIC index. For economic evaluation, economic gross margins were estimated using a simple agronomic model based on nitrogen response functions and agricultural land classification grades. In order to see whether the site-specific optimisation is efficient at the catchment scale, land use optimisation was carried out for four optimisation schemes (i.e. using four sets of criterion weights). Consequently, four land use scenarios were generated and the site-specifically optimised land use scenario was evaluated as the best compromise solution between long term nitrate pollution and agronomy at the catchment scale.

  15. Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics

    PubMed Central

    Trianni, Vito; López-Ibáñez, Manuel

    2015-01-01

    The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics. PMID:26295151

  16. A robust optimisation approach to the problem of supplier selection and allocation in outsourcing

    NASA Astrophysics Data System (ADS)

    Fu, Yelin; Keung Lai, Kin; Liang, Liang

    2016-03-01

    We formulate the supplier selection and allocation problem in outsourcing under an uncertain environment as a stochastic programming problem. Both the decision-maker's attitude towards risk and the penalty parameters for demand deviation are considered in the objective function. A service level agreement, upper bound for each selected supplier's allocation and the number of selected suppliers are considered as constraints. A novel robust optimisation approach is employed to solve this problem under different economic situations. Illustrative examples are presented with managerial implications highlighted to support decision-making.

  17. Current limitations of the assessment of haemostasis in adult extracorporeal membrane oxygenation patients and the role of point-of-care testing.

    PubMed

    Venkatesh, K; Nair, P S; Hoechter, D J; Buscher, H

    2016-11-01

    Haemostatic perturbations are commonly seen in extracorporeal membrane oxygenation (ECMO) patients and remain a clinical challenge, contributing significantly to morbidity and mortality. The approach to anticoagulation monitoring and the management of bleeding varies considerably across ECMO centres. Routine laboratory tests have their limitations in terms of turnaround time and specificity of information provided. Newer point-of-care testing (POCT) for coagulation may overcome these issues, as it provides information about the entire coagulation pathway from clot initiation to lysis. It is also possible to obtain qualitative information on platelet function from these tests. Furthermore, the ability to incorporate these results into a goal-directed algorithm to manage bleeding with targeted transfusion strategies appears particularly attractive and cost effective. Further studies are required to evaluate the utility of POCT to optimise bleeding and anticoagulation management in these complex patients.

  18. Structural optimisation of cage induction motors using finite element analysis

    NASA Astrophysics Data System (ADS)

    Palko, S.

    The current trend in motor design is to have highly efficient, low noise, low cost, and modular motors with a high power factor. High torque motors are useful in applications like servo motors, lifts, cranes, and rolling mills. This report contains a detailed review of different optimization methods applicable in various design problems. Special attention is given to the performance of different methods, when they are used with finite element analysis (FEA) as an objective function, and accuracy problems arising from the numerical simulations. Also an effective method for designing high starting torque and high efficiency motors is presented. The method described in this work utilizes FEA combined with algorithms for the optimization of the slot geometry. The optimization algorithm modifies the position of the nodal points in the element mesh. The number of independent variables ranges from 14 to 140 in this work.

  19. Multi-photon absorption limits to heralded single photon sources

    PubMed Central

    Husko, Chad A.; Clark, Alex S.; Collins, Matthew J.; De Rossi, Alfredo; Combrié, Sylvain; Lehoucq, Gaëlle; Rey, Isabella H.; Krauss, Thomas F.; Xiong, Chunle; Eggleton, Benjamin J.

    2013-01-01

    Single photons are of paramount importance to future quantum technologies, including quantum communication and computation. Nonlinear photonic devices using parametric processes offer a straightforward route to generating photons, however additional nonlinear processes may come into play and interfere with these sources. Here we analyse spontaneous four-wave mixing (SFWM) sources in the presence of multi-photon processes. We conduct experiments in silicon and gallium indium phosphide photonic crystal waveguides which display inherently different nonlinear absorption processes, namely two-photon (TPA) and three-photon absorption (ThPA), respectively. We develop a novel model capturing these diverse effects which is in excellent quantitative agreement with measurements of brightness, coincidence-to-accidental ratio (CAR) and second-order correlation function g(2)(0), showing that TPA imposes an intrinsic limit on heralded single photon sources. We build on these observations to devise a new metric, the quantum utility (QMU), enabling further optimisation of single photon sources. PMID:24186400

  20. Use of a genetic algorithm to improve the rail profile on Stockholm underground

    NASA Astrophysics Data System (ADS)

    Persson, Ingemar; Nilsson, Rickard; Bik, Ulf; Lundgren, Magnus; Iwnicki, Simon

    2010-12-01

    In this paper, a genetic algorithm optimisation method has been used to develop an improved rail profile for Stockholm underground. An inverted penalty index based on a number of key performance parameters was generated as a fitness function and vehicle dynamics simulations were carried out with the multibody simulation package Gensys. The effectiveness of each profile produced by the genetic algorithm was assessed using the roulette wheel method. The method has been applied to the rail profile on the Stockholm underground, where problems with rolling contact fatigue on wheels and rails are currently managed by grinding. From a starting point of the original BV50 and the UIC60 rail profiles, an optimised rail profile with some shoulder relief has been produced. The optimised profile seems similar to measured rail profiles on the Stockholm underground network and although initial grinding is required, maintenance of the profile will probably not require further grinding.

  1. rPM6 parameters for phosphorous and sulphur-containing open-shell molecules

    NASA Astrophysics Data System (ADS)

    Saito, Toru; Takano, Yu

    2018-03-01

    In this article, we have introduced a reparameterisation of PM6 (rPM6) for phosphorus and sulphur to achieve a better description of open-shell species containing the two elements. Two sets of the parameters have been optimised separately using our training sets. The performance of the spin-unrestricted rPM6 (UrPM6) method with the optimised parameters is evaluated against 14 radical species, which contain either phosphorus or sulphur atom, comparing with the original UPM6 and the spin-unrestricted density functional theory (UDFT) methods. The standard UPM6 calculations fail to describe the adiabatic singlet-triplet energy gaps correctly, and may cause significant structural mismatches with UDFT-optimised geometries. Leaving aside three difficult cases, tests on 11 open-shell molecules strongly indicate the superior performance of UrPM6, which provides much better agreement with the results of UDFT methods for geometric and electronic properties.

  2. Hybrid real-code ant colony optimisation for constrained mechanical design

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Bureerat, Sujin

    2016-01-01

    This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.

  3. A robust algorithm for optimisation and customisation of fractal dimensions of time series modified by nonlinearly scaling their time derivatives: mathematical theory and practical applications.

    PubMed

    Fuss, Franz Konstantin

    2013-01-01

    Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals.

  4. A Robust Algorithm for Optimisation and Customisation of Fractal Dimensions of Time Series Modified by Nonlinearly Scaling Their Time Derivatives: Mathematical Theory and Practical Applications

    PubMed Central

    2013-01-01

    Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals. PMID:24151522

  5. Comparisons of the utility of researcher-defined and participant-defined successful ageing.

    PubMed

    Brown, Lynsey J; Bond, Malcolm J

    2016-03-01

    To investigate the impact of different approaches for measuring 'successful ageing', four alternative researcher and participant definitions were compared, including a novel measure informed by cluster analysis. Rates of successful ageing were explored, as were their relative associations with age and measures of successful adaptation, to assess construct validity. Participants, aged over 65, were recruited from community-based organisations. Questionnaires (assessing successful ageing, lifestyle activities and selective optimisation with compensation) were completed by 317 individuals. Successful ageing ranged from 11.4% to 87.4%, with higher rates evident from participant definitions. Though dependent upon the definition, successful agers were typically younger, reported greater engagement with lifestyle activities and more frequent optimisation. While the current study suggested an improved classification algorithm using a common research definition, future research should explore how subjective and objective aspects of successful ageing may be combined to derive a measure relevant to policy and practice. © 2016 AJA Inc.

  6. Midbond basis functions for weakly bound complexes

    NASA Astrophysics Data System (ADS)

    Shaw, Robert A.; Hill, J. Grant

    2018-06-01

    Weakly bound systems present a difficult problem for conventional atom-centred basis sets due to large separations, necessitating the use of large, computationally expensive bases. This can be remedied by placing a small number of functions in the region between molecules in the complex. We present compact sets of optimised midbond functions for a range of complexes involving noble gases, alkali metals and small molecules for use in high accuracy coupled -cluster calculations, along with a more robust procedure for their optimisation. It is shown that excellent results are possible with double-zeta quality orbital basis sets when a few midbond functions are added, improving both the interaction energy and the equilibrium bond lengths of a series of noble gas dimers by 47% and 8%, respectively. When used in conjunction with explicitly correlated methods, near complete basis set limit accuracy is readily achievable at a fraction of the cost that using a large basis would entail. General purpose auxiliary sets are developed to allow explicitly correlated midbond function studies to be carried out, making it feasible to perform very high accuracy calculations on weakly bound complexes.

  7. Fast and fuzzy multi-objective radiotherapy treatment plan generation for head and neck cancer patients with the lexicographic reference point method (LRPM)

    NASA Astrophysics Data System (ADS)

    van Haveren, Rens; Ogryczak, Włodzimierz; Verduijn, Gerda M.; Keijzer, Marleen; Heijmen, Ben J. M.; Breedveld, Sebastiaan

    2017-06-01

    Previously, we have proposed Erasmus-iCycle, an algorithm for fully automated IMRT plan generation based on prioritised (lexicographic) multi-objective optimisation with the 2-phase ɛ-constraint (2pɛc) method. For each patient, the output of Erasmus-iCycle is a clinically favourable, Pareto optimal plan. The 2pɛc method uses a list of objective functions that are consecutively optimised, following a strict, user-defined prioritisation. The novel lexicographic reference point method (LRPM) is capable of solving multi-objective problems in a single optimisation, using a fuzzy prioritisation of the objectives. Trade-offs are made globally, aiming for large favourable gains for lower prioritised objectives at the cost of only slight degradations for higher prioritised objectives, or vice versa. In this study, the LRPM is validated for 15 head and neck cancer patients receiving bilateral neck irradiation. The generated plans using the LRPM are compared with the plans resulting from the 2pɛc method. Both methods were capable of automatically generating clinically relevant treatment plans for all patients. For some patients, the LRPM allowed large favourable gains in some treatment plan objectives at the cost of only small degradations for the others. Moreover, because of the applied single optimisation instead of multiple optimisations, the LRPM reduced the average computation time from 209.2 to 9.5 min, a speed-up factor of 22 relative to the 2pɛc method.

  8. Treatment planning optimisation in proton therapy

    PubMed Central

    McGowan, S E; Burnet, N G; Lomax, A J

    2013-01-01

    ABSTRACT. The goal of radiotherapy is to achieve uniform target coverage while sparing normal tissue. In proton therapy, the same sources of geometric uncertainty are present as in conventional radiotherapy. However, an important and fundamental difference in proton therapy is that protons have a finite range, highly dependent on the electron density of the material they are traversing, resulting in a steep dose gradient at the distal edge of the Bragg peak. Therefore, an accurate knowledge of the sources and magnitudes of the uncertainties affecting the proton range is essential for producing plans which are robust to these uncertainties. This review describes the current knowledge of the geometric uncertainties and discusses their impact on proton dose plans. The need for patient-specific validation is essential and in cases of complex intensity-modulated proton therapy plans the use of a planning target volume (PTV) may fail to ensure coverage of the target. In cases where a PTV cannot be used, other methods of quantifying plan quality have been investigated. A promising option is to incorporate uncertainties directly into the optimisation algorithm. A further development is the inclusion of robustness into a multicriteria optimisation framework, allowing a multi-objective Pareto optimisation function to balance robustness and conformity. The question remains as to whether adaptive therapy can become an integral part of a proton therapy, to allow re-optimisation during the course of a patient's treatment. The challenge of ensuring that plans are robust to range uncertainties in proton therapy remains, although these methods can provide practical solutions. PMID:23255545

  9. Machine learning prediction for classification of outcomes in local minimisation

    NASA Astrophysics Data System (ADS)

    Das, Ritankar; Wales, David J.

    2017-01-01

    Machine learning schemes are employed to predict which local minimum will result from local energy minimisation of random starting configurations for a triatomic cluster. The input data consists of structural information at one or more of the configurations in optimisation sequences that converge to one of four distinct local minima. The ability to make reliable predictions, in terms of the energy or other properties of interest, could save significant computational resources in sampling procedures that involve systematic geometry optimisation. Results are compared for two energy minimisation schemes, and for neural network and quadratic functions of the inputs.

  10. Feasibility and Clinical Utility of High-definition Transcranial Direct Current Stimulation in the Treatment of Persistent Hallucinations in Schizophrenia.

    PubMed

    Bose, A; Shivakumar, V; Chhabra, H; Parlikar, R; Sreeraj, V S; Dinakaran, D; Narayanaswamy, J C; Venkatasubramanian, G

    2017-12-01

    Persistent auditory verbal hallucination is a clinically significant problem in schizophrenia. Recent studies suggest a promising role for add-on transcranial direct current stimulation (tDCS) in treatment. An optimised version of tDCS, namely high-definition tDCS (HD-tDCS), uses smaller electrodes arranged in a 4x1 ring configuration and may offer more focal and predictable neuromodulation than conventional tDCS. This case report illustrates the feasibility and clinical utility of add-on HD-tDCS over the left temporoparietal junction in a 4x1 ring configuration to treat persistent auditory verbal hallucination in schizophrenia.

  11. GNAQPMS v1.1: accelerating the Global Nested Air Quality Prediction Modeling System (GNAQPMS) on Intel Xeon Phi processors

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Chen, Huansheng; Wu, Qizhong; Lin, Junmin; Chen, Xueshun; Xie, Xinwei; Wang, Rongrong; Tang, Xiao; Wang, Zifa

    2017-08-01

    The Global Nested Air Quality Prediction Modeling System (GNAQPMS) is the global version of the Nested Air Quality Prediction Modeling System (NAQPMS), which is a multi-scale chemical transport model used for air quality forecast and atmospheric environmental research. In this study, we present the porting and optimisation of GNAQPMS on a second-generation Intel Xeon Phi processor, codenamed Knights Landing (KNL). Compared with the first-generation Xeon Phi coprocessor (codenamed Knights Corner, KNC), KNL has many new hardware features such as a bootable processor, high-performance in-package memory and ISA compatibility with Intel Xeon processors. In particular, we describe the five optimisations we applied to the key modules of GNAQPMS, including the CBM-Z gas-phase chemistry, advection, convection and wet deposition modules. These optimisations work well on both the KNL 7250 processor and the Intel Xeon E5-2697 V4 processor. They include (1) updating the pure Message Passing Interface (MPI) parallel mode to the hybrid parallel mode with MPI and OpenMP in the emission, advection, convection and gas-phase chemistry modules; (2) fully employing the 512 bit wide vector processing units (VPUs) on the KNL platform; (3) reducing unnecessary memory access to improve cache efficiency; (4) reducing the thread local storage (TLS) in the CBM-Z gas-phase chemistry module to improve its OpenMP performance; and (5) changing the global communication from writing/reading interface files to MPI functions to improve the performance and the parallel scalability. These optimisations greatly improved the GNAQPMS performance. The same optimisations also work well for the Intel Xeon Broadwell processor, specifically E5-2697 v4. Compared with the baseline version of GNAQPMS, the optimised version was 3.51 × faster on KNL and 2.77 × faster on the CPU. Moreover, the optimised version ran at 26 % lower average power on KNL than on the CPU. With the combined performance and energy improvement, the KNL platform was 37.5 % more efficient on power consumption compared with the CPU platform. The optimisations also enabled much further parallel scalability on both the CPU cluster and the KNL cluster scaled to 40 CPU nodes and 30 KNL nodes, with a parallel efficiency of 70.4 and 42.2 %, respectively.

  12. Fractures in sport: Optimising their management and outcome

    PubMed Central

    Robertson, Greg AJ; Wood, Alexander M

    2015-01-01

    Fractures in sport are a specialised cohort of fracture injuries, occurring in a high functioning population, in which the goals are rapid restoration of function and return to play with the minimal symptom profile possible. While the general principles of fracture management, namely accurate fracture reduction, appropriate immobilisation and timely rehabilitation, guide the treatment of these injuries, management of fractures in athletic populations can differ significantly from those in the general population, due to the need to facilitate a rapid return to high demand activities. However, despite fractures comprising up to 10% of all of sporting injuries, dedicated research into the management and outcome of sport-related fractures is limited. In order to assess the optimal methods of treating such injuries, and so allow optimisation of their outcome, the evidence for the management of each specific sport-related fracture type requires assessment and analysis. We present and review the current evidence directing management of fractures in athletes with an aim to promote valid innovative methods and optimise the outcome of such injuries. From this, key recommendations are provided for the management of the common fracture types seen in the athlete. Six case reports are also presented to illustrate the management planning and application of sport-focussed fracture management in the clinical setting. PMID:26716081

  13. Demonstrating the suitability of genetic algorithms for driving microbial ecosystems in desirable directions.

    PubMed

    Vandecasteele, Frederik P J; Hess, Thomas F; Crawford, Ronald L

    2007-07-01

    The functioning of natural microbial ecosystems is determined by biotic interactions, which are in turn influenced by abiotic environmental conditions. Direct experimental manipulation of such conditions can be used to purposefully drive ecosystems toward exhibiting desirable functions. When a set of environmental conditions can be manipulated to be present at a discrete number of levels, finding the right combination of conditions to obtain the optimal desired effect becomes a typical combinatorial optimisation problem. Genetic algorithms are a class of robust and flexible search and optimisation techniques from the field of computer science that may be very suitable for such a task. To verify this idea, datasets containing growth levels of the total microbial community of four different natural microbial ecosystems in response to all possible combinations of a set of five chemical supplements were obtained. Subsequently, the ability of a genetic algorithm to search this parameter space for combinations of supplements driving the microbial communities to high levels of growth was compared to that of a random search, a local search, and a hill-climbing algorithm, three intuitive alternative optimisation approaches. The results indicate that a genetic algorithm is very suitable for driving microbial ecosystems in desirable directions, which opens opportunities for both fundamental ecological research and industrial applications.

  14. Production of biosolid fuels from municipal sewage sludge: Technical and economic optimisation.

    PubMed

    Wzorek, Małgorzata; Tańczuk, Mariusz

    2015-08-01

    The article presents the technical and economic analysis of the production of fuels from municipal sewage sludge. The analysis involved the production of two types of fuel compositions: sewage sludge with sawdust (PBT fuel) and sewage sludge with meat and bone meal (PBM fuel). The technology of the production line of these sewage fuels was proposed and analysed. The main objective of the study is to find the optimal production capacity. The optimisation analysis was performed for the adopted technical and economic parameters under Polish conditions. The objective function was set as a maximum of the net present value index and the optimisation procedure was carried out for the fuel production line input capacity from 0.5 to 3 t h(-1), using the search step 0.5 t h(-1). On the basis of technical and economic assumptions, economic efficiency indexes of the investment were determined for the case of optimal line productivity. The results of the optimisation analysis show that under appropriate conditions, such as prices of components and prices of produced fuels, the production of fuels from sewage sludge can be profitable. In the case of PBT fuel, calculated economic indexes show the best profitability for the capacity of a plant over 1.5 t h(-1) output, while production of PBM fuel is beneficial for a plant with the maximum of searched capacities: 3.0 t h(-1). Sensitivity analyses carried out during the investigation show that influence of both technical and economic assessments on the location of maximum of objective function (net present value) is significant. © The Author(s) 2015.

  15. Is Using the Strengths and Difficulties Questionnaire in a Community Sample the Optimal Way to Assess Mental Health Functioning?

    PubMed

    Vaz, Sharmila; Cordier, Reinie; Boyes, Mark; Parsons, Richard; Joosten, Annette; Ciccarelli, Marina; Falkmer, Marita; Falkmer, Torbjorn

    2016-01-01

    An important characteristic of a screening tool is its discriminant ability or the measure's accuracy to distinguish between those with and without mental health problems. The current study examined the inter-rater agreement and screening concordance of the parent and teacher versions of SDQ at scale, subscale and item-levels, with the view of identifying the items that have the most informant discrepancies; and determining whether the concordance between parent and teacher reports on some items has the potential to influence decision making. Cross-sectional data from parent and teacher reports of the mental health functioning of a community sample of 299 students with and without disabilities from 75 different primary schools in Perth, Western Australia were analysed. The study found that: a) Intraclass correlations between parent and teacher ratings of children's mental health using the SDQ at person level was fair on individual child level; b) The SDQ only demonstrated clinical utility when there was agreement between teacher and parent reports using the possible or 90% dichotomisation system; and c) Three individual items had positive likelihood ratio scores indicating clinical utility. Of note was the finding that the negative likelihood ratio or likelihood of disregarding the absence of a condition when both parents and teachers rate the item as absent was not significant. Taken together, these findings suggest that the SDQ is not optimised for use in community samples and that further psychometric evaluation of the SDQ in this context is clearly warranted.

  16. Functional testing of topical skin formulations using an optimised ex vivo skin organ culture model.

    PubMed

    Sidgwick, G P; McGeorge, D; Bayat, A

    2016-07-01

    A number of equivalent-skin models are available for investigation of the ex vivo effect of topical application of drugs and cosmaceuticals onto skin, however many have their drawbacks. With the March 2013 ban on animal models for cosmetic testing of products or ingredients for sale in the EU, their utility for testing toxicity and effect on skin becomes more relevant. The aim of this study was to demonstrate proof of principle that altered expression of key gene and protein markers could be quantified in an optimised whole tissue biopsy culture model. Topical formulations containing green tea catechins (GTC) were investigated in a skin biopsy culture model (n = 11). Punch biopsies were harvested at 3, 7 and 10 days, and analysed using qRT-PCR, histology and HPLC to determine gene and protein expression, and transdermal delivery of compounds of interest. Reduced gene expression of α-SMA, fibronectin, mast cell tryptase, mast cell chymase, TGF-β1, CTGF and PAI-1 was observed after 7 and 10 days compared with treated controls (p < 0.05). Histological analysis indicated a reduction in mast cell tryptase and chymase positive cell numbers in treated biopsies compared with untreated controls at day 7 and day 10 (p < 0.05). Determination of transdermal uptake indicated that GTCs were detected in the biopsies. This model could be adapted to study a range of different topical formulations in both normal and diseased skin, negating the requirement for animal models in this context, prior to study in a clinical trial environment.

  17. A stable solution-processed polymer semiconductor with record high-mobility for printed transistors

    PubMed Central

    Li, Jun; Zhao, Yan; Tan, Huei Shuan; Guo, Yunlong; Di, Chong-An; Yu, Gui; Liu, Yunqi; Lin, Ming; Lim, Suo Hon; Zhou, Yuhua; Su, Haibin; Ong, Beng S.

    2012-01-01

    Microelectronic circuits/arrays produced via high-speed printing instead of traditional photolithographic processes offer an appealing approach to creating the long-sought after, low-cost, large-area flexible electronics. Foremost among critical enablers to propel this paradigm shift in manufacturing is a stable, solution-processable, high-performance semiconductor for printing functionally capable thin-film transistors — fundamental building blocks of microelectronics. We report herein the processing and optimisation of solution-processable polymer semiconductors for thin-film transistors, demonstrating very high field-effect mobility, high on/off ratio, and excellent shelf-life and operating stabilities under ambient conditions. Exceptionally high-gain inverters and functional ring oscillator devices on flexible substrates have been demonstrated. This optimised polymer semiconductor represents a significant progress in semiconductor development, dispelling prevalent skepticism surrounding practical usability of organic semiconductors for high-performance microelectronic devices, opening up application opportunities hitherto functionally or economically inaccessible with silicon technologies, and providing an excellent structural framework for fundamental studies of charge transport in organic systems. PMID:23082244

  18. DC and analog/RF performance optimisation of source pocket dual work function TFET

    NASA Astrophysics Data System (ADS)

    Raad, Bhagwan Ram; Sharma, Dheeraj; Kondekar, Pravin; Nigam, Kaushal; Baronia, Sagar

    2017-12-01

    We investigate a systematic study of source pocket tunnel field-effect transistor (SP TFET) with dual work function of single gate material by using uniform and Gaussian doping profile in the drain region for ultra-low power high frequency high speed applications. For this, a n+ doped region is created near the source/channel junction to decrease the depletion width results in improvement of ON-state current. However, the dual work function of the double gate is used for enhancement of the device performance in terms of DC and analog/RF parameters. Further, to improve the high frequency performance of the device, Gaussian doping profile is considered in the drain region with different characteristic lengths which decreases the gate to drain capacitance and leads to drastic improvement in analog/RF figures of merit. Furthermore, the optimisation is performed with different concentrations for uniform and Gaussian drain doping profile and for various sectional length of lower work function of the gate electrode. Finally, the effect of temperature variation on the device performance is demonstrated.

  19. The robust model predictive control based on mixed H2/H∞ approach with separated performance formulations and its ISpS analysis

    NASA Astrophysics Data System (ADS)

    Li, Dewei; Li, Jiwei; Xi, Yugeng; Gao, Furong

    2017-12-01

    In practical applications, systems are always influenced by parameter uncertainties and external disturbance. Both the H2 performance and the H∞ performance are important for the real applications. For a constrained system, the previous designs of mixed H2/H∞ robust model predictive control (RMPC) optimise one performance with the other performance requirement as a constraint. But the two performances cannot be optimised at the same time. In this paper, an improved design of mixed H2/H∞ RMPC for polytopic uncertain systems with external disturbances is proposed to optimise them simultaneously. In the proposed design, the original uncertain system is decomposed into two subsystems by the additive character of linear systems. Two different Lyapunov functions are used to separately formulate the two performance indices for the two subsystems. Then, the proposed RMPC is designed to optimise both the two performances by the weighting method with the satisfaction of the H∞ performance requirement. Meanwhile, to make the design more practical, a simplified design is also developed. The recursive feasible conditions of the proposed RMPC are discussed and the closed-loop input state practical stable is proven. The numerical examples reflect the enlarged feasible region and the improved performance of the proposed design.

  20. Impact of the calibration period on the conceptual rainfall-runoff model parameter estimates

    NASA Astrophysics Data System (ADS)

    Todorovic, Andrijana; Plavsic, Jasna

    2015-04-01

    A conceptual rainfall-runoff model is defined by its structure and parameters, which are commonly inferred through model calibration. Parameter estimates depend on objective function(s), optimisation method, and calibration period. Model calibration over different periods may result in dissimilar parameter estimates, while model efficiency decreases outside calibration period. Problem of model (parameter) transferability, which conditions reliability of hydrologic simulations, has been investigated for decades. In this paper, dependence of the parameter estimates and model performance on calibration period is analysed. The main question that is addressed is: are there any changes in optimised parameters and model efficiency that can be linked to the changes in hydrologic or meteorological variables (flow, precipitation and temperature)? Conceptual, semi-distributed HBV-light model is calibrated over five-year periods shifted by a year (sliding time windows). Length of the calibration periods is selected to enable identification of all parameters. One water year of model warm-up precedes every simulation, which starts with the beginning of a water year. The model is calibrated using the built-in GAP optimisation algorithm. The objective function used for calibration is composed of Nash-Sutcliffe coefficient for flows and logarithms of flows, and volumetric error, all of which participate in the composite objective function with approximately equal weights. Same prior parameter ranges are used in all simulations. The model is calibrated against flows observed at the Slovac stream gauge on the Kolubara River in Serbia (records from 1954 to 2013). There are no trends in precipitation nor in flows, however, there is a statistically significant increasing trend in temperatures at this catchment. Parameter variability across the calibration periods is quantified in terms of standard deviations of normalised parameters, enabling detection of the most variable parameters. Correlation coefficients among optimised model parameters and total precipitation P, mean temperature T and mean flow Q are calculated to give an insight into parameter dependence on the hydrometeorological drivers. The results reveal high sensitivity of almost all model parameters towards calibration period. The highest variability is displayed by the refreezing coefficient, water holding capacity, and temperature gradient. The only statistically significant (decreasing) trend is detected in the evapotranspiration reduction threshold. Statistically significant correlation is detected between the precipitation gradient and precipitation depth, and between the time-area histogram base and flows. All other correlations are not statistically significant, implying that changes in optimised parameters cannot generally be linked to the changes in P, T or Q. As for the model performance, the model reproduces the observed runoff satisfactorily, though the runoff is slightly overestimated in wet periods. The Nash-Sutcliffe efficiency coefficient (NSE) ranges from 0.44 to 0.79. Higher NSE values are obtained over wetter periods, what is supported by statistically significant correlation between NSE and flows. Overall, no systematic variations in parameters or in model performance are detected. Parameter variability may therefore rather be attributed to errors in data or inadequacies in the model structure. Further research is required to examine the impact of the calibration strategy or model structure on the variability in optimised parameters in time.

  1. Thermal-economic optimisation of a CHP gas turbine system by applying a fit-problem genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ferreira, Ana C. M.; Teixeira, Senhorinha F. C. F.; Silva, Rui G.; Silva, Ângela M.

    2018-04-01

    Cogeneration allows the optimal use of the primary energy sources and significant reductions in carbon emissions. Its use has great potential for applications in the residential sector. This study aims to develop a methodology for thermal-economic optimisation of small-scale micro-gas turbine for cogeneration purposes, able to fulfil domestic energy needs with a thermal power out of 125 kW. A constrained non-linear optimisation model was built. The objective function is the maximisation of the annual worth from the combined heat and power, representing the balance between the annual incomes and the expenditures subject to physical and economic constraints. A genetic algorithm coded in the java programming language was developed. An optimal micro-gas turbine able to produce 103.5 kW of electrical power with a positive annual profit (i.e. 11,925 €/year) was disclosed. The investment can be recovered in 4 years and 9 months, which is less than half of system lifetime expectancy.

  2. Vehicle trajectory linearisation to enable efficient optimisation of the constant speed racing line

    NASA Astrophysics Data System (ADS)

    Timings, Julian P.; Cole, David J.

    2012-06-01

    A driver model is presented capable of optimising the trajectory of a simple dynamic nonlinear vehicle, at constant forward speed, so that progression along a predefined track is maximised as a function of time. In doing so, the model is able to continually operate a vehicle at its lateral-handling limit, maximising vehicle performance. The technique used forms a part of the solution to the motor racing objective of minimising lap time. A new approach of formulating the minimum lap time problem is motivated by the need for a more computationally efficient and robust tool-set for understanding on-the-limit driving behaviour. This has been achieved through set point-dependent linearisation of the vehicle model and coupling the vehicle-track system using an intrinsic coordinate description. Through this, the geometric vehicle trajectory had been linearised relative to the track reference, leading to new path optimisation algorithm which can be formed as a computationally efficient convex quadratic programming problem.

  3. Fault-tolerant optimised tracking control for unknown discrete-time linear systems using a combined reinforcement learning and residual compensation methodology

    NASA Astrophysics Data System (ADS)

    Han, Ke-Zhen; Feng, Jian; Cui, Xiaohong

    2017-10-01

    This paper considers the fault-tolerant optimised tracking control (FTOTC) problem for unknown discrete-time linear system. A research scheme is proposed on the basis of data-based parity space identification, reinforcement learning and residual compensation techniques. The main characteristic of this research scheme lies in the parity-space-identification-based simultaneous tracking control and residual compensation. The specific technical line consists of four main contents: apply subspace aided method to design observer-based residual generator; use reinforcement Q-learning approach to solve optimised tracking control policy; rely on robust H∞ theory to achieve noise attenuation; adopt fault estimation triggered by residual generator to perform fault compensation. To clarify the design and implementation procedures, an integrated algorithm is further constructed to link up these four functional units. The detailed analysis and proof are subsequently given to explain the guaranteed FTOTC performance of the proposed conclusions. Finally, a case simulation is provided to verify its effectiveness.

  4. Optimisation of substrate blends in anaerobic co-digestion using adaptive linear programming.

    PubMed

    García-Gen, Santiago; Rodríguez, Jorge; Lema, Juan M

    2014-12-01

    Anaerobic co-digestion of multiple substrates has the potential to enhance biogas productivity by making use of the complementary characteristics of different substrates. A blending strategy based on a linear programming optimisation method is proposed aiming at maximising COD conversion into methane, but simultaneously maintaining a digestate and biogas quality. The method incorporates experimental and heuristic information to define the objective function and the linear restrictions. The active constraints are continuously adapted (by relaxing the restriction boundaries) such that further optimisations in terms of methane productivity can be achieved. The feasibility of the blends calculated with this methodology was previously tested and accurately predicted with an ADM1-based co-digestion model. This was validated in a continuously operated pilot plant, treating for several months different mixtures of glycerine, gelatine and pig manure at organic loading rates from 1.50 to 4.93 gCOD/Ld and hydraulic retention times between 32 and 40 days at mesophilic conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Cosmos caudatus as a potential source of polyphenolic compounds: optimisation of oven drying conditions and characterisation of its functional properties.

    PubMed

    Mediani, Ahmed; Abas, Faridah; Khatib, Alfi; Tan, Chin Ping

    2013-08-29

    The aim of the study was to analyze the influence of oven thermal processing of Cosmos caudatus on the total polyphenolic content (TPC) and antioxidant capacity (DPPH) of two different solvent extracts (80% methanol, and 80% ethanol). Sonication was used to extract bioactive compounds from this herb. The results showed that the optimised conditions for the oven drying method for 80% methanol and 80% ethanol were 44.5 °C for 4 h with an IC₅₀ of 0.045 mg/mL and 43.12 °C for 4.05 h with an IC₅₀ of 0.055 mg/mL, respectively. The predicted values for TPC under the optimised conditions for 80% methanol and 80% ethanol were 16.5 and 15.8 mg GAE/100 g DW, respectively. The results obtained from this study demonstrate that Cosmos caudatus can be used as a potential source of antioxidants for food and medicinal applications.

  6. Quantum chemical calculations of Cr2O3/SnO2 using density functional theory method

    NASA Astrophysics Data System (ADS)

    Jawaher, K. Rackesh; Indirajith, R.; Krishnan, S.; Robert, R.; Das, S. Jerome

    2018-03-01

    Quantum chemical calculations have been employed to study the molecular effects produced by Cr2O3/SnO2 optimised structure. The theoretical parameters of the transparent conducting metal oxides were calculated using DFT / B3LYP / LANL2DZ method. The optimised bond parameters such as bond lengths, bond angles and dihedral angles were calculated using the same theory. The non-linear optical property of the title compound was calculated using first-order hyperpolarisability calculation. The calculated HOMO-LUMO analysis explains the charge transfer interaction between the molecule. In addition, MEP and Mulliken atomic charges were also calculated and analysed.

  7. Control of a flexible link by shaping the closed loop frequency response function through optimised feedback filters

    NASA Astrophysics Data System (ADS)

    Del Vescovo, D.; D'Ambrogio, W.

    1995-01-01

    A frequency domain method is presented to design a closed-loop control for vibration reduction flexible mechanisms. The procedure is developed on a single-link flexible arm, driven by one rotary degree of freedom servomotor, although the same technique may be applied to similar systems such as supports for aerospace antennae or solar panels. The method uses the structural frequency response functions (FRFs), thus avoiding system identification, that produces modeling uncertainties. Two closed-loops are implemented: the inner loop uses acceleration feedback with the aim of making the FRF similar to that of an equivalent rigid link; the outer loop feeds back displacements to achieve a fast positioning response and null steady state error. In both cases, the controller type is established a priori, while actual characteristics are defined by an optimisation procedure in which the relevant FRF is constrained into prescribed bounds and stability is taken into account.

  8. Petri-net-based 2D design of DNA walker circuits.

    PubMed

    Gilbert, David; Heiner, Monika; Rohr, Christian

    2018-01-01

    We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.

  9. Protecting complex infrastructures against multiple strategic attackers

    NASA Astrophysics Data System (ADS)

    Hausken, Kjell

    2011-01-01

    Infrastructures are analysed subject to defence by a strategic defender and attack by multiple strategic attackers. A framework is developed where each agent determines how much to invest in defending versus attacking each of multiple targets. A target can have economic, human and symbolic values, which generally vary across agents. Investment expenditure functions for each agent can be linear in the investment effort, concave, convex, logistic, can increase incrementally, or can be subject to budget constraints. Contest success functions (e.g., ratio and difference forms) determine the probability of a successful attack on each target, dependent on the relative investments of the defender and attackers on each target, and on characteristics of the contest. Targets can be in parallel, in series, interlinked, interdependent or independent. The defender minimises the expected damage plus the defence expenditures. Each attacker maximises the expected damage minus the attack expenditures. The number of free choice variables equals the number of agents times the number of targets, or lower if there are budget constraints. Each agent is interested in how his investments vary across the targets, and the impact on his utilities. Alternative optimisation programmes are discussed, together with repeated games, dynamic games and incomplete information. An example is provided for illustration.

  10. Biochemical methane potential (BMP) tests: Reducing test time by early parameter estimation.

    PubMed

    Da Silva, C; Astals, S; Peces, M; Campos, J L; Guerrero, L

    2018-01-01

    Biochemical methane potential (BMP) test is a key analytical technique to assess the implementation and optimisation of anaerobic biotechnologies. However, this technique is characterised by long testing times (from 20 to >100days), which is not suitable for waste utilities, consulting companies or plants operators whose decision-making processes cannot be held for such a long time. This study develops a statistically robust mathematical strategy using sensitivity functions for early prediction of BMP first-order model parameters, i.e. methane yield (B 0 ) and kinetic constant rate (k). The minimum testing time for early parameter estimation showed a potential correlation with the k value, where (i) slowly biodegradable substrates (k≤0.1d -1 ) have a minimum testing times of ≥15days, (ii) moderately biodegradable substrates (0.1

  11. Estimating economic value of agricultural water under changing conditions and the effects of spatial aggregation.

    PubMed

    Medellín-Azuara, Josué; Harou, Julien J; Howitt, Richard E

    2010-11-01

    Given the high proportion of water used for agriculture in certain regions, the economic value of agricultural water can be an important tool for water management and policy development. This value is quantified using economic demand curves for irrigation water. Such demand functions show the incremental contribution of water to agricultural production. Water demand curves are estimated using econometric or optimisation techniques. Calibrated agricultural optimisation models allow the derivation of demand curves using smaller datasets than econometric models. This paper introduces these subject areas then explores the effect of spatial aggregation (upscaling) on the valuation of water for irrigated agriculture. A case study from the Rio Grande-Rio Bravo Basin in North Mexico investigates differences in valuation at farm and regional aggregated levels under four scenarios: technological change, warm-dry climate change, changes in agricultural commodity prices, and water costs for agriculture. The scenarios consider changes due to external shocks or new policies. Positive mathematical programming (PMP), a calibrated optimisation method, is the deductive valuation method used. An exponential cost function is compared to the quadratic cost functions typically used in PMP. Results indicate that the economic value of water at the farm level and the regionally aggregated level are similar, but that the variability and distributional effects of each scenario are affected by aggregation. Moderately aggregated agricultural production models are effective at capturing average-farm adaptation to policy changes and external shocks. Farm-level models best reveal the distribution of scenario impacts. Copyright © 2009 Elsevier B.V. All rights reserved.

  12. Imaging basal ganglia function

    PubMed Central

    BROOKS, DAVID J.

    2000-01-01

    In this review, the value of functional imaging for providing insight into the role of the basal ganglia in motor control is reviewed. Brain activation findings in normal subjects and Parkinson's disease patients are examined and evidence supporting the existence for functionally independent distributed basal ganglia-frontal loops is presented. It is argued that the basal ganglia probably act to focus and filter cortical output, optimising the running of motor programs. PMID:10923986

  13. Reinforcement learning or active inference?

    PubMed

    Friston, Karl J; Daunizeau, Jean; Kiebel, Stefan J

    2009-07-29

    This paper questions the need for reinforcement learning or control theory when optimising behaviour. We show that it is fairly simple to teach an agent complicated and adaptive behaviours using a free-energy formulation of perception. In this formulation, agents adjust their internal states and sampling of the environment to minimize their free-energy. Such agents learn causal structure in the environment and sample it in an adaptive and self-supervised fashion. This results in behavioural policies that reproduce those optimised by reinforcement learning and dynamic programming. Critically, we do not need to invoke the notion of reward, value or utility. We illustrate these points by solving a benchmark problem in dynamic programming; namely the mountain-car problem, using active perception or inference under the free-energy principle. The ensuing proof-of-concept may be important because the free-energy formulation furnishes a unified account of both action and perception and may speak to a reappraisal of the role of dopamine in the brain.

  14. An integrated framework for the optimisation of sport and athlete development: a practitioner approach.

    PubMed

    Gulbin, Jason P; Croser, Morag J; Morley, Elissa J; Weissensteiner, Juanita R

    2013-01-01

    This paper introduces a new sport and athlete development framework that has been generated by multidisciplinary sport practitioners. By combining current theoretical research perspectives with extensive empirical observations from one of the world's leading sport agencies, the proposed FTEM (Foundations, Talent, Elite, Mastery) framework offers broad utility to researchers and sporting stakeholders alike. FTEM is unique in comparison with alternative models and frameworks, because it: integrates general and specialised phases of development for participants within the active lifestyle, sport participation and sport excellence pathways; typically doubles the number of developmental phases (n = 10) in order to better understand athlete transition; avoids chronological and training prescriptions; more optimally establishes a continuum between participation and elite; and allows full inclusion of many developmental support drivers at the sport and system levels. The FTEM framework offers a viable and more flexible alternative for those sporting stakeholders interested in managing, optimising, and researching sport and athlete development pathways.

  15. H2/H∞ control for grid-feeding converter considering system uncertainty

    NASA Astrophysics Data System (ADS)

    Li, Zhongwen; Zang, Chuanzhi; Zeng, Peng; Yu, Haibin; Li, Shuhui; Fu, Xingang

    2017-05-01

    Three-phase grid-feeding converters are key components to integrate distributed generation and renewable power sources to the power utility. Conventionally, proportional integral and proportional resonant-based control strategies are applied to control the output power or current of a GFC. But, those control strategies have poor transient performance and are not robust against uncertainties and volatilities in the system. This paper proposes a H2/H∞-based control strategy, which can mitigate the above restrictions. The uncertainty and disturbance are included to formulate the GFC system state-space model, making it more accurate to reflect the practical system conditions. The paper uses a convex optimisation method to design the H2/H∞-based optimal controller. Instead of using a guess-and-check method, the paper uses particle swarm optimisation to search a H2/H∞ optimal controller. Several case studies implemented by both simulation and experiment can verify the superiority of the proposed control strategy than the traditional PI control methods especially under dynamic and variable system conditions.

  16. Review shows that parental reassurance and nutritional advice help to optimise the management of functional gastrointestinal disorders in infants.

    PubMed

    Salvatore, Silvia; Abkari, Abdelhak; Cai, Wei; Catto-Smith, Anthony; Cruchet, Sylvia; Gottrand, Frederic; Hegar, Badriul; Lifschitz, Carlos; Ludwig, Thomas; Shah, Neil; Staiano, Annamaria; Szajewska, Hania; Treepongkaruna, Suporn; Vandenplas, Yvan

    2018-04-30

    Regurgitation, infantile colic and functional constipation are common functional gastrointestinal disorders (FGIDs) during infancy. Our aim was to carry out a concise review of the literature, evaluate the impact of these common FGIDs on infants and their families, and provide an overview of national and international guidelines and peer-reviewed expert recommendations on their management. National and international guidelines and peer-reviewed expert recommendations on the management of regurgitation, infantile colic and functional constipation were examined and summarised. Regurgitation, infantile colic and functional constipation cause frequent parental concerns, lead to heavy personal and economic costs for families and impose a financial burden on public healthcare systems. Guidelines emphasise that the first-line management of these common FGIDs should focus on parental education, reassurance and nutritional advice. Nutritional advice should stress the benefits of continuing breastfeeding, while special infant formulas may be considered for non-breastfed infants with common FGIDs. Drug treatment is seldom required, with the exception of functional constipation. By providing complete and updated parental education, reassurance and nutritional advice, healthcare professionals can optimise the management of FGIDs and related symptoms and reduce the inappropriate use of medication or dietary interventions. ©2018 The Authors. Acta Paediatrica published by John Wiley & Sons Ltd on behalf of Foundation Acta Paediatrica.

  17. Design of optimised backstepping controller for the synchronisation of chaotic Colpitts oscillator using shark smell algorithm

    NASA Astrophysics Data System (ADS)

    Fouladi, Ehsan; Mojallali, Hamed

    2018-01-01

    In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master-slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm optimisation (PSO) algorithm. Simulation results show better performance in terms of accuracy and convergence for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller.

  18. Improving the Fit of a Land-Surface Model to Data Using its Adjoint

    NASA Astrophysics Data System (ADS)

    Raoult, Nina; Jupp, Tim; Cox, Peter; Luke, Catherine

    2016-04-01

    Land-surface models (LSMs) are crucial components of the Earth System Models (ESMs) which are used to make coupled climate-carbon cycle projections for the 21st century. The Joint UK Land Environment Simulator (JULES) is the land-surface model used in the climate and weather forecast models of the UK Met Office. In this study, JULES is automatically differentiated using commercial software from FastOpt, resulting in an analytical gradient, or adjoint, of the model. Using this adjoint, the adJULES parameter estimation system has been developed, to search for locally optimum parameter sets by calibrating against observations. We present an introduction to the adJULES system and demonstrate its ability to improve the model-data fit using eddy covariance measurements of gross primary production (GPP) and latent heat (LE) fluxes. adJULES also has the ability to calibrate over multiple sites simultaneously. This feature is used to define new optimised parameter values for the 5 Plant Functional Types (PFTS) in JULES. The optimised PFT-specific parameters improve the performance of JULES over 90% of the FLUXNET sites used in the study. These reductions in error are shown and compared to reductions found due to site-specific optimisations. Finally, we show that calculation of the 2nd derivative of JULES allows us to produce posterior probability density functions of the parameters and how knowledge of parameter values is constrained by observations.

  19. Improved packing of protein side chains with parallel ant colonies.

    PubMed

    Quan, Lijun; Lü, Qiang; Li, Haiou; Xia, Xiaoyan; Wu, Hongjie

    2014-01-01

    The accurate packing of protein side chains is important for many computational biology problems, such as ab initio protein structure prediction, homology modelling, and protein design and ligand docking applications. Many of existing solutions are modelled as a computational optimisation problem. As well as the design of search algorithms, most solutions suffer from an inaccurate energy function for judging whether a prediction is good or bad. Even if the search has found the lowest energy, there is no certainty of obtaining the protein structures with correct side chains. We present a side-chain modelling method, pacoPacker, which uses a parallel ant colony optimisation strategy based on sharing a single pheromone matrix. This parallel approach combines different sources of energy functions and generates protein side-chain conformations with the lowest energies jointly determined by the various energy functions. We further optimised the selected rotamers to construct subrotamer by rotamer minimisation, which reasonably improved the discreteness of the rotamer library. We focused on improving the accuracy of side-chain conformation prediction. For a testing set of 442 proteins, 87.19% of X1 and 77.11% of X12 angles were predicted correctly within 40° of the X-ray positions. We compared the accuracy of pacoPacker with state-of-the-art methods, such as CIS-RR and SCWRL4. We analysed the results from different perspectives, in terms of protein chain and individual residues. In this comprehensive benchmark testing, 51.5% of proteins within a length of 400 amino acids predicted by pacoPacker were superior to the results of CIS-RR and SCWRL4 simultaneously. Finally, we also showed the advantage of using the subrotamers strategy. All results confirmed that our parallel approach is competitive to state-of-the-art solutions for packing side chains. This parallel approach combines various sources of searching intelligence and energy functions to pack protein side chains. It provides a frame-work for combining different inaccuracy/usefulness objective functions by designing parallel heuristic search algorithms.

  20. Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model

    NASA Astrophysics Data System (ADS)

    Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.

    2017-09-01

    The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.

  1. Sequential projection pursuit for optimised vibration-based damage detection in an experimental wind turbine blade

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2018-02-01

    To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.

  2. Optimal coordinated voltage control in active distribution networks using backtracking search algorithm

    PubMed Central

    Tengku Hashim, Tengku Juhana; Mohamed, Azah

    2017-01-01

    The growing interest in distributed generation (DG) in recent years has led to a number of generators connected to a distribution system. The integration of DGs in a distribution system has resulted in a network known as active distribution network due to the existence of bidirectional power flow in the system. Voltage rise issue is one of the predominantly important technical issues to be addressed when DGs exist in an active distribution network. This paper presents the application of the backtracking search algorithm (BSA), which is relatively new optimisation technique to determine the optimal settings of coordinated voltage control in a distribution system. The coordinated voltage control considers power factor, on-load tap-changer and generation curtailment control to manage voltage rise issue. A multi-objective function is formulated to minimise total losses and voltage deviation in a distribution system. The proposed BSA is compared with that of particle swarm optimisation (PSO) so as to evaluate its effectiveness in determining the optimal settings of power factor, tap-changer and percentage active power generation to be curtailed. The load flow algorithm from MATPOWER is integrated in the MATLAB environment to solve the multi-objective optimisation problem. Both the BSA and PSO optimisation techniques have been tested on a radial 13-bus distribution system and the results show that the BSA performs better than PSO by providing better fitness value and convergence rate. PMID:28991919

  3. Optimal coordinated voltage control in active distribution networks using backtracking search algorithm.

    PubMed

    Tengku Hashim, Tengku Juhana; Mohamed, Azah

    2017-01-01

    The growing interest in distributed generation (DG) in recent years has led to a number of generators connected to a distribution system. The integration of DGs in a distribution system has resulted in a network known as active distribution network due to the existence of bidirectional power flow in the system. Voltage rise issue is one of the predominantly important technical issues to be addressed when DGs exist in an active distribution network. This paper presents the application of the backtracking search algorithm (BSA), which is relatively new optimisation technique to determine the optimal settings of coordinated voltage control in a distribution system. The coordinated voltage control considers power factor, on-load tap-changer and generation curtailment control to manage voltage rise issue. A multi-objective function is formulated to minimise total losses and voltage deviation in a distribution system. The proposed BSA is compared with that of particle swarm optimisation (PSO) so as to evaluate its effectiveness in determining the optimal settings of power factor, tap-changer and percentage active power generation to be curtailed. The load flow algorithm from MATPOWER is integrated in the MATLAB environment to solve the multi-objective optimisation problem. Both the BSA and PSO optimisation techniques have been tested on a radial 13-bus distribution system and the results show that the BSA performs better than PSO by providing better fitness value and convergence rate.

  4. Optimal control of LQR for discrete time-varying systems with input delays

    NASA Astrophysics Data System (ADS)

    Yin, Yue-Zhu; Yang, Zhong-Lian; Yin, Zhi-Xiang; Xu, Feng

    2018-04-01

    In this work, we consider the optimal control problem of linear quadratic regulation for discrete time-variant systems with single input and multiple input delays. An innovative and simple method to derive the optimal controller is given. The studied problem is first equivalently converted into a problem subject to a constraint condition. Last, with the established duality, the problem is transformed into a static mathematical optimisation problem without input delays. The optimal control input solution to minimise performance index function is derived by solving this optimisation problem with two methods. A numerical simulation example is carried out and its results show that our two approaches are both feasible and very effective.

  5. Chromatic perception of non-invasive lighting of cave paintings

    NASA Astrophysics Data System (ADS)

    Zoido, Jesús; Vazquez, Daniel; Álvarez, Antonio; Bernabeu, Eusebio; García, Ángel; Herraez, Juán A.; del Egido, Marian

    2009-08-01

    This work is intended to deal with the problems which arise when illuminanting Paleolithic cave paintings. We have carried out the spectral and colorimetric characterization of some paintings located in the Murcielagos (bats) cave (Zuheros, Córdoba, Spain). From this characterization, the chromatic changes produced under different lighting conditions are analysed. The damage function is also computed for the different illuminants used. From the results obtained, it is proposed an illuminant whose spectral distribution diminishes the damage by minimizing the absorption of radiation and optimises the color perception of the paintings in this cave. The procedure followed in this study can be applied to optimise the lighting systems used when illuminating any other art work

  6. A model for nematode locomotion in soil

    USGS Publications Warehouse

    Hunt, H. William; Wall, Diana H.; DeCrappeo, Nicole; Brenner, John S.

    2001-01-01

    Locomotion of nematodes in soil is important for both practical and theoretical reasons. We constructed a model for rate of locomotion. The first model component is a simple simulation of nematode movement among finite cells by both random and directed behaviours. Optimisation procedures were used to fit the simulation output to data from published experiments on movement along columns of soil or washed sand, and thus to estimate the values of the model's movement coefficients. The coefficients then provided an objective means to compare rates of locomotion among studies done under different experimental conditions. The second component of the model is an equation to predict the movement coefficients as a function of controlling factors that have been addressed experimentally: soil texture, bulk density, water potential, temperature, trophic group of nematode, presence of an attractant or physical gradient and the duration of the experiment. Parameters of the equation were estimated by optimisation to achieve a good fit to the estimated movement coefficients. Bulk density, which has been reported in a minority of published studies, is predicted to have an important effect on rate of locomotion, at least in fine-textured soils. Soil sieving, which appears to be a universal practice in laboratory studies of nematode movement, is predicted to negatively affect locomotion. Slower movement in finer textured soils would be expected to increase isolation among local populations, and thus to promote species richness. Future additions to the model that might improve its utility include representing heterogeneity within populations in rate of movement, development of gradients of chemical attractants, trade-offs between random and directed components of movement, species differences in optimal temperature and water potential, and interactions among factors controlling locomotion.

  7. Obstacle evasion in free-space optical communications utilizing Airy beams

    NASA Astrophysics Data System (ADS)

    Zhu, Guoxuan; Wen, Yuanhui; Wu, Xiong; Chen, Yujie; Liu, Jie; Yu, Siyuan

    2018-03-01

    A high speed free-space optical communication system capable of self-bending signal transmission around line-of-sight obstacles is proposed and demonstrated. Airy beams are generated and controlled to achieve different propagating trajectories, and the signal transmission characteristics of these beams around the obstacle are investigated. Our results confirm that, by optimising their ballistic trajectories, Airy beams are able to bypass obstacles with more signal energy and thus improve the communication performance compared with normal Gaussian beams.

  8. Optimisation of flight dynamic control based on many-objectives meta-heuristic: a comparative study

    NASA Astrophysics Data System (ADS)

    Bureerat, Sujin; Pholdee, Nantiwat; Radpukdee, Thana

    2018-05-01

    Development of many objective meta-heuristics (MnMHs) is a currently interesting topic as they are suitable to real applications of optimisation problems which usually require many ob-jectives. However, most of MnMHs have been mostly developed and tested based on stand-ard testing functions while the use of MnMHs to real applications is rare. Therefore, in this work, MnMHs are applied for optimisation design of flight dynamic control. The design prob-lem is posed to find control gains for minimising; the control effort, the spiral root, the damp-ing in roll root, sideslip angle deviation, and maximising; the damping ratio of the dutch-roll complex pair, the dutch-roll frequency, bank angle at pre-specified times 1 seconds and 2.8 second subjected to several constraints based on Military Specifications (1969) requirement. Several established many-objective meta-heuristics (MnMHs) are used to solve the problem while their performances are compared. With this research work, performance of several MnMHs for flight control is investigated. The results obtained will be the baseline for future development of flight dynamic and control.

  9. Optimisation of ultrasound-assisted reverse micelles dispersive liquid-liquid micro-extraction by Box-Behnken design for determination of acetoin in butter followed by high performance liquid chromatography.

    PubMed

    Roosta, Mostafa; Ghaedi, Mehrorang; Daneshfar, Ali

    2014-10-15

    A novel approach, ultrasound-assisted reverse micelles dispersive liquid-liquid microextraction (USA-RM-DLLME) followed by high performance liquid chromatography (HPLC) was developed for selective determination of acetoin in butter. The melted butter sample was diluted and homogenised by n-hexane and Triton X-100, respectively. Subsequently, 400μL of distilled water was added and the microextraction was accelerated by 4min sonication. After 8.5min of centrifugation, sedimented phase (surfactant-rich phase) was withdrawn by microsyringe and injected into the HPLC system for analysis. The influence of effective variables was optimised using Box-Behnken design (BBD) combined with desirability function (DF). Under optimised experimental conditions, the calibration graph was linear over the range of 0.6-200mgL(-1). The detection limit of method was 0.2mgL(-1) and coefficient of determination was 0.9992. The relative standard deviations (RSDs) were less than 5% (n=5) while the recoveries were in the range of 93.9-107.8%. Copyright © 2014. Published by Elsevier Ltd.

  10. Optimising mobile phase composition, its flow-rate and column temperature in HPLC using taboo search.

    PubMed

    Guillaume, Y C; Peyrin, E

    2000-03-06

    A chemometric methodology is proposed to study the separation of seven p-hydroxybenzoic esters in reversed phase liquid chromatography (RPLC). Fifteen experiments were found to be necessary to find a mathematical model which linked a novel chromatographic response function (CRF) with the column temperature, the water fraction in the mobile phase and its flow rate. The CRF optimum was determined using a new algorithm based on Glover's taboo search (TS). A flow-rate of 0.9 ml min(-1) with a water fraction of 0.64 in the ACN-water mixture and a column temperature of 10 degrees C gave the most efficient separation conditions. The usefulness of TS was compared with the pure random search (PRS) and simplex search (SS). As demonstrated by calculations, the algorithm avoids entrapment in local minima and continues the search to give a near-optimal final solution. Unlike other methods of global optimisation, this procedure is generally applicable, easy to implement, derivative free, conceptually simple and could be used in the future for much more complex optimisation problems.

  11. Warpage optimisation on the moulded part with straight-drilled and conformal cooling channels using response surface methodology (RSM) and glowworm swarm optimisation (GSO)

    NASA Astrophysics Data System (ADS)

    Hazwan, M. H. M.; Shayfull, Z.; Sharif, S.; Nasir, S. M.; Zainal, N.

    2017-09-01

    In injection moulding process, quality and productivity are notably important and must be controlled for each product type produced. Quality is measured as the extent of warpage of moulded parts while productivity is measured as a duration of moulding cycle time. To control the quality, many researchers have introduced various of optimisation approaches which have been proven enhanced the quality of the moulded part produced. In order to improve the productivity of injection moulding process, some of researches have proposed the application of conformal cooling channels which have been proven reduced the duration of moulding cycle time. Therefore, this paper presents an application of alternative optimisation approach which is Response Surface Methodology (RSM) with Glowworm Swarm Optimisation (GSO) on the moulded part with straight-drilled and conformal cooling channels mould. This study examined the warpage condition of the moulded parts before and after optimisation work applied for both cooling channels. A front panel housing have been selected as a specimen and the performance of proposed optimisation approach have been analysed on the conventional straight-drilled cooling channels compared to the Milled Groove Square Shape (MGSS) conformal cooling channels by simulation analysis using Autodesk Moldflow Insight (AMI) 2013. Based on the results, melt temperature is the most significant factor contribute to the warpage condition and warpage have optimised by 39.1% after optimisation for straight-drilled cooling channels and cooling time is the most significant factor contribute to the warpage condition and warpage have optimised by 38.7% after optimisation for MGSS conformal cooling channels. In addition, the finding shows that the application of optimisation work on the conformal cooling channels offers the better quality and productivity of the moulded part produced.

  12. Optimising the quantification of cytokines present at low concentrations in small human mucosal tissue samples using Luminex assays☆

    PubMed Central

    Staples, Emily; Ingram, Richard James Michael; Atherton, John Christopher; Robinson, Karen

    2013-01-01

    Sensitive measurement of multiple cytokine profiles from small mucosal tissue biopsies, for example human gastric biopsies obtained through an endoscope, is technically challenging. Multiplex methods such as Luminex assays offer an attractive solution but standard protocols are not available for tissue samples. We assessed the utility of three commercial Luminex kits (VersaMAP, Bio-Plex and MILLIPLEX) to measure interleukin-17A (IL-17) and interferon-gamma (IFNγ) concentrations in human gastric biopsies and we optimised preparation of mucosal samples for this application. First, we assessed the technical performance, limits of sensitivity and linear dynamic ranges for each kit. Next we spiked human gastric biopsies with recombinant IL-17 and IFNγ at a range of concentrations (1.5 to 1000 pg/mL) and assessed kit accuracy for spiked cytokine recovery and intra-assay precision. We also evaluated the impact of different tissue processing methods and extraction buffers on our results. Finally we assessed recovery of endogenous cytokines in unspiked samples. In terms of sensitivity, all of the kits performed well within the manufacturers' recommended standard curve ranges but the MILLIPLEX kit provided most consistent sensitivity for low cytokine concentrations. In the spiking experiments, the MILLIPLEX kit performed most consistently over the widest range of concentrations. For tissue processing, manual disruption provided significantly improved cytokine recovery over automated methods. Our selected kit and optimised protocol were further validated by measurement of relative cytokine levels in inflamed and uninflamed gastric mucosa using Luminex and real-time polymerase chain reaction. In summary, with proper optimisation Luminex kits (and for IL-17 and IFNγ the MILLIPLEX kit in particular) can be used for the sensitive detection of cytokines in mucosal biopsies. Our results should help other researchers seeking to quantify multiple low concentration cytokines in small tissue samples. PMID:23644159

  13. Using Optimisation Techniques to Granulise Rough Set Partitions

    NASA Astrophysics Data System (ADS)

    Crossingham, Bodie; Marwala, Tshilidzi

    2007-11-01

    This paper presents an approach to optimise rough set partition sizes using various optimisation techniques. Three optimisation techniques are implemented to perform the granularisation process, namely, genetic algorithm (GA), hill climbing (HC) and simulated annealing (SA). These optimisation methods maximise the classification accuracy of the rough sets. The proposed rough set partition method is tested on a set of demographic properties of individuals obtained from the South African antenatal survey. The three techniques are compared in terms of their computational time, accuracy and number of rules produced when applied to the Human Immunodeficiency Virus (HIV) data set. The optimised methods results are compared to a well known non-optimised discretisation method, equal-width-bin partitioning (EWB). The accuracies achieved after optimising the partitions using GA, HC and SA are 66.89%, 65.84% and 65.48% respectively, compared to the accuracy of EWB of 59.86%. In addition to rough sets providing the plausabilities of the estimated HIV status, they also provide the linguistic rules describing how the demographic parameters drive the risk of HIV.

  14. Multi-objective optimisation of aircraft flight trajectories in the ATM and avionics context

    NASA Astrophysics Data System (ADS)

    Gardi, Alessandro; Sabatini, Roberto; Ramasamy, Subramanian

    2016-05-01

    The continuous increase of air transport demand worldwide and the push for a more economically viable and environmentally sustainable aviation are driving significant evolutions of aircraft, airspace and airport systems design and operations. Although extensive research has been performed on the optimisation of aircraft trajectories and very efficient algorithms were widely adopted for the optimisation of vertical flight profiles, it is only in the last few years that higher levels of automation were proposed for integrated flight planning and re-routing functionalities of innovative Communication Navigation and Surveillance/Air Traffic Management (CNS/ATM) and Avionics (CNS+A) systems. In this context, the implementation of additional environmental targets and of multiple operational constraints introduces the need to efficiently deal with multiple objectives as part of the trajectory optimisation algorithm. This article provides a comprehensive review of Multi-Objective Trajectory Optimisation (MOTO) techniques for transport aircraft flight operations, with a special focus on the recent advances introduced in the CNS+A research context. In the first section, a brief introduction is given, together with an overview of the main international research initiatives where this topic has been studied, and the problem statement is provided. The second section introduces the mathematical formulation and the third section reviews the numerical solution techniques, including discretisation and optimisation methods for the specific problem formulated. The fourth section summarises the strategies to articulate the preferences and to select optimal trajectories when multiple conflicting objectives are introduced. The fifth section introduces a number of models defining the optimality criteria and constraints typically adopted in MOTO studies, including fuel consumption, air pollutant and noise emissions, operational costs, condensation trails, airspace and airport operations. A brief overview of atmospheric and weather modelling is also included. Key equations describing the optimality criteria are presented, with a focus on the latest advancements in the respective application areas. In the sixth section, a number of MOTO implementations in the CNS+A systems context are mentioned with relevant simulation case studies addressing different operational tasks. The final section draws some conclusions and outlines guidelines for future research on MOTO and associated CNS+A system implementations.

  15. The use of surrogates for an optimal management of coupled groundwater-agriculture hydrosystems

    NASA Astrophysics Data System (ADS)

    Grundmann, J.; Schütze, N.; Brettschneider, M.; Schmitz, G. H.; Lennartz, F.

    2012-04-01

    For ensuring an optimal sustainable water resources management in arid coastal environments, we develop a new simulation based integrated water management system. It aims at achieving best possible solutions for groundwater withdrawals for agricultural and municipal water use including saline water management together with a substantial increase of the water use efficiency in irrigated agriculture. To achieve a robust and fast operation of the management system regarding water quality and water quantity we develop appropriate surrogate models by combining physically based process modelling with methods of artificial intelligence. Thereby we use an artificial neural network for modelling the aquifer response, inclusive the seawater interface, which was trained on a scenario database generated by a numerical density depended groundwater flow model. For simulating the behaviour of high productive agricultural farms crop water production functions are generated by means of soil-vegetation-atmosphere-transport (SVAT)-models, adapted to the regional climate conditions, and a novel evolutionary optimisation algorithm for optimal irrigation scheduling and control. We apply both surrogates exemplarily within a simulation based optimisation environment using the characteristics of the south Batinah region in the Sultanate of Oman which is affected by saltwater intrusion into the coastal aquifer due to excessive groundwater withdrawal for irrigated agriculture. We demonstrate the effectiveness of our methodology for the evaluation and optimisation of different irrigation practices, cropping pattern and resulting abstraction scenarios. Due to contradicting objectives like profit-oriented agriculture vs. aquifer sustainability a multi-criterial optimisation is performed.

  16. Optimisation of MSW collection routes for minimum fuel consumption using 3D GIS modelling.

    PubMed

    Tavares, G; Zsigraiova, Z; Semiao, V; Carvalho, M G

    2009-03-01

    Collection of municipal solid waste (MSW) may account for more than 70% of the total waste management budget, most of which is for fuel costs. It is therefore crucial to optimise the routing network used for waste collection and transportation. This paper proposes the use of geographical information systems (GIS) 3D route modelling software for waste collection and transportation, which adds one more degree of freedom to the system and allows driving routes to be optimised for minimum fuel consumption. The model takes into account the effects of road inclination and vehicle weight. It is applied to two different cases: routing waste collection vehicles in the city of Praia, the capital of Cape Verde, and routing the transport of waste from different municipalities of Santiago Island to an incineration plant. For the Praia city region, the 3D model that minimised fuel consumption yielded cost savings of 8% as compared with an approach that simply calculated the shortest 3D route. Remarkably, this was true despite the fact that the GIS-recommended fuel reduction route was actually 1.8% longer than the shortest possible travel distance. For the Santiago Island case, the difference was even more significant: a 12% fuel reduction for a similar total travel distance. These figures indicate the importance of considering both the relief of the terrain and fuel consumption in selecting a suitable cost function to optimise vehicle routing.

  17. Chemical study, antioxidant, anti-hypertensive, and cytotoxic/cytoprotective activities of Centaurea cyanus L. petals aqueous extract.

    PubMed

    Escher, Graziela Bragueto; Santos, Jânio Sousa; Rosso, Neiva Deliberali; Marques, Mariza Boscacci; Azevedo, Luciana; do Carmo, Mariana Araújo Vieira; Daguer, Heitor; Molognoni, Luciano; Prado-Silva, Leonardo do; Sant'Ana, Anderson S; da Silva, Marcia Cristina; Granato, Daniel

    2018-05-19

    This study aimed to optimise the experimental conditions of extraction of the phytochemical compounds and functional properties of Centaurea cyanus petals. The following parameters were determined: the chemical composition (LC-ESI-MS/MS), the effects of pH on the stability and antioxidant activity of anthocyanins, the inhibition of lipid peroxidation, antioxidant activity, anti-hemolytic activity, antimicrobial, anti-hypertensive, and cytotoxic/cytoprotective effect, and the measurements of intracellular reactive oxygen species. Results showed that the temperature and time influenced (p ≤ 0.05) the content of flavonoids, anthocyanins, and FRAP. Only the temperature influenced the total phenolic content, non-anthocyanin flavonoids, and antioxidant activity (DPPH). The statistical approach made it possible to obtain the optimised experimental extraction conditions to increase the level of bioactive compounds. Chlorogenic, caffeic, ferulic, and p-coumaric acids, isoquercitrin, and coumarin were identified as the major compounds in the optimised extract. The optimised extract presented anti-hemolytic and anti-hypertensive activity in vitro, in addition to showing stability and reversibility of anthocyanins and antioxidant activity with pH variation. The C. cyanus petals aqueous extract exhibited high IC 50 and GI 50 (>900 μg/mL) values for all cell lines, meaning low cytotoxicity. Based on the stress oxidative assay, the extract exhibited pro-oxidant action (10-100 μg/mL) but did not cause damage or cell death. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Multidisciplinary design optimisation of a recurve bow based on applications of the autogenetic design theory and distributed computing

    NASA Astrophysics Data System (ADS)

    Fritzsche, Matthias; Kittel, Konstantin; Blankenburg, Alexander; Vajna, Sándor

    2012-08-01

    The focus of this paper is to present a method of multidisciplinary design optimisation based on the autogenetic design theory (ADT) that provides methods, which are partially implemented in the optimisation software described here. The main thesis of the ADT is that biological evolution and the process of developing products are mainly similar, i.e. procedures from biological evolution can be transferred into product development. In order to fulfil requirements and boundary conditions of any kind (that may change at any time), both biological evolution and product development look for appropriate solution possibilities in a certain area, and try to optimise those that are actually promising by varying parameters and combinations of these solutions. As the time necessary for multidisciplinary design optimisations is a critical aspect in product development, ways to distribute the optimisation process with the effective use of unused calculating capacity, can reduce the optimisation time drastically. Finally, a practical example shows how ADT methods and distributed optimising are applied to improve a product.

  19. Optimisation in radiotherapy. III: Stochastic optimisation algorithms and conclusions.

    PubMed

    Ebert, M

    1997-12-01

    This is the final article in a three part examination of optimisation in radiotherapy. Previous articles have established the bases and form of the radiotherapy optimisation problem, and examined certain types of optimisation algorithm, namely, those which perform some form of ordered search of the solution space (mathematical programming), and those which attempt to find the closest feasible solution to the inverse planning problem (deterministic inversion). The current paper examines algorithms which search the space of possible irradiation strategies by stochastic methods. The resulting iterative search methods move about the solution space by sampling random variates, which gradually become more constricted as the algorithm converges upon the optimal solution. This paper also discusses the implementation of optimisation in radiotherapy practice.

  20. Improved packing of protein side chains with parallel ant colonies

    PubMed Central

    2014-01-01

    Introduction The accurate packing of protein side chains is important for many computational biology problems, such as ab initio protein structure prediction, homology modelling, and protein design and ligand docking applications. Many of existing solutions are modelled as a computational optimisation problem. As well as the design of search algorithms, most solutions suffer from an inaccurate energy function for judging whether a prediction is good or bad. Even if the search has found the lowest energy, there is no certainty of obtaining the protein structures with correct side chains. Methods We present a side-chain modelling method, pacoPacker, which uses a parallel ant colony optimisation strategy based on sharing a single pheromone matrix. This parallel approach combines different sources of energy functions and generates protein side-chain conformations with the lowest energies jointly determined by the various energy functions. We further optimised the selected rotamers to construct subrotamer by rotamer minimisation, which reasonably improved the discreteness of the rotamer library. Results We focused on improving the accuracy of side-chain conformation prediction. For a testing set of 442 proteins, 87.19% of X1 and 77.11% of X12 angles were predicted correctly within 40° of the X-ray positions. We compared the accuracy of pacoPacker with state-of-the-art methods, such as CIS-RR and SCWRL4. We analysed the results from different perspectives, in terms of protein chain and individual residues. In this comprehensive benchmark testing, 51.5% of proteins within a length of 400 amino acids predicted by pacoPacker were superior to the results of CIS-RR and SCWRL4 simultaneously. Finally, we also showed the advantage of using the subrotamers strategy. All results confirmed that our parallel approach is competitive to state-of-the-art solutions for packing side chains. Conclusions This parallel approach combines various sources of searching intelligence and energy functions to pack protein side chains. It provides a frame-work for combining different inaccuracy/usefulness objective functions by designing parallel heuristic search algorithms. PMID:25474164

  1. Optimising the production of succinate and lactate in Escherichia coli using a hybrid of artificial bee colony algorithm and minimisation of metabolic adjustment.

    PubMed

    Tang, Phooi Wah; Choon, Yee Wen; Mohamad, Mohd Saberi; Deris, Safaai; Napis, Suhaimi

    2015-03-01

    Metabolic engineering is a research field that focuses on the design of models for metabolism, and uses computational procedures to suggest genetic manipulation. It aims to improve the yield of particular chemical or biochemical products. Several traditional metabolic engineering methods are commonly used to increase the production of a desired target, but the products are always far below their theoretical maximums. Using numeral optimisation algorithms to identify gene knockouts may stall at a local minimum in a multivariable function. This paper proposes a hybrid of the artificial bee colony (ABC) algorithm and the minimisation of metabolic adjustment (MOMA) to predict an optimal set of solutions in order to optimise the production rate of succinate and lactate. The dataset used in this work was from the iJO1366 Escherichia coli metabolic network. The experimental results include the production rate, growth rate and a list of knockout genes. From the comparative analysis, ABCMOMA produced better results compared to previous works, showing potential for solving genetic engineering problems. Copyright © 2014 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  2. A novel swarm intelligence algorithm for finding DNA motifs.

    PubMed

    Lei, Chengwei; Ruan, Jianhua

    2009-01-01

    Discovering DNA motifs from co-expressed or co-regulated genes is an important step towards deciphering complex gene regulatory networks and understanding gene functions. Despite significant improvement in the last decade, it still remains one of the most challenging problems in computational molecular biology. In this work, we propose a novel motif finding algorithm that finds consensus patterns using a population-based stochastic optimisation technique called Particle Swarm Optimisation (PSO), which has been shown to be effective in optimising difficult multidimensional problems in continuous domains. We propose to use a word dissimilarity graph to remap the neighborhood structure of the solution space of DNA motifs, and propose a modification of the naive PSO algorithm to accommodate discrete variables. In order to improve efficiency, we also propose several strategies for escaping from local optima and for automatically determining the termination criteria. Experimental results on simulated challenge problems show that our method is both more efficient and more accurate than several existing algorithms. Applications to several sets of real promoter sequences also show that our approach is able to detect known transcription factor binding sites, and outperforms two of the most popular existing algorithms.

  3. Optimisation of active suspension control inputs for improved vehicle ride performance

    NASA Astrophysics Data System (ADS)

    Čorić, Mirko; Deur, Joško; Xu, Li; Tseng, H. Eric; Hrovat, Davor

    2016-07-01

    A collocation-type control variable optimisation method is used in the paper to analyse to which extent the fully active suspension (FAS) can improve the vehicle ride comfort while preserving the wheel holding ability. The method is first applied for a cosine-shaped bump road disturbance of different heights, and for both quarter-car and full 10 degree-of-freedom vehicle models. A nonlinear anti-wheel hop constraint is considered, and the influence of bump preview time period is analysed. The analysis is then extended to the case of square- or cosine-shaped pothole with different lengths, and the quarter-car model. In this case, the cost function is extended with FAS energy consumption and wheel damage resilience costs. The FAS action is found to be such to provide a wheel hop over the pothole, in order to avoid or minimise the damage at the pothole trailing edge. In the case of long pothole, when the FAS cannot provide the wheel hop, the wheel is travelling over the pothole bottom and then hops over the pothole trailing edge. The numerical optimisation results are accompanied by a simplified algebraic analysis.

  4. Formulation of multiparticulate systems as lyophilised orally disintegrating tablets.

    PubMed

    Alhusban, Farhan; Perrie, Yvonne; Mohammed, Afzal R

    2011-11-01

    The current study aimed to exploit the electrostatic associative interaction between carrageenan and gelatin to optimise a formulation of lyophilised orally disintegrating tablets (ODTs) suitable for multiparticulate delivery. A central composite face centred (CCF) design was applied to study the influence of formulation variables (gelatin, carrageenan and alanine concentrations) on the crucial responses of the formulation (disintegration time, hardness, viscosity and pH). The disintegration time and viscosity were controlled by the associative interaction between gelatin and carrageenan upon hydration which forms a strong complex that increases the viscosity of the stock solution and forms tablet with higher resistant to disintegration in aqueous medium. Therefore, the levels of carrageenan, gelatin and their interaction in the formulation were the significant factors. In terms of hardness, increasing gelatin and alanine concentration was the most effective way to improve tablet hardness. Accordingly, optimum concentrations of these excipients were needed to find the best balance that fulfilled all formulation requirements. The revised model showed high degree of predictability and optimisation reliability and therefore was successful in developing an ODT formulation with optimised properties that were able deliver enteric coated multiparticulates of omeprazole without compromising their functionality. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. The Use of Mathematical Modelling for Improving the Tissue Engineering of Organs and Stem Cell Therapy.

    PubMed

    Lemon, Greg; Sjoqvist, Sebastian; Lim, Mei Ling; Feliu, Neus; Firsova, Alexandra B; Amin, Risul; Gustafsson, Ylva; Stuewer, Annika; Gubareva, Elena; Haag, Johannes; Jungebluth, Philipp; Macchiarini, Paolo

    2016-01-01

    Regenerative medicine is a multidisciplinary field where continued progress relies on the incorporation of a diverse set of technologies from a wide range of disciplines within medicine, science and engineering. This review describes how one such technique, mathematical modelling, can be utilised to improve the tissue engineering of organs and stem cell therapy. Several case studies, taken from research carried out by our group, ACTREM, demonstrate the utility of mechanistic mathematical models to help aid the design and optimisation of protocols in regenerative medicine.

  6. Productivity of pure- and crossbred cattle in a subtropical environment

    NASA Astrophysics Data System (ADS)

    van Zyl, J. G. E.; Schoeman, S. J.; Coertze, R. J.; Groeneveld, H. T.

    1991-06-01

    The influence of different breeds of sire and dam types on cow productivity in an arid, subtropical environment was studied. Cows with calves sired by Simmentaler, Hereford and Bonsmara bulls were more ( P<0.05) productive than those with calves sired by Afrikaner bulls. Simmentaler sires were superior ( P<0.05) to Bonsmara sires. Crossbred cows of predominant (>50%) Bos taurus breeding were generally superior to crossbreds of predominant B. indicus breeding and purebreds. Crossbreeding systems to utilize breed effects to optimise cow productivity within environmental constraints are discussed.

  7. Optimisation on processing parameters for minimising warpage on side arm using response surface methodology (RSM) and particle swarm optimisation (PSO)

    NASA Astrophysics Data System (ADS)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.

    2017-09-01

    This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.

  8. X-ray backscatter radiography with lower open fraction coded masks

    NASA Astrophysics Data System (ADS)

    Muñoz, André A. M.; Vella, Anna; Healy, Matthew J. F.; Lane, David W.; Jupp, Ian; Lockley, David

    2017-09-01

    Single sided radiographic imaging would find great utility for medical, aerospace and security applications. While coded apertures can be used to form such an image from backscattered X-rays they suffer from near field limitations that introduce noise. Several theoretical studies have indicated that for an extended source the images signal to noise ratio may be optimised by using a low open fraction (<0.5) mask. However, few experimental results have been published for such low open fraction patterns and details of their formulation are often unavailable or are ambiguous. In this paper we address this process for two types of low open fraction mask, the dilute URA and the Singer set array. For the dilute URA the procedure for producing multiple 2D array patterns from given 1D binary sequences (Barker codes) is explained. Their point spread functions are calculated and their imaging properties are critically reviewed. These results are then compared to those from the Singer set and experimental exposures are presented for both type of pattern; their prospects for near field imaging are discussed.

  9. Close packing in curved space by simulated annealing

    NASA Astrophysics Data System (ADS)

    Wille, L. T.

    1987-12-01

    The problem of packing spheres of a maximum radius on the surface of a four-dimensional hypersphere is considered. It is shown how near-optimal solutions can be obtained by packing soft spheres, modelled as classical particles interacting under an inverse power potential, followed by a subsequent hardening of the interaction. In order to avoid trapping in high-lying local minima, the simulated annealing method is used to optimise the soft-sphere packing. Several improvements over other work (based on local optimisation of random initial configurations of hard spheres) have been found. The freezing behaviour of this system is discussed as a function of particle number, softness of the potential and cooling rate. Apart from their geometric interest, these results are useful in the study of topological frustration, metallic glasses and quasicrystals.

  10. Identification and characterisation of midbrain nuclei using optimised functional magnetic resonance imaging

    PubMed Central

    Limbrick-Oldfield, Eve H.; Brooks, Jonathan C.W.; Wise, Richard J.S.; Padormo, Francesco; Hajnal, Jo V.; Beckmann, Christian F.; Ungless, Mark A.

    2012-01-01

    Localising activity in the human midbrain with conventional functional MRI (fMRI) is challenging because the midbrain nuclei are small and located in an area that is prone to physiological artefacts. Here we present a replicable and automated method to improve the detection and localisation of midbrain fMRI signals. We designed a visual fMRI task that was predicted would activate the superior colliculi (SC) bilaterally. A limited number of coronal slices were scanned, orientated along the long axis of the brainstem, whilst simultaneously recording cardiac and respiratory traces. A novel anatomical registration pathway was used to optimise the localisation of the small midbrain nuclei in stereotactic space. Two additional structural scans were used to improve registration between functional and structural T1-weighted images: an echo-planar image (EPI) that matched the functional data but had whole-brain coverage, and a whole-brain T2-weighted image. This pathway was compared to conventional registration pathways, and was shown to significantly improve midbrain registration. To reduce the physiological artefacts in the functional data, we estimated and removed structured noise using a modified version of a previously described physiological noise model (PNM). Whereas a conventional analysis revealed only unilateral SC activity, the PNM analysis revealed the predicted bilateral activity. We demonstrate that these methods improve the measurement of a biologically plausible fMRI signal. Moreover they could be used to investigate the function of other midbrain nuclei. PMID:21867762

  11. Optimising the Inflammatory Bowel Disease Unit to Improve Quality of Care: Expert Recommendations.

    PubMed

    Louis, Edouard; Dotan, Iris; Ghosh, Subrata; Mlynarsky, Liat; Reenaers, Catherine; Schreiber, Stefan

    2015-08-01

    The best care setting for patients with inflammatory bowel disease [IBD] may be in a dedicated unit. Whereas not all gastroenterology units have the same resources to develop dedicated IBD facilities and services, there are steps that can be taken by any unit to optimise patients' access to interdisciplinary expert care. A series of pragmatic recommendations relating to IBD unit optimisation have been developed through discussion among a large panel of international experts. Suggested recommendations were extracted through systematic search of published evidence and structured requests for expert opinion. Physicians [n = 238] identified as IBD specialists by publications or clinical focus on IBD were invited for discussion and recommendation modification [Barcelona, Spain; 2014]. Final recommendations were voted on by the group. Participants also completed an online survey to evaluate their own experience related to IBD units. A total of 60% of attendees completed the survey, with 15% self-classifying their centre as a dedicated IBD unit. Only half of respondents indicated that they had a defined IBD treatment algorithm in place. Key recommendations included the need to develop a multidisciplinary team covering specifically-defined specialist expertise in IBD, to instil processes that facilitate cross-functional communication and to invest in shared care models of IBD management. Optimising the setup of IBD units will require progressive leadership and willingness to challenge the status quo in order to provide better quality of care for our patients. IBD units are an important step towards harmonising care for IBD across Europe and for establishing standards for disease management programmes. © European Crohn’s and Colitis Organisation 2015.

  12. Optimising the Inflammatory Bowel Disease Unit to Improve Quality of Care: Expert Recommendations

    PubMed Central

    Dotan, Iris; Ghosh, Subrata; Mlynarsky, Liat; Reenaers, Catherine; Schreiber, Stefan

    2015-01-01

    Introduction: The best care setting for patients with inflammatory bowel disease [IBD] may be in a dedicated unit. Whereas not all gastroenterology units have the same resources to develop dedicated IBD facilities and services, there are steps that can be taken by any unit to optimise patients’ access to interdisciplinary expert care. A series of pragmatic recommendations relating to IBD unit optimisation have been developed through discussion among a large panel of international experts. Methods: Suggested recommendations were extracted through systematic search of published evidence and structured requests for expert opinion. Physicians [n = 238] identified as IBD specialists by publications or clinical focus on IBD were invited for discussion and recommendation modification [Barcelona, Spain; 2014]. Final recommendations were voted on by the group. Participants also completed an online survey to evaluate their own experience related to IBD units. Results: A total of 60% of attendees completed the survey, with 15% self-classifying their centre as a dedicated IBD unit. Only half of respondents indicated that they had a defined IBD treatment algorithm in place. Key recommendations included the need to develop a multidisciplinary team covering specifically-defined specialist expertise in IBD, to instil processes that facilitate cross-functional communication and to invest in shared care models of IBD management. Conclusions: Optimising the setup of IBD units will require progressive leadership and willingness to challenge the status quo in order to provide better quality of care for our patients. IBD units are an important step towards harmonising care for IBD across Europe and for establishing standards for disease management programmes. PMID:25987349

  13. Protocol investigating the clinical utility of an objective measure of attention, impulsivity and activity (QbTest) for optimising medication management in children and young people with ADHD 'QbTest Utility for Optimising Treatment in ADHD' (QUOTA): a feasibility randomised controlled trial.

    PubMed

    Hall, Charlotte L; James, Marilyn; Brown, Sue; Martin, Jennifer L; Brown, Nikki; Selby, Kim; Clarke, Julie; Vijayan, Hena; Guo, Boliang; Sayal, Kapil; Hollis, Chris; Groom, Madeleine J

    2018-02-15

    Attention-deficit hyperactivity disorder (ADHD) is characterised by symptoms of inattention, hyperactivity and impulsivity. To improve outcomes, the National Institute for Health and Care Excellence ADHD guidelines recommend regular monitoring of symptoms when children commence medication. However, research suggests that routine monitoring rarely happens, and clinicians often rely on subjective information such as reports from parents and teachers to ascertain improvement. These sources can be unreliable and difficult to obtain. The addition of an objective test of attention and activity (QbTest) may improve the objectivity, reliability and speed of clinical decision-making and so reduce the time to identify the optimal medication dose. This study aims to assess the feasibility and acceptability of a QbTest medication management protocol delivered in routine healthcare services for children with ADHD. This multisite feasibility randomised controlled trial (RCT) will recruit 60 young people (aged 6-17 years old), diagnosed with ADHD, and starting stimulant medication who are seen by Child and Adolescent Mental Health Services or Community Paediatric services. Participants will be randomised into one of two arms. In the experimental arm (QbTest protocol), the participant will complete a QbTest at baseline (prior to medication initiation), and two follow-up QbTests on medication (2-4 weeks and 8-10 weeks later). In the control arm, participants will receive treatment as usual, with at least two follow-up consultations. Measures of parent-, teacher- and clinician-rated symptoms and global functioning will be completed at each time point. Health economic measures will be completed. Clinicians will record treatment decision-making. Acceptability and feasibility of the protocol will be assessed alongside outcome measure completion rates. Qualitative interviews will be conducted. The findings will be used to inform the development of a fully powered RCT. The results will be submitted for publication in peer-reviewed journals. The study has ethical approval. NCT03368573; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Influence of denture improvement on the nutritional status and quality of life of geriatric patients.

    PubMed

    Wöstmann, Bernd; Michel, Karin; Brinkert, Bernd; Melchheier-Weskott, Andrea; Rehmann, Peter; Balkenhol, Markus

    2008-10-01

    Recent research suggests that there is a correlation between nutrition, oral health, dietary habits, patients' satisfaction and their socio-economic status. However, the dependent and independent variables have remained unclear. This exploratory interventional study aimed to identify the impact of denture improvement on the nutritional status as well as the oral health-related quality of life in geriatric patients. Forty-seven patients who were capable of feeding themselves (minimum age: 60 years) and with dentures requiring repair or replacement were selected from a random sample of 100 residents of two nursing homes. Before and 6 months after the dentures were optimised a Mini Nutritional Assessment (MNA) and a masticatory function test were carried out. Nutritional markers (pre-albumin, serum albumin, zinc) were determined and an OHIP-G14 (Oral Health Impact Profile, German version) was recorded in order to determine the effect of the optimised oral situation on the patient's nutritional status and oral health-related quality of life. Despite the highly significant improvement in masticatory ability after the optimisation of the dentures, no general improvement regarding the nutritional status was observed since the albumin, zinc and MNA values remained unchanged and pre-albumin even decreased. Since masticatory ability and masticatory efficiency are not the only factors affecting this, prosthetic measures alone apparently cannot effect a lasting improvement in nutritional status as masticatory ability and masticatory efficiency are not the only factors of influence. Nutrition is not only a matter of masticatory function, but also depends on other influencing factors (e.g. habits, taste and cultural customs as well as financial and organisational aspects).

  15. Reciprocal peer review for quality improvement: an ethnographic case study of the Improving Lung Cancer Outcomes Project.

    PubMed

    Aveling, Emma-Louise; Martin, Graham; Jiménez García, Senai; Martin, Lisa; Herbert, Georgia; Armstrong, Natalie; Dixon-Woods, Mary; Woolhouse, Ian

    2012-12-01

    Peer review offers a promising way of promoting improvement in health systems, but the optimal model is not yet clear. We aimed to describe a specific peer review model-reciprocal peer-to-peer review (RP2PR)-to identify the features that appeared to support optimal functioning. We conducted an ethnographic study involving observations, interviews and documentary analysis of the Improving Lung Cancer Outcomes Project, which involved 30 paired multidisciplinary lung cancer teams participating in facilitated reciprocal site visits. Analysis was based on the constant comparative method. Fundamental features of the model include multidisciplinary participation, a focus on discussion and observation of teams in action, rather than paperwork; facilitated reflection and discussion on data and observations; support to develop focused improvement plans. Five key features were identified as important in optimising this model: peers and pairing methods; minimising logistic burden; structure of visits; independent facilitation; and credibility of the process. Facilitated RP2PR was generally a positive experience for participants, but implementing improvement plans was challenging and required substantial support. RP2PR appears to be optimised when it is well organised; a safe environment for learning is created; credibility is maximised; implementation and impact are supported. RP2PR is seen as credible and legitimate by lung cancer teams and can act as a powerful stimulus to produce focused quality improvement plans and to support implementation. Our findings have identified how RP2PR functioned and may be optimised to provide a constructive, open space for identifying opportunities for improvement and solutions.

  16. The seasonal behaviour of carbon fluxes in the Amazon: fusion of FLUXNET data and the ORCHIDEE model

    NASA Astrophysics Data System (ADS)

    Verbeeck, H.; Peylin, P.; Bacour, C.; Ciais, P.

    2009-04-01

    Eddy covariance measurements at the Santarém (km 67) site revealed an unexpected seasonal pattern in carbon fluxes which could not be simulated by existing state-of-the-art global ecosystem models (Saleska et al., Sciece 2003). An unexpected high carbon uptake was measured during dry season. In contrast, carbon release was observed in the wet season. There are several possible (combined) underlying mechanisms of this phenomenon: (1) an increased soil respiration due to soil moisture in the wet season, (2) increased photosynthesis during the dry season due to deep rooting, hydraulic lift, increased radiation and/or a leaf flush. The objective of this study is to optimise the ORCHIDEE model using eddy covariance data in order to be able to mimic the seasonal response of carbon fluxes to dry/wet conditions in tropical forest ecosystems. By doing this, we try to identify the underlying mechanisms of this seasonal response. The ORCHIDEE model is a state of the art mechanistic global vegetation model that can be run at local or global scale. It calculates the carbon and water cycle in the different soil and vegetation pools and resolves the diurnal cycle of fluxes. ORCHIDEE is built on the concept of plant functional types (PFT) to describe vegetation. To bring the different carbon pool sizes to realistic values, spin-up runs are used. ORCHIDEE uses climate variables as drivers together with a number of ecosystem parameters that have been assessed from laboratory and in situ experiments. These parameters are still associated with a large uncertainty and may vary between and within PFTs in a way that is currently not informed or captured by the model. Recently, the development of assimilation techniques allows the objective use of eddy covariance data to improve our knowledge of these parameters in a statistically coherent approach. We use a Bayesian optimisation approach. This approach is based on the minimization of a cost function containing the mismatch between simulated model output and observations as well as the mismatch between a priori and optimized parameters. The parameters can be optimized on different time scales (annually, monthly, daily). For this study the model is optimised at local scale for 5 eddy flux sites: 4 sites in Brazil and one in French Guyana. The seasonal behaviour of C fluxes in response to wet and dry conditions differs among these sites. Key processes that are optimised include: the effect of the soil water on heterotrophic soil respiration, the effect of soil water availability on stomatal conductance and photosynthesis, and phenology. By optimising several key parameters we could improve the simulation of the seasonal pattern of NEE significantly. Nevertheless, posterior parameters should be interpreted with care, because resulting parameter values might compensate for uncertainties on the model structure or other parameters. Moreover, several critical issues appeared during this study e.g. how to assimilate latent and sensible heat data, when the energy balance is not closed in the data? Optimisation of the Q10 parameter showed that on some sites respiration was not sensitive at all to temperature, which show only small variations in this region. Considering this, one could question the reliability of the partitioned fluxes (GPP/Reco) at these sites. This study also tests if there is coherence between optimised parameter values of different sites within the tropical forest PFT and if the forward model response to climate variations is similar between sites.

  17. CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox

    NASA Astrophysics Data System (ADS)

    Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano

    2018-03-01

    Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.

  18. Boundary element based multiresolution shape optimisation in electrostatics

    NASA Astrophysics Data System (ADS)

    Bandara, Kosala; Cirak, Fehmi; Of, Günther; Steinbach, Olaf; Zapletal, Jan

    2015-09-01

    We consider the shape optimisation of high-voltage devices subject to electrostatic field equations by combining fast boundary elements with multiresolution subdivision surfaces. The geometry of the domain is described with subdivision surfaces and different resolutions of the same geometry are used for optimisation and analysis. The primal and adjoint problems are discretised with the boundary element method using a sufficiently fine control mesh. For shape optimisation the geometry is updated starting from the coarsest control mesh with increasingly finer control meshes. The multiresolution approach effectively prevents the appearance of non-physical geometry oscillations in the optimised shapes. Moreover, there is no need for mesh regeneration or smoothing during the optimisation due to the absence of a volume mesh. We present several numerical experiments and one industrial application to demonstrate the robustness and versatility of the developed approach.

  19. Tail mean and related robust solution concepts

    NASA Astrophysics Data System (ADS)

    Ogryczak, Włodzimierz

    2014-01-01

    Robust optimisation might be viewed as a multicriteria optimisation problem where objectives correspond to the scenarios although their probabilities are unknown or imprecise. The simplest robust solution concept represents a conservative approach focused on the worst-case scenario results optimisation. A softer concept allows one to optimise the tail mean thus combining performances under multiple worst scenarios. We show that while considering robust models allowing the probabilities to vary only within given intervals, the tail mean represents the robust solution for only upper bounded probabilities. For any arbitrary intervals of probabilities the corresponding robust solution may be expressed by the optimisation of appropriately combined mean and tail mean criteria thus remaining easily implementable with auxiliary linear inequalities. Moreover, we use the tail mean concept to develope linear programming implementable robust solution concepts related to risk averse optimisation criteria.

  20. A CONCEPTUAL FRAMEWORK FOR MANAGING RADIATION DOSE TO PATIENTS IN DIAGNOSTIC RADIOLOGY USING REFERENCE DOSE LEVELS.

    PubMed

    Almén, Anja; Båth, Magnus

    2016-06-01

    The overall aim of the present work was to develop a conceptual framework for managing radiation dose in diagnostic radiology with the intention to support optimisation. An optimisation process was first derived. The framework for managing radiation dose, based on the derived optimisation process, was then outlined. The outset of the optimisation process is four stages: providing equipment, establishing methodology, performing examinations and ensuring quality. The optimisation process comprises a series of activities and actions at these stages. The current system of diagnostic reference levels is an activity in the last stage, ensuring quality. The system becomes a reactive activity only to a certain extent engaging the core activity in the radiology department, performing examinations. Three reference dose levels-possible, expected and established-were assigned to the three stages in the optimisation process, excluding ensuring quality. A reasonably achievable dose range is also derived, indicating an acceptable deviation from the established dose level. A reasonable radiation dose for a single patient is within this range. The suggested framework for managing radiation dose should be regarded as one part of the optimisation process. The optimisation process constitutes a variety of complementary activities, where managing radiation dose is only one part. This emphasises the need to take a holistic approach integrating the optimisation process in different clinical activities. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Methods of increasing the performance of radionuclide generators used in nuclear medicine: daughter nuclide build-up optimisation, elution-purification-concentration integration, and effective control of radionuclidic purity.

    PubMed

    Le, Van So; Do, Zoe Phuc-Hien; Le, Minh Khoi; Le, Vicki; Le, Natalie Nha-Truc

    2014-06-10

    Methods of increasing the performance of radionuclide generators used in nuclear medicine radiotherapy and SPECT/PET imaging were developed and detailed for 99Mo/99mTc and 68Ge/68Ga radionuclide generators as the cases. Optimisation methods of the daughter nuclide build-up versus stand-by time and/or specific activity using mean progress functions were developed for increasing the performance of radionuclide generators. As a result of this optimisation, the separation of the daughter nuclide from its parent one should be performed at a defined optimal time to avoid the deterioration in specific activity of the daughter nuclide and wasting stand-by time of the generator, while the daughter nuclide yield is maintained to a reasonably high extent. A new characteristic parameter of the formation-decay kinetics of parent/daughter nuclide system was found and effectively used in the practice of the generator production and utilisation. A method of "early elution schedule" was also developed for increasing the daughter nuclide production yield and specific radioactivity, thus saving the cost of the generator and improving the quality of the daughter radionuclide solution. These newly developed optimisation methods in combination with an integrated elution-purification-concentration system of radionuclide generators recently developed is the most suitable way to operate the generator effectively on the basis of economic use and improvement of purposely suitable quality and specific activity of the produced daughter radionuclides. All these features benefit the economic use of the generator, the improved quality of labelling/scan, and the lowered cost of nuclear medicine procedure. Besides, a new method of quality control protocol set-up for post-delivery test of radionuclidic purity has been developed based on the relationship between gamma ray spectrometric detection limit, required limit of impure radionuclide activity and its measurement certainty with respect to optimising decay/measurement time and product sample activity used for QC quality control. The optimisation ensures a certainty of measurement of the specific impure radionuclide and avoids wasting the useful amount of valuable purified/concentrated daughter nuclide product. This process is important for the spectrometric measurement of very low activity of impure radionuclide contamination in the radioisotope products of much higher activity used in medical imaging and targeted radiotherapy.

  2. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience.

    PubMed

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases.

  3. Optimisation of groundwater level monitoring networks using geostatistical modelling based on the Spartan family variogram and a genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2016-04-01

    Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the optimization problem (the best wells to retain in the monitoring network) depends on the total number of wells removed; this number is a management decision. The water level monitoring network of Mires basin has been optimized 6 times by removing 5, 8, 12, 15, 20 and 25 wells from the original network. In order to achieve the optimum solution in the minimum possible computational time, a stall generations criterion was set for each optimisation scenario. An improvement made to the classic genetic algorithm was the change of the mutation and crossover fraction in respect to the change of the mean fitness value. This results to a randomness in reproduction, if the solution converges, to avoid local minima, or, in a more educated reproduction (higher crossover ratio) when there is higher change in the mean fitness value. The choice of integer genetic algorithm in MATLAB 2015a poses the restriction of adding custom selection and crossover-mutation functions. Therefore, custom population and crossover-mutation-selection functions have been created to set the initial population type to custom and have the ability to change the mutation crossover probability in respect to the convergence of the genetic algorithm, achieving thus higher accuracy. The application of the network optimisation tool to Mires basin indicates that 25 wells can be removed with a relatively small deterioration of the groundwater level map. The results indicate the robustness of the network optimisation tool: Wells were removed from high well-density areas while preserving the spatial pattern of the original groundwater level map. Varouchakis, E. A. and D. T. Hristopulos (2013). "Improvement of groundwater level prediction in sparsely gauged basins using physical laws and local geographic features as auxiliary variables." Advances in Water Resources 52: 34-49.

  4. Development and optimisation of atorvastatin calcium loaded self-nanoemulsifying drug delivery system (SNEDDS) for enhancing oral bioavailability: in vitro and in vivo evaluation.

    PubMed

    Kassem, Abdulsalam M; Ibrahim, Hany M; Samy, Ahmed M

    2017-05-01

    The objective of this study was to develop and optimise self-nanoemulsifying drug delivery system (SNEDDS) of atorvastatin calcium (ATC) for improving dissolution rate and eventually oral bioavailability. Ternary phase diagrams were constructed on basis of solubility and emulsification studies. The composition of ATC-SNEDDS was optimised using the Box-Behnken optimisation design. Optimised ATC-SNEDDS was characterised for various physicochemical properties. Pharmacokinetic, pharmacodynamic and histological findings were performed in rats. Optimised ATC-SNEDDS resulted in droplets size of 5.66 nm, zeta potential of -19.52 mV, t 90 of 5.43 min and completely released ATC within 30 min irrespective of pH of the medium. Area under the curve of optimised ATC-SNEDDS in rats was 2.34-folds higher than ATC suspension. Pharmacodynamic studies revealed significant reduction in serum lipids of rats with fatty liver. Photomicrographs showed improvement in hepatocytes structure. In this study, we confirmed that ATC-SNEDDS would be a promising approach for improving oral bioavailability of ATC.

  5. Statistical methods for convergence detection of multi-objective evolutionary algorithms.

    PubMed

    Trautmann, H; Wagner, T; Naujoks, B; Preuss, M; Mehnen, J

    2009-01-01

    In this paper, two approaches for estimating the generation in which a multi-objective evolutionary algorithm (MOEA) shows statistically significant signs of convergence are introduced. A set-based perspective is taken where convergence is measured by performance indicators. The proposed techniques fulfill the requirements of proper statistical assessment on the one hand and efficient optimisation for real-world problems on the other hand. The first approach accounts for the stochastic nature of the MOEA by repeating the optimisation runs for increasing generation numbers and analysing the performance indicators using statistical tools. This technique results in a very robust offline procedure. Moreover, an online convergence detection method is introduced as well. This method automatically stops the MOEA when either the variance of the performance indicators falls below a specified threshold or a stagnation of their overall trend is detected. Both methods are analysed and compared for two MOEA and on different classes of benchmark functions. It is shown that the methods successfully operate on all stated problems needing less function evaluations while preserving good approximation quality at the same time.

  6. Dual ant colony operational modal analysis parameter estimation method

    NASA Astrophysics Data System (ADS)

    Sitarz, Piotr; Powałka, Bartosz

    2018-01-01

    Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.

  7. Application of Three Existing Stope Boundary Optimisation Methods in an Operating Underground Mine

    NASA Astrophysics Data System (ADS)

    Erdogan, Gamze; Yavuz, Mahmut

    2017-12-01

    The underground mine planning and design optimisation process have received little attention because of complexity and variability of problems in underground mines. Although a number of optimisation studies and software tools are available and some of them, in special, have been implemented effectively to determine the ultimate-pit limits in an open pit mine, there is still a lack of studies for optimisation of ultimate stope boundaries in underground mines. The proposed approaches for this purpose aim at maximizing the economic profit by selecting the best possible layout under operational, technical and physical constraints. In this paper, the existing three heuristic techniques including Floating Stope Algorithm, Maximum Value Algorithm and Mineable Shape Optimiser (MSO) are examined for optimisation of stope layout in a case study. Each technique is assessed in terms of applicability, algorithm capabilities and limitations considering the underground mine planning challenges. Finally, the results are evaluated and compared.

  8. Design Optimisation of a Magnetic Field Based Soft Tactile Sensor

    PubMed Central

    Raske, Nicholas; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Culmer, Peter; Hewson, Robert

    2017-01-01

    This paper investigates the design optimisation of a magnetic field based soft tactile sensor, comprised of a magnet and Hall effect module separated by an elastomer. The aim was to minimise sensitivity of the output force with respect to the input magnetic field; this was achieved by varying the geometry and material properties. Finite element simulations determined the magnetic field and structural behaviour under load. Genetic programming produced phenomenological expressions describing these responses. Optimisation studies constrained by a measurable force and stable loading conditions were conducted; these produced Pareto sets of designs from which the optimal sensor characteristics were selected. The optimisation demonstrated a compromise between sensitivity and the measurable force, a fabricated version of the optimised sensor validated the improvements made using this methodology. The approach presented can be applied in general for optimising soft tactile sensor designs over a range of applications and sensing modes. PMID:29099787

  9. Flow chemistry meets advanced functional materials.

    PubMed

    Myers, Rebecca M; Fitzpatrick, Daniel E; Turner, Richard M; Ley, Steven V

    2014-09-22

    Flow chemistry and continuous processing techniques are beginning to have a profound impact on the production of functional materials ranging from quantum dots, nanoparticles and metal organic frameworks to polymers and dyes. These techniques provide robust procedures which not only enable accurate control of the product material's properties but they are also ideally suited to conducting experiments on scale. The modular nature of flow and continuous processing equipment rapidly facilitates reaction optimisation and variation in function of the products. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Study protocol for the optimisation, feasibility testing and pilot cluster randomised trial of Positive Choices: a school-based social marketing intervention to promote sexual health, prevent unintended teenage pregnancies and address health inequalities in England.

    PubMed

    Ponsford, Ruth; Allen, Elizabeth; Campbell, Rona; Elbourne, Diana; Hadley, Alison; Lohan, Maria; Melendez-Torres, G J; Mercer, Catherine H; Morris, Steve; Young, Honor; Bonell, Chris

    2018-01-01

    Since the introduction of the Teenage Pregnancy Strategy (TPS), England's under-18 conception rate has fallen by 55%, but a continued focus on prevention is needed to maintain and accelerate progress. The teenage birth rate remains higher in the UK than comparable Western European countries. Previous trials indicate that school-based social marketing interventions are a promising approach to addressing teenage pregnancy and improving sexual health. Such interventions are yet to be trialled in the UK. This study aims to optimise and establish the feasibility and acceptability of one such intervention: Positive Choices. Design: Optimisation, feasibility testing and pilot cluster randomised trial.Interventions: The Positive Choices intervention comprises a student needs survey, a student/staff led School Health Promotion Council (SHPC), a classroom curriculum for year nine students covering social and emotional skills and sex education, student-led social marketing activities, parent information and a review of school sexual health services.Systematic optimisation of Positive Choices will be carried out with the National Children's Bureau Sex Education Forum (NCB SEF), one state secondary school in England and other youth and policy stakeholders.Feasibility testing will involve the same state secondary school and will assess progression criteria to advance to the pilot cluster RCT.Pilot cluster RCT with integral process evaluation will involve six different state secondary schools (four interventions and two controls) and will assess the feasibility and utility of progressing to a full effectiveness trial.The following outcome measures will be trialled as part of the pilot:Self-reported pregnancy and unintended pregnancy (initiation of pregnancy for boys) and sexually transmitted infections,Age of sexual debut, number of sexual partners, use of contraception at first and last sex and non-volitional sexEducational attainmentThe feasibility of linking administrative data on births and termination to self-report survey data to measure our primary outcome (unintended teenage pregnancy) will also be tested. This will be the first UK-based pilot trial of a school-wide social marketing intervention to reduce unintended teenage pregnancy and improve sexual health. If this study indicates feasibility and acceptability of the optimised Positive Choices intervention in English secondary schools, plans will be initiated for a phase III trial and economic evaluation of the intervention. ISRCTN registry (ISCTN12524938. Registered 03/07/2017).

  11. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    PubMed Central

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  12. Generalised form of a power law threshold function for rainfall-induced landslides

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Díaz, Manuel Roberto; Nadim, Farrokh; Høeg, Kaare; Elverhøi, Anders

    2010-05-01

    The following new function is proposed for estimating thresholds for rainfall-triggered landslides: I = α1Anα2Dβ, where I is rainfall intensity in mm/h, D is rainfall duration in h, An is the n-hours or n-days antecedent precipitation, and α1, α2, β and n are threshold parameters. A threshold model that combines two functions with different durations of antecedent precipitation is also introduced. A storm observation exceeds the threshold when the storm parameters are located at or above the two functions simultaneously. A novel optimisation procedure for estimating the threshold parameters is proposed using Receiver Operating Characteristics (ROC) analysis. The new threshold function and optimisation procedure are applied for estimating thresholds for triggering of debris flows in the Western Metropolitan Area of San Salvador (AMSS), El Salvador, where up to 500 casualties were produced by a single event. The resulting thresholds are I = 2322 A7d-1D-0.43 and I = 28534 A150d-1D-0.43 for debris flows having volumes greater than 3000 m3. Thresholds are also derived for debris flows greater than 200 000 m3 and for hyperconcentrated flows initiating in burned areas caused by forest fires. The new thresholds show an improved performance compared to the traditional formulations, indicated by a reduction in false alarms from 51 to 5 for the 3000 m3 thresholds and from 6 to 0 false alarms for the 200 000 m3 thresholds.

  13. Economic impact of optimising antiretroviral treatment in human immunodeficiency virus-infected adults with suppressed viral load in Spain, by implementing the grade A-1 evidence recommendations of the 2015 GESIDA/National AIDS Plan.

    PubMed

    Ribera, Esteban; Martínez-Sesmero, José Manuel; Sánchez-Rubio, Javier; Rubio, Rafael; Pasquau, Juan; Poveda, José Luis; Pérez-Mitru, Alejandro; Roldán, Celia; Hernández-Novoa, Beatriz

    2018-03-01

    The objective of this study is to estimate the economic impact associated with the optimisation of triple antiretroviral treatment (ART) in patients with undetectable viral load according to the recommendations from the GeSIDA/PNS (2015) Consensus and their applicability in the Spanish clinical practice. A pharmacoeconomic model was developed based on data from a National Hospital Prescription Survey on ART (2014) and the A-I evidence recommendations for the optimisation of ART from the GeSIDA/PNS (2015) consensus. The optimisation model took into account the willingness to optimise a particular regimen and other assumptions, and the results were validated by an expert panel in HIV infection (Infectious Disease Specialists and Hospital Pharmacists). The analysis was conducted from the NHS perspective, considering the annual wholesale price and accounting for deductions stated in the RD-Law 8/2010 and the VAT. The expert panel selected six optimisation strategies, and estimated that 10,863 (13.4%) of the 80,859 patients in Spain currently on triple ART, would be candidates to optimise their ART, leading to savings of €15.9M/year (2.4% of total triple ART drug cost). The most feasible strategies (>40% of patients candidates for optimisation, n=4,556) would be optimisations to ATV/r+3TC therapy. These would produce savings between €653 and €4,797 per patient per year depending on baseline triple ART. Implementation of the main optimisation strategies recommended in the GeSIDA/PNS (2015) Consensus into Spanish clinical practice would lead to considerable savings, especially those based in dual therapy with ATV/r+3TC, thus contributing to the control of pharmaceutical expenditure and NHS sustainability. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  14. Optimisation of the hybrid renewable energy system by HOMER, PSO and CPSO for the study area

    NASA Astrophysics Data System (ADS)

    Khare, Vikas; Nema, Savita; Baredar, Prashant

    2017-04-01

    This study is based on simulation and optimisation of the renewable energy system of the police control room at Sagar in central India. To analyse this hybrid system, the meteorological data of solar insolation and hourly wind speeds of Sagar in central India (longitude 78°45‧ and latitude 23°50‧) have been considered. The pattern of load consumption is studied and suitably modelled for optimisation of the hybrid energy system using HOMER software. The results are compared with those of the particle swarm optimisation and the chaotic particle swarm optimisation algorithms. The use of these two algorithms to optimise the hybrid system leads to a higher quality result with faster convergence. Based on the optimisation result, it has been found that replacing conventional energy sources by the solar-wind hybrid renewable energy system will be a feasible solution for the distribution of electric power as a stand-alone application at the police control room. This system is more environmentally friendly than the conventional diesel generator. The fuel cost reduction is approximately 70-80% more than that of the conventional diesel generator.

  15. A new paradigm in personal dosimetry using LiF:Mg,Cu,P.

    PubMed

    Cassata, J R; Moscovitch, M; Rotunda, J E; Velbeck, K J

    2002-01-01

    The United States Navy has been monitoring personnel for occupational exposure to ionising radiation since 1947. Film was exclusively used until 1973 when thermoluminescence dosemeters were introduced and used to the present time. In 1994, a joint research project between the Naval Dosimetry Center, Georgetown University, and Saint Gobain Crystals and Detectors (formerly Bicron RMP formerly Harshaw TLD) began to develop a state of the art thermoluminescent dosimetry system. The study was conducted from a large-scale dosimetry processor point of view with emphasis on a systems approach. Significant improvements were achieved by replacing the LiF:Mg,Ti with LiF:Mg,Cu,P TL elements due to the significant sensitivity increase, linearity, and negligible hiding. Dosemeter filters were optimised for gamma and X ray energy discrimination using Monte Carlo modelling (MCNP) resulting in significant improvement in accuracy and precision. Further improvements were achieved through the use of neural-network based dose calculation algorithms. Both back propagation and functional link methods were implemented and the data compared with essentially the same results. Several operational aspects of the system are discussed, including (1) background subtraction using control dosemeters, (2) selection criteria for control dosemeters, (3) optimisation of the TLD readers, (4) calibration methodology, and (5) the optimisation of the heating profile.

  16. Experimental design-based isotope-dilution SPME-GC/MS method development for the analysis of smoke flavouring products.

    PubMed

    Giri, Anupam; Zelinkova, Zuzana; Wenzl, Thomas

    2017-12-01

    For the implementation of Regulation (EC) No 2065/2003 related to smoke flavourings used or intended for use in or on foods a method based on solid-phase micro extraction (SPME) GC/MS was developed for the characterisation of liquid smoke products. A statistically based experimental design (DoE) was used for method optimisation. The best general conditions to quantitatively analyse the liquid smoke compounds were obtained with a polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre, 60°C extraction temperature, 30 min extraction time, 250°C desorption temperature, 180 s desorption time, 15 s agitation time, and 250 rpm agitation speed. Under the optimised conditions, 119 wood pyrolysis products including furan/pyran derivatives, phenols, guaiacol, syringol, benzenediol, and their derivatives, cyclic ketones, and several other heterocyclic compounds were identified. The proposed method was repeatable (RSD% <5) and the calibration functions were linear for all compounds under study. Nine isotopically labelled internal standards were used for improving quantification of analytes by compensating matrix effects that might affect headspace equilibrium and extractability of compounds. The optimised isotope dilution SPME-GC/MS based analytical method proved to be fit for purpose, allowing the rapid identification and quantification of volatile compounds in liquid smoke flavourings.

  17. Challenges and Potential Solutions – Individualised Antibiotic Dosing at the Bedside for Critically Ill Patients: a structured review

    PubMed Central

    Roberts, Jason A.; Aziz, Mohd Hafiz Abdul; Lipman, Jeffrey; Mouton, Johan W.; Vinks, Alexander A.; Felton, Timothy W.; Hope, William W.; Farkas, Andras; Neely, Michael N.; Schentag, Jerome J.; Drusano, George; Frey, Otto R.; Theuretzbacher, Ursula; Kuti, Joseph L.

    2014-01-01

    Summary Infections in critically ill patients are associated with persistently poor clinical outcomes. These patients have severely altered and variable antibiotic pharmacokinetics and are infected by less susceptible pathogens. Antibiotic dosing that does not account for these features is likely to result in sub-optimal outcomes. In this paper, we review the patient- and pathogen-related challenges that contribute to inadequate antibiotic dosing and discuss how a process for individualised antibiotic therapy, that increases the accuracy of dosing, can be implemented to further optimise care for the critically ill patient. The process for optimised antibiotic dosing firstly requires determination of the physiological derangements in the patient that can alter antibiotic concentrations including altered fluid status, microvascular failure, serum albumin concentrations as well as altered renal and hepatic function. Secondly, knowledge of the susceptibility of the infecting pathogen should be determined through liaison with the microbiology laboratory. The patient and pathogen challenges can then be solved by combining susceptibility data with measured antibiotic concentration data (where possible) into a clinical dosing software. Such software uses pharmacokinetic-pharmacodynamic (PK/PD) models from critically ill patients to accurately predict the dosing requirements for the individual patient with the aim of optimising antibiotic exposure and maximising effectiveness. PMID:24768475

  18. Simulation studies promote technological development of radiofrequency phased array hyperthermia.

    PubMed

    Wust, P; Seebass, M; Nadobny, J; Deuflhard, P; Mönich, G; Felix, R

    1996-01-01

    A treatment planning program package for radiofrequency hyperthermia has been developed. It consists of software modules for processing three-dimensional computerized tomography (CT) data sets, manual segmentation, generation of tetrahedral grids, numerical calculation and optimisation of three-dimensional E field distributions using a volume surface integral equation algorithm as well as temperature distributions using an adaptive multilevel finite-elements code, and graphical tools for simultaneous representation of CT data and simulation results. Heat treatments are limited by hot spots in healthy tissues caused by E field maxima at electrical interfaces (bone/muscle). In order to reduce or avoid hot spots suitable objective functions are derived from power deposition patterns and temperature distributions, and are utilised to optimise antenna parameters (phases, amplitudes). The simulation and optimisation tools have been applied to estimate the improvements that could be reached by upgrades of the clinically used SIGMA-60 applicator (consisting of a single ring of four antenna pairs). The investigated upgrades are increased number of antennas and channels (triple-ring of 3 x 8 antennas and variation of antenna inclination. Significant improvement of index temperatures (1-2 degrees C) is achieved by upgrading the single ring to a triple ring with free phase selection for every antenna or antenna pair. Antenna amplitudes and inclinations proved as less important parameters.

  19. Ant colony optimisation-direct cover: a hybrid ant colony direct cover technique for multi-level synthesis of multiple-valued logic functions

    NASA Astrophysics Data System (ADS)

    Abd-El-Barr, Mostafa

    2010-12-01

    The use of non-binary (multiple-valued) logic in the synthesis of digital systems can lead to savings in chip area. Advances in very large scale integration (VLSI) technology have enabled the successful implementation of multiple-valued logic (MVL) circuits. A number of heuristic algorithms for the synthesis of (near) minimal sum-of products (two-level) realisation of MVL functions have been reported in the literature. The direct cover (DC) technique is one such algorithm. The ant colony optimisation (ACO) algorithm is a meta-heuristic that uses constructive greediness to explore a large solution space in finding (near) optimal solutions. The ACO algorithm mimics the ant's behaviour in the real world in using the shortest path to reach food sources. We have previously introduced an ACO-based heuristic for the synthesis of two-level MVL functions. In this article, we introduce the ACO-DC hybrid technique for the synthesis of multi-level MVL functions. The basic idea is to use an ant to decompose a given MVL function into a number of levels and then synthesise each sub-function using a DC-based technique. The results obtained using the proposed approach are compared to those obtained using existing techniques reported in the literature. A benchmark set consisting of 50,000 randomly generated 2-variable 4-valued functions is used in the comparison. The results obtained using the proposed ACO-DC technique are shown to produce efficient realisation in terms of the average number of gates (as a measure of chip area) needed for the synthesis of a given MVL function.

  20. A support vector machine for predicting defibrillation outcomes from waveform metrics.

    PubMed

    Howe, Andrew; Escalona, Omar J; Di Maio, Rebecca; Massot, Bertrand; Cromie, Nick A; Darragh, Karen M; Adgey, Jennifer; McEneaney, David J

    2014-03-01

    Algorithms to predict shock success based on VF waveform metrics could significantly enhance resuscitation by optimising the timing of defibrillation. To investigate robust methods of predicting defibrillation success in VF cardiac arrest patients, by using a support vector machine (SVM) optimisation approach. Frequency-domain (AMSA, dominant frequency and median frequency) and time-domain (slope and RMS amplitude) VF waveform metrics were calculated in a 4.1Y window prior to defibrillation. Conventional prediction test validity of each waveform parameter was conducted and used AUC>0.6 as the criterion for inclusion as a corroborative attribute processed by the SVM classification model. The latter used a Gaussian radial-basis-function (RBF) kernel and the error penalty factor C was fixed to 1. A two-fold cross-validation resampling technique was employed. A total of 41 patients had 115 defibrillation instances. AMSA, slope and RMS waveform metrics performed test validation with AUC>0.6 for predicting termination of VF and return-to-organised rhythm. Predictive accuracy of the optimised SVM design for termination of VF was 81.9% (± 1.24 SD); positive and negative predictivity were respectively 84.3% (± 1.98 SD) and 77.4% (± 1.24 SD); sensitivity and specificity were 87.6% (± 2.69 SD) and 71.6% (± 9.38 SD) respectively. AMSA, slope and RMS were the best VF waveform frequency-time parameters predictors of termination of VF according to test validity assessment. This a priori can be used for a simplified SVM optimised design that combines the predictive attributes of these VF waveform metrics for improved prediction accuracy and generalisation performance without requiring the definition of any threshold value on waveform metrics. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Optimisation of 12 MeV electron beam simulation using variance reduction technique

    NASA Astrophysics Data System (ADS)

    Jayamani, J.; Termizi, N. A. S. Mohd; Kamarulzaman, F. N. Mohd; Aziz, M. Z. Abdul

    2017-05-01

    Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 107 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 107 to 20 × 107. In this study, 5 MeV electron cut-off with 10 × 107 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy.

  2. Advanced treatment planning using direct 4D optimisation for pencil-beam scanned particle therapy

    NASA Astrophysics Data System (ADS)

    Bernatowicz, Kinga; Zhang, Ye; Perrin, Rosalind; Weber, Damien C.; Lomax, Antony J.

    2017-08-01

    We report on development of a new four-dimensional (4D) optimisation approach for scanned proton beams, which incorporates both irregular motion patterns and the delivery dynamics of the treatment machine into the plan optimiser. Furthermore, we assess the effectiveness of this technique to reduce dose to critical structures in proximity to moving targets, while maintaining effective target dose homogeneity and coverage. The proposed approach has been tested using both a simulated phantom and a clinical liver cancer case, and allows for realistic 4D calculations and optimisation using irregular breathing patterns extracted from e.g. 4DCT-MRI (4D computed tomography-magnetic resonance imaging). 4D dose distributions resulting from our 4D optimisation can achieve almost the same quality as static plans, independent of the studied geometry/anatomy or selected motion (regular and irregular). Additionally, current implementation of the 4D optimisation approach requires less than 3 min to find the solution for a single field planned on 4DCT of a liver cancer patient. Although 4D optimisation allows for realistic calculations using irregular breathing patterns, it is very sensitive to variations from the planned motion. Based on a sensitivity analysis, target dose homogeneity comparable to static plans (D5-D95  <5%) has been found only for differences in amplitude of up to 1 mm, for changes in respiratory phase  <200 ms and for changes in the breathing period of  <20 ms in comparison to the motions used during optimisation. As such, methods to robustly deliver 4D optimised plans employing 4D intensity-modulated delivery are discussed.

  3. Critical aspects for the reliable headspace analysis of plants cultivated in vitro.

    PubMed

    Maes, K; Vercammen, J; Pham-Tuan, H; Sandra, P; Debergh, P C

    2001-01-01

    Various factors controlling the recoveries of volatile organic compounds in vitro headspace analysis of tomato plants (Lycopersicon esculentum Mill. 'Moneymaker'), sampled using solid phase micro-extraction, were evaluated and optimised. The variations in composition of the headspaces were determined as a function of time, and following in vitro wounding of the plant.

  4. Efficient methods for enol phosphate synthesis using carbon-centred magnesium bases.

    PubMed

    Kerr, William J; Lindsay, David M; Patel, Vipulkumar K; Rajamanickam, Muralikrishnan

    2015-10-28

    Efficient conversion of ketones into kinetic enol phosphates under mild and accessible conditions has been realised using the developed methods with di-tert-butylmagnesium and bismesitylmagnesium. Optimisation of the quench protocol resulted in high yields of enol phosphates from a range of cyclohexanones and aryl methyl ketones, with tolerance of a range of additional functional units.

  5. Optimisation of insect cell growth in deep-well blocks: development of a high-throughput insect cell expression screen.

    PubMed

    Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian

    2005-01-01

    This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.

  6. Mutual information-based LPI optimisation for radar network

    NASA Astrophysics Data System (ADS)

    Shi, Chenguang; Zhou, Jianjiang; Wang, Fei; Chen, Jun

    2015-07-01

    Radar network can offer significant performance improvement for target detection and information extraction employing spatial diversity. For a fixed number of radars, the achievable mutual information (MI) for estimating the target parameters may extend beyond a predefined threshold with full power transmission. In this paper, an effective low probability of intercept (LPI) optimisation algorithm is presented to improve LPI performance for radar network. Based on radar network system model, we first provide Schleher intercept factor for radar network as an optimisation metric for LPI performance. Then, a novel LPI optimisation algorithm is presented, where for a predefined MI threshold, Schleher intercept factor for radar network is minimised by optimising the transmission power allocation among radars in the network such that the enhanced LPI performance for radar network can be achieved. The genetic algorithm based on nonlinear programming (GA-NP) is employed to solve the resulting nonconvex and nonlinear optimisation problem. Some simulations demonstrate that the proposed algorithm is valuable and effective to improve the LPI performance for radar network.

  7. Visual grading characteristics and ordinal regression analysis during optimisation of CT head examinations.

    PubMed

    Zarb, Francis; McEntee, Mark F; Rainford, Louise

    2015-06-01

    To evaluate visual grading characteristics (VGC) and ordinal regression analysis during head CT optimisation as a potential alternative to visual grading assessment (VGA), traditionally employed to score anatomical visualisation. Patient images (n = 66) were obtained using current and optimised imaging protocols from two CT suites: a 16-slice scanner at the national Maltese centre for trauma and a 64-slice scanner in a private centre. Local resident radiologists (n = 6) performed VGA followed by VGC and ordinal regression analysis. VGC alone indicated that optimised protocols had similar image quality as current protocols. Ordinal logistic regression analysis provided an in-depth evaluation, criterion by criterion allowing the selective implementation of the protocols. The local radiology review panel supported the implementation of optimised protocols for brain CT examinations (including trauma) in one centre, achieving radiation dose reductions ranging from 24 % to 36 %. In the second centre a 29 % reduction in radiation dose was achieved for follow-up cases. The combined use of VGC and ordinal logistic regression analysis led to clinical decisions being taken on the implementation of the optimised protocols. This improved method of image quality analysis provided the evidence to support imaging protocol optimisation, resulting in significant radiation dose savings. • There is need for scientifically based image quality evaluation during CT optimisation. • VGC and ordinal regression analysis in combination led to better informed clinical decisions. • VGC and ordinal regression analysis led to dose reductions without compromising diagnostic efficacy.

  8. Optimising experimental design for MEG resting state functional connectivity measurement.

    PubMed

    Liuzzi, Lucrezia; Gascoyne, Lauren E; Tewarie, Prejaas K; Barratt, Eleanor L; Boto, Elena; Brookes, Matthew J

    2017-07-15

    The study of functional connectivity using magnetoencephalography (MEG) is an expanding area of neuroimaging, and adds an extra dimension to the more common assessments made using fMRI. The importance of such metrics is growing, with recent demonstrations of their utility in clinical research, however previous reports suggest that whilst group level resting state connectivity is robust, single session recordings lack repeatability. Such robustness is critical if MEG measures in individual subjects are to prove clinically valuable. In the present paper, we test how practical aspects of experimental design affect the intra-subject repeatability of MEG findings; specifically we assess the effect of co-registration method and data recording duration. We show that the use of a foam head-cast, which is known to improve co-registration accuracy, increased significantly the between session repeatability of both beamformer reconstruction and connectivity estimation. We also show that recording duration is a critical parameter, with large improvements in repeatability apparent when using ten minute, compared to five minute recordings. Further analyses suggest that the origin of this latter effect is not underpinned by technical aspects of source reconstruction, but rather by a genuine effect of brain state; short recordings are simply inefficient at capturing the canonical MEG network in a single subject. Our results provide important insights on experimental design and will prove valuable for future MEG connectivity studies. Copyright © 2016. Published by Elsevier Inc.

  9. On the design and optimisation of new fractal antenna using PSO

    NASA Astrophysics Data System (ADS)

    Rani, Shweta; Singh, A. P.

    2013-10-01

    An optimisation technique for newly shaped fractal structure using particle swarm optimisation with curve fitting is presented in this article. The aim of particle swarm optimisation is to find the geometry of the antenna for the required user-defined frequency. To assess the effectiveness of the presented method, a set of representative numerical simulations have been done and the results are compared with the measurements from experimental prototypes built according to the design specifications coming from the optimisation procedure. The proposed fractal antenna resonates at the 5.8 GHz industrial, scientific and medical band which is suitable for wireless telemedicine applications. The antenna characteristics have been studied using extensive numerical simulations and are experimentally verified. The antenna exhibits well-defined radiation patterns over the band.

  10. Stochastic optimisation of water allocation on a global scale

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Straatsma, Menno; Karssenberg, Derek; Bierkens, Marc F. P.

    2014-05-01

    Climate change, increasing population and further economic developments are expected to increase water scarcity for many regions of the world. Optimal water management strategies are required to minimise the water gap between water supply and domestic, industrial and agricultural water demand. A crucial aspect of water allocation is the spatial scale of optimisation. Blue water supply peaks at the upstream parts of large catchments, whereas demands are often largest at the industrialised downstream parts. Two extremes exist in water allocation: (i) 'First come, first serve,' which allows the upstream water demands to be fulfilled without considerations of downstream demands, and (ii) 'All for one, one for all' that satisfies water allocation over the whole catchment. In practice, water treaties govern intermediate solutions. The objective of this study is to determine the effect of these two end members on water allocation optimisation with respect to water scarcity. We conduct this study on a global scale with the year 2100 as temporal horizon. Water supply is calculated using the hydrological model PCR-GLOBWB, operating at a 5 arcminutes resolution and a daily time step. PCR-GLOBWB is forced with temperature and precipitation fields from the Hadgem2-ES global circulation model that participated in the latest coupled model intercomparison project (CMIP5). Water demands are calculated for representative concentration pathway 6.0 (RCP 6.0) and shared socio-economic pathway scenario 2 (SSP2). To enable the fast computation of the optimisation, we developed a hydrologically correct network of 1800 basin segments with an average size of 100 000 square kilometres. The maximum number of nodes in a network was 140 for the Amazon Basin. Water demands and supplies are aggregated to cubic kilometres per month per segment. A new open source implementation of the water allocation is developed for the stochastic optimisation of the water allocation. We apply a Genetic Algorithm for each segment to estimate the set of parameters that distribute the water supply for each node. We use the Python programming language and a flexible software architecture allowing to straightforwardly 1) exchange the process description for the nodes such that different water allocation schemes can be tested 2) exchange the objective function 3) apply the optimisation either to the whole catchment or to different sub-levels and 4) use multi-core CPUs concurrently and therefore reducing computation time. We demonstrate the application of the scientific workflow to the model outputs of PCR-GLOBWB and present first results on how water scarcity depends on the choice between the two extremes in water allocation.

  11. Controlled hydrostatic pressure stress downregulates the expression of ribosomal genes in preimplantation embryos: a possible protection mechanism?

    PubMed

    Bock, I; Raveh-Amit, H; Losonczi, E; Carstea, A C; Feher, A; Mashayekhi, K; Matyas, S; Dinnyes, A; Pribenszky, C

    2016-04-01

    The efficiency of various assisted reproductive techniques can be improved by preconditioning the gametes and embryos with sublethal hydrostatic pressure treatment. However, the underlying molecular mechanism responsible for this protective effect remains unknown and requires further investigation. Here, we studied the effect of optimised hydrostatic pressure treatment on the global gene expression of mouse oocytes after embryonic genome activation. Based on a gene expression microarray analysis, a significant effect of treatment was observed in 4-cell embryos derived from treated oocytes, revealing a transcriptional footprint of hydrostatic pressure-affected genes. Functional analysis identified numerous genes involved in protein synthesis that were downregulated in 4-cell embryos in response to hydrostatic pressure treatment, suggesting that regulation of translation has a major role in optimised hydrostatic pressure-induced stress tolerance. We present a comprehensive microarray analysis and further delineate a potential mechanism responsible for the protective effect of hydrostatic pressure treatment.

  12. Heterologous expression of Aspergillus terreus fructosyltransferase in Kluyveromyces lactis.

    PubMed

    Spohner, Sebastian C; Czermak, Peter

    2016-06-25

    Fructo-oligosaccharides are prebiotic and hypocaloric sweeteners that are usually extracted from chicory. They can also be produced from sucrose using fructosyltransferases, but the only commercial enzyme suitable for this purpose is Pectinex Ultra, which is produced with Aspergillus aculeatus. Here we used the yeast Kluyveromyces lactis to express a secreted recombinant fructosyltransferase from the inulin-producing fungus Aspergillus terreus. A synthetic codon-optimised version of the putative β-fructofuranosidase ATEG 04996 (XP 001214174.1) from A. terreus NIH2624 was secreted as a functional protein into the extracellular medium. At 60°C, the purified A. terreus enzyme generated the same pattern of oligosaccharides as Pectinex Ultra, but at lower temperatures it also produced oligomers with up to seven units. We achieved activities of up to 986.4U/mL in high-level expression experiments, which is better than previous reports of optimised Aspergillus spp. fermentations. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Hypertrophic scarring: the greatest unmet challenge following burn injury

    PubMed Central

    Finnerty, Celeste C; Jeschke, Marc G; Branski, Ludwik K; Barret, Juan P.; Dziewulski, Peter; Herndon, David N

    2017-01-01

    Summary Improvements in acute burn care have enabled patients to survive massive burns which would have once been fatal. Now up to 70% of patients develop hypertrophic scars following burns. The functional and psychosocial sequelae remain a major rehabilitative challenge, decreasing quality of life and delaying reintegration into society. The current approach is to optimise the healing potential of the burn wound using targeted wound care and surgery in order to minimise the development of hypertrophic scarring. This approach often fails, and modulation of established scar is continued although the optimal indication, timing, and combination of therapies have yet to be established. The need for novel treatments is paramount, and future efforts to improve outcomes and quality of life should include optimisation of wound healing to attenuate or prevent hypertrophic scarring, well-designed trials to confirm treatment efficacy, and further elucidation of molecular mechanisms to allow development of new preventative and therapeutic strategies. PMID:27707499

  14. Suppressing unsteady flow in arterio-venous fistulae

    NASA Astrophysics Data System (ADS)

    Grechy, L.; Iori, F.; Corbett, R. W.; Shurey, S.; Gedroyc, W.; Duncan, N.; Caro, C. G.; Vincent, P. E.

    2017-10-01

    Arterio-Venous Fistulae (AVF) are regarded as the "gold standard" method of vascular access for patients with end-stage renal disease who require haemodialysis. However, a large proportion of AVF do not mature, and hence fail, as a result of various pathologies such as Intimal Hyperplasia (IH). Unphysiological flow patterns, including high-frequency flow unsteadiness, associated with the unnatural and often complex geometries of AVF are believed to be implicated in the development of IH. In the present study, we employ a Mesh Adaptive Direct Search optimisation framework, computational fluid dynamics simulations, and a new cost function to design a novel non-planar AVF configuration that can suppress high-frequency unsteady flow. A prototype device for holding an AVF in the optimal configuration is then fabricated, and proof-of-concept is demonstrated in a porcine model. Results constitute the first use of numerical optimisation to design a device for suppressing potentially pathological high-frequency flow unsteadiness in AVF.

  15. Magnetic resonance imaging-guided surgical design: can we optimise the Fontan operation?

    PubMed

    Haggerty, Christopher M; Yoganathan, Ajit P; Fogel, Mark A

    2013-12-01

    The Fontan procedure, although an imperfect solution for children born with a single functional ventricle, is the only reconstruction at present short of transplantation. The haemodynamics associated with the total cavopulmonary connection, the modern approach to Fontan, are severely altered from the normal biventricular circulation and may contribute to the long-term complications that are frequently noted. Through recent technological advances, spear-headed by advances in medical imaging, it is now possible to virtually model these surgical procedures and evaluate the patient-specific haemodynamics as part of the pre-operative planning process. This is a novel paradigm with the potential to revolutionise the approach to Fontan surgery, help to optimise the haemodynamic results, and improve patient outcomes. This review provides a brief overview of these methods, presents preliminary results of their clinical usage, and offers insights into its potential future directions.

  16. Shape and energy consistent pseudopotentials for correlated electron systems

    PubMed Central

    Needs, R. J.

    2017-01-01

    A method is developed for generating pseudopotentials for use in correlated-electron calculations. The paradigms of shape and energy consistency are combined and defined in terms of correlated-electron wave-functions. The resulting energy consistent correlated electron pseudopotentials (eCEPPs) are constructed for H, Li–F, Sc–Fe, and Cu. Their accuracy is quantified by comparing the relaxed molecular geometries and dissociation energies which they provide with all electron results, with all quantities evaluated using coupled cluster singles, doubles, and triples calculations. Errors inherent in the pseudopotentials are also compared with those arising from a number of approximations commonly used with pseudopotentials. The eCEPPs provide a significant improvement in optimised geometries and dissociation energies for small molecules, with errors for the latter being an order-of-magnitude smaller than for Hartree-Fock-based pseudopotentials available in the literature. Gaussian basis sets are optimised for use with these pseudopotentials. PMID:28571391

  17. 3D Reconstruction of human bones based on dictionary learning.

    PubMed

    Zhang, Binkai; Wang, Xiang; Liang, Xiao; Zheng, Jinjin

    2017-11-01

    An effective method for reconstructing a 3D model of human bones from computed tomography (CT) image data based on dictionary learning is proposed. In this study, the dictionary comprises the vertices of triangular meshes, and the sparse coefficient matrix indicates the connectivity information. For better reconstruction performance, we proposed a balance coefficient between the approximation and regularisation terms and a method for optimisation. Moreover, we applied a local updating strategy and a mesh-optimisation method to update the dictionary and the sparse matrix, respectively. The two updating steps are iterated alternately until the objective function converges. Thus, a reconstructed mesh could be obtained with high accuracy and regularisation. The experimental results show that the proposed method has the potential to obtain high precision and high-quality triangular meshes for rapid prototyping, medical diagnosis, and tissue engineering. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Hardware Design of the Energy Efficient Fall Detection Device

    NASA Astrophysics Data System (ADS)

    Skorodumovs, A.; Avots, E.; Hofmanis, J.; Korāts, G.

    2016-04-01

    Health issues for elderly people may lead to different injuries obtained during simple activities of daily living. Potentially the most dangerous are unintentional falls that may be critical or even lethal to some patients due to the heavy injury risk. In the project "Wireless Sensor Systems in Telecare Application for Elderly People", we have developed a robust fall detection algorithm for a wearable wireless sensor. To optimise the algorithm for hardware performance and test it in field, we have designed an accelerometer based wireless fall detector. Our main considerations were: a) functionality - so that the algorithm can be applied to the chosen hardware, and b) power efficiency - so that it can run for a very long time. We have picked and tested the parts, built a prototype, optimised the firmware for lowest consumption, tested the performance and measured the consumption parameters. In this paper, we discuss our design choices and present the results of our work.

  19. Effectiveness of an implementation optimisation intervention aimed at increasing parent engagement in HENRY, a childhood obesity prevention programme - the Optimising Family Engagement in HENRY (OFTEN) trial: study protocol for a randomised controlled trial.

    PubMed

    Bryant, Maria; Burton, Wendy; Cundill, Bonnie; Farrin, Amanda J; Nixon, Jane; Stevens, June; Roberts, Kim; Foy, Robbie; Rutter, Harry; Hartley, Suzanne; Tubeuf, Sandy; Collinson, Michelle; Brown, Julia

    2017-01-24

    Family-based interventions to prevent childhood obesity depend upon parents' taking action to improve diet and other lifestyle behaviours in their families. Programmes that attract and retain high numbers of parents provide an enhanced opportunity to improve public health and are also likely to be more cost-effective than those that do not. We have developed a theory-informed optimisation intervention to promote parent engagement within an existing childhood obesity prevention group programme, HENRY (Health Exercise Nutrition for the Really Young). Here, we describe a proposal to evaluate the effectiveness of this optimisation intervention in regard to the engagement of parents and cost-effectiveness. The Optimising Family Engagement in HENRY (OFTEN) trial is a cluster randomised controlled trial being conducted across 24 local authorities (approximately 144 children's centres) which currently deliver HENRY programmes. The primary outcome will be parental enrolment and attendance at the HENRY programme, assessed using routinely collected process data. Cost-effectiveness will be presented in terms of primary outcomes using acceptability curves and through eliciting the willingness to pay for the optimisation from HENRY commissioners. Secondary outcomes include the longitudinal impact of the optimisation, parent-reported infant intake of fruits and vegetables (as a proxy to compliance) and other parent-reported family habits and lifestyle. This innovative trial will provide evidence on the implementation of a theory-informed optimisation intervention to promote parent engagement in HENRY, a community-based childhood obesity prevention programme. The findings will be generalisable to other interventions delivered to parents in other community-based environments. This research meets the expressed needs of commissioners, children's centres and parents to optimise the potential impact that HENRY has on obesity prevention. A subsequent cluster randomised controlled pilot trial is planned to determine the practicality of undertaking a definitive trial to robustly evaluate the effectiveness and cost-effectiveness of the optimised intervention on childhood obesity prevention. ClinicalTrials.gov identifier: NCT02675699 . Registered on 4 February 2016.

  20. Rapid detection of Ganoderma-infected oil palms by microwave ergosterol extraction with HPLC and TLC.

    PubMed

    Muniroh, M S; Sariah, M; Zainal Abidin, M A; Lima, N; Paterson, R R M

    2014-05-01

    Detection of basal stem rot (BSR) by Ganoderma of oil palms was based on foliar symptoms and production of basidiomata. Enzyme-Linked Immunosorbent Assays-Polyclonal Antibody (ELISA-PAB) and PCR have been proposed as early detection methods for the disease. These techniques are complex, time consuming and have accuracy limitations. An ergosterol method was developed which correlated well with the degree of infection in oil palms, including samples growing in plantations. However, the method was capable of being optimised. This current study was designed to develop a simpler, more rapid and efficient ergosterol method with utility in the field that involved the use of microwave extraction. The optimised procedure involved extracting a small amount of Ganoderma, or Ganoderma-infected oil palm suspended in low volumes of solvent followed by irradiation in a conventional microwave oven at 70°C and medium high power for 30s, resulting in simultaneous extraction and saponification. Ergosterol was detected by thin layer chromatography (TLC) and quantified using high performance liquid chromatography with diode array detection. The TLC method was novel and provided a simple, inexpensive method with utility in the field. The new method was particularly effective at extracting high yields of ergosterol from infected oil palm and enables rapid analysis of field samples on site, allowing infected oil palms to be treated or culled very rapidly. Some limitations of the method are discussed herein. The procedures lend themselves to controlling the disease more effectively and allowing more effective use of land currently employed to grow oil palms, thereby reducing pressure to develop new plantations. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Radiation dose optimisation for conventional imaging in infants and newborns using automatic dose management software: an application of the new 2013/59 EURATOM directive.

    PubMed

    Alejo, L; Corredoira, E; Sánchez-Muñoz, F; Huerga, C; Aza, Z; Plaza-Núñez, R; Serrada, A; Bret-Zurita, M; Parrón, M; Prieto-Areyano, C; Garzón-Moll, G; Madero, R; Guibelalde, E

    2018-04-09

    Objective: The new 2013/59 EURATOM Directive (ED) demands dosimetric optimisation procedures without undue delay. The aim of this study was to optimise paediatric conventional radiology examinations applying the ED without compromising the clinical diagnosis. Automatic dose management software (ADMS) was used to analyse 2678 studies of children from birth to 5 years of age, obtaining local diagnostic reference levels (DRLs) in terms of entrance surface air kerma. Given local DRL for infants and chest examinations exceeded the European Commission (EC) DRL, an optimisation was performed decreasing the kVp and applying the automatic control exposure. To assess the image quality, an analysis of high-contrast resolution (HCSR), signal-to-noise ratio (SNR) and figure of merit (FOM) was performed, as well as a blind test based on the generalised estimating equations method. For newborns and chest examinations, the local DRL exceeded the EC DRL by 113%. After the optimisation, a reduction of 54% was obtained. No significant differences were found in the image quality blind test. A decrease in SNR (-37%) and HCSR (-68%), and an increase in FOM (42%), was observed. ADMS allows the fast calculation of local DRLs and the performance of optimisation procedures in babies without delay. However, physical and clinical analyses of image quality remain to be needed to ensure the diagnostic integrity after the optimisation process. Advances in knowledge: ADMS are useful to detect radiation protection problems and to perform optimisation procedures in paediatric conventional imaging without undue delay, as ED requires.

  2. Integration of Monte-Carlo ray tracing with a stochastic optimisation method: application to the design of solar receiver geometry.

    PubMed

    Asselineau, Charles-Alexis; Zapata, Jose; Pye, John

    2015-06-01

    A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.

  3. Impact of field number and beam angle on functional image-guided lung cancer radiotherapy planning

    NASA Astrophysics Data System (ADS)

    Tahir, Bilal A.; Bragg, Chris M.; Wild, Jim M.; Swinscoe, James A.; Lawless, Sarah E.; Hart, Kerry A.; Hatton, Matthew Q.; Ireland, Rob H.

    2017-09-01

    To investigate the effect of beam angles and field number on functionally-guided intensity modulated radiotherapy (IMRT) normal lung avoidance treatment plans that incorporate hyperpolarised helium-3 magnetic resonance imaging (3He MRI) ventilation data. Eight non-small cell lung cancer patients had pre-treatment 3He MRI that was registered to inspiration breath-hold radiotherapy planning computed tomography. IMRT plans that minimised the volume of total lung receiving  ⩾20 Gy (V20) were compared with plans that minimised 3He MRI defined functional lung receiving  ⩾20 Gy (fV20). Coplanar IMRT plans using 5-field manually optimised beam angles and 9-field equidistant plans were also evaluated. For each pair of plans, the Wilcoxon signed ranks test was used to compare fV20 and the percentage of planning target volume (PTV) receiving 90% of the prescription dose (PTV90). Incorporation of 3He MRI led to median reductions in fV20 of 1.3% (range: 0.2-9.3% p  =  0.04) and 0.2% (range: 0 to 4.1%; p  =  0.012) for 5- and 9-field arrangements, respectively. There was no clinically significant difference in target coverage. Functionally-guided IMRT plans incorporating hyperpolarised 3He MRI information can reduce the dose received by ventilated lung without comprising PTV coverage. The effect was greater for optimised beam angles rather than uniformly spaced fields.

  4. 'Intelligent' system's cost-cutting power.

    PubMed

    Dodge, Jeremy

    2010-05-01

    Jeremy Dodge, business manager at Marshall Tufflex Energy Management, explains how a voltage optimisation system that, in a claimed industry first, uses "auto-transformers" to reduce incoming mains electricity voltage so that electrical equipment receives precisely the "outgoing feed" it needs to function optimally and no more, thus significantly reducing wastage, can help major electricity users cut their bills "by as much as 25%".

  5. Topology optimisation for natural convection problems

    NASA Astrophysics Data System (ADS)

    Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe; Sigmund, Ole

    2014-12-01

    This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach for designing heat sink geometries cooled by natural convection and micropumps powered by natural convection.

  6. Elitist Binary Wolf Search Algorithm for Heuristic Feature Selection in High-Dimensional Bioinformatics Datasets.

    PubMed

    Li, Jinyan; Fong, Simon; Wong, Raymond K; Millham, Richard; Wong, Kelvin K L

    2017-06-28

    Due to the high-dimensional characteristics of dataset, we propose a new method based on the Wolf Search Algorithm (WSA) for optimising the feature selection problem. The proposed approach uses the natural strategy established by Charles Darwin; that is, 'It is not the strongest of the species that survives, but the most adaptable'. This means that in the evolution of a swarm, the elitists are motivated to quickly obtain more and better resources. The memory function helps the proposed method to avoid repeat searches for the worst position in order to enhance the effectiveness of the search, while the binary strategy simplifies the feature selection problem into a similar problem of function optimisation. Furthermore, the wrapper strategy gathers these strengthened wolves with the classifier of extreme learning machine to find a sub-dataset with a reasonable number of features that offers the maximum correctness of global classification models. The experimental results from the six public high-dimensional bioinformatics datasets tested demonstrate that the proposed method can best some of the conventional feature selection methods up to 29% in classification accuracy, and outperform previous WSAs by up to 99.81% in computational time.

  7. Optimisation of oat milk formulation to obtain fermented derivatives by using probiotic Lactobacillus reuteri microorganisms.

    PubMed

    Bernat, N; Cháfer, M; González-Martínez, C; Rodríguez-García, J; Chiralt, A

    2015-03-01

    Functional advantages of probiotics combined with interesting composition of oat were considered as an alternative to dairy products. In this study, fermentation of oat milk with Lactobacillus reuteri and Streptococcus thermophilus was analysed to develop a new probiotic product. Central composite design with response surface methodology was used to analyse the effect of different factors (glucose, fructose, inulin and starters) on the probiotic population in the product. Optimised formulation was characterised throughout storage time at 4 ℃ in terms of pH, acidity, β-glucan and oligosaccharides contents, colour and rheological behaviour. All formulations studied were adequate to produce fermented foods and minimum dose of each factor was considered as optimum. The selected formulation allowed starters survival above 10(7)/cfu ml to be considered as a functional food and was maintained during the 28 days controlled. β-glucans remained in the final product with a positive effect on viscosity. Therefore, a new probiotic non-dairy milk was successfully developed in which high probiotic survivals were assured throughout the typical yoghurt-like shelf life. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. Anti-predator adaptations in a great scallop (Pecten maximus) - a palaeontological perspective

    NASA Astrophysics Data System (ADS)

    Brom, Krzysztof Roman; Szopa, Krzysztof; Krzykawski, Tomasz; Brachaniec, Tomasz; Salamon, Mariusz Andrzej

    2015-12-01

    Shelly fauna was exposed to increased pressure exerted by shell-crushing durophagous predators during the so-called Mesozoic Marine Revolution that was initiated in the Triassic. As a result of evolutionary `arms race', prey animals such as bivalves, developed many adaptations to reduce predation pressure (e.g. they changed lifestyle and shell morphology in order to increase their mechanical strength). For instance, it was suggested that Pectinidae had acquired the ability to actively swim to avoid predator attack during the early Mesozoic. However, pectinids are also know to have a specific shell microstructure that may effectively protect them against predators. For instance, we highlight that the shells of some recent pectinid species (e.g. Pecten maximus) that display cross-lamellar structures in the middle part playing a significant role in the energy dissipation, improve the mechanical strength. In contrast, the outer layers of these bivalves are highly porous, which allow them to swim more efficiently by reducing the shell weight. Pectinids are thus perfect examples of animals optimising their skeletons for several functions. We suggest that such an optimisation of their skeletons for multiple functions likely occurred as a results of increased predation pressure during the so-called Mesozoic Marine Revolution.

  9. Optimisation of chromatographic resolution using objective functions including both time and spectral information.

    PubMed

    Torres-Lapasió, J R; Pous-Torres, S; Ortiz-Bolsico, C; García-Alvarez-Coque, M C

    2015-01-16

    The optimisation of the resolution in high-performance liquid chromatography is traditionally performed attending only to the time information. However, even in the optimal conditions, some peak pairs may remain unresolved. Such incomplete resolution can be still accomplished by deconvolution, which can be carried out with more guarantees of success by including spectral information. In this work, two-way chromatographic objective functions (COFs) that incorporate both time and spectral information were tested, based on the peak purity (analyte peak fraction free of overlapping) and the multivariate selectivity (figure of merit derived from the net analyte signal) concepts. These COFs are sensitive to situations where the components that coelute in a mixture show some spectral differences. Therefore, they are useful to find out experimental conditions where the spectrochromatograms can be recovered by deconvolution. Two-way multivariate selectivity yielded the best performance and was applied to the separation using diode-array detection of a mixture of 25 phenolic compounds, which remained unresolved in the chromatographic order using linear and multi-linear gradients of acetonitrile-water. Peak deconvolution was carried out using the combination of orthogonal projection approach and alternating least squares. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Developing a musculoskeletal model of the primate skull: predicting muscle activations, bite force, and joint reaction forces using multibody dynamics analysis and advanced optimisation methods.

    PubMed

    Shi, Junfen; Curtis, Neil; Fitton, Laura C; O'Higgins, Paul; Fagan, Michael J

    2012-10-07

    An accurate, dynamic, functional model of the skull that can be used to predict muscle forces, bite forces, and joint reaction forces would have many uses across a broad range of disciplines. One major issue however with musculoskeletal analyses is that of muscle activation pattern indeterminacy. A very large number of possible muscle force combinations will satisfy a particular functional task. This makes predicting physiological muscle recruitment patterns difficult. Here we describe in detail the process of development of a complex multibody computer model of a primate skull (Macaca fascicularis), that aims to predict muscle recruitment patterns during biting. Using optimisation criteria based on minimisation of muscle stress we predict working to balancing side muscle force ratios, peak bite forces, and joint reaction forces during unilateral biting. Validation of such models is problematic; however we have shown comparable working to balancing muscle activity and TMJ reaction ratios during biting to those observed in vivo and that peak predicted bite forces compare well to published experimental data. To our knowledge the complexity of the musculoskeletal model is greater than any previously reported for a primate. This complexity, when compared to more simple representations provides more nuanced insights into the functioning of masticatory muscles. Thus, we have shown muscle activity to vary throughout individual muscle groups, which enables them to function optimally during specific masticatory tasks. This model will be utilised in future studies into the functioning of the masticatory apparatus. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. A supportive architecture for CFD-based design optimisation

    NASA Astrophysics Data System (ADS)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.

  12. Optimisation of SOA-REAMs for hybrid DWDM-TDMA PON applications.

    PubMed

    Naughton, Alan; Antony, Cleitus; Ossieur, Peter; Porto, Stefano; Talli, Giuseppe; Townsend, Paul D

    2011-12-12

    We demonstrate how loss-optimised, gain-saturated SOA-REAM based reflective modulators can reduce the burst to burst power variations due to differential access loss in the upstream path in carrier distributed passive optical networks by 18 dB compared to fixed linear gain modulators. We also show that the loss optimised device has a high tolerance to input power variations and can operate in deep saturation with minimal patterning penalties. Finally, we demonstrate that an optimised device can operate across the C-Band and also over a transmission distance of 80 km. © 2011 Optical Society of America

  13. Optimising ICT Effectiveness in Instruction and Learning: Multilevel Transformation Theory and a Pilot Project in Secondary Education

    ERIC Educational Resources Information Center

    Mooij, Ton

    2004-01-01

    Specific combinations of educational and ICT conditions including computer use may optimise learning processes, particularly for learners at risk. This position paper asks which curricular, instructional, and ICT characteristics can be expected to optimise learning processes and outcomes, and how to best achieve this optimization. A theoretical…

  14. Study design and rationale for Optimal aNtiplatelet pharmacotherapy guided by bedSIDE genetic or functional TESTing in elective percutaneous coronary intervention patients (ONSIDE TEST): a prospective, open-label, randomised parallel-group multicentre trial (NCT01930773).

    PubMed

    Kołtowski, Łukasz; Aradi, Daniel; Huczek, Zenon; Tomaniak, Mariusz; Sibbing, Dirk; Filipiak, Krzysztof J; Kochman, Janusz; Balsam, Paweł; Opolski, Grzegorz

    2016-01-01

    High platelet reactivity (HPR) and presence of CYP2C19 loss-of-function alleles are associated with higher risk for periprocedural myocardial infarction in clopidogrel-treated patients undergoing percutaneous coronary intervention (PCI). It is unknown whether personalised treatment based on platelet function testing or genotyping can prevent such complications. The ONSIDE-TEST is a multicentre, prospective, open-label, randomised controlled clinical trial aiming to assess if optimisation of antiplatelet therapy based on either phenotyping or genotyping is superior to conventional care. Patients will be randomised into phenotyping, genotyping, or control arms. In the phenotyping group, patients will be tested with the VerifyNow P2Y12 assay before PCI, and patients with a platelet reactivity unit greater than 208 will be switched over to prasugrel, while others will continue on clopidogrel therapy. In the genotyping group, carriers of the *2 loss-of-function allele will receive prasugrel for PCI, while wild-type subjects will be treated with clopidogrel. Patients in the control arm will be treated with standard-dose clopidogrel. The primary endpoint of the study is the prevalence of periprocedural myocardial injury within 24 h after PCI in the controls as compared to the phenotyping and genotyping group. Secondary endpoints include cardiac death, myocardial infarction, definite or probable stent thrombosis, or urgent repeat revascularisation within 30 days of PCI. Primary safety outcome is Bleeding Academic Research Consortium (BARC) type 3 and 5 bleeding during 30 days of PCI. The ONSIDE TEST trial is expected to verify the clinical utility of an individualised antiplatelet strategy in preventing periprocedural myocardial injury by either phenotyping or genotyping. ClinicalTrials.gov: NCT01930773.

  15. Modelling of human walking to optimise the function of ankle-foot orthosis in Guillan-Barré patients with drop foot.

    PubMed

    Jamshidi, N; Rostami, M; Najarian, S; Menhaj, M B; Saadatnia, M; Firooz, S

    2009-04-01

    This paper deals with the dynamic modelling of human walking. The main focus of this research was to optimise the function of the orthosis in patients with neuropathic feet, based on the kinematics data from different categories of neuropathic patients. The patient's body on the sagittal plane was modelled for calculating the torques generated in joints. The kinematics data required for mathematical modelling of the patients were obtained from the films of patients captured by high speed camera, and then the films were analysed through a motion analysis software. An inverse dynamic model was used for estimating the spring coefficient. In our dynamic model, the role of muscles was substituted by adding a spring-damper between the shank and ankle that could compensate for their weakness by designing ankle-foot orthoses based on the kinematics data obtained from the patients. The torque generated in the ankle was varied by changing the spring constant. Therefore, it was possible to decrease the torque generated in muscles which could lead to the design of more comfortable and efficient orthoses. In this research, unlike previous research activities, instead of studying the abnormal gait or modelling the ankle-foot orthosis separately, the function of the ankle-foot orthosis on the abnormal gait has been quantitatively improved through a correction of the torque.

  16. Reliability of clinical impact grading by healthcare professionals of common prescribing error and optimisation cases in critical care patients.

    PubMed

    Bourne, Richard S; Shulman, Rob; Tomlin, Mark; Borthwick, Mark; Berry, Will; Mills, Gary H

    2017-04-01

    To identify between and within profession-rater reliability of clinical impact grading for common critical care prescribing error and optimisation cases. To identify representative clinical impact grades for each individual case. Electronic questionnaire. 5 UK NHS Trusts. 30 Critical care healthcare professionals (doctors, pharmacists and nurses). Participants graded severity of clinical impact (5-point categorical scale) of 50 error and 55 optimisation cases. Case between and within profession-rater reliability and modal clinical impact grading. Between and within profession rater reliability analysis used linear mixed model and intraclass correlation, respectively. The majority of error and optimisation cases (both 76%) had a modal clinical severity grade of moderate or higher. Error cases: doctors graded clinical impact significantly lower than pharmacists (-0.25; P < 0.001) and nurses (-0.53; P < 0.001), with nurses significantly higher than pharmacists (0.28; P < 0.001). Optimisation cases: doctors graded clinical impact significantly lower than nurses and pharmacists (-0.39 and -0.5; P < 0.001, respectively). Within profession reliability grading was excellent for pharmacists (0.88 and 0.89; P < 0.001) and doctors (0.79 and 0.83; P < 0.001) but only fair to good for nurses (0.43 and 0.74; P < 0.001), for optimisation and error cases, respectively. Representative clinical impact grades for over 100 common prescribing error and optimisation cases are reported for potential clinical practice and research application. The between professional variability highlights the importance of multidisciplinary perspectives in assessment of medication error and optimisation cases in clinical practice and research. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  17. Laboratory evaluation of an optimised internet-based speech-in-noise test for occupational high-frequency hearing loss screening: Occupational Earcheck.

    PubMed

    Sheikh Rashid, Marya; Leensen, Monique C J; de Laat, Jan A P M; Dreschler, Wouter A

    2017-11-01

    The "Occupational Earcheck" (OEC) is a Dutch online self-screening speech-in-noise test developed for the detection of occupational high-frequency hearing loss (HFHL). This study evaluates an optimised version of the test and determines the most appropriate masking noise. The original OEC was improved by homogenisation of the speech material, and shortening the test. A laboratory-based cross-sectional study was performed in which the optimised OEC in five alternative masking noise conditions was evaluated. The study was conducted on 18 normal-hearing (NH) adults, and 15 middle-aged listeners with HFHL. The OEC in a low-pass (LP) filtered stationary background noise (test version LP 3: with a cut-off frequency of 1.6 kHz, and a noise floor of -12 dB) was the most accurate version tested. The test showed a reasonable sensitivity (93%), and specificity (94%) and test reliability (intra-class correlation coefficient: 0.84, mean within-subject standard deviation: 1.5 dB SNR, slope of psychometric function: 13.1%/dB SNR). The improved OEC, with homogenous word material in a LP filtered noise, appears to be suitable for the discrimination between younger NH listeners and older listeners with HFHL. The appropriateness of the OEC for screening purposes in an occupational setting will be studied further.

  18. Moss and peat hydraulic properties are optimized to maximise peatland water use efficiency

    NASA Astrophysics Data System (ADS)

    Kettridge, Nicholas; Tilak, Amey; Devito, Kevin; Petrone, Rich; Mendoza, Carl; Waddington, Mike

    2016-04-01

    Peatland ecosystems are globally important carbon and terrestrial surface water stores that have formed over millennia. These ecosystems have likely optimised their ecohydrological function over the long-term development of their soil hydraulic properties. Through a theoretical ecosystem approach, applying hydrological modelling integrated with known ecological thresholds and concepts, the optimisation of peat hydraulic properties is examined to determine which of the following conditions peatland ecosystems target during this development: i) maximise carbon accumulation, ii) maximise water storage, or iii) balance carbon profit across hydrological disturbances. Saturated hydraulic conductivity (Ks) and empirical van Genuchten water retention parameter α are shown to provide a first order control on simulated water tensions. Across parameter space, peat profiles with hypothetical combinations of Ks and α show a strong binary tendency towards targeting either water or carbon storage. Actual hydraulic properties from five northern peatlands fall at the interface between these goals, balancing the competing demands of carbon accumulation and water storage. We argue that peat hydraulic properties are thus optimized to maximise water use efficiency and that this optimisation occurs over a centennial to millennial timescale as the peatland develops. This provides a new conceptual framework to characterise peat hydraulic properties across climate zones and between a range of different disturbances, and which can be used to provide benchmarks for peatland design and reclamation.

  19. An optimisation methodology of artificial neural network models for predicting solar radiation: a case study

    NASA Astrophysics Data System (ADS)

    Rezrazi, Ahmed; Hanini, Salah; Laidi, Maamar

    2016-02-01

    The right design and the high efficiency of solar energy systems require accurate information on the availability of solar radiation. Due to the cost of purchase and maintenance of the radiometers, these data are not readily available. Therefore, there is a need to develop alternative ways of generating such data. Artificial neural networks (ANNs) are excellent and effective tools for learning, pinpointing or generalising data regularities, as they have the ability to model nonlinear functions; they can also cope with complex `noisy' data. The main objective of this paper is to show how to reach an optimal model of ANNs for applying in prediction of solar radiation. The measured data of the year 2007 in Ghardaïa city (Algeria) are used to demonstrate the optimisation methodology. The performance evaluation and the comparison of results of ANN models with measured data are made on the basis of mean absolute percentage error (MAPE). It is found that MAPE in the ANN optimal model reaches 1.17 %. Also, this model yields a root mean square error (RMSE) of 14.06 % and an MBE of 0.12. The accuracy of the outputs exceeded 97 % and reached up 99.29 %. Results obtained indicate that the optimisation strategy satisfies practical requirements. It can successfully be generalised for any location in the world and be used in other fields than solar radiation estimation.

  20. Land-surface parameter optimisation using data assimilation techniques: the adJULES system V1.0

    NASA Astrophysics Data System (ADS)

    Raoult, Nina M.; Jupp, Tim E.; Cox, Peter M.; Luke, Catherine M.

    2016-08-01

    Land-surface models (LSMs) are crucial components of the Earth system models (ESMs) that are used to make coupled climate-carbon cycle projections for the 21st century. The Joint UK Land Environment Simulator (JULES) is the land-surface model used in the climate and weather forecast models of the UK Met Office. JULES is also extensively used offline as a land-surface impacts tool, forced with climatologies into the future. In this study, JULES is automatically differentiated with respect to JULES parameters using commercial software from FastOpt, resulting in an analytical gradient, or adjoint, of the model. Using this adjoint, the adJULES parameter estimation system has been developed to search for locally optimum parameters by calibrating against observations. This paper describes adJULES in a data assimilation framework and demonstrates its ability to improve the model-data fit using eddy-covariance measurements of gross primary production (GPP) and latent heat (LE) fluxes. adJULES also has the ability to calibrate over multiple sites simultaneously. This feature is used to define new optimised parameter values for the five plant functional types (PFTs) in JULES. The optimised PFT-specific parameters improve the performance of JULES at over 85 % of the sites used in the study, at both the calibration and evaluation stages. The new improved parameters for JULES are presented along with the associated uncertainties for each parameter.

  1. Optimisation of lateral car dynamics taking into account parameter uncertainties

    NASA Astrophysics Data System (ADS)

    Busch, Jochen; Bestle, Dieter

    2014-02-01

    Simulation studies on an active all-wheel-steering car show that disturbance of vehicle parameters have high influence on lateral car dynamics. This motivates the need of robust design against such parameter uncertainties. A specific parametrisation is established combining deterministic, velocity-dependent steering control parameters with partly uncertain, velocity-independent vehicle parameters for simultaneous use in a numerical optimisation process. Model-based objectives are formulated and summarised in a multi-objective optimisation problem where especially the lateral steady-state behaviour is improved by an adaption strategy based on measurable uncertainties. The normally distributed uncertainties are generated by optimal Latin hypercube sampling and a response surface based strategy helps to cut down time consuming model evaluations which offers the possibility to use a genetic optimisation algorithm. Optimisation results are discussed in different criterion spaces and the achieved improvements confirm the validity of the proposed procedure.

  2. An effective pseudospectral method for constraint dynamic optimisation problems with characteristic times

    NASA Astrophysics Data System (ADS)

    Xiao, Long; Liu, Xinggao; Ma, Liang; Zhang, Zeyin

    2018-03-01

    Dynamic optimisation problem with characteristic times, widely existing in many areas, is one of the frontiers and hotspots of dynamic optimisation researches. This paper considers a class of dynamic optimisation problems with constraints that depend on the interior points either fixed or variable, where a novel direct pseudospectral method using Legendre-Gauss (LG) collocation points for solving these problems is presented. The formula for the state at the terminal time of each subdomain is derived, which results in a linear combination of the state at the LG points in the subdomains so as to avoid the complex nonlinear integral. The sensitivities of the state at the collocation points with respect to the variable characteristic times are derived to improve the efficiency of the method. Three well-known characteristic time dynamic optimisation problems are solved and compared in detail among the reported literature methods. The research results show the effectiveness of the proposed method.

  3. Medicines optimisation: priorities and challenges.

    PubMed

    Kaufman, Gerri

    2016-03-23

    Medicines optimisation is promoted in a guideline published in 2015 by the National Institute for Health and Care Excellence. Four guiding principles underpin medicines optimisation: aim to understand the patient's experience; ensure evidence-based choice of medicines; ensure medicines use is as safe as possible; and make medicines optimisation part of routine practice. Understanding the patient experience is important to improve adherence to medication regimens. This involves communication, shared decision making and respect for patient preferences. Evidence-based choice of medicines is important for clinical and cost effectiveness. Systems and processes for the reporting of medicines-related safety incidents have to be improved if medicines use is to be as safe as possible. Ensuring safe practice in medicines use when patients are transferred between organisations, and managing the complexities of polypharmacy are imperative. A medicines use review can help to ensure that medicines optimisation forms part of routine practice.

  4. Multi-Optimisation Consensus Clustering

    NASA Astrophysics Data System (ADS)

    Li, Jian; Swift, Stephen; Liu, Xiaohui

    Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.

  5. Optimised collision avoidance for an ultra-close rendezvous with a failed satellite based on the Gauss pseudospectral method

    NASA Astrophysics Data System (ADS)

    Chu, Xiaoyu; Zhang, Jingrui; Lu, Shan; Zhang, Yao; Sun, Yue

    2016-11-01

    This paper presents a trajectory planning algorithm to optimise the collision avoidance of a chasing spacecraft operating in an ultra-close proximity to a failed satellite. The complex configuration and the tumbling motion of the failed satellite are considered. The two-spacecraft rendezvous dynamics are formulated based on the target body frame, and the collision avoidance constraints are detailed, particularly concerning the uncertainties. An optimisation solution of the approaching problem is generated using the Gauss pseudospectral method. A closed-loop control is used to track the optimised trajectory. Numerical results are provided to demonstrate the effectiveness of the proposed algorithms.

  6. Thermal time constant: optimising the skin temperature predictive modelling in lower limb prostheses using Gaussian processes

    PubMed Central

    Buis, Arjan

    2016-01-01

    Elevated skin temperature at the body/device interface of lower-limb prostheses is one of the major factors that affect tissue health. The heat dissipation in prosthetic sockets is greatly influenced by the thermal conductive properties of the hard socket and liner material employed. However, monitoring of the interface temperature at skin level in lower-limb prosthesis is notoriously complicated. This is due to the flexible nature of the interface liners used which requires consistent positioning of sensors during donning and doffing. Predicting the residual limb temperature by monitoring the temperature between socket and liner rather than skin and liner could be an important step in alleviating complaints on increased temperature and perspiration in prosthetic sockets. To predict the residual limb temperature, a machine learning algorithm – Gaussian processes is employed, which utilizes the thermal time constant values of commonly used socket and liner materials. This Letter highlights the relevance of thermal time constant of prosthetic materials in Gaussian processes technique which would be useful in addressing the challenge of non-invasively monitoring the residual limb skin temperature. With the introduction of thermal time constant, the model can be optimised and generalised for a given prosthetic setup, thereby making the predictions more reliable. PMID:27695626

  7. Thermal time constant: optimising the skin temperature predictive modelling in lower limb prostheses using Gaussian processes.

    PubMed

    Mathur, Neha; Glesk, Ivan; Buis, Arjan

    2016-06-01

    Elevated skin temperature at the body/device interface of lower-limb prostheses is one of the major factors that affect tissue health. The heat dissipation in prosthetic sockets is greatly influenced by the thermal conductive properties of the hard socket and liner material employed. However, monitoring of the interface temperature at skin level in lower-limb prosthesis is notoriously complicated. This is due to the flexible nature of the interface liners used which requires consistent positioning of sensors during donning and doffing. Predicting the residual limb temperature by monitoring the temperature between socket and liner rather than skin and liner could be an important step in alleviating complaints on increased temperature and perspiration in prosthetic sockets. To predict the residual limb temperature, a machine learning algorithm - Gaussian processes is employed, which utilizes the thermal time constant values of commonly used socket and liner materials. This Letter highlights the relevance of thermal time constant of prosthetic materials in Gaussian processes technique which would be useful in addressing the challenge of non-invasively monitoring the residual limb skin temperature. With the introduction of thermal time constant, the model can be optimised and generalised for a given prosthetic setup, thereby making the predictions more reliable.

  8. Pure random search for ambient sensor distribution optimisation in a smart home environment.

    PubMed

    Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming

    2011-01-01

    Smart homes are living spaces facilitated with technology to allow individuals to remain in their own homes for longer, rather than be institutionalised. Sensors are the fundamental physical layer with any smart home, as the data they generate is used to inform decision support systems, facilitating appropriate actuator actions. Positioning of sensors is therefore a fundamental characteristic of a smart home. Contemporary smart home sensor distribution is aligned to either a) a total coverage approach; b) a human assessment approach. These methods for sensor arrangement are not data driven strategies, are unempirical and frequently irrational. This Study hypothesised that sensor deployment directed by an optimisation method that utilises inhabitants' spatial frequency data as the search space, would produce more optimal sensor distributions vs. the current method of sensor deployment by engineers. Seven human engineers were tasked to create sensor distributions based on perceived utility for 9 deployment scenarios. A Pure Random Search (PRS) algorithm was then tasked to create matched sensor distributions. The PRS method produced superior distributions in 98.4% of test cases (n=64) against human engineer instructed deployments when the engineers had no access to the spatial frequency data, and in 92.0% of test cases (n=64) when engineers had full access to these data. These results thus confirmed the hypothesis.

  9. Response surface methodology investigation into the interactions between arsenic and humic acid in water during the coagulation process.

    PubMed

    Watson, Malcolm Alexander; Tubić, Aleksandra; Agbaba, Jasmina; Nikić, Jasmina; Maletić, Snežana; Molnar Jazić, Jelena; Dalmacija, Božo

    2016-07-15

    Interactions between arsenic and natural organic matter (NOM) are key limiting factors during the optimisation of drinking water treatment when significant amounts of both must be removed. This work uses Response Surface Methodology (RSM) to investigate how they interact during their simultaneous removal by iron chloride coagulation, using humic acid (HA) as a model NOM substance. Using a three factor Box-Behnken experimental design, As and HA removals were modelled, as well as a combined removal response. ANOVA results showed the significance of the coagulant dose for all three responses. At high initial arsenic concentrations (200μg/l), As removal was significantly hindered by the presence of HA. In contrast, the HA removal response was found to be largely independent of the initial As concentration, with the optimum coagulant dose increasing at increasing HA concentrations. The combined response was similar to the HA removal response, and the interactions evident are most interesting in terms of optimising treatment processes during the preparation of drinking water, highlighting the importance of utilizing RSM for such investigations. The combined response model was successfully validated with two different groundwaters used for drinking water supply in the Republic of Serbia, showing excellent agreement under similar experimental conditions. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Fabrication of Organic Radar Absorbing Materials: A Report on the TIF Project

    DTIC Science & Technology

    2005-05-01

    thickness, permittivity and permeability. The ability to measure the permittivity and permeability is an essential requirement for designing an optimised...absorber. And good optimisations codes are required in order to achieve the best possible absorber designs . In this report, the results from a...through measurement of their conductivity and permittivity at microwave frequencies. Methods were then developed for optimising the design of

  11. Thermal buckling optimisation of composite plates using firefly algorithm

    NASA Astrophysics Data System (ADS)

    Kamarian, S.; Shakeri, M.; Yas, M. H.

    2017-07-01

    Composite plates play a very important role in engineering applications, especially in aerospace industry. Thermal buckling of such components is of great importance and must be known to achieve an appropriate design. This paper deals with stacking sequence optimisation of laminated composite plates for maximising the critical buckling temperature using a powerful meta-heuristic algorithm called firefly algorithm (FA) which is based on the flashing behaviour of fireflies. The main objective of present work was to show the ability of FA in optimisation of composite structures. The performance of FA is compared with the results reported in the previous published works using other algorithms which shows the efficiency of FA in stacking sequence optimisation of laminated composite structures.

  12. Distributed convex optimisation with event-triggered communication in networked systems

    NASA Astrophysics Data System (ADS)

    Liu, Jiayun; Chen, Weisheng

    2016-12-01

    This paper studies the distributed convex optimisation problem over directed networks. Motivated by practical considerations, we propose a novel distributed zero-gradient-sum optimisation algorithm with event-triggered communication. Therefore, communication and control updates just occur at discrete instants when some predefined condition satisfies. Thus, compared with the time-driven distributed optimisation algorithms, the proposed algorithm has the advantages of less energy consumption and less communication cost. Based on Lyapunov approaches, we show that the proposed algorithm makes the system states asymptotically converge to the solution of the problem exponentially fast and the Zeno behaviour is excluded. Finally, simulation example is given to illustrate the effectiveness of the proposed algorithm.

  13. Female Mate Choice Can Drive the Evolution of High Frequency Echolocation in Bats: A Case Study with Rhinolophus mehelyi

    PubMed Central

    Puechmaille, Sébastien J.; Borissov, Ivailo M.; Zsebok, Sándor; Allegrini, Benjamin; Hizem, Mohammed; Kuenzel, Sven; Schuchmann, Maike; Teeling, Emma C.

    2014-01-01

    Animals employ an array of signals (i.e. visual, acoustic, olfactory) for communication. Natural selection favours signals, receptors, and signalling behaviour that optimise the received signal relative to background noise. When the signal is used for more than one function, antagonisms amongst the different signalling functions may constrain the optimisation of the signal for any one function. Sexual selection through mate choice can strongly modify the effects of natural selection on signalling systems ultimately causing maladaptive signals to evolve. Echolocating bats represent a fascinating group in which to study the evolution of signalling systems as unlike bird songs or frog calls, echolocation has a dual role in foraging and communication. The function of bat echolocation is to generate echoes that the calling bat uses for orientation and food detection with call characteristics being directly related to the exploitation of particular ecological niches. Therefore, it is commonly assumed that echolocation has been shaped by ecology via natural selection. Here we demonstrate for the first time using a novel combined behavioural, ecological and genetic approach that in a bat species, Rhinolophus mehelyi: (1) echolocation peak frequency is an honest signal of body size; (2) females preferentially select males with high frequency calls during the mating season; (3) high frequency males sire more off-spring, providing evidence that echolocation calls may play a role in female mate choice. Our data refute the sole role of ecology in the evolution of echolocation and highlight the antagonistic interplay between natural and sexual selection in shaping acoustic signals. PMID:25075972

  14. Female mate choice can drive the evolution of high frequency echolocation in bats: a case study with Rhinolophus mehelyi.

    PubMed

    Puechmaille, Sébastien J; Borissov, Ivailo M; Zsebok, Sándor; Allegrini, Benjamin; Hizem, Mohammed; Kuenzel, Sven; Schuchmann, Maike; Teeling, Emma C; Siemers, Björn M

    2014-01-01

    Animals employ an array of signals (i.e. visual, acoustic, olfactory) for communication. Natural selection favours signals, receptors, and signalling behaviour that optimise the received signal relative to background noise. When the signal is used for more than one function, antagonisms amongst the different signalling functions may constrain the optimisation of the signal for any one function. Sexual selection through mate choice can strongly modify the effects of natural selection on signalling systems ultimately causing maladaptive signals to evolve. Echolocating bats represent a fascinating group in which to study the evolution of signalling systems as unlike bird songs or frog calls, echolocation has a dual role in foraging and communication. The function of bat echolocation is to generate echoes that the calling bat uses for orientation and food detection with call characteristics being directly related to the exploitation of particular ecological niches. Therefore, it is commonly assumed that echolocation has been shaped by ecology via natural selection. Here we demonstrate for the first time using a novel combined behavioural, ecological and genetic approach that in a bat species, Rhinolophus mehelyi: (1) echolocation peak frequency is an honest signal of body size; (2) females preferentially select males with high frequency calls during the mating season; (3) high frequency males sire more off-spring, providing evidence that echolocation calls may play a role in female mate choice. Our data refute the sole role of ecology in the evolution of echolocation and highlight the antagonistic interplay between natural and sexual selection in shaping acoustic signals.

  15. Optimising operational amplifiers by evolutionary algorithms and gm/Id method

    NASA Astrophysics Data System (ADS)

    Tlelo-Cuautle, E.; Sanabria-Borbon, A. C.

    2016-10-01

    The evolutionary algorithm called non-dominated sorting genetic algorithm (NSGA-II) is applied herein in the optimisation of operational transconductance amplifiers. NSGA-II is accelerated by applying the gm/Id method to estimate reduced search spaces associated to widths (W) and lengths (L) of the metal-oxide-semiconductor field-effect-transistor (MOSFETs), and to guarantee their appropriate bias levels conditions. In addition, we introduce an integer encoding for the W/L sizes of the MOSFETs to avoid a post-processing step for rounding-off their values to be multiples of the integrated circuit fabrication technology. Finally, from the feasible solutions generated by NSGA-II, we introduce a second optimisation stage to guarantee that the final feasible W/L sizes solutions support process, voltage and temperature (PVT) variations. The optimisation results lead us to conclude that the gm/Id method and integer encoding are quite useful to accelerate the convergence of the evolutionary algorithm NSGA-II, while the second optimisation stage guarantees robustness of the feasible solutions to PVT variations.

  16. Optimisation of active suspension control inputs for improved vehicle handling performance

    NASA Astrophysics Data System (ADS)

    Čorić, Mirko; Deur, Joško; Kasać, Josip; Tseng, H. Eric; Hrovat, Davor

    2016-11-01

    Active suspension is commonly considered under the framework of vertical vehicle dynamics control aimed at improvements in ride comfort. This paper uses a collocation-type control variable optimisation tool to investigate to which extent the fully active suspension (FAS) application can be broaden to the task of vehicle handling/cornering control. The optimisation approach is firstly applied to solely FAS actuator configurations and three types of double lane-change manoeuvres. The obtained optimisation results are used to gain insights into different control mechanisms that are used by FAS to improve the handling performance in terms of path following error reduction. For the same manoeuvres the FAS performance is compared with the performance of different active steering and active differential actuators. The optimisation study is finally extended to combined FAS and active front- and/or rear-steering configurations to investigate if they can use their complementary control authorities (over the vertical and lateral vehicle dynamics, respectively) to further improve the handling performance.

  17. DryLab® optimised two-dimensional high performance liquid chromatography for differentiation of ephedrine and pseudoephedrine based methamphetamine samples.

    PubMed

    Andrighetto, Luke M; Stevenson, Paul G; Pearson, James R; Henderson, Luke C; Conlan, Xavier A

    2014-11-01

    In-silico optimised two-dimensional high performance liquid chromatographic (2D-HPLC) separations of a model methamphetamine seizure sample are described, where an excellent match between simulated and real separations was observed. Targeted separation of model compounds was completed with significantly reduced method development time. This separation was completed in the heart-cutting mode of 2D-HPLC where C18 columns were used in both dimensions taking advantage of the selectivity difference of methanol and acetonitrile as the mobile phases. This method development protocol is most significant when optimising the separation of chemically similar chemical compounds as it eliminates potentially hours of trial and error injections to identify the optimised experimental conditions. After only four screening injections the gradient profile for both 2D-HPLC dimensions could be optimised via simulations, ensuring the baseline resolution of diastereomers (ephedrine and pseudoephedrine) in 9.7 min. Depending on which diastereomer is present the potential synthetic pathway can be categorized.

  18. Shape Optimisation of Holes in Loaded Plates by Minimisation of Multiple Stress Peaks

    DTIC Science & Technology

    2015-04-01

    UNCLASSIFIED UNCLASSIFIED Shape Optimisation of Holes in Loaded Plates by Minimisation of Multiple Stress Peaks Witold Waldman and Manfred...minimising the peak tangential stresses on multiple segments around the boundary of a hole in a uniaxially-loaded or biaxially-loaded plate . It is based...RELEASE UNCLASSIFIED UNCLASSIFIED Shape Optimisation of Holes in Loaded Plates by Minimisation of Multiple Stress Peaks Executive Summary Aerospace

  19. Navigating catastrophes: Local but not global optimisation allows for macro-economic navigation of crises

    NASA Astrophysics Data System (ADS)

    Harré, Michael S.

    2013-02-01

    Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.

  20. Optimisation techniques in vaginal cuff brachytherapy.

    PubMed

    Tuncel, N; Garipagaoglu, M; Kizildag, A U; Andic, F; Toy, A

    2009-11-01

    The aim of this study was to explore whether an in-house dosimetry protocol and optimisation method are able to produce a homogeneous dose distribution in the target volume, and how often optimisation is required in vaginal cuff brachytherapy. Treatment planning was carried out for 109 fractions in 33 patients who underwent high dose rate iridium-192 (Ir(192)) brachytherapy using Fletcher ovoids. Dose prescription and normalisation were performed to catheter-oriented lateral dose points (dps) within a range of 90-110% of the prescribed dose. The in-house vaginal apex point (Vk), alternative vaginal apex point (Vk'), International Commission on Radiation Units and Measurements (ICRU) rectal point (Rg) and bladder point (Bl) doses were calculated. Time-position optimisations were made considering dps, Vk and Rg doses. Keeping the Vk dose higher than 95% and the Rg dose less than 85% of the prescribed dose was intended. Target dose homogeneity, optimisation frequency and the relationship between prescribed dose, Vk, Vk', Rg and ovoid diameter were investigated. The mean target dose was 99+/-7.4% of the prescription dose. Optimisation was required in 92 out of 109 (83%) fractions. Ovoid diameter had a significant effect on Rg (p = 0.002), Vk (p = 0.018), Vk' (p = 0.034), minimum dps (p = 0.021) and maximum dps (p<0.001). Rg, Vk and Vk' doses with 2.5 cm diameter ovoids were significantly higher than with 2 cm and 1.5 cm ovoids. Catheter-oriented dose point normalisation provided a homogeneous dose distribution with a 99+/-7.4% mean dose within the target volume, requiring time-position optimisation.

  1. Effect of preventive (beta blocker) treatment, behavioural migraine management, or their combination on outcomes of optimised acute treatment in frequent migraine: randomised controlled trial.

    PubMed

    Holroyd, Kenneth A; Cottrell, Constance K; O'Donnell, Francis J; Cordingley, Gary E; Drew, Jana B; Carlson, Bruce W; Himawan, Lina

    2010-09-29

    To determine if the addition of preventive drug treatment (β blocker), brief behavioural migraine management, or their combination improves the outcome of optimised acute treatment in the management of frequent migraine. Randomised placebo controlled trial over 16 months from July 2001 to November 2005. Two outpatient sites in Ohio, USA. 232 adults (mean age 38 years; 79% female) with diagnosis of migraine with or without aura according to International Headache Society classification of headache disorders criteria, who recorded at least three migraines with disability per 30 days (mean 5.5 migraines/30 days), during an optimised run-in of acute treatment. Addition of one of four preventive treatments to optimised acute treatment: β blocker (n=53), matched placebo (n=55), behavioural migraine management plus placebo (n=55), or behavioural migraine management plus β blocker (n=69). The primary outcome was change in migraines/30 days; secondary outcomes included change in migraine days/30 days and change in migraine specific quality of life scores. Mixed model analysis showed statistically significant (P≤0.05) differences in outcomes among the four added treatments for both the primary outcome (migraines/30 days) and the two secondary outcomes (change in migraine days/30 days and change in migraine specific quality of life scores). The addition of combined β blocker and behavioural migraine management (-3.3 migraines/30 days, 95% confidence interval -3.2 to -3.5), but not the addition of β blocker alone (-2.1 migraines/30 days, -1.9 to -2.2) or behavioural migraine management alone (-2.2 migraines migraines/30 days, -2.0 to -2.4), improved outcomes compared with optimised acute treatment alone (-2.1 migraines/30 days, -1.9 to -2.2). For a clinically significant (≥50% reduction) in migraines/30 days, the number needed to treat for optimised acute treatment plus combined β blocker and behavioural migraine management was 3.1 compared with optimised acute treatment alone, 2.6 compared with optimised acute treatment plus β blocker, and 3.1 compared with optimised acute treatment plus behavioural migraine management. Results were consistent for the two secondary outcomes, and at both month 10 (the primary endpoint) and month 16. The addition of combined β blocker plus behavioural migraine management, but not the addition of β blocker alone or behavioural migraine management alone, improved outcomes of optimised acute treatment. Combined β blocker treatment and behavioural migraine management may improve outcomes in the treatment of frequent migraine. Clinical trials NCT00910689.

  2. Separable projection integrals for higher-order correlators of the cosmic microwave sky: Acceleration by factors exceeding 100

    NASA Astrophysics Data System (ADS)

    Briggs, J. P.; Pennycook, S. J.; Fergusson, J. R.; Jäykkä, J.; Shellard, E. P. S.

    2016-04-01

    We present a case study describing efforts to optimise and modernise "Modal", the simulation and analysis pipeline used by the Planck satellite experiment for constraining general non-Gaussian models of the early universe via the bispectrum (or three-point correlator) of the cosmic microwave background radiation. We focus on one particular element of the code: the projection of bispectra from the end of inflation to the spherical shell at decoupling, which defines the CMB we observe today. This code involves a three-dimensional inner product between two functions, one of which requires an integral, on a non-rectangular domain containing a sparse grid. We show that by employing separable methods this calculation can be reduced to a one-dimensional summation plus two integrations, reducing the overall dimensionality from four to three. The introduction of separable functions also solves the issue of the non-rectangular sparse grid. This separable method can become unstable in certain scenarios and so the slower non-separable integral must be calculated instead. We present a discussion of the optimisation of both approaches. We demonstrate significant speed-ups of ≈100×, arising from a combination of algorithmic improvements and architecture-aware optimisations targeted at improving thread and vectorisation behaviour. The resulting MPI/OpenMP hybrid code is capable of executing on clusters containing processors and/or coprocessors, with strong-scaling efficiency of 98.6% on up to 16 nodes. We find that a single coprocessor outperforms two processor sockets by a factor of 1.3× and that running the same code across a combination of both microarchitectures improves performance-per-node by a factor of 3.38×. By making bispectrum calculations competitive with those for the power spectrum (or two-point correlator) we are now able to consider joint analysis for cosmological science exploitation of new data.

  3. Investigation of Light Manipulation by the Ultrastructure of Marine Diatoms

    DTIC Science & Technology

    2009-11-13

    added effect of the semiconductor EL emission is to be identified or its function optimised. In other biological organisms, such as insecta...nanopatterned ultrastructures comprising periodic or quasi-periodic spatial variations in refractive index, give rise to strong photonic effects . These... effects are well documented across a broad range of species through many detailed optical studies13-15. A number of them have gone on to inspire

  4. Production, optimisation and characterisation of angiotensin converting enzyme inhibitory peptides from sea cucumber (Stichopus japonicus) gonad.

    PubMed

    Zhong, Chan; Sun, Le-Chang; Yan, Long-Jie; Lin, Yi-Chen; Liu, Guang-Ming; Cao, Min-Jie

    2018-01-24

    In this study, production of bioactive peptides with angiotensin converting enzyme (ACE) inhibitory activity from sea cucumber (Stichopus japonicus) gonad using commercial protamex was optimised by response surface methodology (RSM). As a result, the optimal condition to achieve the highest ACE inhibitory activity in sea cucumber gonad hydrolysate (SCGH) was hydrolysis for 1.95 h and E/S of 0.75%. For further characterisation, three individual peptides (EIYR, LF and NAPHMR) were purified and identified. The peptide NAPHMR showed the highest ACE inhibitory activity with IC 50 of 260.22 ± 3.71 μM. NAPHMR was stable against simulated gastrointestinal digestion and revealed no significant cytotoxicity toward Caco-2 cells. Molecular docking study suggested that Arg, His and Asn residues in NAPHMR interact with the S2 pocket or Zn 2+ binding motifs of ACE via hydrogen or π-bonds, potentially contributing to ACE inhibitory effect. Sea cucumber gonad is thus a potential resource to produce ACE inhibitory peptides for preparation of functional foods.

  5. Image charge multi-role and function detectors

    NASA Astrophysics Data System (ADS)

    Milnes, James; Lapington, Jon S.; Jagutzki, Ottmar; Howorth, Jon

    2009-06-01

    The image charge technique used with microchannel plate imaging tubes provides several operational and practical benefits by serving to isolate the electronic image readout from the detector. The simple dielectric interface between detector and readout provides vacuum isolation and no vacuum electrical feed-throughs are required. Since the readout is mechanically separate from the detector, an image tube of generic design can be simply optimised for various applications by attaching it to different readout devices and electronics. We present imaging performance results using a single image tube with a variety of readout devices suited to differing applications: (a) A four electrode charge division tetra wedge anode, optimised for best spatial resolution in photon counting mode. (b) A cross delay line anode, enabling higher count rate, and the possibility of discriminating near co-incident events, and an event timing resolution of better than 1 ns. (c) A multi-anode readout connected, either to a multi-channel oscilloscope for analogue measurements of fast optical pulses, or alternately, to a multi-channel time correlated single photon counting (TCSPC) card.

  6. Clinical utility of an optimised multiplex real-time PCR assay for the identification of pathogens causing sepsis in Vietnamese patients.

    PubMed

    Tat Trung, Ngo; Van Tong, Hoang; Lien, Tran Thi; Van Son, Trinh; Thanh Huyen, Tran Thi; Quyen, Dao Thanh; Hoan, Phan Quoc; Meyer, Christian G; Song, Le Huu

    2018-02-01

    For the identification of bacterial pathogens, blood culture is still the gold standard diagnostic method. However, several disadvantages apply to blood cultures, such as time and rather large volumes of blood sample required. We have previously established an optimised multiplex real-time PCR method in order to diagnose bloodstream infections. In the present study, we evaluated the diagnostic performance of this optimised multiplex RT-PCR in blood samples collected from 110 septicaemia patients enrolled at the 108 Military Central Hospital, Hanoi, Vietnam. Positive results were obtained by blood culture, the Light Cylcler-based SeptiFast ® assay and our multiplex RT-PCR in 35 (32%), 31 (28%), and 31 (28%) samples, respectively. Combined use of the three methods confirmed 50 (45.5%) positive cases of bloodstream infection, a rate significantly higher compared to the exclusive use of one of the three methods (P=0.052, 0.012 and 0.012, respectively). The sensitivity, specificity and area under the curve (AUC) of our assay were higher compared to that of the SeptiFast ® assay (77.4%, 86.1% and 0.8 vs. 67.7%, 82.3% and 0.73, respectively). Combined use of blood culture and multiplex RT-PCR assay showed a superior diagnostic performance, as the sensitivity, specificity, and AUC reached 83.3%, 100%, and 0.95, respectively. The concordance between blood culture and the multiplex RT-PCR assay was highest for Klebsiella pneumonia (100%), followed by Streptococcus spp. (77.8%), Escherichia coli (66.7%), Staphylococcus spp. (50%) and Salmonella spp. (50%). In addition, the use of the newly established multiplex RT-PCR assay increased the spectrum of identifiable agents (Acintobacter baumannii, 1/32; Proteus mirabilis, 1/32). The combination of culture and the multiplex RT-PCR assay provided an excellent diagnostic accomplishment and significantly supported the identification of causative pathogens in clinical samples obtained from septic patients. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  7. Optimisation of wire-cut EDM process parameter by Grey-based response surface methodology

    NASA Astrophysics Data System (ADS)

    Kumar, Amit; Soota, Tarun; Kumar, Jitendra

    2018-03-01

    Wire electric discharge machining (WEDM) is one of the advanced machining processes. Response surface methodology coupled with Grey relation analysis method has been proposed and used to optimise the machining parameters of WEDM. A face centred cubic design is used for conducting experiments on high speed steel (HSS) M2 grade workpiece material. The regression model of significant factors such as pulse-on time, pulse-off time, peak current, and wire feed is considered for optimising the responses variables material removal rate (MRR), surface roughness and Kerf width. The optimal condition of the machining parameter was obtained using the Grey relation grade. ANOVA is applied to determine significance of the input parameters for optimising the Grey relation grade.

  8. Land-surface parameter optimisation using data assimilation techniques: the adJULES system V1.0

    DOE PAGES

    Raoult, Nina M.; Jupp, Tim E.; Cox, Peter M.; ...

    2016-08-25

    Land-surface models (LSMs) are crucial components of the Earth system models (ESMs) that are used to make coupled climate–carbon cycle projections for the 21st century. The Joint UK Land Environment Simulator (JULES) is the land-surface model used in the climate and weather forecast models of the UK Met Office. JULES is also extensively used offline as a land-surface impacts tool, forced with climatologies into the future. In this study, JULES is automatically differentiated with respect to JULES parameters using commercial software from FastOpt, resulting in an analytical gradient, or adjoint, of the model. Using this adjoint, the adJULES parameter estimationmore » system has been developed to search for locally optimum parameters by calibrating against observations. This paper describes adJULES in a data assimilation framework and demonstrates its ability to improve the model–data fit using eddy-covariance measurements of gross primary production (GPP) and latent heat (LE) fluxes. adJULES also has the ability to calibrate over multiple sites simultaneously. This feature is used to define new optimised parameter values for the five plant functional types (PFTs) in JULES. The optimised PFT-specific parameters improve the performance of JULES at over 85 % of the sites used in the study, at both the calibration and evaluation stages. Furthermore, the new improved parameters for JULES are presented along with the associated uncertainties for each parameter.« less

  9. Optimisation of Over-Expression in E. coli and Biophysical Characterisation of Human Membrane Protein Synaptogyrin 1

    PubMed Central

    Löw, Christian; Jegerschöld, Caroline; Kovermann, Michael; Moberg, Per; Nordlund, Pär

    2012-01-01

    Progress in functional and structural studies of integral membrane proteins (IMPs) is lacking behind their soluble counterparts due to the great challenge in producing stable and homogeneous IMPs. Low natural abundance, toxicity when over-expressed and potential lipid requirements of IMPs are only a few reasons for the limited progress. Here, we describe an optimised workflow for the recombinant over-expression of the human tetraspan vesicle protein (TVP) synaptogyrin in Escherichia coli and its biophysical characterisation. TVPs are ubiquitous and abundant components of vesicles. They are believed to be involved in various aspects of the synaptic vesicle cycle, including vesicle biogenesis, exocytosis and endocytotic recycling. Even though TVPs are found in most cell types, high-resolution structural information for this class of membrane proteins is still missing. The optimisation of the N-terminal sequence of the gene together with the usage of the recently developed Lemo21(DE3) strain which allows the balancing of the translation with the membrane insertion rate led to a 50-fold increased expression rate compared to the classical BL21(DE3) strain. The protein was soluble and stable in a variety of mild detergents and multiple biophysical methods confirmed the folded state of the protein. Crosslinking experiments suggest an oligomeric architecture of at least four subunits. The protein stability is significantly improved in the presence of cholesteryl hemisuccinate as judged by differential light scattering. The approach described here can easily be adapted to other eukaryotic IMPs. PMID:22675529

  10. Text analysis of MEDLINE for discovering functional relationships among genes: evaluation of keyword extraction weighting schemes.

    PubMed

    Liu, Ying; Navathe, Shamkant B; Pivoshenko, Alex; Dasigi, Venu G; Dingledine, Ray; Ciliax, Brian J

    2006-01-01

    One of the key challenges of microarray studies is to derive biological insights from the gene-expression patterns. Clustering genes by functional keyword association can provide direct information about the functional links among genes. However, the quality of the keyword lists significantly affects the clustering results. We compared two keyword weighting schemes: normalised z-score and term frequency-inverse document frequency (TFIDF). Two gene sets were tested to evaluate the effectiveness of the weighting schemes for keyword extraction for gene clustering. Using established measures of cluster quality, the results produced from TFIDF-weighted keywords outperformed those produced from normalised z-score weighted keywords. The optimised algorithms should be useful for partitioning genes from microarray lists into functionally discrete clusters.

  11. Fabrication, characterisation and voltammetric studies of gold amalgam nanoparticle modified electrodes.

    PubMed

    Welch, Christine M; Nekrassova, Olga; Dai, Xuan; Hyde, Michael E; Compton, Richard G

    2004-09-20

    The tabrication, characterisation, and electroanalytical application of gold and gold amalgam nanoparticles on glassy carbon electrodes is examined. Once the deposition parameters for gold nanoparticle electrodes were optimised, the analytical utility of the electrodes was examined in CrIII electroanalysis. It was found that gold nanoparticle modified (Au-NM) electrodes possess higher sensitivity than gold macroelectrodes. In addition, gold amalgam nanoparticle modified (AuHg-NM) electrodes were fabricated and characterised. The response of those electrodes was recorded in the presence of important environmental analytes (heavy metal cations). It was found AuHg-NM electrodes demonstrate a unique voltammetric behaviour and can be applied for electroanalysis when enhanced sensitivity is crucial.

  12. Resonant tunneling based graphene quantum dot memristors.

    PubMed

    Pan, Xuan; Skafidas, Efstratios

    2016-12-08

    In this paper, we model two-terminal all graphene quantum dot (GQD) based resistor-type memory devices (memristors). The resistive switching is achieved by resonant electron tunneling. We show that parallel GQDs can be used to create multi-state memory circuits. The number of states can be optimised with additional voltage sources, whilst the noise margin for each state can be controlled by appropriately choosing the branch resistance. A three-terminal GQD device configuration is also studied. The addition of an isolated gate terminal can be used to add further or modify the states of the memory device. The proposed devices provide a promising route towards volatile memory devices utilizing only atomically thin two-dimensional graphene.

  13. Systemic solutions for multi-benefit water and environmental management.

    PubMed

    Everard, Mark; McInnes, Robert

    2013-09-01

    The environmental and financial costs of inputs to, and unintended consequences arising from narrow consideration of outputs from, water and environmental management technologies highlight the need for low-input solutions that optimise outcomes across multiple ecosystem services. Case studies examining the inputs and outputs associated with several ecosystem-based water and environmental management technologies reveal a range from those that differ little from conventional electro-mechanical engineering techniques through methods, such as integrated constructed wetlands (ICWs), designed explicitly as low-input systems optimising ecosystem service outcomes. All techniques present opportunities for further optimisation of outputs, and hence for greater cumulative public value. We define 'systemic solutions' as "…low-input technologies using natural processes to optimise benefits across the spectrum of ecosystem services and their beneficiaries". They contribute to sustainable development by averting unintended negative impacts and optimising benefits to all ecosystem service beneficiaries, increasing net economic value. Legacy legislation addressing issues in a fragmented way, associated 'ring-fenced' budgets and established management assumptions represent obstacles to implementing 'systemic solutions'. However, flexible implementation of legacy regulations recognising their primary purpose, rather than slavish adherence to detailed sub-clauses, may achieve greater overall public benefit through optimisation of outcomes across ecosystem services. Systemic solutions are not a panacea if applied merely as 'downstream' fixes, but are part of, and a means to accelerate, broader culture change towards more sustainable practice. This necessarily entails connecting a wider network of interests in the formulation and design of mutually-beneficial systemic solutions, including for example spatial planners, engineers, regulators, managers, farming and other businesses, and researchers working on ways to quantify and optimise delivery of ecosystem services. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Topology optimisation of micro fluidic mixers considering fluid-structure interactions with a coupled Lattice Boltzmann algorithm

    NASA Astrophysics Data System (ADS)

    Munk, David J.; Kipouros, Timoleon; Vio, Gareth A.; Steven, Grant P.; Parks, Geoffrey T.

    2017-11-01

    Recently, the study of micro fluidic devices has gained much interest in various fields from biology to engineering. In the constant development cycle, the need to optimise the topology of the interior of these devices, where there are two or more optimality criteria, is always present. In this work, twin physical situations, whereby optimal fluid mixing in the form of vorticity maximisation is accompanied by the requirement that the casing in which the mixing takes place has the best structural performance in terms of the greatest specific stiffness, are considered. In the steady state of mixing this also means that the stresses in the casing are as uniform as possible, thus giving a desired operating life with minimum weight. The ultimate aim of this research is to couple two key disciplines, fluids and structures, into a topology optimisation framework, which shows fast convergence for multidisciplinary optimisation problems. This is achieved by developing a bi-directional evolutionary structural optimisation algorithm that is directly coupled to the Lattice Boltzmann method, used for simulating the flow in the micro fluidic device, for the objectives of minimum compliance and maximum vorticity. The needs for the exploration of larger design spaces and to produce innovative designs make meta-heuristic algorithms, such as genetic algorithms, particle swarms and Tabu Searches, less efficient for this task. The multidisciplinary topology optimisation framework presented in this article is shown to increase the stiffness of the structure from the datum case and produce physically acceptable designs. Furthermore, the topology optimisation method outperforms a Tabu Search algorithm in designing the baffle to maximise the mixing of the two fluids.

  15. Person-centred medicines optimisation policy in England: an agenda for research on polypharmacy.

    PubMed

    Heaton, Janet; Britten, Nicky; Krska, Janet; Reeve, Joanne

    2017-01-01

    Aim To examine how patient perspectives and person-centred care values have been represented in documents on medicines optimisation policy in England. There has been growing support in England for a policy of medicines optimisation as a response to the rise of problematic polypharmacy. Conceptually, medicines optimisation differs from the medicines management model of prescribing in being based around the patient rather than processes and systems. This critical examination of current official and independent policy documents questions how central the patient is in them and whether relevant evidence has been utilised in their development. A documentary analysis of reports on medicines optimisation published by the Royal Pharmaceutical Society (RPS), The King's Fund and National Institute for Health and Social Care Excellence since 2013. The analysis draws on a non-systematic review of research on patient experiences of using medicines. Findings The reports varied in their inclusion of patient perspectives and person-centred care values, and in the extent to which they drew on evidence from research on patients' experiences of polypharmacy and medicines use. In the RPS report, medicines optimisation is represented as being a 'step change' from medicines management, in contrast to the other documents which suggest that it is facilitated by the systems and processes that comprise the latter model. Only The King's Fund report considered evidence from qualitative studies of people's use of medicines. However, these studies are not without their limitations. We suggest five ways in which researchers could improve this evidence base and so inform the development of future policy: by facilitating reviews of existing research; conducting studies of patient experiences of polypharmacy and multimorbidity; evaluating medicines optimisation interventions; making better use of relevant theories, concepts and tools; and improving patient and public involvement in research and in guideline development.

  16. Optimising health care within given budgets: primary prevention of cardiovascular disease in different regions of Sweden.

    PubMed

    Löfroth, Emil; Lindholm, Lars; Wilhelmsen, Lars; Rosén, Måns

    2006-01-01

    This study investigated the consequences of applying strict health maximisation to the choice between three different interventions with a defined budget. We analysed three interventions of preventing cardiovascular diseases, through doctor's advice on smoking secession, through blood-pressure-lowering drugs, and through lipid-lowering drugs. A state transition model has been used to estimate the cost-utility ratios for entire population in three different county councils in Sweden, where the populations were stratified into mutually excluding risk groups. The incremental cost-utility ratios are being presented in a league table and combined with the local resources and the local epidemiological data as a proxy for need for treatment. All interventions with an incremental cost-utility ratio exceeding the threshold ratios are excluded from being funded. The threshold varied between 1687 Euro and 6192 Euro. The general reallocation of resources between the three interventions was a 60% reduction of blood-pressure-lowering drugs with redistribution of resources to advice on smoking secession and to lipid-lowering drugs. One advantage of this method is that the results are very concrete. Recommendations can thereby be more precise which hopefully will create a public debate between decision-makers, practising physicians and patient groups.

  17. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry.

    PubMed

    Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P

    2008-11-30

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.

  18. A brief understanding of process optimisation in microwave-assisted extraction of botanical materials: options and opportunities with chemometric tools.

    PubMed

    Das, Anup Kumar; Mandal, Vivekananda; Mandal, Subhash C

    2014-01-01

    Extraction forms the very basic step in research on natural products for drug discovery. A poorly optimised and planned extraction methodology can jeopardise the entire mission. To provide a vivid picture of different chemometric tools and planning for process optimisation and method development in extraction of botanical material, with emphasis on microwave-assisted extraction (MAE) of botanical material. A review of studies involving the application of chemometric tools in combination with MAE of botanical materials was undertaken in order to discover what the significant extraction factors were. Optimising a response by fine-tuning those factors, experimental design or statistical design of experiment (DoE), which is a core area of study in chemometrics, was then used for statistical analysis and interpretations. In this review a brief explanation of the different aspects and methodologies related to MAE of botanical materials that were subjected to experimental design, along with some general chemometric tools and the steps involved in the practice of MAE, are presented. A detailed study on various factors and responses involved in the optimisation is also presented. This article will assist in obtaining a better insight into the chemometric strategies of process optimisation and method development, which will in turn improve the decision-making process in selecting influential extraction parameters. Copyright © 2013 John Wiley & Sons, Ltd.

  19. Implementation and comparative analysis of the optimisations produced by evolutionary algorithms for the parameter extraction of PSP MOSFET model

    NASA Astrophysics Data System (ADS)

    Hadia, Sarman K.; Thakker, R. A.; Bhatt, Kirit R.

    2016-05-01

    The study proposes an application of evolutionary algorithms, specifically an artificial bee colony (ABC), variant ABC and particle swarm optimisation (PSO), to extract the parameters of metal oxide semiconductor field effect transistor (MOSFET) model. These algorithms are applied for the MOSFET parameter extraction problem using a Pennsylvania surface potential model. MOSFET parameter extraction procedures involve reducing the error between measured and modelled data. This study shows that ABC algorithm optimises the parameter values based on intelligent activities of honey bee swarms. Some modifications have also been applied to the basic ABC algorithm. Particle swarm optimisation is a population-based stochastic optimisation method that is based on bird flocking activities. The performances of these algorithms are compared with respect to the quality of the solutions. The simulation results of this study show that the PSO algorithm performs better than the variant ABC and basic ABC algorithm for the parameter extraction of the MOSFET model; also the implementation of the ABC algorithm is shown to be simpler than that of the PSO algorithm.

  20. Reference voltage calculation method based on zero-sequence component optimisation for a regional compensation DVR

    NASA Astrophysics Data System (ADS)

    Jian, Le; Cao, Wang; Jintao, Yang; Yinge, Wang

    2018-04-01

    This paper describes the design of a dynamic voltage restorer (DVR) that can simultaneously protect several sensitive loads from voltage sags in a region of an MV distribution network. A novel reference voltage calculation method based on zero-sequence voltage optimisation is proposed for this DVR to optimise cost-effectiveness in compensation of voltage sags with different characteristics in an ungrounded neutral system. Based on a detailed analysis of the characteristics of voltage sags caused by different types of faults and the effect of the wiring mode of the transformer on these characteristics, the optimisation target of the reference voltage calculation is presented with several constraints. The reference voltages under all types of voltage sags are calculated by optimising the zero-sequence component, which can reduce the degree of swell in the phase-to-ground voltage after compensation to the maximum extent and can improve the symmetry degree of the output voltages of the DVR, thereby effectively increasing the compensation ability. The validity and effectiveness of the proposed method are verified by simulation and experimental results.

  1. Computational aero-acoustics for fan duct propagation and radiation. Current status and application to turbofan liner optimisation

    NASA Astrophysics Data System (ADS)

    Astley, R. J.; Sugimoto, R.; Mustafi, P.

    2011-08-01

    Novel techniques are presented to reduce noise from turbofan aircraft engines by optimising the acoustic treatment in engine ducts. The application of Computational Aero-Acoustics (CAA) to predict acoustic propagation and absorption in turbofan ducts is reviewed and a critical assessment of performance indicates that validated and accurate techniques are now available for realistic engine predictions. A procedure for integrating CAA methods with state of the art optimisation techniques is proposed in the remainder of the article. This is achieved by embedding advanced computational methods for noise prediction within automated and semi-automated optimisation schemes. Two different strategies are described and applied to realistic nacelle geometries and fan sources to demonstrate the feasibility of this approach for industry scale problems.

  2. The utility of estimating population-level trajectories of terminal wellbeing decline within a growth mixture modelling framework.

    PubMed

    Burns, R A; Byles, J; Magliano, D J; Mitchell, P; Anstey, K J

    2015-03-01

    Mortality-related decline has been identified across multiple domains of human functioning, including mental health and wellbeing. The current study utilised a growth mixture modelling framework to establish whether a single population-level trajectory best describes mortality-related changes in both wellbeing and mental health, or whether subpopulations report quite different mortality-related changes. Participants were older-aged (M = 69.59 years; SD = 8.08 years) deceased females (N = 1,862) from the dynamic analyses to optimise ageing (DYNOPTA) project. Growth mixture models analysed participants' responses on measures of mental health and wellbeing for up to 16 years from death. Multi-level models confirmed overall terminal decline and terminal drop in both mental health and wellbeing. However, modelling data from the same participants within a latent class growth mixture framework indicated that most participants reported stability in mental health (90.3 %) and wellbeing (89.0 %) in the years preceding death. Whilst confirming other population-level analyses which support terminal decline and drop hypotheses in both mental health and wellbeing, we subsequently identified that most of this effect is driven by a small, but significant minority of the population. Instead, most individuals report stable levels of mental health and wellbeing in the years preceding death.

  3. Is ICRP guidance on the use of reference levels consistent?

    PubMed

    Hedemann-Jensen, Per; McEwan, Andrew C

    2011-12-01

    In ICRP 103, which has replaced ICRP 60, it is stated that no fundamental changes have been introduced compared with ICRP 60. This is true except that the application of reference levels in emergency and existing exposure situations seems to be applied inconsistently, and also in the related publications ICRP 109 and ICRP 111. ICRP 103 emphasises that focus should be on the residual doses after the implementation of protection strategies in emergency and existing exposure situations. If possible, the result of an optimised protection strategy should bring the residual dose below the reference level. Thus the reference level represents the maximum acceptable residual dose after an optimised protection strategy has been implemented. It is not an 'off-the-shelf item' that can be set free of the prevailing situation. It should be determined as part of the process of optimising the protection strategy. If not, protection would be sub-optimised. However, in ICRP 103 some inconsistent concepts have been introduced, e.g. in paragraph 279 which states: 'All exposures above or below the reference level should be subject to optimisation of protection, and particular attention should be given to exposures above the reference level'. If, in fact, all exposures above and below reference levels are subject to the process of optimisation, reference levels appear superfluous. It could be considered that if optimisation of protection below a fixed reference level is necessary, then the reference level has been set too high at the outset. Up until the last phase of the preparation of ICRP 103 the concept of a dose constraint was recommended to constrain the optimisation of protection in all types of exposure situations. In the final phase, the term 'dose constraint' was changed to 'reference level' for emergency and existing exposure situations. However, it seems as if in ICRP 103 it was not fully recognised that dose constraints and reference levels are conceptually different. The use of reference levels in radiological protection is reviewed. It is concluded that the recommendations in ICRP 103 and related ICRP publications seem to be inconsistent regarding the use of reference levels in existing and emergency exposure situations.

  4. OzPythonPlex: An optimised forensic STR multiplex assay set for the Australasian carpet python (Morelia spilota).

    PubMed

    Ciavaglia, Sherryn; Linacre, Adrian

    2018-05-01

    Reptile species, and in particular snakes, are protected by national and international agreements yet are commonly handled illegally. To aid in the enforcement of such legislation, we report on the development of three 11-plex assays from the genome of the carpet python to type 24 loci of tetra-nucleotide and penta-nucleotide repeat motifs (pure, compound and complex included). The loci range in size between 70 and 550 bp. Seventeen of the loci are newly characterised with the inclusion of seven previously developed loci to facilitate cross-comparison with previous carpet python genotyping studies. Assays were optimised in accordance with human forensic profiling kits using one nanogram template DNA. Three loci are included in all three of the multiplex reactions as quality assurance markers, to ensure sample identity and genotyping accuracy is maintained across the three profiling assays. Allelic ladders have been developed for the three assays to ensure consistent and precise allele designation. A DNA reference database of allele frequencies is presented based on 249 samples collected from throughout the species native range. A small number of validation tests are conducted to demonstrate the utility of these multiplex assays. We suggest further appropriate validation tests that should be conducted prior to the application of the multiplex assays in criminal investigations involving carpet pythons. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Employing Solid Phase Microextraction as Extraction Tool for Pesticide Residues in Traditional Medicinal Plants

    PubMed Central

    Gondo, Thamani T.; Mmualefe, Lesego C.; Okatch, Harriet

    2016-01-01

    HS-SPME was optimised using blank plant sample for analysis of organochlorine pesticides (OCPs) of varying polarities in selected medicinal plants obtained from northern part of Botswana, where OCPs such as DDT and endosulfan have been historically applied to control disease carrying vectors (mosquitos and tsetse fly). The optimised SPME parameters were used to isolate analytes from root samples of five medicinal plants obtained from Maun and Kasane, Botswana. The final analytes determination was done with a gas chromatograph equipped with GC-ECD and analyte was confirmed using electron ionisation mass spectrometer (GC-MS). Dieldrin was the only pesticide detected and confirmed with MS in the Terminalia sericea sample obtained from Kasane. The method was validated and the analyte recoveries ranged from 69.58 ± 7.20 to 113 ± 15.44%, with RSDs ranging from 1.19 to 17.97%. The method indicated good linearity (R 2 > 0.9900) in the range of 2 to 100 ng g−1. The method also proved to be sensitive with low limits of detection (LODs) ranging from 0.48 ± 0.16 to 1.50 ± 0.50 ng g−1. It can be concluded that SPME was successfully utilized as a sampling and extraction tool for pesticides of diverse polarities in root samples of medicinal plants. PMID:27725893

  6. Optimisation of Embryonic and Larval ECG Measurement in Zebrafish for Quantifying the Effect of QT Prolonging Drugs

    PubMed Central

    Dhillon, Sundeep Singh; Dóró, Éva; Magyary, István; Egginton, Stuart; Sík, Attila; Müller, Ferenc

    2013-01-01

    Effective chemical compound toxicity screening is of paramount importance for safe cardiac drug development. Using mammals in preliminary screening for detection of cardiac dysfunction by electrocardiography (ECG) is costly and requires a large number of animals. Alternatively, zebrafish embryos can be used as the ECG waveform is similar to mammals, a minimal amount of chemical is necessary for drug testing, while embryos are abundant, inexpensive and represent replacement in animal research with reduced bioethical concerns. We demonstrate here the utility of pre-feeding stage zebrafish larvae in detection of cardiac dysfunction by electrocardiography. We have optimised an ECG recording system by addressing key parameters such as the form of immobilization, recording temperature, electrode positioning and developmental age. Furthermore, analysis of 3 days post fertilization (dpf) zebrafish embryos treated with known QT prolonging drugs such as terfenadine, verapamil and haloperidol led to reproducible detection of QT prolongation as previously shown for adult zebrafish. In addition, calculation of Z-factor scores revealed that the assay was sensitive and specific enough to detect large drug-induced changes in QTc intervals. Thus, the ECG recording system is a useful drug-screening tool to detect alteration to cardiac cycle components and secondary effects such as heart block and arrhythmias in zebrafish larvae before free feeding stage, and thus provides a suitable replacement for mammalian experimentation. PMID:23579446

  7. Comparison of two optimization algorithms for fuzzy finite element model updating for damage detection in a wind turbine blade

    NASA Astrophysics Data System (ADS)

    Turnbull, Heather; Omenzetter, Piotr

    2018-03-01

    vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.

  8. Whole-brain high in-plane resolution fMRI using accelerated EPIK for enhanced characterisation of functional areas at 3T

    PubMed Central

    Yun, Seong Dae

    2017-01-01

    The relatively high imaging speed of EPI has led to its widespread use in dynamic MRI studies such as functional MRI. An approach to improve the performance of EPI, EPI with Keyhole (EPIK), has been previously presented and its use in fMRI was verified at 1.5T as well as 3T. The method has been proven to achieve a higher temporal resolution and smaller image distortions when compared to single-shot EPI. Furthermore, the performance of EPIK in the detection of functional signals was shown to be comparable to that of EPI. For these reasons, we were motivated to employ EPIK here for high-resolution imaging. The method was optimised to offer the highest possible in-plane resolution and slice coverage under the given imaging constraints: fixed TR/TE, FOV and acceleration factors for parallel imaging and partial Fourier techniques. The performance of EPIK was evaluated in direct comparison to the optimised protocol obtained from EPI. The two imaging methods were applied to visual fMRI experiments involving sixteen subjects. The results showed that enhanced spatial resolution with a whole-brain coverage was achieved by EPIK (1.00 mm × 1.00 mm; 32 slices) when compared to EPI (1.25 mm × 1.25 mm; 28 slices). As a consequence, enhanced characterisation of functional areas has been demonstrated in EPIK particularly for relatively small brain regions such as the lateral geniculate nucleus (LGN) and superior colliculus (SC); overall, a significantly increased t-value and activation area were observed from EPIK data. Lastly, the use of EPIK for fMRI was validated with the simulation of different types of data reconstruction methods. PMID:28945780

  9. A new early cognitive screening measure to detect cognitive side-effects of electroconvulsive therapy?

    PubMed

    Martin, Donel M; Katalinic, Natalie; Ingram, Anna; Schweitzer, Isaac; Smith, Deidre J; Hadzi-Pavlovic, Dusan; Loo, Colleen K

    2013-12-01

    Cognitive side-effects from electroconvulsive therapy (ECT) can be distressing for patients and early detection may have an important role in guiding treatment decisions over the ECT course. This prospective study examined the utility of an early cognitive screening battery for predicting cognitive side-effects which develop later in the ECT course. The screening battery, together with the Mini Mental Status Examination (MMSE), was administered to 123 patients at baseline and after 3 ECT treatments. A more detailed cognitive battery was administered at baseline, after six treatments (post ECT 6) and after the last ECT treatment (post treatment) to assess cognitive side-effects across several domains: global cognition, anterograde memory, executive function, speed and concentration, and retrograde memory. Multivariate analyses examined the predictive utility of change on items from the screening battery for later cognitive changes at post ECT 6 and post treatment. Results showed that changes on a combination of items from the screening battery were predictive of later cognitive changes at post treatment, particularly for anterograde memory (p < 0.01), after controlling for patient and treatment factors. Change on the MMSE predicted cognitive changes at post ECT 6 but not at post treatment. A scoring method for the new screening battery was tested for discriminative ability in a sub-sample of patients. This study provides preliminary evidence that a simple and easy-to-administer measure may potentially be used to help guide clinical treatment decisions to optimise efficacy and cognitive outcomes. Further development of this measure and validation in a more representative ECT clinical population is required. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Optimisation of a Generic Ionic Model of Cardiac Myocyte Electrical Activity

    PubMed Central

    Guo, Tianruo; Al Abed, Amr; Lovell, Nigel H.; Dokos, Socrates

    2013-01-01

    A generic cardiomyocyte ionic model, whose complexity lies between a simple phenomenological formulation and a biophysically detailed ionic membrane current description, is presented. The model provides a user-defined number of ionic currents, employing two-gate Hodgkin-Huxley type kinetics. Its generic nature allows accurate reconstruction of action potential waveforms recorded experimentally from a range of cardiac myocytes. Using a multiobjective optimisation approach, the generic ionic model was optimised to accurately reproduce multiple action potential waveforms recorded from central and peripheral sinoatrial nodes and right atrial and left atrial myocytes from rabbit cardiac tissue preparations, under different electrical stimulus protocols and pharmacological conditions. When fitted simultaneously to multiple datasets, the time course of several physiologically realistic ionic currents could be reconstructed. Model behaviours tend to be well identified when extra experimental information is incorporated into the optimisation. PMID:23710254

  11. Load-sensitive dynamic workflow re-orchestration and optimisation for faster patient healthcare.

    PubMed

    Meli, Christopher L; Khalil, Ibrahim; Tari, Zahir

    2014-01-01

    Hospital waiting times are considerably long, with no signs of reducing any-time soon. A number of factors including population growth, the ageing population and a lack of new infrastructure are expected to further exacerbate waiting times in the near future. In this work, we show how healthcare services can be modelled as queueing nodes, together with healthcare service workflows, such that these workflows can be optimised during execution in order to reduce patient waiting times. Services such as X-ray, computer tomography, and magnetic resonance imaging often form queues, thus, by taking into account the waiting times of each service, the workflow can be re-orchestrated and optimised. Experimental results indicate average waiting time reductions are achievable by optimising workflows using dynamic re-orchestration. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Robustness analysis of bogie suspension components Pareto optimised values

    NASA Astrophysics Data System (ADS)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  13. A new empirical potential energy function for Ar2

    NASA Astrophysics Data System (ADS)

    Myatt, Philip T.; Dham, Ashok K.; Chandrasekhar, Pragna; McCourt, Frederick R. W.; Le Roy, Robert J.

    2018-06-01

    A critical re-analysis of all available spectroscopic and virial coefficient data for Ar2 has been used to determine an improved empirical analytic potential energy function that has been 'tuned' to optimise its agreement with viscosity, diffusion and thermal diffusion data, and whose short-range behaviour is in reasonably good agreement with the most recent ab initio calculations for this system. The recommended Morse/long-range potential function is smooth and differentiable at all distances, and incorporates both the correct theoretically predicted long-range behaviour and the correct limiting short-range functional behaviour. The resulting value of the well depth is ? cm-1 and the associated equilibrium distance is re = 3.766 (±0.002) Å, while the 40Ar s-wave scattering length is -714 Å.

  14. Reservoir optimisation using El Niño information. Case study of Daule Peripa (Ecuador)

    NASA Astrophysics Data System (ADS)

    Gelati, Emiliano; Madsen, Henrik; Rosbjerg, Dan

    2010-05-01

    The optimisation of water resources systems requires the ability to produce runoff scenarios that are consistent with available climatic information. We approach stochastic runoff modelling with a Markov-modulated autoregressive model with exogenous input, which belongs to the class of Markov-switching models. The model assumes runoff parameterisation to be conditioned on a hidden climatic state following a Markov chain, whose state transition probabilities depend on climatic information. This approach allows stochastic modeling of non-stationary runoff, as runoff anomalies are described by a mixture of autoregressive models with exogenous input, each one corresponding to a climate state. We calibrate the model on the inflows of the Daule Peripa reservoir located in western Ecuador, where the occurrence of El Niño leads to anomalously heavy rainfall caused by positive sea surface temperature anomalies along the coast. El Niño - Southern Oscillation (ENSO) information is used to condition the runoff parameterisation. Inflow predictions are realistic, especially at the occurrence of El Niño events. The Daule Peripa reservoir serves a hydropower plant and a downstream water supply facility. Using historical ENSO records, synthetic monthly inflow scenarios are generated for the period 1950-2007. These scenarios are used as input to perform stochastic optimisation of the reservoir rule curves with a multi-objective Genetic Algorithm (MOGA). The optimised rule curves are assumed to be the reservoir base policy. ENSO standard indices are currently forecasted at monthly time scale with nine-month lead time. These forecasts are used to perform stochastic optimisation of reservoir releases at each monthly time step according to the following procedure: (i) nine-month inflow forecast scenarios are generated using ENSO forecasts; (ii) a MOGA is set up to optimise the upcoming nine monthly releases; (iii) the optimisation is carried out by simulating the releases on the inflow forecasts, and by applying the base policy on a subsequent synthetic inflow scenario in order to account for long-term costs; (iv) the optimised release for the first month is implemented; (v) the state of the system is updated and (i), (ii), (iii), and (iv) are iterated for the following time step. The results highlight the advantages of using a climate-driven stochastic model to produce inflow scenarios and forecasts for reservoir optimisation, showing potential improvements with respect to the current management. Dynamic programming was used to find the best possible release time series given the inflow observations, in order to benchmark any possible operational improvement.

  15. On the dynamic rounding-off in analogue and RF optimal circuit sizing

    NASA Astrophysics Data System (ADS)

    Kotti, Mouna; Fakhfakh, Mourad; Fino, Maria Helena

    2014-04-01

    Frequently used approaches to solve discrete multivariable optimisation problems consist of computing solutions using a continuous optimisation technique. Then, using heuristics, the variables are rounded-off to their nearest available discrete values to obtain a discrete solution. Indeed, in many engineering problems, and particularly in analogue circuit design, component values, such as the geometric dimensions of the transistors, the number of fingers in an integrated capacitor or the number of turns in an integrated inductor, cannot be chosen arbitrarily since they have to obey to some technology sizing constraints. However, rounding-off the variables values a posteriori and can lead to infeasible solutions (solutions that are located too close to the feasible solution frontier) or degradation of the obtained results (expulsion from the neighbourhood of a 'sharp' optimum) depending on how the added perturbation affects the solution. Discrete optimisation techniques, such as the dynamic rounding-off technique (DRO) are, therefore, needed to overcome the previously mentioned situation. In this paper, we deal with an improvement of the DRO technique. We propose a particle swarm optimisation (PSO)-based DRO technique, and we show, via some analog and RF-examples, the necessity to implement such a routine into continuous optimisation algorithms.

  16. PGA/MOEAD: a preference-guided evolutionary algorithm for multi-objective decision-making problems with interval-valued fuzzy preferences

    NASA Astrophysics Data System (ADS)

    Luo, Bin; Lin, Lin; Zhong, ShiSheng

    2018-02-01

    In this research, we propose a preference-guided optimisation algorithm for multi-criteria decision-making (MCDM) problems with interval-valued fuzzy preferences. The interval-valued fuzzy preferences are decomposed into a series of precise and evenly distributed preference-vectors (reference directions) regarding the objectives to be optimised on the basis of uniform design strategy firstly. Then the preference information is further incorporated into the preference-vectors based on the boundary intersection approach, meanwhile, the MCDM problem with interval-valued fuzzy preferences is reformulated into a series of single-objective optimisation sub-problems (each sub-problem corresponds to a decomposed preference-vector). Finally, a preference-guided optimisation algorithm based on MOEA/D (multi-objective evolutionary algorithm based on decomposition) is proposed to solve the sub-problems in a single run. The proposed algorithm incorporates the preference-vectors within the optimisation process for guiding the search procedure towards a more promising subset of the efficient solutions matching the interval-valued fuzzy preferences. In particular, lots of test instances and an engineering application are employed to validate the performance of the proposed algorithm, and the results demonstrate the effectiveness and feasibility of the algorithm.

  17. Optimisation of active suspension control inputs for improved performance of active safety systems

    NASA Astrophysics Data System (ADS)

    Čorić, Mirko; Deur, Joško; Xu, Li; Tseng, H. Eric; Hrovat, Davor

    2018-01-01

    A collocation-type control variable optimisation method is used to investigate the extent to which the fully active suspension (FAS) can be applied to improve the vehicle electronic stability control (ESC) performance and reduce the braking distance. First, the optimisation approach is applied to the scenario of vehicle stabilisation during the sine-with-dwell manoeuvre. The results are used to provide insights into different FAS control mechanisms for vehicle performance improvements related to responsiveness and yaw rate error reduction indices. The FAS control performance is compared to performances of the standard ESC system, optimal active brake system and combined FAS and ESC configuration. Second, the optimisation approach is employed to the task of FAS-based braking distance reduction for straight-line vehicle motion. Here, the scenarios of uniform and longitudinally or laterally non-uniform tyre-road friction coefficient are considered. The influences of limited anti-lock braking system (ABS) actuator bandwidth and limit-cycle ABS behaviour are also analysed. The optimisation results indicate that the FAS can provide competitive stabilisation performance and improved agility when compared to the ESC system, and that it can reduce the braking distance by up to 5% for distinctively non-uniform friction conditions.

  18. A New Multiconstraint Method for Determining the Optimal Cable Stresses in Cable-Stayed Bridges

    PubMed Central

    Asgari, B.; Osman, S. A.; Adnan, A.

    2014-01-01

    Cable-stayed bridges are one of the most popular types of long-span bridges. The structural behaviour of cable-stayed bridges is sensitive to the load distribution between the girder, pylons, and cables. The determination of pretensioning cable stresses is critical in the cable-stayed bridge design procedure. By finding the optimum stresses in cables, the load and moment distribution of the bridge can be improved. In recent years, different research works have studied iterative and modern methods to find optimum stresses of cables. However, most of the proposed methods have limitations in optimising the structural performance of cable-stayed bridges. This paper presents a multiconstraint optimisation method to specify the optimum cable forces in cable-stayed bridges. The proposed optimisation method produces less bending moments and stresses in the bridge members and requires shorter simulation time than other proposed methods. The results of comparative study show that the proposed method is more successful in restricting the deck and pylon displacements and providing uniform deck moment distribution than unit load method (ULM). The final design of cable-stayed bridges can be optimised considerably through proposed multiconstraint optimisation method. PMID:25050400

  19. A new multiconstraint method for determining the optimal cable stresses in cable-stayed bridges.

    PubMed

    Asgari, B; Osman, S A; Adnan, A

    2014-01-01

    Cable-stayed bridges are one of the most popular types of long-span bridges. The structural behaviour of cable-stayed bridges is sensitive to the load distribution between the girder, pylons, and cables. The determination of pretensioning cable stresses is critical in the cable-stayed bridge design procedure. By finding the optimum stresses in cables, the load and moment distribution of the bridge can be improved. In recent years, different research works have studied iterative and modern methods to find optimum stresses of cables. However, most of the proposed methods have limitations in optimising the structural performance of cable-stayed bridges. This paper presents a multiconstraint optimisation method to specify the optimum cable forces in cable-stayed bridges. The proposed optimisation method produces less bending moments and stresses in the bridge members and requires shorter simulation time than other proposed methods. The results of comparative study show that the proposed method is more successful in restricting the deck and pylon displacements and providing uniform deck moment distribution than unit load method (ULM). The final design of cable-stayed bridges can be optimised considerably through proposed multiconstraint optimisation method.

  20. Optimisation and validation of a rapid and efficient microemulsion liquid chromatographic (MELC) method for the determination of paracetamol (acetaminophen) content in a suppository formulation.

    PubMed

    McEvoy, Eamon; Donegan, Sheila; Power, Joe; Altria, Kevin

    2007-05-09

    A rapid and efficient oil-in-water microemulsion liquid chromatographic method has been optimised and validated for the analysis of paracetamol in a suppository formulation. Excellent linearity, accuracy, precision and assay results were obtained. Lengthy sample pre-treatment/extraction procedures were eliminated due to the solubilising power of the microemulsion and rapid analysis times were achieved. The method was optimised to achieve rapid analysis time and relatively high peak efficiencies. A standard microemulsion composition of 33 g SDS, 66 g butan-1-ol, 8 g n-octane in 1l of 0.05% TFA modified with acetonitrile has been shown to be suitable for the rapid analysis of paracetamol in highly hydrophobic preparations under isocratic conditions. Validated assay results and overall analysis time of the optimised method was compared to British Pharmacopoeia reference methods. Sample preparation and analysis times for the MELC analysis of paracetamol in a suppository were extremely rapid compared to the reference method and similar assay results were achieved. A gradient MELC method using the same microemulsion has been optimised for the resolution of paracetamol and five of its related substances in approximately 7 min.

  1. Optimisation of the supercritical extraction of toxic elements in fish oil.

    PubMed

    Hajeb, P; Jinap, S; Shakibazadeh, Sh; Afsah-Hejri, L; Mohebbi, G H; Zaidul, I S M

    2014-01-01

    This study aims to optimise the operating conditions for the supercritical fluid extraction (SFE) of toxic elements from fish oil. The SFE operating parameters of pressure, temperature, CO2 flow rate and extraction time were optimised using a central composite design (CCD) of response surface methodology (RSM). High coefficients of determination (R²) (0.897-0.988) for the predicted response surface models confirmed a satisfactory adjustment of the polynomial regression models with the operation conditions. The results showed that the linear and quadratic terms of pressure and temperature were the most significant (p < 0.05) variables affecting the overall responses. The optimum conditions for the simultaneous elimination of toxic elements comprised a pressure of 61 MPa, a temperature of 39.8ºC, a CO₂ flow rate of 3.7 ml min⁻¹ and an extraction time of 4 h. These optimised SFE conditions were able to produce fish oil with the contents of lead, cadmium, arsenic and mercury reduced by up to 98.3%, 96.1%, 94.9% and 93.7%, respectively. The fish oil extracted under the optimised SFE operating conditions was of good quality in terms of its fatty acid constituents.

  2. A statistical approach to determine fluxapyroxad and its three metabolites in soils, sediment and sludge based on a combination of chemometric tools and a modified quick, easy, cheap, effective, rugged and safe method.

    PubMed

    Li, Shasha; Liu, Xingang; Zhu, Yulong; Dong, Fengshou; Xu, Jun; Li, Minmin; Zheng, Yongquan

    2014-09-05

    An effective method for the quantification of fluxapyroxad and its three metabolites in soils, sediment and sludge was developed using ultrahigh performance chromatography coupled with tandem mass spectrometry (UHPLC-MS/MS). Both the extraction and clean-up steps of the QuEChERS procedure were optimised using a chemometric tool, which was expected to facilitate the rapid analysis with minimal procedures. Several operating parameters (MeCN/acetic acid ratio in the extraction solution (i.e., acetic acid percentage), water volume, extraction time, PSA amount, C18 amount, and GCB amount) were investigated using a Plackett-Burman (P-B) screening design. Afterward, the significant factors (acetic acid percentage, water volume, and PSA amount) obtained were optimised using central composite design (CCD) combined with the desirability function (DF) to determine the optimum experimental conditions. The optimised procedure provides high-level linearity for all studied compounds with correlation coefficients ranging between 0.9972 and 0.9999. The detection limits were in the range of 0.1 to 1.0μg/kg and the limits of quantitation (LOQs) were between 0.5 and 3.4μg/kg with relative standard deviations (RSD) between 2.3% and 9.6% (n=6). Therefore, the developed protocol can serve as a simple and sensitive tool for monitoring fluxapyroxad and its three metabolites in soil, sediment and sludge samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Selecting a climate model subset to optimise key ensemble properties

    NASA Astrophysics Data System (ADS)

    Herger, Nadja; Abramowitz, Gab; Knutti, Reto; Angélil, Oliver; Lehmann, Karsten; Sanderson, Benjamin M.

    2018-02-01

    End users studying impacts and risks caused by human-induced climate change are often presented with large multi-model ensembles of climate projections whose composition and size are arbitrarily determined. An efficient and versatile method that finds a subset which maintains certain key properties from the full ensemble is needed, but very little work has been done in this area. Therefore, users typically make their own somewhat subjective subset choices and commonly use the equally weighted model mean as a best estimate. However, different climate model simulations cannot necessarily be regarded as independent estimates due to the presence of duplicated code and shared development history. Here, we present an efficient and flexible tool that makes better use of the ensemble as a whole by finding a subset with improved mean performance compared to the multi-model mean while at the same time maintaining the spread and addressing the problem of model interdependence. Out-of-sample skill and reliability are demonstrated using model-as-truth experiments. This approach is illustrated with one set of optimisation criteria but we also highlight the flexibility of cost functions, depending on the focus of different users. The technique is useful for a range of applications that, for example, minimise present-day bias to obtain an accurate ensemble mean, reduce dependence in ensemble spread, maximise future spread, ensure good performance of individual models in an ensemble, reduce the ensemble size while maintaining important ensemble characteristics, or optimise several of these at the same time. As in any calibration exercise, the final ensemble is sensitive to the metric, observational product, and pre-processing steps used.

  4. Perspectives on hand function in girls and women with Rett syndrome.

    PubMed

    Downs, Jenny; Parkinson, Stephanie; Ranelli, Sonia; Leonard, Helen; Diener, Pamela; Lotan, Meir

    2014-06-01

    Rett syndrome is a rare neurodevelopmental disorder that is usually associated with a mutation on the X-linked MECP2 gene. Hand function is particularly affected and we discuss theoretical and practical perspectives for optimising hand function in Rett syndrome. We reviewed the literature pertaining to hand function and stereotypies in Rett syndrome and developed a toolkit for their assessment and treatment. There is little published information on management of hand function in Rett syndrome. We suggest assessment and treatment strategies based on available literature, clinical experience and grounded in theories of motor control and motor learning. Additional studies are needed to determine the best treatments for hand function in Rett syndrome. Meanwhile, clinical needs can be addressed by supplementing the evidence base with an understanding of the complexities of Rett syndrome, clinical experience, environmental enrichment animal studies and theories of motor control and motor learning.

  5. Functional Land Management: Bridging the Think-Do-Gap using a multi-stakeholder science policy interface.

    PubMed

    O'Sullivan, Lilian; Wall, David; Creamer, Rachel; Bampa, Francesca; Schulte, Rogier P O

    2018-03-01

    Functional Land Management (FLM) is proposed as an integrator for sustainability policies and assesses the functional capacity of the soil and land to deliver primary productivity, water purification and regulation, carbon cycling and storage, habitat for biodiversity and recycling of nutrients. This paper presents the catchment challenge as a method to bridge the gap between science, stakeholders and policy for the effective management of soils to deliver these functions. Two challenges were completed by a wide range of stakeholders focused around a physical catchment model-(1) to design an optimised catchment based on soil function targets, (2) identify gaps to implementation of the proposed design. In challenge 1, a high level of consensus between different stakeholders emerged on soil and management measures to be implemented to achieve soil function targets. Key gaps including knowledge, a mix of market and voluntary incentives and mandatory measures were identified in challenge 2.

  6. Evolving aerodynamic airfoils for wind turbines through a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Hernández, J. J.; Gómez, E.; Grageda, J. I.; Couder, C.; Solís, A.; Hanotel, C. L.; Ledesma, JI

    2017-01-01

    Nowadays, genetic algorithms stand out for airfoil optimisation, due to the virtues of mutation and crossing-over techniques. In this work we propose a genetic algorithm with arithmetic crossover rules. The optimisation criteria are taken to be the maximisation of both aerodynamic efficiency and lift coefficient, while minimising drag coefficient. Such algorithm shows greatly improvements in computational costs, as well as a high performance by obtaining optimised airfoils for Mexico City's specific wind conditions from generic wind turbines designed for higher Reynolds numbers, in few iterations.

  7. Exemples d’utilisation des techniques d’optimisation en calcul de structures de reacteurs

    DTIC Science & Technology

    2003-03-01

    34~ optimisation g~om~trique (architecture fig~e) A la difference du secteur automobile et des avionneurs, la plupart des composants des r~acteurs n...utilise des lois de comportement mat~riaux non lin~aires ainsi que des hypotheses de grands d~placements. L𔄀tude d’optimisation consiste ý minimiser...un disque simple et d~cid6 de s~lectionner trois param~tes qui influent sur la rupture : 1𔄀paisseur de la toile du disque ElI, la hauteur L3 et la

  8. Self-optimisation and model-based design of experiments for developing a C-H activation flow process.

    PubMed

    Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei

    2017-01-01

    A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  9. [Inter-disciplinary approach of a mobile team specialised in geriatric oncology].

    PubMed

    Benyahia, Stéphanie; Cudennec, Tristan

    2015-01-01

    Ageing is an individual process. Chronological age does not reflect life expectancy or functional capacity. That is why, in geriatric oncology, the estimation of this capacity is a determining factor. An inter-disciplinary approach is necessary in order to coordinate the different players in the care and optimise the hospitalisation of elderly patients with multiple pathologies, all the more so when they are suffering from cancer. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  10. McStas event logger: Definition and applications

    NASA Astrophysics Data System (ADS)

    Bergbäck Knudsen, Erik; Bryndt Klinkby, Esben; Kjær Willendrup, Peter

    2014-02-01

    Functionality is added to the McStas neutron ray-tracing code, which allows individual neutron states before and after a scattering to be temporarily stored, and analysed. This logging mechanism has multiple uses, including studies of longitudinal intensity loss in neutron guides and guide coating design optimisations. Furthermore, the logging method enables the cold/thermal neutron induced gamma background along the guide to be calculated from the un-reflected neutron, using a recently developed MCNPX-McStas interface.

  11. Monotonicity of fitness landscapes and mutation rate control.

    PubMed

    Belavkin, Roman V; Channon, Alastair; Aston, Elizabeth; Aston, John; Krašovec, Rok; Knight, Christopher G

    2016-12-01

    A common view in evolutionary biology is that mutation rates are minimised. However, studies in combinatorial optimisation and search have shown a clear advantage of using variable mutation rates as a control parameter to optimise the performance of evolutionary algorithms. Much biological theory in this area is based on Ronald Fisher's work, who used Euclidean geometry to study the relation between mutation size and expected fitness of the offspring in infinite phenotypic spaces. Here we reconsider this theory based on the alternative geometry of discrete and finite spaces of DNA sequences. First, we consider the geometric case of fitness being isomorphic to distance from an optimum, and show how problems of optimal mutation rate control can be solved exactly or approximately depending on additional constraints of the problem. Then we consider the general case of fitness communicating only partial information about the distance. We define weak monotonicity of fitness landscapes and prove that this property holds in all landscapes that are continuous and open at the optimum. This theoretical result motivates our hypothesis that optimal mutation rate functions in such landscapes will increase when fitness decreases in some neighbourhood of an optimum, resembling the control functions derived in the geometric case. We test this hypothesis experimentally by analysing approximately optimal mutation rate control functions in 115 complete landscapes of binding scores between DNA sequences and transcription factors. Our findings support the hypothesis and find that the increase of mutation rate is more rapid in landscapes that are less monotonic (more rugged). We discuss the relevance of these findings to living organisms.

  12. Thin-film X-ray filters on microstructured substrates and their thermophysical properties

    NASA Astrophysics Data System (ADS)

    Mitrofanov, A. V.

    2018-02-01

    It is shown that structured substrates having micron- or submicron-sized through holes and coated with an ultrathin organic film can be used for the fabrication of thin-film X-ray filters via direct growth of functional layers on a substrate by sputter deposition, without additional complex processing steps. An optimised process is considered for the fabrication of X-ray filters on support structures in the form of electroplated fine nickel grids and on track-etched polymer membranes with micron- and submicrondiameter through pores. 'Optimisation' is here taken to mean matching the sputter deposition conditions with the properties of substrates so as to avoid overheating. The filters in question are intended for both imaging and single-channel detectors operating in the soft X-ray and vacuum UV spectral regions, at wavelengths from 10 to 60 nm. Thermal calculations are presented for the heating of ultrathin layers of organic films and thin-film support substrates during the sputter deposition of aluminium or other functional materials. The paper discusses approaches for cooling thinfilm composites during the sputter deposition process and the service of the filters in experiments and gives a brief overview of the works that utilised filters produced by the described technique on microstructured substrates, including orbital solar X-ray research in the framework of the CORONAS programme and laboratory laser plasma experiments.

  13. Robust imaging and gene delivery to study human lymphoblastoid cell lines.

    PubMed

    Jolly, Lachlan A; Sun, Ying; Carroll, Renée; Homan, Claire C; Gecz, Jozef

    2018-06-20

    Lymphoblastoid cell lines (LCLs) have been by far the most prevalent cell type used to study the genetics underlying normal and disease-relevant human phenotypic variation, across personal to epidemiological scales. In contrast, only few studies have explored the use of LCLs in functional genomics and mechanistic studies. Two major reasons are technical, as (1) interrogating the sub-cellular spatial information of LCLs is challenged by their non-adherent nature, and (2) LCLs are refractory to gene transfection. Methodological details relating to techniques that overcome these limitations are scarce, largely inadequate (without additional knowledge and expertise), and optimisation has never been described. Here we compare, optimise, and convey such methods in-depth. We provide a robust method to adhere LCLs to coverslips, which maintained cellular integrity, morphology, and permitted visualisation of sub-cellular structures and protein localisation. Next, we developed the use of lentiviral-based gene delivery to LCLs. Through empirical and combinatorial testing of multiple transduction conditions, we improved transduction efficiency from 3% up to 48%. Furthermore, we established strategies to purify transduced cells, to achieve sustainable cultures containing >85% transduced cells. Collectively, our methodologies provide a vital resource that enables the use of LCLs in functional cell and molecular biology experiments. Potential applications include the characterisation of genetic variants of unknown significance, the interrogation of cellular disease pathways and mechanisms, and high-throughput discovery of genetic modifiers of disease states among others.

  14. [Improving pre- and perioperative hospital care : Major elective surgery].

    PubMed

    Punt, Ilona M; van der Most, Roel; Bongers, Bart C; Didden, Anouk; Hulzebos, Erik H J; Dronkers, Jaap J; van Meeteren, Nico L U

    2017-04-01

    Surgery is aimed at improving a patient's health. However, surgery is plagued with a risk of negative consequences, such as perioperative complications and prolonged hospitalization. Also, achieving preoperative levels of physical functionality may be delayed. Above all, the "waiting" period before the operation and the period of hospitalisation endanger the state of health, especially in frail patients.The Better in Better out™ (BiBo™) strategy is aimed at reducing the risk of a complicated postoperative course through the optimisation and professionalisation of perioperative treatment strategies in a physiotherapy activating context. BiBo™ includes four steps towards optimising personalised health care in patients scheduled for elective surgery: 1) preoperative risk assessment, 2) preoperative patient education, 3) preoperative exercise therapy for high-risk patients (prehabilitation) and 4) postoperative mobilisation and functional exercise therapy.Preoperative screening is aimed at identifying frail, high-risk patients at an early stage, and advising these high-risk patients to participate in outpatient exercise training (prehabilitation) as soon as possible. By improving preoperative physical fitness, a patient is able to better withstand the impact of major surgery and this will lead to both a reduced risk of negative side effects and better short-term outcomes as a result. Besides prehabilitation, treatment culture and infrastructure should be inherently changing in such a way that patients stay as active as they can, socially, mentally and physically after discharge.

  15. Conjugate gradient minimisation approach to generating holographic traps for ultracold atoms.

    PubMed

    Harte, Tiffany; Bruce, Graham D; Keeling, Jonathan; Cassettari, Donatella

    2014-11-03

    Direct minimisation of a cost function can in principle provide a versatile and highly controllable route to computational hologram generation. Here we show that the careful design of cost functions, combined with numerically efficient conjugate gradient minimisation, establishes a practical method for the generation of holograms for a wide range of target light distributions. This results in a guided optimisation process, with a crucial advantage illustrated by the ability to circumvent optical vortex formation during hologram calculation. We demonstrate the implementation of the conjugate gradient method for both discrete and continuous intensity distributions and discuss its applicability to optical trapping of ultracold atoms.

  16. Synthesis and Multiple Incorporations of 2′‐O‐Methyl‐5‐hydroxymethylcytidine, 5‐Hydroxymethylcytidine and 5‐Formylcytidine Monomers into RNA Oligonucleotides

    PubMed Central

    Tanpure, Arun A.

    2017-01-01

    Abstract The synthesis of 2′‐O‐methyl‐5‐hydroxymethylcytidine (hm5Cm), 5‐hydroxymethylcytidine (hm5C) and 5‐formylcytidine (f5C) phosphoramidite monomers has been developed. Optimisation of mild post‐synthetic deprotection conditions enabled the synthesis of RNA containing all four naturally occurring cytosine modifications (hm5Cm, hm5C, f5C plus 5‐methylcytosine). Given the considerable interest in RNA modifications and epitranscriptomics, the availability of synthetic monomers and RNAs containing these modifications will be valuable for elucidating their biological function(s). PMID:28901692

  17. Identification of the contribution of contact and aerial biomechanical parameters in acrobatic performance

    PubMed Central

    Haering, Diane; Huchez, Aurore; Barbier, Franck; Holvoët, Patrice; Begon, Mickaël

    2017-01-01

    Introduction Teaching acrobatic skills with a minimal amount of repetition is a major challenge for coaches. Biomechanical, statistical or computer simulation tools can help them identify the most determinant factors of performance. Release parameters, change in moment of inertia and segmental momentum transfers were identified in the prediction of acrobatics success. The purpose of the present study was to evaluate the relative contribution of these parameters in performance throughout expertise or optimisation based improvements. The counter movement forward in flight (CMFIF) was chosen for its intrinsic dichotomy between the accessibility of its attempt and complexity of its mastery. Methods Three repetitions of the CMFIF performed by eight novice and eight advanced female gymnasts were recorded using a motion capture system. Optimal aerial techniques that maximise rotation potential at regrasp were also computed. A 14-segment-multibody-model defined through the Rigid Body Dynamics Library was used to compute recorded and optimal kinematics, and biomechanical parameters. A stepwise multiple linear regression was used to determine the relative contribution of these parameters in novice recorded, novice optimised, advanced recorded and advanced optimised trials. Finally, fixed effects of expertise and optimisation were tested through a mixed-effects analysis. Results and discussion Variation in release state only contributed to performances in novice recorded trials. Moment of inertia contribution to performance increased from novice recorded, to novice optimised, advanced recorded, and advanced optimised trials. Contribution to performance of momentum transfer to the trunk during the flight prevailed in all recorded trials. Although optimisation decreased transfer contribution, momentum transfer to the arms appeared. Conclusion Findings suggest that novices should be coached on both contact and aerial technique. Inversely, mainly improved aerial technique helped advanced gymnasts increase their performance. For both, reduction of the moment of inertia should be focused on. The method proposed in this article could be generalized to any aerial skill learning investigation. PMID:28422954

  18. Total knee replacement plus physical and medical therapy or treatment with physical and medical therapy alone: a randomised controlled trial in patients with knee osteoarthritis (the MEDIC-study).

    PubMed

    Skou, Soren T; Roos, Ewa M; Laursen, Mogens B; Rathleff, Michael S; Arendt-Nielsen, Lars; Simonsen, Ole H; Rasmussen, Sten

    2012-05-09

    There is a lack of high quality evidence concerning the efficacy of total knee arthroplasty (TKA). According to international evidence-based guidelines, treatment of knee osteoarthritis (KOA) should include patient education, exercise and weight loss. Insoles and pharmacological treatment can be included as supplementary treatments. If the combination of these non-surgical treatment modalities is ineffective, TKA may be indicated. The purpose of this randomised controlled trial is to examine whether TKA provides further improvement in pain, function and quality of life in addition to optimised non-surgical treatment in patients with KOA defined as definite radiographic OA and up to moderate pain. The study will be conducted in The North Denmark Region. 100 participants with radiographic KOA (K-L grade ≥2) and mean pain during the previous week of ≤ 60 mm (0-100, best to worst scale) who are considered eligible for TKA by an orthopaedic surgeon will be included. The treatment will consist of 12 weeks of optimised non-surgical treatment consisting of patient education, exercise, diet, insoles, analgesics and/or NSAIDs. Patients will be randomised to either receiving or not receiving a TKA in addition to the optimised non-surgical treatment. The primary outcome will be the change from baseline to 12 months on the Knee Injury and Osteoarthritis Outcome Score (KOOS)(4) defined as the average score for the subscale scores for pain, symptoms, activities of daily living, and quality of life. Secondary outcomes include the five individual KOOS subscale scores, EQ-5D, pain on a 100 mm Visual Analogue Scale, self-efficacy, pain pressure thresholds, and isometric knee flexion and knee extension strength. This is the first randomised controlled trial to investigate the efficacy of TKA as an adjunct treatment to optimised non-surgical treatment in patients with KOA. The results will significantly contribute to evidence-based recommendations for the treatment of patients with KOA. Clinicaltrials.gov reference: NCT01410409.

  19. Total knee replacement plus physical and medical therapy or treatment with physical and medical therapy alone: a randomised controlled trial in patients with knee osteoarthritis (the MEDIC-study)

    PubMed Central

    2012-01-01

    Background There is a lack of high quality evidence concerning the efficacy of total knee arthroplasty (TKA). According to international evidence-based guidelines, treatment of knee osteoarthritis (KOA) should include patient education, exercise and weight loss. Insoles and pharmacological treatment can be included as supplementary treatments. If the combination of these non-surgical treatment modalities is ineffective, TKA may be indicated. The purpose of this randomised controlled trial is to examine whether TKA provides further improvement in pain, function and quality of life in addition to optimised non-surgical treatment in patients with KOA defined as definite radiographic OA and up to moderate pain. Methods/Design The study will be conducted in The North Denmark Region. 100 participants with radiographic KOA (K-L grade ≥2) and mean pain during the previous week of ≤ 60 mm (0–100, best to worst scale) who are considered eligible for TKA by an orthopaedic surgeon will be included. The treatment will consist of 12 weeks of optimised non-surgical treatment consisting of patient education, exercise, diet, insoles, analgesics and/or NSAIDs. Patients will be randomised to either receiving or not receiving a TKA in addition to the optimised non-surgical treatment. The primary outcome will be the change from baseline to 12 months on the Knee Injury and Osteoarthritis Outcome Score (KOOS)4 defined as the average score for the subscale scores for pain, symptoms, activities of daily living, and quality of life. Secondary outcomes include the five individual KOOS subscale scores, EQ-5D, pain on a 100 mm Visual Analogue Scale, self-efficacy, pain pressure thresholds, and isometric knee flexion and knee extension strength. Discussion This is the first randomised controlled trial to investigate the efficacy of TKA as an adjunct treatment to optimised non-surgical treatment in patients with KOA. The results will significantly contribute to evidence-based recommendations for the treatment of patients with KOA. Trial registration Clinicaltrials.gov reference: NCT01410409 PMID:22571284

  20. Eatwell Guide: modelling the dietary and cost implications of incorporating new sugar and fibre guidelines.

    PubMed

    Scarborough, Peter; Kaur, Asha; Cobiac, Linda; Owens, Paul; Parlesak, Alexandr; Sweeney, Kate; Rayner, Mike

    2016-12-21

    To model food group consumption and price of diet associated with achieving UK dietary recommendations while deviating as little as possible from the current UK diet, in order to support the redevelopment of the UK food-based dietary guidelines (now called the Eatwell Guide). Optimisation modelling, minimising an objective function of the difference between population mean modelled and current consumption of 125 food groups, and constraints of nutrient and food-based recommendations. The UK. Adults aged 19 years and above from the National Diet and Nutrition Survey 2008-2011. Proportion of diet consisting of major foods groups and price of the optimised diet. The optimised diet has an increase in consumption of 'potatoes, bread, rice, pasta and other starchy carbohydrates' (+69%) and 'fruit and vegetables' (+54%) and reductions in consumption of 'beans, pulses, fish, eggs, meat and other proteins' (-24%), 'dairy and alternatives' (-21%) and 'foods high in fat and sugar' (-53%). Results within food groups show considerable variety (eg, +90% for beans and pulses, -78% for red meat). The modelled diet would cost £5.99 (£5.93 to £6.05) per adult per day, very similar to the cost of the current diet: £6.02 (£5.96 to £6.08). The optimised diet would result in increased consumption of n-3 fatty acids and most micronutrients (including iron and folate), but decreased consumption of zinc and small decreases in consumption of calcium and riboflavin. To achieve the UK dietary recommendations would require large changes in the average diet of UK adults, including in food groups where current average consumption is well within the recommended range (eg, processed meat) or where there are no current recommendations (eg, dairy). These large changes in the diet will not lead to significant changes in the price of the diet. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction.

    PubMed

    O'Boyle, Noel M; Palmer, David S; Nigsch, Florian; Mitchell, John Bo

    2008-10-29

    We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024-1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581-590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6 degrees C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, epsilon of 0.21) and an RMSE of 45.1 degrees C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3 degrees C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5 degrees C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

  2. Multi-objective optimisation and decision-making of space station logistics strategies

    NASA Astrophysics Data System (ADS)

    Zhu, Yue-he; Luo, Ya-zhong

    2016-10-01

    Space station logistics strategy optimisation is a complex engineering problem with multiple objectives. Finding a decision-maker-preferred compromise solution becomes more significant when solving such a problem. However, the designer-preferred solution is not easy to determine using the traditional method. Thus, a hybrid approach that combines the multi-objective evolutionary algorithm, physical programming, and differential evolution (DE) algorithm is proposed to deal with the optimisation and decision-making of space station logistics strategies. A multi-objective evolutionary algorithm is used to acquire a Pareto frontier and help determine the range parameters of the physical programming. Physical programming is employed to convert the four-objective problem into a single-objective problem, and a DE algorithm is applied to solve the resulting physical programming-based optimisation problem. Five kinds of objective preference are simulated and compared. The simulation results indicate that the proposed approach can produce good compromise solutions corresponding to different decision-makers' preferences.

  3. A shrinking hypersphere PSO for engineering optimisation problems

    NASA Astrophysics Data System (ADS)

    Yadav, Anupam; Deep, Kusum

    2016-03-01

    Many real-world and engineering design problems can be formulated as constrained optimisation problems (COPs). Swarm intelligence techniques are a good approach to solve COPs. In this paper an efficient shrinking hypersphere-based particle swarm optimisation (SHPSO) algorithm is proposed for constrained optimisation. The proposed SHPSO is designed in such a way that the movement of the particle is set to move under the influence of shrinking hyperspheres. A parameter-free approach is used to handle the constraints. The performance of the SHPSO is compared against the state-of-the-art algorithms for a set of 24 benchmark problems. An exhaustive comparison of the results is provided statistically as well as graphically. Moreover three engineering design problems namely welded beam design, compressed string design and pressure vessel design problems are solved using SHPSO and the results are compared with the state-of-the-art algorithms.

  4. Achieving optimal SERS through enhanced experimental design

    PubMed Central

    Fisk, Heidi; Westley, Chloe; Turner, Nicholas J.

    2016-01-01

    One of the current limitations surrounding surface‐enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal‐based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd. PMID:27587905

  5. Microfluidic converging/diverging channels optimised for homogeneous extensional deformation.

    PubMed

    Zografos, K; Pimenta, F; Alves, M A; Oliveira, M S N

    2016-07-01

    In this work, we optimise microfluidic converging/diverging geometries in order to produce constant strain-rates along the centreline of the flow, for performing studies under homogeneous extension. The design is examined for both two-dimensional and three-dimensional flows where the effects of aspect ratio and dimensionless contraction length are investigated. Initially, pressure driven flows of Newtonian fluids under creeping flow conditions are considered, which is a reasonable approximation in microfluidics, and the limits of the applicability of the design in terms of Reynolds numbers are investigated. The optimised geometry is then used for studying the flow of viscoelastic fluids and the practical limitations in terms of Weissenberg number are reported. Furthermore, the optimisation strategy is also applied for electro-osmotic driven flows, where the development of a plug-like velocity profile allows for a wider region of homogeneous extensional deformation in the flow field.

  6. Topology Optimisation of Wideband Coaxial-to-Waveguide Transitions

    NASA Astrophysics Data System (ADS)

    Hassan, Emadeldeen; Noreland, Daniel; Wadbro, Eddie; Berggren, Martin

    2017-03-01

    To maximize the matching between a coaxial cable and rectangular waveguides, we present a computational topology optimisation approach that decides for each point in a given domain whether to hold a good conductor or a good dielectric. The conductivity is determined by a gradient-based optimisation method that relies on finite-difference time-domain solutions to the 3D Maxwell’s equations. Unlike previously reported results in the literature for this kind of problems, our design algorithm can efficiently handle tens of thousands of design variables that can allow novel conceptual waveguide designs. We demonstrate the effectiveness of the approach by presenting optimised transitions with reflection coefficients lower than -15 dB over more than a 60% bandwidth, both for right-angle and end-launcher configurations. The performance of the proposed transitions is cross-verified with a commercial software, and one design case is validated experimentally.

  7. Topology Optimisation of Wideband Coaxial-to-Waveguide Transitions.

    PubMed

    Hassan, Emadeldeen; Noreland, Daniel; Wadbro, Eddie; Berggren, Martin

    2017-03-23

    To maximize the matching between a coaxial cable and rectangular waveguides, we present a computational topology optimisation approach that decides for each point in a given domain whether to hold a good conductor or a good dielectric. The conductivity is determined by a gradient-based optimisation method that relies on finite-difference time-domain solutions to the 3D Maxwell's equations. Unlike previously reported results in the literature for this kind of problems, our design algorithm can efficiently handle tens of thousands of design variables that can allow novel conceptual waveguide designs. We demonstrate the effectiveness of the approach by presenting optimised transitions with reflection coefficients lower than -15 dB over more than a 60% bandwidth, both for right-angle and end-launcher configurations. The performance of the proposed transitions is cross-verified with a commercial software, and one design case is validated experimentally.

  8. Optimal design and operation of a photovoltaic-electrolyser system using particle swarm optimisation

    NASA Astrophysics Data System (ADS)

    Sayedin, Farid; Maroufmashat, Azadeh; Roshandel, Ramin; Khavas, Sourena Sattari

    2016-07-01

    In this study, hydrogen generation is maximised by optimising the size and the operating conditions of an electrolyser (EL) directly connected to a photovoltaic (PV) module at different irradiance. Due to the variations of maximum power points of the PV module during a year and the complexity of the system, a nonlinear approach is considered. A mathematical model has been developed to determine the performance of the PV/EL system. The optimisation methodology presented here is based on the particle swarm optimisation algorithm. By this method, for the given number of PV modules, the optimal sizeand operating condition of a PV/EL system areachieved. The approach can be applied for different sizes of PV systems, various ambient temperatures and different locations with various climaticconditions. The results show that for the given location and the PV system, the energy transfer efficiency of PV/EL system can reach up to 97.83%.

  9. Assessment of grid optimisation measures for the German transmission grid using open source grid data

    NASA Astrophysics Data System (ADS)

    Böing, F.; Murmann, A.; Pellinger, C.; Bruckmeier, A.; Kern, T.; Mongin, T.

    2018-02-01

    The expansion of capacities in the German transmission grid is a necessity for further integration of renewable energy sources into the electricity sector. In this paper, the grid optimisation measures ‘Overhead Line Monitoring’, ‘Power-to-Heat’ and ‘Demand Response in the Industry’ are evaluated and compared against conventional grid expansion for the year 2030. Initially, the methodical approach of the simulation model is presented and detailed descriptions of the grid model and the used grid data, which partly originates from open-source platforms, are provided. Further, this paper explains how ‘Curtailment’ and ‘Redispatch’ can be reduced by implementing grid optimisation measures and how the depreciation of economic costs can be determined considering construction costs. The developed simulations show that the conventional grid expansion is more efficient and implies more grid relieving effects than the evaluated grid optimisation measures.

  10. Topology Optimisation of Wideband Coaxial-to-Waveguide Transitions

    PubMed Central

    Hassan, Emadeldeen; Noreland, Daniel; Wadbro, Eddie; Berggren, Martin

    2017-01-01

    To maximize the matching between a coaxial cable and rectangular waveguides, we present a computational topology optimisation approach that decides for each point in a given domain whether to hold a good conductor or a good dielectric. The conductivity is determined by a gradient-based optimisation method that relies on finite-difference time-domain solutions to the 3D Maxwell’s equations. Unlike previously reported results in the literature for this kind of problems, our design algorithm can efficiently handle tens of thousands of design variables that can allow novel conceptual waveguide designs. We demonstrate the effectiveness of the approach by presenting optimised transitions with reflection coefficients lower than −15 dB over more than a 60% bandwidth, both for right-angle and end-launcher configurations. The performance of the proposed transitions is cross-verified with a commercial software, and one design case is validated experimentally. PMID:28332585

  11. VLSI Technology for Cognitive Radio

    NASA Astrophysics Data System (ADS)

    VIJAYALAKSHMI, B.; SIDDAIAH, P.

    2017-08-01

    One of the most challenging tasks of cognitive radio is the efficiency in the spectrum sensing scheme to overcome the spectrum scarcity problem. The popular and widely used spectrum sensing technique is the energy detection scheme as it is very simple and doesn’t require any previous information related to the signal. We propose one such approach which is an optimised spectrum sensing scheme with reduced filter structure. The optimisation is done in terms of area and power performance of the spectrum. The simulations of the VLSI structure of the optimised flexible spectrum is done using verilog coding by using the XILINX ISE software. Our method produces performance with 13% reduction in area and 66% reduction in power consumption in comparison to the flexible spectrum sensing scheme. All the results are tabulated and comparisons are made. A new scheme for optimised and effective spectrum sensing opens up with our model.

  12. Achieving optimal SERS through enhanced experimental design.

    PubMed

    Fisk, Heidi; Westley, Chloe; Turner, Nicholas J; Goodacre, Royston

    2016-01-01

    One of the current limitations surrounding surface-enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal-based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd.

  13. Optimisation of decentralisation for effective Disaster Risk Reduction (DRR) through the case study of Indonesia

    NASA Astrophysics Data System (ADS)

    Grady, A.; Makarigakis, A.; Gersonius, B.

    2015-09-01

    This paper investigates how to optimise decentralisation for effective disaster risk reduction (DRR) in developing states. There is currently limited literature on empirical analysis of decentralisation for DRR. This paper evaluates decentralised governance for DRR in the case study of Indonesia and provides recommendations for its optimisation. Wider implications are drawn to optimise decentralisation for DRR in developing states more generally. A framework to evaluate the institutional and policy setting was developed which necessitated the use of a gap analysis, desk study and field investigation. Key challenges to decentralised DRR include capacity gaps at lower levels, low compliance with legislation, disconnected policies, issues in communication and coordination and inadequate resourcing. DRR authorities should lead coordination and advocacy on DRR. Sustainable multistakeholder platforms and civil society organisations should fill the capacity gap at lower levels. Dedicated and regulated resources for DRR should be compulsory.

  14. Microfluidic converging/diverging channels optimised for homogeneous extensional deformation

    PubMed Central

    Zografos, K.; Oliveira, M. S. N.

    2016-01-01

    In this work, we optimise microfluidic converging/diverging geometries in order to produce constant strain-rates along the centreline of the flow, for performing studies under homogeneous extension. The design is examined for both two-dimensional and three-dimensional flows where the effects of aspect ratio and dimensionless contraction length are investigated. Initially, pressure driven flows of Newtonian fluids under creeping flow conditions are considered, which is a reasonable approximation in microfluidics, and the limits of the applicability of the design in terms of Reynolds numbers are investigated. The optimised geometry is then used for studying the flow of viscoelastic fluids and the practical limitations in terms of Weissenberg number are reported. Furthermore, the optimisation strategy is also applied for electro-osmotic driven flows, where the development of a plug-like velocity profile allows for a wider region of homogeneous extensional deformation in the flow field. PMID:27478523

  15. Optimisation of confinement in a fusion reactor using a nonlinear turbulence model

    NASA Astrophysics Data System (ADS)

    Highcock, E. G.; Mandell, N. R.; Barnes, M.

    2018-04-01

    The confinement of heat in the core of a magnetic fusion reactor is optimised using a multidimensional optimisation algorithm. For the first time in such a study, the loss of heat due to turbulence is modelled at every stage using first-principles nonlinear simulations which accurately capture the turbulent cascade and large-scale zonal flows. The simulations utilise a novel approach, with gyrofluid treatment of the small-scale drift waves and gyrokinetic treatment of the large-scale zonal flows. A simple near-circular equilibrium with standard parameters is chosen as the initial condition. The figure of merit, fusion power per unit volume, is calculated, and then two control parameters, the elongation and triangularity of the outer flux surface, are varied, with the algorithm seeking to optimise the chosen figure of merit. A twofold increase in the plasma power per unit volume is achieved by moving to higher elongation and strongly negative triangularity.

  16. Lead-acid batteries for micro- and mild-hybrid applications

    NASA Astrophysics Data System (ADS)

    Valenciano, J.; Fernández, M.; Trinidad, F.; Sanz, L.

    Car manufactures have announced the launch in coming months of vehicles with reduced emissions due to the introduction of new functions like stop-start and regenerative braking. Initial performance request of automotive lead-acid batteries are becoming more and more demanding and, in addition to this, cycle life with new accelerated ageing profiles are being proposed in order to determine the influence of the new functions on the expected battery life. This paper will show how different lead-acid battery technologies comply with these new demands, from an improved version of the conventional flooded SLI battery to the high performance of spiral wound valve-regulated lead-acid (VRLA) battery. Different approaches have been studied for improving conventional flooded batteries, i.e., either by the addition of new additives for reducing electrolyte stratification or by optimisation of the battery design to extend cycling life in partial state of charge conditions. With respect to VRLA technology, two different battery designs have been compared. Spiral wound design combines excellent power capability and cycle life under different depth of discharge (DoD) cycling conditions, but flat plate design outperform the latter in energy density due to better utilization of the space available in a prismatic enclosure. This latter design is more adequate for high end class vehicles with high electrical energy demand, whereas spiral wound is better suited for high power/long life demand of commercial vehicle. High temperature behaviour (75 °C) is rather poor for both designs due to water loss, and then VRLA batteries should preferably be located out of the engine compartment.

  17. One-step growth of thin film SnS with large grains using MOCVD.

    PubMed

    Clayton, Andrew J; Charbonneau, Cecile M E; Tsoi, Wing C; Siderfin, Peter J; Irvine, Stuart J C

    2018-01-01

    Thin film tin sulphide (SnS) films were produced with grain sizes greater than 1 μm using a one-step metal organic chemical vapour deposition process. Tin-doped indium oxide (ITO) was used as the substrate, having a similar work function to molybdenum typically used as the back contact, but with potential use of its transparency for bifacial illumination. Tetraethyltin and ditertiarybutylsulphide were used as precursors with process temperatures 430-470 °C to promote film growth with large grains. The film stoichiometry was controlled by varying the precursor partial pressure ratios and characterised with energy dispersive X-ray spectroscopy to optimise the SnS composition. X-ray diffraction and Raman spectroscopy were used to determine the phases that were present in the film and revealed that small amounts of ottemannite Sn 2 S 3 was present when SnS was deposited on to the ITO using optimised growth parameters. Interaction at the SnS/ITO interface to form Sn 2 S 3 was deduced to have resulted for all growth conditions.

  18. Total centralisation and optimisation of an oncology management suite via Citrix®

    NASA Astrophysics Data System (ADS)

    James, C.; Frantzis, J.; Ripps, L.; Fenton, P.

    2014-03-01

    The management of patient information and treatment planning is traditionally an intra-departmental requirement of a radiation oncology service. Epworth Radiation Oncology systems must support the transient nature of Visiting Medical Officers (VMOs). This unique work practice created challenges when implementing the vision of a completely paperless solution that allows for a responsive and efficient service delivery. ARIA® and EclipseTM (Varian Medical Systems, Palo Alto, CA, USA) have been deployed across four dedicated Citrix® (Citrix Systems, Santa Clara, CA, USA) servers allowing VMOs to access these applications remotely. A range of paperless solutions were developed within ARIA® to facilitate clinical and organisational management whilst optimising efficient work practices. The IT infrastructure and paperless workflow has enabled VMOs to securely access the VarianTM (Varian Medical Systems, Palo Alto, CA, USA) oncology software and experience full functionality from any location on multiple devices. This has enhanced access to patient information and improved the responsiveness of the service. Epworth HealthCare has developed a unique solution to enable remote access to a centralised oncology management suite, while maintaining a secure and paperless working environment.

  19. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization.

    PubMed

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm.

  20. Phospholipid fingerprints of milk from different mammalians determined by 31P NMR: towards specific interest in human health.

    PubMed

    Garcia, Cyrielle; Lutz, Norbert W; Confort-Gouny, Sylviane; Cozzone, Patrick J; Armand, Martine; Bernard, Monique

    2012-12-01

    Our objective was to identify and quantify phospholipids in milk from different species (human HM, cow CoM, camel CaM, and mare MM) using an optimised (31)P NMR spectroscopy procedure. The phospholipid fingerprints were species-specific with a broader variety of classes found in HM and MM; HM and CaM were richer in sphingomyelin (78.3 and 117.5μg/ml) and plasmalogens (27.3 and 24μg/ml), possibly important for infant development. Total phospholipid content was higher in CaM (0.503mM) and lower in MM (0.101mM) compared to HM (0.324mM) or CoM (0.265mM). Our optimised method showed good sensitivity, high resolution, and easy sample preparation with minimal loss of target molecules. It is suitable for determining the accurate composition of a large number of bioactive phospholipids with putative health benefits, including plasmalogens, and should aid in selecting appropriate ingredient sources for infant milk substitutes or fortifiers, and for functional foods dedicated to adults. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Relative electronic and free energies of octane's unique conformations

    NASA Astrophysics Data System (ADS)

    Kirschner, Karl N.; Heiden, Wolfgang; Reith, Dirk

    2017-06-01

    This study reports the geometries and electronic energies of n-octane's unique conformations using perturbation methods that best mimic CCSD(T) results. In total, the fully optimised minima of n-butane (2 conformations), n-pentane (4 conformations), n-hexane (12 conformations) and n-octane (96 conformations) were investigated at several different theory levels and basis sets. We find that DF-MP2.5/aug-cc-pVTZ is in very good agreement with the more expensive CCSD(T) results. At this level, we can clearly confirm the 96 stable minima which were previously found using a reparameterised density functional theory (DFT). Excellent agreement was found between their DFT results and our DF-MP2.5 perturbation results. Subsequent Gibbs free energy calculations, using scaled MP2/aug-cc-pVTZ zero-point vibrational energy and frequencies, indicate a significant temperature dependency of the relative energies, with a change in the predicted global minimum. The results of this work will be important for future computational investigations of fuel-related octane reactions and for optimisation of molecular force fields (e.g. lipids).

  2. Gait as solution, but what is the problem? Exploring cost, economy and compromise in locomotion.

    PubMed

    Bertram, John E A

    2013-12-01

    Many studies have examined how legged mammals move, defining 'what' happens in locomotion. However, few ask 'why' those motions occur as they do. The energetic and functional constraints acting on an animal require that locomotion should be metabolically 'cost effective' and this in large part determines the strategies available to accomplish the task. Understanding the gaits utilised, within the spectrum of gaits possible, and determination of the value of specific relationships among speed, stride length, stride frequency and morphology, depends on identifying the fundamental costs involved and the effects of different movement strategies on those costs. It is argued here that a fundamental loss associated with moving on limbs (centre of mass momentum and energy loss) and two costs involved with controlling and replacing that loss (muscular work of the supporting limb during stance and muscular work of repositioning the limbs during swing) interact to determine the cost trade-offs involved and the optimisation strategies available for each species and speed. These optimisation strategies are what has been observed and characterised as gait. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Coil optimisation for transcranial magnetic stimulation in realistic head geometry.

    PubMed

    Koponen, Lari M; Nieminen, Jaakko O; Mutanen, Tuomas P; Stenroos, Matti; Ilmoniemi, Risto J

    Transcranial magnetic stimulation (TMS) allows focal, non-invasive stimulation of the cortex. A TMS pulse is inherently weakly coupled to the cortex; thus, magnetic stimulation requires both high current and high voltage to reach sufficient intensity. These requirements limit, for example, the maximum repetition rate and the maximum number of consecutive pulses with the same coil due to the rise of its temperature. To develop methods to optimise, design, and manufacture energy-efficient TMS coils in realistic head geometry with an arbitrary overall coil shape. We derive a semi-analytical integration scheme for computing the magnetic field energy of an arbitrary surface current distribution, compute the electric field induced by this distribution with a boundary element method, and optimise a TMS coil for focal stimulation. Additionally, we introduce a method for manufacturing such a coil by using Litz wire and a coil former machined from polyvinyl chloride. We designed, manufactured, and validated an optimised TMS coil and applied it to brain stimulation. Our simulations indicate that this coil requires less than half the power of a commercial figure-of-eight coil, with a 41% reduction due to the optimised winding geometry and a partial contribution due to our thinner coil former and reduced conductor height. With the optimised coil, the resting motor threshold of abductor pollicis brevis was reached with the capacitor voltage below 600 V and peak current below 3000 A. The described method allows designing practical TMS coils that have considerably higher efficiency than conventional figure-of-eight coils. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Optimisation of rocker sole footwear for prevention of first plantar ulcer: comparison of group-optimised and individually-selected footwear designs.

    PubMed

    Preece, Stephen J; Chapman, Jonathan D; Braunstein, Bjoern; Brüggemann, Gert-Peter; Nester, Christopher J

    2017-01-01

    Appropriate footwear for individuals with diabetes but no ulceration history could reduce the risk of first ulceration. However, individuals who deem themselves at low risk are unlikely to seek out bespoke footwear which is personalised. Therefore, our primary aim was to investigate whether group-optimised footwear designs, which could be prefabricated and delivered in a retail setting, could achieve appropriate pressure reduction, or whether footwear selection must be on a patient-by-patient basis. A second aim was to compare responses to footwear design between healthy participants and people with diabetes in order to understand the transferability of previous footwear research, performed in healthy populations. Plantar pressures were recorded from 102 individuals with diabetes, considered at low risk of ulceration. This cohort included 17 individuals with peripheral neuropathy. We also collected data from 66 healthy controls. Each participant walked in 8 rocker shoe designs (4 apex positions × 2 rocker angles). ANOVA analysis was then used to understand the effect of two design features and descriptive statistics used to identify the group-optimised design. Using 200 kPa as a target, this group-optimised design was then compared to the design identified as the best for each participant (using plantar pressure data). Peak plantar pressure increased significantly as apex position was moved distally and rocker angle reduced ( p  < 0.001). The group-optimised design incorporated an apex at 52% of shoe length, a 20° rocker angle and an apex angle of 95°. With this design 71-81% of peak pressures were below the 200 kPa threshold, both in the full cohort of individuals with diabetes and also in the neuropathic subgroup. Importantly, only small increases (<5%) in this proportion were observed when participants wore footwear which was individually selected. In terms of optimised footwear designs, healthy participants demonstrated the same response as participants with diabetes, despite having lower plantar pressures. This is the first study demonstrating that a group-optimised, generic rocker shoe might perform almost as well as footwear selected on a patient by patient basis in a low risk patient group. This work provides a starting point for clinical evaluation of generic versus personalised pressure reducing footwear.

  5. Heterologous expression of an α-amylase inhibitor from common bean (Phaseolus vulgaris) in Kluyveromyces lactis and Saccharomyces cerevisiae.

    PubMed

    Brain-Isasi, Stephanie; Álvarez-Lueje, Alejandro; Higgins, Thomas Joseph V

    2017-06-15

    Phaseolamin or α-amylase inhibitor 1 (αAI) is a glycoprotein from common beans (Phaseolus vulgaris L.) that inhibits some insect and mammalian α-amylases. Several clinical studies support the beneficial use of bean αAI for control of diabetes and obesity. Commercial extracts of P. vulgaris are available but their efficacy is still under question, mainly because some of these extracts contain antinutritional impurities naturally present in bean seeds and also exhibit a lower specific activity αAI. The production of recombinant αAI allows to overcome these disadvantages and provides a platform for the large-scale production of pure and functional αAI protein for biotechnological and pharmaceutical applications. A synthetic gene encoding αAI from the common bean (Phaseolus vulgaris cv. Pinto) was codon-optimised for expression in yeasts (αAI-OPT) and cloned into the protein expression vectors pKLAC2 and pYES2. The yeasts Kluyveromyces lactis GG799 (and protease deficient derivatives such as YCT390) and Saccharomyces cerevisiae YPH499 were transformed with the optimised genes and transformants were screened for expression by antibody dot blot. Recombinant colonies of K. lactis YCT390 that expressed and secreted functional αAI into the culture supernatants were selected for further analyses. Recombinant αAI from K. lactis YCT390 was purified using anion-exchange and affinity resins leading to the recovery of a functional inhibitor. The identity of the purified αAI was confirmed by mass spectrometry. Recombinant clones of S. cerevisiae YPH499 expressed functional αAI intracellularly, but did not secrete the protein. This is the first report describing the heterologous expression of the α-amylase inhibitor 1 (αAI) from P. vulgaris in yeasts. We demonstrated that recombinant strains of K. lactis and S. cerevisiae expressed and processed the αAI precursor into mature and active protein and also showed that K. lactis secretes functional αAI.

  6. A new compound arithmetic crossover-based genetic algorithm for constrained optimisation in enterprise systems

    NASA Astrophysics Data System (ADS)

    Jin, Chenxia; Li, Fachao; Tsang, Eric C. C.; Bulysheva, Larissa; Kataev, Mikhail Yu

    2017-01-01

    In many real industrial applications, the integration of raw data with a methodology can support economically sound decision-making. Furthermore, most of these tasks involve complex optimisation problems. Seeking better solutions is critical. As an intelligent search optimisation algorithm, genetic algorithm (GA) is an important technique for complex system optimisation, but it has internal drawbacks such as low computation efficiency and prematurity. Improving the performance of GA is a vital topic in academic and applications research. In this paper, a new real-coded crossover operator, called compound arithmetic crossover operator (CAC), is proposed. CAC is used in conjunction with a uniform mutation operator to define a new genetic algorithm CAC10-GA. This GA is compared with an existing genetic algorithm (AC10-GA) that comprises an arithmetic crossover operator and a uniform mutation operator. To judge the performance of CAC10-GA, two kinds of analysis are performed. First the analysis of the convergence of CAC10-GA is performed by the Markov chain theory; second, a pair-wise comparison is carried out between CAC10-GA and AC10-GA through two test problems available in the global optimisation literature. The overall comparative study shows that the CAC performs quite well and the CAC10-GA defined outperforms the AC10-GA.

  7. A rapid, automated approach to optimisation of multiple reaction monitoring conditions for quantitative bioanalytical mass spectrometry.

    PubMed

    Higton, D M

    2001-01-01

    An improvement to the procedure for the rapid optimisation of mass spectrometry (PROMS), for the development of multiple reaction methods (MRM) for quantitative bioanalytical liquid chromatography/tandem mass spectrometry (LC/MS/MS), is presented. PROMS is an automated protocol that uses flow-injection analysis (FIA) and AppleScripts to create methods and acquire the data for optimisation. The protocol determines the optimum orifice potential, the MRM conditions for each compound, and finally creates the MRM methods needed for sample analysis. The sensitivities of the MRM methods created by PROMS approach those created manually. MRM method development using PROMS currently takes less than three minutes per compound compared to at least fifteen minutes manually. To further enhance throughput, approaches to MRM optimisation using one injection per compound, two injections per pool of five compounds and one injection per pool of five compounds have been investigated. No significant difference in the optimised instrumental parameters for MRM methods were found between the original PROMS approach and these new methods, which are up to ten times faster. The time taken for an AppleScript to determine the optimum conditions and build the MRM methods is the same with all approaches. Copyright 2001 John Wiley & Sons, Ltd.

  8. Optimised analytical models of the dielectric properties of biological tissue.

    PubMed

    Salahuddin, Saqib; Porter, Emily; Krewer, Finn; O' Halloran, Martin

    2017-05-01

    The interaction of electromagnetic fields with the human body is quantified by the dielectric properties of biological tissues. These properties are incorporated into complex numerical simulations using parametric models such as Debye and Cole-Cole, for the computational investigation of electromagnetic wave propagation within the body. These parameters can be acquired through a variety of optimisation algorithms to achieve an accurate fit to measured data sets. A number of different optimisation techniques have been proposed, but these are often limited by the requirement for initial value estimations or by the large overall error (often up to several percentage points). In this work, a novel two-stage genetic algorithm proposed by the authors is applied to optimise the multi-pole Debye parameters for 54 types of human tissues. The performance of the two-stage genetic algorithm has been examined through a comparison with five other existing algorithms. The experimental results demonstrate that the two-stage genetic algorithm produces an accurate fit to a range of experimental data and efficiently out-performs all other optimisation algorithms under consideration. Accurate values of the three-pole Debye models for 54 types of human tissues, over 500 MHz to 20 GHz, are also presented for reference. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  9. Optimisation of phenolic extraction from Averrhoa carambola pomace by response surface methodology and its microencapsulation by spray and freeze drying.

    PubMed

    Saikia, Sangeeta; Mahnot, Nikhil Kumar; Mahanta, Charu Lata

    2015-03-15

    Optimised of the extraction of polyphenol from star fruit (Averrhoa carambola) pomace using response surface methodology was carried out. Two variables viz. temperature (°C) and ethanol concentration (%) with 5 levels (-1.414, -1, 0, +1 and +1.414) were used to design the optimisation model using central composite rotatable design where, -1.414 and +1.414 refer to axial values, -1 and +1 mean factorial points and 0 refers to centre point of the design. The two variables, temperature of 40°C and ethanol concentration of 65% were the optimised conditions for the response variables of total phenolic content, ferric reducing antioxidant capacity and 2,2-diphenyl-1-picrylhydrazyl scavenging activity. The reverse phase-high pressure liquid chromatography chromatogram of the polyphenol extract showed eight phenolic acids and ascorbic acid. The extract was then encapsulated with maltodextrin (⩽ DE 20) by spray and freeze drying methods at three different concentrations. Highest encapsulating efficiency was obtained in freeze dried encapsulates (78-97%). The obtained optimised model could be used for polyphenol extraction from star fruit pomace and microencapsulates can be incorporated in different food systems to enhance their antioxidant property. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Analysis of power gating in different hierarchical levels of 2MB cache, considering variation

    NASA Astrophysics Data System (ADS)

    Jafari, Mohsen; Imani, Mohsen; Fathipour, Morteza

    2015-09-01

    This article reintroduces power gating technique in different hierarchical levels of static random-access memory (SRAM) design including cell, row, bank and entire cache memory in 16 nm Fin field effect transistor. Different structures of SRAM cells such as 6T, 8T, 9T and 10T are used in design of 2MB cache memory. The power reduction of the entire cache memory employing cell-level optimisation is 99.7% with the expense of area and other stability overheads. The power saving of the cell-level optimisation is 3× (1.2×) higher than power gating in cache (bank) level due to its superior selectivity. The access delay times are allowed to increase by 4% in the same energy delay product to achieve the best power reduction for each supply voltages and optimisation levels. The results show the row-level power gating is the best for optimising the power of the entire cache with lowest drawbacks. Comparisons of cells show that the cells whose bodies have higher power consumption are the best candidates for power gating technique in row-level optimisation. The technique has the lowest percentage of saving in minimum energy point (MEP) of the design. The power gating also improves the variation of power in all structures by at least 70%.

  11. Exopolysaccharide from Lactobacillus fermentum Lf2 and its functional characterization as a yogurt additive.

    PubMed

    Ale, Elisa C; Perezlindo, Marcos J; Burns, Patricia; Tabacman, Eduardo; Reinheimer, Jorge A; Binetti, Ana G

    2016-11-01

    Lactobacillus fermentum Lf2 is a strain which is able to produce high levels (approximately 1 g/l) of crude exopolysaccharide (EPS) when it is grown in optimised conditions. The aim of this work was to characterize the functional aspects of this EPS extract, focusing on its application as a dairy food additive. Our findings are consistent with an EPS extract that acts as moderate immunomodulator, modifying s-IgA and IL-6 levels in the small intestine when added to yogurt and milk, respectively. Furthermore, this EPS extract, in a dose feasible to use as a food additive, provides protection against Salmonella infection in a murine model, thus representing a mode of action to elicit positive health benefits. Besides, it contributes to the rheological characteristics of yogurt, and could function as a food additive with both technological and functional roles, making possible the production of a new functional yogurt with improved texture.

  12. Statistical modelling for precision agriculture: A case study in optimal environmental schedules for Agaricus Bisporus production via variable domain functional regression.

    PubMed

    Panayi, Efstathios; Peters, Gareth W; Kyriakides, George

    2017-01-01

    Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields.

  13. Statistical modelling for precision agriculture: A case study in optimal environmental schedules for Agaricus Bisporus production via variable domain functional regression

    PubMed Central

    Panayi, Efstathios; Kyriakides, George

    2017-01-01

    Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields. PMID:28961254

  14. A cost-utility analysis of a comprehensive orthogeriatric care for hip fracture patients, compared with standard of care treatment.

    PubMed

    Ginsberg, Gary; Adunsky, Abraham; Rasooly, Iris

    2013-01-01

    The economic burden associated with hip fractures calls for the investigation of innovative new cost-utility forms of organisation and integration of services for these patients. To carry out a cost-utility analysis integrating epidemiological and economic aspects for hip fracture patients treated within a comprehensive orthogeriatric model (COGM) of care, as compared with standard of care model (SOCM). A demonstration study conducted in a major tertiary medical centre, operating both a COGM ward and standard orthopaedic and rehabilitation wards. Data was collected on the clinical outcomes and health care costs of the two different treatment modalities, in order to calculate the absolute cost and disability-adjusted life years (DALY) ratio. The COGM model used 23% fewer resources per patient ($14,919 vs. $19,363) than the SOCM model and to avert 0.226 additional DALY per patient, mainly as a result of lower 1-year mortality rates among COGM patients (14.8% vs. 17.3%). A comprehensive ortho-geriatric care modality is more cost-effective, providing additional quality-adjusted life years (QALY) while using fewer resources compared with standard of care approach. The results should assist health policy-makers in optimising healthcare use and healthcare planning.

  15. Hybrid supply chain model for material requirement planning under financial constraints: A case study

    NASA Astrophysics Data System (ADS)

    Curci, Vita; Dassisti, Michele; Josefa, Mula Bru; Manuel, Díaz Madroñero

    2014-10-01

    Supply chain model (SCM) are potentially capable to integrate different aspects in supporting decision making for enterprise management tasks. The aim of the paper is to propose an hybrid mathematical programming model for optimization of production requirements resources planning. The preliminary model was conceived bottom-up from a real industrial case analysed oriented to maximize cash flow. Despite the intense computational effort required to converge to a solution, optimisation done brought good result in solving the objective function.

  16. hydroPSO: A Versatile Particle Swarm Optimisation R Package for Calibration of Environmental Models

    NASA Astrophysics Data System (ADS)

    Zambrano-Bigiarini, M.; Rojas, R.

    2012-04-01

    Particle Swarm Optimisation (PSO) is a recent and powerful population-based stochastic optimisation technique inspired by social behaviour of bird flocking, which shares similarities with other evolutionary techniques such as Genetic Algorithms (GA). In PSO, however, each individual of the population, known as particle in PSO terminology, adjusts its flying trajectory on the multi-dimensional search-space according to its own experience (best-known personal position) and the one of its neighbours in the swarm (best-known local position). PSO has recently received a surge of attention given its flexibility, ease of programming, low memory and CPU requirements, and efficiency. Despite these advantages, PSO may still get trapped into sub-optimal solutions, suffer from swarm explosion or premature convergence. Thus, the development of enhancements to the "canonical" PSO is an active area of research. To date, several modifications to the canonical PSO have been proposed in the literature, resulting into a large and dispersed collection of codes and algorithms which might well be used for similar if not identical purposes. In this work we present hydroPSO, a platform-independent R package implementing several enhancements to the canonical PSO that we consider of utmost importance to bring this technique to the attention of a broader community of scientists and practitioners. hydroPSO is model-independent, allowing the user to interface any model code with the calibration engine without having to invest considerable effort in customizing PSO to a new calibration problem. Some of the controlling options to fine-tune hydroPSO are: four alternative topologies, several types of inertia weight, time-variant acceleration coefficients, time-variant maximum velocity, regrouping of particles when premature convergence is detected, different types of boundary conditions and many others. Additionally, hydroPSO implements recent PSO variants such as: Improved Particle Swarm Optimisation (IPSO), Fully Informed Particle Swarm (FIPS), and weighted FIPS (wFIPS). Finally, an advanced sensitivity analysis using the Latin Hypercube One-At-a-Time (LH-OAT) method and user-friendly plotting summaries facilitate the interpretation and assessment of the calibration/optimisation results. We validate hydroPSO against the standard PSO algorithm (SPSO-2007) employing five test functions commonly used to assess the performance of optimisation algorithms. Additionally, we illustrate how the performance of the optimization/calibration engine is boosted by using several of the fine-tune options included in hydroPSO. Finally, we show how to interface SWAT-2005 with hydroPSO to calibrate a semi-distributed hydrological model for the Ega River basin in Spain, and how to interface MODFLOW-2000 and hydroPSO to calibrate a groundwater flow model for the regional aquifer of the Pampa del Tamarugal in Chile. We limit the applications of hydroPSO to study cases dealing with surface water and groundwater models as these two are the authors' areas of expertise. However, based on the flexibility of hydroPSO we believe this package can be implemented to any model code requiring some form of parameter estimation.

  17. Multiple response optimisation of processing and formulation parameters of pH sensitive sustained release pellets of capecitabine for targeting colon.

    PubMed

    Pandey, Sonia; Swamy, S M Vijayendra; Gupta, Arti; Koli, Akshay; Patel, Swagat; Maulvi, Furqan; Vyas, Bhavin

    2018-04-29

    To optimise the Eudragit/Surelease ® -coated pH-sensitive pellets for controlled and target drug delivery to the colon tissue and to avoid frequent high dosing and associated side effects which restrict its use in the colorectal-cancer therapy. The pellets were prepared using extrusion-spheronisation technique. Box-Behnken and 3 2 full factorial designs were applied to optimise the process parameters [extruder sieve size, spheroniser-speed, and spheroniser-time] and the coating levels [%w/v of Eudragit S100/Eudragit-L100 and Surelease ® ], respectively, to achieve the smooth optimised size pellets with sustained drug delivery without prior drug release in upper gastrointestinal tract (GIT). The design proposed the optimised batch by selecting independent variables at; extruder sieve size (X 1  = 1 mm), spheroniser speed (X 2  = 900 revolutions per minute, rpm), and spheroniser time (X 3  = 15 min) to achieve pellet size of 0.96 mm, aspect ratio of 0.98, and roundness 97.42%. The 16%w/v coating strength of Surelease ® and 13%w/v coating strength of Eudragit showed pH-dependent sustained release up to 22.35 h (t 99% ). The organ distribution study showed the absence of the drug in the upper part of GIT tissue and the presence of high level of capecitabine in the caecum and colon tissue. Thus, the presence of Eudragit coat prevent the release of drug in stomach and the inner Surelease ® coat showed sustained drug release in the colon tissue. The study demonstrates the potential of optimised Eudragit/Surelease ® -coated capecitabine-pellets for effective colon-targeted delivery system to avoid frequent high dosing and associated systemic side effects of drug.

  18. Multiple utility constrained multi-objective programs using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Abbasian, Pooneh; Mahdavi-Amiri, Nezam; Fazlollahtabar, Hamed

    2018-03-01

    A utility function is an important tool for representing a DM's preference. We adjoin utility functions to multi-objective optimization problems. In current studies, usually one utility function is used for each objective function. Situations may arise for a goal to have multiple utility functions. Here, we consider a constrained multi-objective problem with each objective having multiple utility functions. We induce the probability of the utilities for each objective function using Bayesian theory. Illustrative examples considering dependence and independence of variables are worked through to demonstrate the usefulness of the proposed model.

  19. Least cost pathways to a low carbon electricity system for Australia: impacts of transmission augmentation and extension

    NASA Astrophysics Data System (ADS)

    Dargaville, R. J.

    2016-12-01

    Designing the pathway to a low carbon energy system is complex, requiring consideration of the variable nature of renewables at the hourly timescale, emission intensity and ramp rate constraints of dispatchable technologies (both fossil and renewable) and transmission and distribution network limitations. In this work, an optimization framework taking into account these considerations has been applied to find the lowest cost ways to reduce carbon emissions by either 80% or 100% in 2050 while keeping the system operating reliably along the way. Technologies included are existing and advanced coal and gas technologies (with and without carbon capture and storage), rooftop PV, utility scale PV, concentrating solar thermal, hydro with and without pumped storage, bioenergy, and nuclear. In this study we also also the optimisation to increase transmission capacity along existing lines, and to extend key trunk lines into currently unserved areas. These augementations and extensions come at a cost. The otpimisation chooses these options when the benefits of accessing high quality renewable energy resources outweights the costs. Results show that for the 80% emission reduction case, there is limited need for transmission capacity increase, and that the existing grid copes well with the increased flows due to conversion to distrubuted renewable energy resources. However, in the 100% case the increased reliance on renewables means that signficant transmission augmentation is beneficial to the overall cost. This strongly suggests that it is important to understand the long term emission target early so that infrastructure investments can be optimised.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Mingyuan; Mead, James; Wu, Yueqin

    In this study, a nanoindentation-based microcantilever bending technique was utilized to investigate the interfacial properties of a β-Mg{sub 17}Al{sub 12}/AZ91 Mg alloy film/substrate system under tensile loading conditions. Finite element analysis (FEA) was first undertaken to optimise the design of cantilever structures for inducing high tensile stresses at the interface. Cantilevers consisting of a necked region or notch at the interface were determined to be the most successful designs. Microcantilevers containing the β-Mg{sub 17}Al{sub 12}/AZ91 interface were then made using focused ion beam (FIB) milling technique. Necks were made in the cantilevers to intensify the tension at the interface andmore » notches were used to introduce a stress concentration to the interface. During bending, the cantilevers were deflected to failure. Subsequent analysis of the deformed cantilevers using electron microscopies revealed that plastic deformation, and subsequent ductile rupture, of the AZ91 phase was the dominant failure mechanism. When the β-Mg{sub 17}Al{sub 12}/AZ91 film/substrate system was subjected to tension, the softer AZ91 phase failed prior to interfacial delamination, demonstrating that the strength of the interface exceeded the stresses that caused ductile failure in the substrate material. - Highlights: •Microcantilever bending was used to study the property of film/substrate interface. •FEA was used to optimise cantilever design for achieving high interfacial tension. •The intermetallic coatings on AZ91 substrate have strong interfacial adhesion.« less

  1. Optimisation of process parameters on thin shell part using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.

    2017-09-01

    This study is carried out to focus on optimisation of process parameters by simulation using Autodesk Moldflow Insight (AMI) software. The process parameters are taken as the input in order to analyse the warpage value which is the output in this study. There are some significant parameters that have been used which are melt temperature, mould temperature, packing pressure, and cooling time. A plastic part made of Polypropylene (PP) has been selected as the study part. Optimisation of process parameters is applied in Design Expert software with the aim to minimise the obtained warpage value. Response Surface Methodology (RSM) has been applied in this study together with Analysis of Variance (ANOVA) in order to investigate the interactions between parameters that are significant to the warpage value. Thus, the optimised warpage value can be obtained using the model designed using RSM due to its minimum error value. This study comes out with the warpage value improved by using RSM.

  2. Letter to the editor concerning the article "Effects of acoustic feedback training in elite-standard Para-Rowing" by Schaffert and Mattes (2015).

    PubMed

    Hill, Holger

    2015-01-01

    In a case study, Schaffert and Mattes reported the application of acoustic feedback (sonification) to optimise the time course of boat acceleration. The authors attributed an increased boat speed in the feedback condition to an optimised boat acceleration (mainly during the recovery phase). However, in rowing it is biomechanically impossible to increase the boat speed significantly by reducing the fluctuations in boat acceleration during the rowing cycle. To assess such a, potentially small, optimising effect experimentally, the confounding variables must be controlled very accurately (that is especially the propulsive forces must be kept constant between experimental conditions or the differences in propulsive forces between conditions must be much smaller than the effects on boat speed resulting from an optimised movement pattern). However, this was not controlled adequately by the authors. Instead, the presented boat acceleration data show that the increased boat speed under acoustic feedback was due to increased propulsive forces.

  3. Optimisation of composite bone plates for ulnar transverse fractures.

    PubMed

    Chakladar, N D; Harper, L T; Parsons, A J

    2016-04-01

    Metallic bone plates are commonly used for arm bone fractures where conservative treatment (casts) cannot provide adequate support and compression at the fracture site. These plates, made of stainless steel or titanium alloys, tend to shield stress transfer at the fracture site and delay the bone healing rate. This study investigates the feasibility of adopting advanced composite materials to overcome stress shielding effects by optimising the geometry and mechanical properties of the plate to match more closely to the bone. An ulnar transverse fracture is characterised and finite element techniques are employed to investigate the feasibility of a composite-plated fractured bone construct over a stainless steel equivalent. Numerical models of intact and fractured bones are analysed and the mechanical behaviour is found to agree with experimental data. The mechanical properties are tailored to produce an optimised composite plate, offering a 25% reduction in length and a 70% reduction in mass. The optimised design may help to reduce stress shielding and increase bone healing rates. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Thermal energy and economic analysis of a PCM-enhanced household envelope considering different climate zones in Morocco

    NASA Astrophysics Data System (ADS)

    Kharbouch, Yassine; Mimet, Abdelaziz; El Ganaoui, Mohammed; Ouhsaine, Lahoucine

    2018-07-01

    This study investigates the thermal energy potentials and economic feasibility of an air-conditioned family household-integrated phase change material (PCM) considering different climate zones in Morocco. A simulation-based optimisation was carried out in order to define the optimal design of a PCM-enhanced household envelope for thermal energy effectiveness and cost-effectiveness of predefined candidate solutions. The optimisation methodology is based on coupling Energyplus® as a dynamic simulation tool and GenOpt® as an optimisation tool. Considering the obtained optimum design strategies, a thermal energy and economic analysis are carried out to investigate PCMs' integration feasibility in the Moroccan constructions. The results show that the PCM-integrated household envelope allows minimising the cooling/heating thermal energy demand vs. a reference household without PCM. While for the cost-effectiveness optimisation, it has been deduced that the economic feasibility is stilling insufficient under the actual PCM market conditions. The optimal design parameters results are also analysed.

  5. Paediatric CT protocol optimisation: a design of experiments to support the modelling and optimisation process.

    PubMed

    Rani, K; Jahnen, A; Noel, A; Wolf, D

    2015-07-01

    In the last decade, several studies have emphasised the need to understand and optimise the computed tomography (CT) procedures in order to reduce the radiation dose applied to paediatric patients. To evaluate the influence of the technical parameters on the radiation dose and the image quality, a statistical model has been developed using the design of experiments (DOE) method that has been successfully used in various fields (industry, biology and finance) applied to CT procedures for the abdomen of paediatric patients. A Box-Behnken DOE was used in this study. Three mathematical models (contrast-to-noise ratio, noise and CTDI vol) depending on three factors (tube current, tube voltage and level of iterative reconstruction) were developed and validated. They will serve as a basis for the development of a CT protocol optimisation model. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Evolving optimised decision rules for intrusion detection using particle swarm paradigm

    NASA Astrophysics Data System (ADS)

    Sivatha Sindhu, Siva S.; Geetha, S.; Kannan, A.

    2012-12-01

    The aim of this article is to construct a practical intrusion detection system (IDS) that properly analyses the statistics of network traffic pattern and classify them as normal or anomalous class. The objective of this article is to prove that the choice of effective network traffic features and a proficient machine-learning paradigm enhances the detection accuracy of IDS. In this article, a rule-based approach with a family of six decision tree classifiers, namely Decision Stump, C4.5, Naive Baye's Tree, Random Forest, Random Tree and Representative Tree model to perform the detection of anomalous network pattern is introduced. In particular, the proposed swarm optimisation-based approach selects instances that compose training set and optimised decision tree operate over this trained set producing classification rules with improved coverage, classification capability and generalisation ability. Experiment with the Knowledge Discovery and Data mining (KDD) data set which have information on traffic pattern, during normal and intrusive behaviour shows that the proposed algorithm produces optimised decision rules and outperforms other machine-learning algorithm.

  7. Multi-objective thermodynamic optimisation of supercritical CO2 Brayton cycles integrated with solar central receivers

    NASA Astrophysics Data System (ADS)

    Vasquez Padilla, Ricardo; Soo Too, Yen Chean; Benito, Regano; McNaughton, Robbie; Stein, Wes

    2018-01-01

    In this paper, optimisation of the supercritical CO? Brayton cycles integrated with a solar receiver, which provides heat input to the cycle, was performed. Four S-CO? Brayton cycle configurations were analysed and optimum operating conditions were obtained by using a multi-objective thermodynamic optimisation. Four different sets, each including two objective parameters, were considered individually. The individual multi-objective optimisation was performed by using Non-dominated Sorting Genetic Algorithm. The effect of reheating, solar receiver pressure drop and cycle parameters on the overall exergy and cycle thermal efficiency was analysed. The results showed that, for all configurations, the overall exergy efficiency of the solarised systems achieved at maximum value between 700°C and 750°C and the optimum value is adversely affected by the solar receiver pressure drop. In addition, the optimum cycle high pressure was in the range of 24.2-25.9 MPa, depending on the configurations and reheat condition.

  8. An integrated and dynamic optimisation model for the multi-level emergency logistics network in anti-bioterrorism system

    NASA Astrophysics Data System (ADS)

    Liu, Ming; Zhao, Lindu

    2012-08-01

    Demand for emergency resources is usually uncertain and varies quickly in anti-bioterrorism system. Besides, emergency resources which had been allocated to the epidemic areas in the early rescue cycle will affect the demand later. In this article, an integrated and dynamic optimisation model with time-varying demand based on the epidemic diffusion rule is constructed. The heuristic algorithm coupled with the MATLAB mathematical programming solver is adopted to solve the optimisation model. In what follows, the application of the optimisation model as well as a short sensitivity analysis of the key parameters in the time-varying demand forecast model is presented. The results show that both the model and the solution algorithm are useful in practice, and both objectives of inventory level and emergency rescue cost can be controlled effectively. Thus, it can provide some guidelines for decision makers when coping with emergency rescue problem with uncertain demand, and offers an excellent reference when issues pertain to bioterrorism.

  9. Losses of nutrients and anti-nutrients in red and white sorghum cultivars after decorticating in optimised conditions.

    PubMed

    Galán, María Gimena; Llopart, Emilce Elina; Drago, Silvina Rosa

    2018-05-01

    The aims were to optimise pearling process of red and white sorghum by assessing the effects of pearling time and grain moisture on endosperm yield and flour ash content and to assess nutrient and anti-nutrient losses produced by pearling different cultivars in optimised conditions. Both variables significantly affected both responses. Losses of ashes (58%), proteins (9.5%), lipids (54.5%), Na (37%), Mg (48.5%) and phenolic compounds (43%) were similar among red and white hybrids. However, losses of P (30% vs. 51%), phytic acid (47% vs. 66%), Fe (22% vs. 55%), Zn (32% vs. 62%), Ca (60% vs. 66%), K (46% vs. 61%) and Cu (51% vs. 71%) were lower for red than white sorghum due to different degree of extraction and distribution of components in the grain. Optimised pearling conditions were extrapolated to other hybrids, indicating these criteria could be applied at industrial level to obtain refined flours with proper quality and good endosperm yields.

  10. Optimisation of sensing time and transmission time in cognitive radio-based smart grid networks

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Fu, Yuli; Yang, Junjie

    2016-07-01

    Cognitive radio (CR)-based smart grid (SG) networks have been widely recognised as emerging communication paradigms in power grids. However, a sufficient spectrum resource and reliability are two major challenges for real-time applications in CR-based SG networks. In this article, we study the traffic data collection problem. Based on the two-stage power pricing model, the power price is associated with the efficient received traffic data in a metre data management system (MDMS). In order to minimise the system power price, a wideband hybrid access strategy is proposed and analysed, to share the spectrum between the SG nodes and CR networks. The sensing time and transmission time are jointly optimised, while both the interference to primary users and the spectrum opportunity loss of secondary users are considered. Two algorithms are proposed to solve the joint optimisation problem. Simulation results show that the proposed joint optimisation algorithms outperform the fixed parameters (sensing time and transmission time) algorithms, and the power cost is reduced efficiently.

  11. A generic methodology for the optimisation of sewer systems using stochastic programming and self-optimizing control.

    PubMed

    Mauricio-Iglesias, Miguel; Montero-Castro, Ignacio; Mollerup, Ane L; Sin, Gürkan

    2015-05-15

    The design of sewer system control is a complex task given the large size of the sewer networks, the transient dynamics of the water flow and the stochastic nature of rainfall. This contribution presents a generic methodology for the design of a self-optimising controller in sewer systems. Such controller is aimed at keeping the system close to the optimal performance, thanks to an optimal selection of controlled variables. The definition of an optimal performance was carried out by a two-stage optimisation (stochastic and deterministic) to take into account both the overflow during the current rain event as well as the expected overflow given the probability of a future rain event. The methodology is successfully applied to design an optimising control strategy for a subcatchment area in Copenhagen. The results are promising and expected to contribute to the advance of the operation and control problem of sewer systems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Optimisation of synergistic biomass-degrading enzyme systems for efficient rice straw hydrolysis using an experimental mixture design.

    PubMed

    Suwannarangsee, Surisa; Bunterngsook, Benjarat; Arnthong, Jantima; Paemanee, Atchara; Thamchaipenet, Arinthip; Eurwilaichitr, Lily; Laosiripojana, Navadol; Champreda, Verawat

    2012-09-01

    Synergistic enzyme system for the hydrolysis of alkali-pretreated rice straw was optimised based on the synergy of crude fungal enzyme extracts with a commercial cellulase (Celluclast™). Among 13 enzyme extracts, the enzyme preparation from Aspergillus aculeatus BCC 199 exhibited the highest level of synergy with Celluclast™. This synergy was based on the complementary cellulolytic and hemicellulolytic activities of the BCC 199 enzyme extract. A mixture design was used to optimise the ternary enzyme complex based on the synergistic enzyme mixture with Bacillus subtilis expansin. Using the full cubic model, the optimal formulation of the enzyme mixture was predicted to the percentage of Celluclast™: BCC 199: expansin=41.4:37.0:21.6, which produced 769 mg reducing sugar/g biomass using 2.82 FPU/g enzymes. This work demonstrated the use of a systematic approach for the design and optimisation of a synergistic enzyme mixture of fungal enzymes and expansin for lignocellulosic degradation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Metric optimisation for analogue forecasting by simulated annealing

    NASA Astrophysics Data System (ADS)

    Bliefernicht, J.; Bárdossy, A.

    2009-04-01

    It is well known that weather patterns tend to recur from time to time. This property of the atmosphere is used by analogue forecasting techniques. They have a long history in weather forecasting and there are many applications predicting hydrological variables at the local scale for different lead times. The basic idea of the technique is to identify past weather situations which are similar (analogue) to the predicted one and to take the local conditions of the analogues as forecast. But the forecast performance of the analogue method depends on user-defined criteria like the choice of the distance function and the size of the predictor domain. In this study we propose a new methodology of optimising both criteria by minimising the forecast error with simulated annealing. The performance of the methodology is demonstrated for the probability forecast of daily areal precipitation. It is compared with a traditional analogue forecasting algorithm, which is used operational as an element of a hydrological forecasting system. The study is performed for several meso-scale catchments located in the Rhine basin in Germany. The methodology is validated by a jack-knife method in a perfect prognosis framework for a period of 48 years (1958-2005). The predictor variables are derived from the NCEP/NCAR reanalysis data set. The Brier skill score and the economic value are determined to evaluate the forecast skill and value of the technique. In this presentation we will present the concept of the optimisation algorithm and the outcome of the comparison. It will be also demonstrated how a decision maker should apply a probability forecast to maximise the economic benefit from it.

  14. A method to incorporate the effect of beam quality on image noise in a digitally reconstructed radiograph (DRR) based computer simulation for optimisation of digital radiography

    NASA Astrophysics Data System (ADS)

    Moore, Craig S.; Wood, Tim J.; Saunderson, John R.; Beavis, Andrew W.

    2017-09-01

    The use of computer simulated digital x-radiographs for optimisation purposes has become widespread in recent years. To make these optimisation investigations effective, it is vital simulated radiographs contain accurate anatomical and system noise. Computer algorithms that simulate radiographs based solely on the incident detector x-ray intensity (‘dose’) have been reported extensively in the literature. However, while it has been established for digital mammography that x-ray beam quality is an important factor when modelling noise in simulated images there are no such studies for diagnostic imaging of the chest, abdomen and pelvis. This study investigates the influence of beam quality on image noise in a digital radiography (DR) imaging system, and incorporates these effects into a digitally reconstructed radiograph (DRR) computer simulator. Image noise was measured on a real DR imaging system as a function of dose (absorbed energy) over a range of clinically relevant beam qualities. Simulated ‘absorbed energy’ and ‘beam quality’ DRRs were then created for each patient and tube voltage under investigation. Simulated noise images, corrected for dose and beam quality, were subsequently produced from the absorbed energy and beam quality DRRs, using the measured noise, absorbed energy and beam quality relationships. The noise images were superimposed onto the noiseless absorbed energy DRRs to create the final images. Signal-to-noise measurements in simulated chest, abdomen and spine images were within 10% of the corresponding measurements in real images. This compares favourably to our previous algorithm where images corrected for dose only were all within 20%.

  15. Focal ratio degradation: a new perspective

    NASA Astrophysics Data System (ADS)

    Haynes, Dionne M.; Withford, Michael J.; Dawes, Judith M.; Haynes, Roger; Bland-Hawthorn, Joss

    2008-07-01

    We have developed an alternative FRD empirical model for the parallel laser beam technique which can accommodate contributions from both scattering and modal diffusion. It is consistent with scattering inducing a Lorentzian contribution and modal diffusion inducing a Gaussian contribution. The convolution of these two functions produces a Voigt function which is shown to better simulate the observed behavior of the FRD distribution and provides a greatly improved fit over the standard Gaussian fitting approach. The Voigt model can also be used to quantify the amount of energy displaced by FRD, therefore allowing astronomical instrument scientists to identify, quantify and potentially minimize the various sources of FRD, and optimise the fiber and instrument performance.

  16. Pearson's Functions to Describe FSW Weld Geometry

    NASA Astrophysics Data System (ADS)

    Lacombe, D.; Gutierrez-Orrantia, M. E.; Coupard, D.; Tcherniaeff, S.; Girot, F.

    2011-01-01

    Friction stir welding (FSW) is a relatively new joining technique particularly for aluminium alloys that are difficult to fusion weld. In this study, the geometry of the weld has been investigated and modelled using Pearson's functions. It has been demonstrated that the Pearson's parameters (mean, standard deviation, skewness, kurtosis and geometric constant) can be used to characterize the weld geometry and the tensile strength of the weld assembly. Pearson's parameters and process parameters are strongly correlated allowing to define a control process procedure for FSW assemblies which make radiographic or ultrasonic controls unnecessary. Finally, an optimisation using a Generalized Gradient Method allows to determine the geometry of the weld which maximises the assembly tensile strength.

  17. Optimisation of cavity parameters for lasers based on AlGaInAsP/InP solid solutions (λ = 1470 nm)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veselov, D A; Ayusheva, K R; Shashkin, I S

    2015-10-31

    We have studied the effect of laser cavity parameters on the light–current characteristics of lasers based on the AlGaInAs/GaInAsP/InP solid solution system that emit in the spectral range 1400 – 1600 nm. It has been shown that optimisation of cavity parameters (chip length and front facet reflectivity) allows one to improve heat removal from the laser, without changing other laser characteristics. An increase in the maximum output optical power of the laser by 0.5 W has been demonstrated due to cavity design optimisation. (lasers)

  18. Vertical transportation systems embedded on shuffled frog leaping algorithm for manufacturing optimisation problems in industries.

    PubMed

    Aungkulanon, Pasura; Luangpaiboon, Pongchanun

    2016-01-01

    Response surface methods via the first or second order models are important in manufacturing processes. This study, however, proposes different structured mechanisms of the vertical transportation systems or VTS embedded on a shuffled frog leaping-based approach. There are three VTS scenarios, a motion reaching a normal operating velocity, and both reaching and not reaching transitional motion. These variants were performed to simultaneously inspect multiple responses affected by machining parameters in multi-pass turning processes. The numerical results of two machining optimisation problems demonstrated the high performance measures of the proposed methods, when compared to other optimisation algorithms for an actual deep cut design.

  19. Linear-scaling time-dependent density-functional theory beyond the Tamm-Dancoff approximation: Obtaining efficiency and accuracy with in situ optimised local orbitals.

    PubMed

    Zuehlsdorff, T J; Hine, N D M; Payne, M C; Haynes, P D

    2015-11-28

    We present a solution of the full time-dependent density-functional theory (TDDFT) eigenvalue equation in the linear response formalism exhibiting a linear-scaling computational complexity with system size, without relying on the simplifying Tamm-Dancoff approximation (TDA). The implementation relies on representing the occupied and unoccupied subspaces with two different sets of in situ optimised localised functions, yielding a very compact and efficient representation of the transition density matrix of the excitation with the accuracy associated with a systematic basis set. The TDDFT eigenvalue equation is solved using a preconditioned conjugate gradient algorithm that is very memory-efficient. The algorithm is validated on a small test molecule and a good agreement with results obtained from standard quantum chemistry packages is found, with the preconditioner yielding a significant improvement in convergence rates. The method developed in this work is then used to reproduce experimental results of the absorption spectrum of bacteriochlorophyll in an organic solvent, where it is demonstrated that the TDA fails to reproduce the main features of the low energy spectrum, while the full TDDFT equation yields results in good qualitative agreement with experimental data. Furthermore, the need for explicitly including parts of the solvent into the TDDFT calculations is highlighted, making the treatment of large system sizes necessary that are well within reach of the capabilities of the algorithm introduced here. Finally, the linear-scaling properties of the algorithm are demonstrated by computing the lowest excitation energy of bacteriochlorophyll in solution. The largest systems considered in this work are of the same order of magnitude as a variety of widely studied pigment-protein complexes, opening up the possibility of studying their properties without having to resort to any semiclassical approximations to parts of the protein environment.

  20. Optimisation of novel method for the extraction of steviosides from Stevia rebaudiana leaves.

    PubMed

    Puri, Munish; Sharma, Deepika; Barrow, Colin J; Tiwary, A K

    2012-06-01

    Stevioside, a diterpene glycoside, is well known for its intense sweetness and is used as a non-caloric sweetener. Its potential widespread use requires an easy and effective extraction method. Enzymatic extraction of stevioside from Stevia rebaudiana leaves with cellulase, pectinase and hemicellulase, using various parameters, such as concentration of enzyme, incubation time and temperature, was optimised. Hemicellulase was observed to give the highest stevioside yield (369.23±0.11μg) in 1h in comparison to cellulase (359±0.30μg) and pectinases (333±0.55μg). Extraction from leaves under optimised conditions showed a remarkable increase in the yield (35 times) compared with a control experiment. The extraction conditions were further optimised using response surface methodology (RSM). A central composite design (CCD) was used for experimental design and analysis of the results to obtain optimal extraction conditions. Based on RSM analysis, temperature of 51-54°C, time of 36-45min and the cocktail of pectinase, cellulase and hemicellulase, set at 2% each, gave the best results. Under the optimised conditions, the experimental values were in close agreement with the prediction model and resulted in a three times yield enhancement of stevioside. The isolated stevioside was characterised through 1 H-NMR spectroscopy, by comparison with a stevioside standard. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Dynamic least-cost optimisation of wastewater system remedial works requirements.

    PubMed

    Vojinovic, Z; Solomatine, D; Price, R K

    2006-01-01

    In recent years, there has been increasing concern for wastewater system failure and identification of optimal set of remedial works requirements. So far, several methodologies have been developed and applied in asset management activities by various water companies worldwide, but often with limited success. In order to fill the gap, there are several research projects that have been undertaken in exploring various algorithms to optimise remedial works requirements, but mostly for drinking water supply systems, and very limited work has been carried out for the wastewater assets. Some of the major deficiencies of commonly used methods can be found in either one or more of the following aspects: inadequate representation of systems complexity, incorporation of a dynamic model into the decision-making loop, the choice of an appropriate optimisation technique and experience in applying that technique. This paper is oriented towards resolving these issues and discusses a new approach for the optimisation of wastewater systems remedial works requirements. It is proposed that the optimal problem search is performed by a global optimisation tool (with various random search algorithms) and the system performance is simulated by the hydrodynamic pipe network model. The work on assembling all required elements and the development of an appropriate interface protocols between the two tools, aimed to decode the potential remedial solutions into the pipe network model and to calculate the corresponding scenario costs, is currently underway.

  2. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    PubMed Central

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  3. As reliable as the sun

    NASA Astrophysics Data System (ADS)

    Leijtens, J. A. P.

    2017-11-01

    Fortunately there is almost nothing as reliable as the sun which can consequently be utilized as a very reliable source of spacecraft power. In order to harvest this power, the solar panels have to be pointed towards the sun as accurately and reliably as possible. To this extend, sunsensors are available on almost every satellite to support vital sun-pointing capability throughout the mission, even in the deployment and save mode phases of the satellites life. Given the criticality of the application one would expect that after more than 50 years of sun sensor utilisation, such sensors would be fully matured and optimised. In actual fact though, the majority of sunsensors employed are still coarse sunsensors which have a proven extreme reliability but present major issues regarding albedo sensitivity and pointing accuracy.

  4. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    NASA Astrophysics Data System (ADS)

    Seiller, G.; Anctil, F.; Roy, R.

    2017-09-01

    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  5. Optimisation of biocide dose as a function of residual biocide in a heat exchanger pilot plant effluent.

    PubMed

    Eguía, Emilio; Trueba, Alfredo; Girón, Alfredo; Río-Calonge, Belén; Otero, Félix; Bielva, Carlos

    2007-01-01

    Biofouling is one of the most serious problems facing numerous industrial processes. In the case of a heat exchanger unit, biological deposits adhering to the inside surface of its tubes reduce heat transfer and, thus, the thermal performance of the cycle. Control of this phenomenon is proving fundamental for both land and marine equipment to operate in optimum working conditions. Hence, it is necessary to apply antifouling methods capable of keeping surfaces free of any kind of biofouling. This paper reports on the behaviour resulting from use of the flow inversion method vs that obtained by using various chemical treatments. The study compares the effectiveness of certain chemical treatments (Na hypochlorite, peracetic acid and a compound formed by Na bromide + Na hypochlorite) for removing a biofouling film that has already formed on the inside surfaces of tubes in a heat exchanger pilot plant. The paper also addresses the issue of optimising the concentration of biocide dose as a function of the residual biocide in order minimise the environmental impact caused by effluent from industrial plants. The results indicate that it is possible to eliminate a biofilm formed on the inside surfaces of tubes by the use of intermittent doses of chemical treatments at low concentrations and over long application times. Furthermore, once the stabilisation phase is reached 6 d after starting the treatment, it is possible to maintain the conditions achieved using only 20% of the initial dosage.

  6. Minimising back reflections from the common path objective in a fundus camera

    NASA Astrophysics Data System (ADS)

    Swat, A.

    2016-11-01

    Eliminating back reflections is critical in the design of a fundus camera with internal illuminating system. As there is very little light reflected from the retina, even excellent antireflective coatings are not sufficient suppression of ghost reflections, therefore the number of surfaces in the common optics in illuminating and imaging paths shall be minimised. Typically a single aspheric objective is used. In the paper an alternative approach, an objective with all spherical surfaces, is presented. As more surfaces are required, more sophisticated method is needed to get rid of back reflections. Typically back reflections analysis, comprise treating subsequent objective surfaces as mirrors, and reflections from the objective surfaces are traced back through the imaging path. This approach can be applied in both sequential and nonsequential ray tracing. It is good enough for system check but not very suitable for early optimisation process in the optical system design phase. There are also available standard ghost control merit function operands in the sequential ray-trace, for example in Zemax system, but these don't allow back ray-trace in an alternative optical path, illumination vs. imaging. What is proposed in the paper, is a complete method to incorporate ghost reflected energy into the raytracing system merit function for sequential mode which is more efficient in optimisation process. Although developed for the purpose of specific case of fundus camera, the method might be utilised in a wider range of applications where ghost control is critical.

  7. Review: The effect of nutrition on timing of pubertal onset and subsequent fertility in the bull.

    PubMed

    Kenny, D A; Byrne, C J

    2018-06-01

    The advent of genomic selection has led to increased interest within the cattle breeding industry to market semen from young bulls as early as possible. However, both the quantity and quality of such semen is dictated by the age at which these animals reach puberty. Enhancing early life plane of nutrition of the bull stimulates a complex biochemical interplay involving metabolic and neuroendocrine signalling and culminating in enhanced testicular growth and development and earlier onset of sexual maturation. Recent evidence suggests that an enhanced plane of nutrition leads to an advancement of testicular development in bulls at 18 weeks of age. However, as of yet, much of the neuronal mechanisms regulating these developmental processes remain to be elucidated in the bull. While early life nutrition clearly affects the sexual maturation process in bulls, there is little evidence for latent effects on semen traits post-puberty. Equally the influence of prevailing nutritional status on the fertility of mature bulls is unclear though management practices that result in clinical or even subclinical metabolic disease can undoubtedly impact upon normal sexual function. Dietary supplements enriched with various polyunsaturated fatty acids or fortified with trace elements do not consistently affect reproductive function in the bull, certainly where animals are already adequately nourished. Further insight on how nutrition mediates the biochemical interaction between neuroendocrine and testicular processes will facilitate optimisation of nutritional regimens to optimise sexual maturation and subsequent semen production in bulls.

  8. Nursing implications: symptom presentation and quality of life in rectal cancer patients.

    PubMed

    O'Gorman, Claire; Barry, Amanda; Denieffe, Suzanne; Sasiadek, Wojciech; Gooney, Martina

    2016-05-01

    To determine the changes in symptoms experienced by rectal cancer patients during preoperative chemoradiotherapy, with a specific focus on fatigue and to explore how symptoms impact the quality of life. Rectal cancer continues to be a healthcare issue internationally, despite advances in management strategies, which includes the administration of preoperative chemoradiotherapy to improve locoregional control. It is known that this treatment may cause adverse effects; however, there is a paucity of literature that specifically examines fatigue, symptoms and quality of life in this patient cohort. A prospective, quantitative correlational design using purposive sampling was adopted. Symptoms and quality of life were measured with validated questionnaires in 35 patients at four time points. Symptoms that changed significantly over time as examined using rm-anova include fatigue, bowel function issues, nutritional issues, pain, dermatological issues and urinary function issues. Findings indicate that fatigue leads to poorer quality of life, with constipation, bloating, stool frequency, appetite loss, weight worry, nausea and vomiting, dry mouth and pain also identified as influencing factors on quality of life. Findings have highlighted the importance of thorough symptom assessment and management of patients receiving preoperative chemoradiotherapy, particularly midway through treatment, in order to optimise quality of life and minimise interruptions to treatment. Close monitoring of symptoms during preoperative chemoradiotherapy, particularly at week 4, will enable the implementation of timely interventions so that interruptions to treatment are prevented and the quality of life is optimised, which may hasten postoperative recovery times. © 2016 John Wiley & Sons Ltd.

  9. On Optimal Development and Becoming an Optimiser

    ERIC Educational Resources Information Center

    de Ruyter, Doret J.

    2012-01-01

    The article aims to provide a justification for the claim that optimal development and becoming an optimiser are educational ideals that parents should pursue in raising their children. Optimal development is conceptualised as enabling children to grow into flourishing persons, that is persons who have developed (and are still developing) their…

  10. Using non-empirically tuned range-separated functionals with simulated emission bands to model fluorescence lifetimes.

    PubMed

    Wong, Z C; Fan, W Y; Chwee, T S; Sullivan, Michael B

    2017-08-09

    Fluorescence lifetimes were evaluated using TD-DFT under different approximations for the emitting molecule and various exchange-correlation functionals, such as B3LYP, BMK, CAM-B3LYP, LC-BLYP, M06, M06-2X, M11, PBE0, ωB97, ωB97X, LC-BLYP*, and ωB97X* where the range-separation parameters in the last two functionals were tuned in a non-empirical fashion. Changes in the optimised molecular geometries between the ground and electronically excited states were found to affect the quality of the calculated lifetimes significantly, while the inclusion of vibronic features led to further improvements over the assumption of a vertical electronic transition. The LC-BLYP* functional was found to return the most accurate fluorescence lifetimes with unsigned errors that are mostly within 1.5 ns of experimental values.

  11. A Galerkin discretisation-based identification for parameters in nonlinear mechanical systems

    NASA Astrophysics Data System (ADS)

    Liu, Zuolin; Xu, Jian

    2018-04-01

    In the paper, a new parameter identification method is proposed for mechanical systems. Based on the idea of Galerkin finite-element method, the displacement over time history is approximated by piecewise linear functions, and the second-order terms in model equation are eliminated by integrating by parts. In this way, the lost function of integration form is derived. Being different with the existing methods, the lost function actually is a quadratic sum of integration over the whole time history. Then for linear or nonlinear systems, the optimisation of the lost function can be applied with traditional least-squares algorithm or the iterative one, respectively. Such method could be used to effectively identify parameters in linear and arbitrary nonlinear mechanical systems. Simulation results show that even under the condition of sparse data or low sampling frequency, this method could still guarantee high accuracy in identifying linear and nonlinear parameters.

  12. Discovery and optimisation studies of antimalarial phenotypic hits

    PubMed Central

    Mital, Alka; Murugesan, Dinakaran; Kaiser, Marcel; Yeates, Clive; Gilbert, Ian H.

    2015-01-01

    There is an urgent need for the development of new antimalarial compounds. As a result of a phenotypic screen, several compounds with potent activity against the parasite Plasmodium falciparum were identified. Characterization of these compounds is discussed, along with approaches to optimise the physicochemical properties. The in vitro antimalarial activity of these compounds against P. falciparum K1 had EC50 values in the range of 0.09–29 μM, and generally good selectivity (typically >100-fold) compared to a mammalian cell line (L6). One example showed no significant activity against a rodent model of malaria, and more work is needed to optimise these compounds. PMID:26408453

  13. Design and optimisation of wheel-rail profiles for adhesion improvement

    NASA Astrophysics Data System (ADS)

    Liu, B.; Mei, T. X.; Bruni, S.

    2016-03-01

    This paper describes a study for the optimisation of the wheel profile in the wheel-rail system to increase the overall level of adhesion available at the contact interface, in particular to investigate how the wheel and rail profile combination may be designed to ensure the improved delivery of tractive/braking forces even in poor contact conditions. The research focuses on the geometric combination of both wheel and rail profiles to establish how the contact interface may be optimised to increase the adhesion level, but also to investigate how the change in the property of the contact mechanics at the wheel-rail interface may also lead to changes in the vehicle dynamic behaviour.

  14. Optimal Reward Functions in Distributed Reinforcement Learning

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Tumer, Kagan

    2000-01-01

    We consider the design of multi-agent systems so as to optimize an overall world utility function when (1) those systems lack centralized communication and control, and (2) each agents runs a distinct Reinforcement Learning (RL) algorithm. A crucial issue in such design problems is to initialize/update each agent's private utility function, so as to induce best possible world utility. Traditional 'team game' solutions to this problem sidestep this issue and simply assign to each agent the world utility as its private utility function. In previous work we used the 'Collective Intelligence' framework to derive a better choice of private utility functions, one that results in world utility performance up to orders of magnitude superior to that ensuing from use of the team game utility. In this paper we extend these results. We derive the general class of private utility functions that both are easy for the individual agents to learn and that, if learned well, result in high world utility. We demonstrate experimentally that using these new utility functions can result in significantly improved performance over that of our previously proposed utility, over and above that previous utility's superiority to the conventional team game utility.

  15. Optimal Earth's reentry disposal of the Galileo constellation

    NASA Astrophysics Data System (ADS)

    Armellin, Roberto; San-Juan, Juan F.

    2018-02-01

    Nowadays there is international consensus that space activities must be managed to minimize debris generation and risk. The paper presents a method for the end-of-life (EoL) disposal of spacecraft in Medium Earth Orbit (MEO). The problem is formulated as a multiobjective optimisation one, which is solved with an evolutionary algorithm. An impulsive manoeuvre is optimised to reenter the spacecraft in Earth's atmosphere within 100 years. Pareto optimal solutions are obtained using the manoeuvre Δv and the time-to-reentry as objective functions to be minimised. To explore at the best the search space a semi-analytical orbit propagator, which can propagate an orbit for 100 years in few seconds, is adopted. An in-depth analysis of the results is carried out to understand the conditions leading to a fast reentry with minimum propellant. For this aim a new way of representing the disposal solutions is introduced. With a single 2D plot we are able to fully describe the time evolution of all the relevant orbital parameters as well as identify the conditions that enables the eccentricity build-up. The EoL disposal of the Galileo constellation is used as test case.

  16. Variational Bayesian identification and prediction of stochastic nonlinear dynamic causal models.

    PubMed

    Daunizeau, J; Friston, K J; Kiebel, S J

    2009-11-01

    In this paper, we describe a general variational Bayesian approach for approximate inference on nonlinear stochastic dynamic models. This scheme extends established approximate inference on hidden-states to cover: (i) nonlinear evolution and observation functions, (ii) unknown parameters and (precision) hyperparameters and (iii) model comparison and prediction under uncertainty. Model identification or inversion entails the estimation of the marginal likelihood or evidence of a model. This difficult integration problem can be finessed by optimising a free-energy bound on the evidence using results from variational calculus. This yields a deterministic update scheme that optimises an approximation to the posterior density on the unknown model variables. We derive such a variational Bayesian scheme in the context of nonlinear stochastic dynamic hierarchical models, for both model identification and time-series prediction. The computational complexity of the scheme is comparable to that of an extended Kalman filter, which is critical when inverting high dimensional models or long time-series. Using Monte-Carlo simulations, we assess the estimation efficiency of this variational Bayesian approach using three stochastic variants of chaotic dynamic systems. We also demonstrate the model comparison capabilities of the method, its self-consistency and its predictive power.

  17. Sybil--efficient constraint-based modelling in R.

    PubMed

    Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J

    2013-11-13

    Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).

  18. An interval-parameter mixed integer multi-objective programming for environment-oriented evacuation management

    NASA Astrophysics Data System (ADS)

    Wu, C. Z.; Huang, G. H.; Yan, X. P.; Cai, Y. P.; Li, Y. P.

    2010-05-01

    Large crowds are increasingly common at political, social, economic, cultural and sports events in urban areas. This has led to attention on the management of evacuations under such situations. In this study, we optimise an approximation method for vehicle allocation and route planning in case of an evacuation. This method, based on an interval-parameter multi-objective optimisation model, has potential for use in a flexible decision support system for evacuation management. The modeling solutions are obtained by sequentially solving two sub-models corresponding to lower- and upper-bounds for the desired objective function value. The interval solutions are feasible and stable in the given decision space, and this may reduce the negative effects of uncertainty, thereby improving decision makers' estimates under different conditions. The resulting model can be used for a systematic analysis of the complex relationships among evacuation time, cost and environmental considerations. The results of a case study used to validate the proposed model show that the model does generate useful solutions for planning evacuation management and practices. Furthermore, these results are useful for evacuation planners, not only in making vehicle allocation decisions but also for providing insight into the tradeoffs among evacuation time, environmental considerations and economic objectives.

  19. Aquatic therapy for boys with Duchenne muscular dystrophy (DMD): an external pilot randomised controlled trial.

    PubMed

    Hind, Daniel; Parkin, James; Whitworth, Victoria; Rex, Saleema; Young, Tracey; Hampson, Lisa; Sheehan, Jennie; Maguire, Chin; Cantrill, Hannah; Scott, Elaine; Epps, Heather; Main, Marion; Geary, Michelle; McMurchie, Heather; Pallant, Lindsey; Woods, Daniel; Freeman, Jennifer; Lee, Ellen; Eagle, Michelle; Willis, Tracey; Muntoni, Francesco; Baxter, Peter

    2017-01-01

    Standard treatment of Duchenne muscular dystrophy (DMD) includes regular physiotherapy. There are no data to show whether adding aquatic therapy (AT) to land-based exercises helps maintain motor function. We assessed the feasibility of recruiting and collecting data from boys with DMD in a parallel-group pilot randomised trial (primary objective), also assessing how intervention and trial procedures work. Ambulant boys with DMD aged 7-16 years established on steroids, with North Star Ambulatory Assessment (NSAA) score ≥8, who were able to complete a 10-m walk test without aids or assistance, were randomly allocated (1:1) to 6 months of either optimised land-based exercises 4 to 6 days/week, defined by local community physiotherapists, or the same 4 days/week plus AT 2 days/week. Those unable to commit to a programme, with >20% variation between NSAA scores 4 weeks apart, or contraindications to AT were excluded. The main outcome measures included feasibility of recruiting 40 participants in 6 months from six UK centres, clinical outcomes including NSAA, independent assessment of treatment optimisation, participant/therapist views on acceptability of intervention and research protocols, value of information (VoI) analysis and cost-impact analysis. Over 6 months, 348 boys were screened: most lived too far from centres or were enrolled in other trials; 12 (30% of the targets) were randomised to AT ( n  = 8) or control ( n  = 4). The mean change in NSAA at 6 months was -5.5 (SD 7.8) in the control arm and -2.8 (SD 4.1) in the AT arm. Harms included fatigue in two boys, pain in one. Physiotherapists and parents valued AT but believed it should be delivered in community settings. Randomisation was unattractive to families, who had already decided that AT was useful and who often preferred to enrol in drug studies. The AT prescription was considered to be optimised for three boys, with other boys given programmes that were too extensive and insufficiently focused. Recruitment was insufficient for VoI analysis. Neither a UK-based RCT of AT nor a twice weekly AT therapy delivered at tertiary centres is feasible. Our study will help in the optimisation of AT service provision and the design of future research. ISRCTN41002956.

  20. Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction

    PubMed Central

    O'Boyle, Noel M; Palmer, David S; Nigsch, Florian; Mitchell, John BO

    2008-01-01

    Background We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45: 1024–1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45: 581–590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Results Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6°C and R2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, ε of 0.21) and an RMSE of 45.1°C and R2 of 0.54. This model outperforms a kNN model (RMSE of 48.3°C, R2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5°C, R2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. Conclusion With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors. PMID:18959785

  1. Multiple Criteria Evaluation of Quality and Optimisation of e-Learning System Components

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Dagiene, Valentina

    2010-01-01

    The main research object of the paper is investigation and proposal of the comprehensive Learning Object Repositories (LORs) quality evaluation tool suitable for their multiple criteria decision analysis, evaluation and optimisation. Both LORs "internal quality" and "quality in use" evaluation (decision making) criteria are analysed in the paper.…

  2. Optimising the Parallelisation of OpenFOAM Simulations

    DTIC Science & Technology

    2014-06-01

    UNCLASSIFIED UNCLASSIFIED Optimising the Parallelisation of OpenFOAM Simulations Shannon Keough Maritime Division Defence...Science and Technology Organisation DSTO-TR-2987 ABSTRACT The OpenFOAM computational fluid dynamics toolbox allows parallel computation of...performance of a given high performance computing cluster with several OpenFOAM cases, running using a combination of MPI libraries and corresponding MPI

  3. Optimising Microbial Growth with a Bench-Top Bioreactor

    ERIC Educational Resources Information Center

    Baker, A. M. R.; Borin, S. L.; Chooi, K. P.; Huang, S. S.; Newgas, A. J. S.; Sodagar, D.; Ziegler, C. A.; Chan, G. H. T.; Walsh, K. A. P.

    2006-01-01

    The effects of impeller size, agitation and aeration on the rate of yeast growth were investigated using bench-top bioreactors. This exercise, carried out over a six-month period, served as an effective demonstration of the importance of different operating parameters on cell growth and provided a means of determining the optimisation conditions…

  4. Engaging Homeless Individuals in Discussion about Their Food Experiences to Optimise Wellbeing: A Pilot Study

    ERIC Educational Resources Information Center

    Pettinger, Clare; Parsons, Julie M.; Cunningham, Miranda; Withers, Lyndsey; D'Aprano, Gia; Letherby, Gayle; Sutton, Carole; Whiteford, Andrew; Ayres, Richard

    2017-01-01

    Objective: High levels of social and economic deprivation are apparent in many UK cities, where there is evidence of certain "marginalised" communities suffering disproportionately from poor nutrition, threatening health. Finding ways to engage with these communities is essential to identify strategies to optimise wellbeing and life…

  5. Discontinuous permeable adsorptive barrier design and cost analysis: a methodological approach to optimisation.

    PubMed

    Santonastaso, Giovanni Francesco; Bortone, Immacolata; Chianese, Simeone; Di Nardo, Armando; Di Natale, Michele; Erto, Alessandro; Karatza, Despina; Musmarra, Dino

    2017-09-19

    The following paper presents a method to optimise a discontinuous permeable adsorptive barrier (PAB-D). This method is based on the comparison of different PAB-D configurations obtained by changing some of the main PAB-D design parameters. In particular, the well diameters, the distance between two consecutive passive wells and the distance between two consecutive well lines were varied, and a cost analysis for each configuration was carried out in order to define the best performing and most cost-effective PAB-D configuration. As a case study, a benzene-contaminated aquifer located in an urban area in the north of Naples (Italy) was considered. The PAB-D configuration with a well diameter of 0.8 m resulted the best optimised layout in terms of performance and cost-effectiveness. Moreover, in order to identify the best configuration for the remediation of the aquifer studied, a comparison with a continuous permeable adsorptive barrier (PAB-C) was added. In particular, this showed a 40% reduction of the total remediation costs by using the optimised PAB-D.

  6. Optimization of physical conditions for the production of thermostable T1 lipase in Pichia guilliermondii strain SO using response surface methodology.

    PubMed

    Abu, Mary Ladidi; Nooh, Hisham Mohd; Oslan, Siti Nurbaya; Salleh, Abu Bakar

    2017-11-10

    Pichia guilliermondii was found capable of expressing the recombinant thermostable lipase without methanol under the control of methanol dependent alcohol oxidase 1 promoter (AOXp 1). In this study, statistical approaches were employed for the screening and optimisation of physical conditions for T1 lipase production in P. guilliermondii. The screening of six physical conditions by Plackett-Burman Design has identified pH, inoculum size and incubation time as exerting significant effects on lipase production. These three conditions were further optimised using, Box-Behnken Design of Response Surface Methodology, which predicted an optimum medium comprising pH 6, 24 h incubation time and 2% inoculum size. T1 lipase activity of 2.0 U/mL was produced with a biomass of OD 600 23.0. The process of using RSM for optimisation yielded a 3-fold increase of T1 lipase over medium before optimisation. Therefore, this result has proven that T1 lipase can be produced at a higher yield in P. guilliermondii.

  7. Performance Analysis and Discussion on the Thermoelectric Element Footprint for PV-TE Maximum Power Generation

    NASA Astrophysics Data System (ADS)

    Li, Guiqiang; Zhao, Xudong; Jin, Yi; Chen, Xiao; Ji, Jie; Shittu, Samson

    2018-06-01

    Geometrical optimisation is a valuable way to improve the efficiency of a thermoelectric element (TE). In a hybrid photovoltaic-thermoelectric (PV-TE) system, the photovoltaic (PV) and thermoelectric (TE) components have a relatively complex relationship; their individual effects mean that geometrical optimisation of the TE element alone may not be sufficient to optimize the entire PV-TE hybrid system. In this paper, we introduce a parametric optimisation of the geometry of the thermoelectric element footprint for a PV-TE system. A uni-couple TE model was built for the PV-TE using the finite element method and temperature-dependent thermoelectric material properties. Two types of PV cells were investigated in this paper and the performance of PV-TE with different lengths of TE elements and different footprint areas was analysed. The outcome showed that no matter the TE element's length and the footprint areas, the maximum power output occurs when A n /A p = 1. This finding is useful, as it provides a reference whenever PV-TE optimisation is investigated.

  8. Joint optimisation of arbitrage profits and battery life degradation for grid storage application of battery electric vehicles

    NASA Astrophysics Data System (ADS)

    Kies, Alexander

    2018-02-01

    To meet European decarbonisation targets by 2050, the electrification of the transport sector is mandatory. Most electric vehicles rely on lithium-ion batteries, because they have a higher energy/power density and longer life span compared to other practical batteries such as zinc-carbon batteries. Electric vehicles can thus provide energy storage to support the system integration of generation from highly variable renewable sources, such as wind and photovoltaics (PV). However, charging/discharging causes batteries to degradate progressively with reduced capacity. In this study, we investigate the impact of the joint optimisation of arbitrage revenue and battery degradation of electric vehicle batteries in a simplified setting, where historical prices allow for market participation of battery electric vehicle owners. It is shown that the joint optimisation of both leads to stronger gains then the sum of both optimisation strategies and that including battery degradation into the model avoids state of charges close to the maximum at times. It can be concluded that degradation is an important aspect to consider in power system models, which incorporate any kind of lithium-ion battery storage.

  9. Modelling of auctioning mechanism for solar photovoltaic capacity

    NASA Astrophysics Data System (ADS)

    Poullikkas, Andreas

    2016-10-01

    In this work, a modified optimisation model for the integration of renewable energy sources for power-generation (RES-E) technologies in power-generation systems on a unit commitment basis is developed. The purpose of the modified optimisation procedure is to account for RES-E capacity auctions for different solar photovoltaic (PV) capacity electricity prices. The optimisation model developed uses a genetic algorithm (GA) technique for the calculation of the required RES-E levy (or green tax) in the electricity bills. Also, the procedure enables the estimation of the level of the adequate (or eligible) feed-in-tariff to be offered to future RES-E systems, which do not participate in the capacity auctioning procedure. In order to demonstrate the applicability of the optimisation procedure developed the case of PV capacity auctioning for commercial systems is examined. The results indicated that the required green tax, in order to promote the use of RES-E technologies, which is charged to the electricity customers through their electricity bills, is reduced with the reduction in the final auctioning price. This has a significant effect related to the reduction of electricity bills.

  10. An integrated portfolio optimisation procedure based on data envelopment analysis, artificial bee colony algorithm and genetic programming

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-Ming

    2014-12-01

    Portfolio optimisation is an important issue in the field of investment/financial decision-making and has received considerable attention from both researchers and practitioners. However, besides portfolio optimisation, a complete investment procedure should also include the selection of profitable investment targets and determine the optimal timing for buying/selling the investment targets. In this study, an integrated procedure using data envelopment analysis (DEA), artificial bee colony (ABC) and genetic programming (GP) is proposed to resolve a portfolio optimisation problem. The proposed procedure is evaluated through a case study on investing in stocks in the semiconductor sub-section of the Taiwan stock market for 4 years. The potential average 6-month return on investment of 9.31% from 1 November 2007 to 31 October 2011 indicates that the proposed procedure can be considered a feasible and effective tool for making outstanding investment plans, and thus making profits in the Taiwan stock market. Moreover, it is a strategy that can help investors to make profits even when the overall stock market suffers a loss.

  11. Characterisation and optimisation of flexible transfer lines for liquid helium. Part I: Experimental results

    NASA Astrophysics Data System (ADS)

    Dittmar, N.; Haberstroh, Ch.; Hesse, U.; Krzyzowski, M.

    2016-04-01

    The transfer of liquid helium (LHe) into mobile dewars or transport vessels is a common and unavoidable process at LHe decant stations. During this transfer reasonable amounts of LHe evaporate due to heat leak and pressure drop. Thus generated helium gas needs to be collected and reliquefied which requires a huge amount of electrical energy. Therefore, the design of transfer lines used at LHe decant stations has been optimised to establish a LHe transfer with minor evaporation losses which increases the overall efficiency and capacity of LHe decant stations. This paper presents the experimental results achieved during the thermohydraulic optimisation of a flexible LHe transfer line. An extensive measurement campaign with a set of dedicated transfer lines equipped with pressure and temperature sensors led to unique experimental data of this specific transfer process. The experimental results cover the heat leak, the pressure drop, the transfer rate, the outlet quality, and the cool-down and warm-up behaviour of the examined transfer lines. Based on the obtained results the design of the considered flexible transfer line has been optimised, featuring reduced heat leak and pressure drop.

  12. Synthesis of concentric circular antenna arrays using dragonfly algorithm

    NASA Astrophysics Data System (ADS)

    Babayigit, B.

    2018-05-01

    Due to the strong non-linear relationship between the array factor and the array elements, concentric circular antenna array (CCAA) synthesis problem is challenging. Nature-inspired optimisation techniques have been playing an important role in solving array synthesis problems. Dragonfly algorithm (DA) is a novel nature-inspired optimisation technique which is based on the static and dynamic swarming behaviours of dragonflies in nature. This paper presents the design of CCAAs to get low sidelobes using DA. The effectiveness of the proposed DA is investigated in two different (with and without centre element) cases of two three-ring (having 4-, 6-, 8-element or 8-, 10-, 12-element) CCAA design. The radiation pattern of each design cases is obtained by finding optimal excitation weights of the array elements using DA. Simulation results show that the proposed algorithm outperforms the other state-of-the-art techniques (symbiotic organisms search, biogeography-based optimisation, sequential quadratic programming, opposition-based gravitational search algorithm, cat swarm optimisation, firefly algorithm, evolutionary programming) for all design cases. DA can be a promising technique for electromagnetic problems.

  13. VizieR Online Data Catalog: Proper motions of PM2000 open clusters (Krone-Martins+, 2010)

    NASA Astrophysics Data System (ADS)

    Krone-Martins, A.; Soubiran, C.; Ducourant, C.; Teixeira, R.; Le Campion, J. F.

    2010-04-01

    We present lists of proper-motions and kinematic membership probabilities in the region of 49 open clusters or possible open clusters. The stellar proper motions were taken from the Bordeaux PM2000 catalogue. The segregation between cluster and field stars and the assignment of membership probabilities was accomplished by applying a fully automated method based on parametrisations for the probability distribution functions and genetic algorithm optimisation heuristics associated with a derivative-based hill climbing algorithm for the likelihood optimization. (3 data files).

  14. High risk of hypogonadism after traumatic brain injury: clinical implications.

    PubMed

    Agha, Amar; Thompson, Christopher J

    2005-01-01

    Several recent studies have convincingly documented a close association between traumatic brain injury (TBI) and pituitary dysfunction. Post-traumatic hypogonadism is very common in the acute post-TBI phase, though most cases recover within six to twelve months following trauma. The functional significance of early hypogonadism, which may reflect adaptation to acute illness, is not known. Hypogonadism persists, however, in 10-17% of long-term survivors. Sex steroid deficiency has implications beyond psychosexual function and fertility for survivors of TBI. Muscle weakness may impair functional recovery from trauma and osteoporosis may be exacerbated by immobility secondary to trauma. Identification and appropriate and timely management of post-traumatic hypogonadism is important in order to optimise patient recovery from head trauma, improve quality of life and avoid the long-term adverse consequences of untreated sex steroid deficiency.

  15. Discrete bacteria foraging optimization algorithm for graph based problems - a transition from continuous to discrete

    NASA Astrophysics Data System (ADS)

    Sur, Chiranjib; Shukla, Anupam

    2018-03-01

    Bacteria Foraging Optimisation Algorithm is a collective behaviour-based meta-heuristics searching depending on the social influence of the bacteria co-agents in the search space of the problem. The algorithm faces tremendous hindrance in terms of its application for discrete problems and graph-based problems due to biased mathematical modelling and dynamic structure of the algorithm. This had been the key factor to revive and introduce the discrete form called Discrete Bacteria Foraging Optimisation (DBFO) Algorithm for discrete problems which exceeds the number of continuous domain problems represented by mathematical and numerical equations in real life. In this work, we have mainly simulated a graph-based road multi-objective optimisation problem and have discussed the prospect of its utilisation in other similar optimisation problems and graph-based problems. The various solution representations that can be handled by this DBFO has also been discussed. The implications and dynamics of the various parameters used in the DBFO are illustrated from the point view of the problems and has been a combination of both exploration and exploitation. The result of DBFO has been compared with Ant Colony Optimisation and Intelligent Water Drops Algorithms. Important features of DBFO are that the bacteria agents do not depend on the local heuristic information but estimates new exploration schemes depending upon the previous experience and covered path analysis. This makes the algorithm better in combination generation for graph-based problems and combination generation for NP hard problems.

  16. Optimisation of shape kernel and threshold in image-processing motion analysers.

    PubMed

    Pedrocchi, A; Baroni, G; Sada, S; Marcon, E; Pedotti, A; Ferrigno, G

    2001-09-01

    The aim of the work is to optimise the image processing of a motion analyser. This is to improve accuracy, which is crucial for neurophysiological and rehabilitation applications. A new motion analyser, ELITE-S2, for installation on the International Space Station is described, with the focus on image processing. Important improvements are expected in the hardware of ELITE-S2 compared with ELITE and previous versions (ELITE-S and Kinelite). The core algorithm for marker recognition was based on the current ELITE version, using the cross-correlation technique. This technique was based on the matching of the expected marker shape, the so-called kernel, with image features. Optimisation of the kernel parameters was achieved using a genetic algorithm, taking into account noise rejection and accuracy. Optimisation was achieved by performing tests on six highly precise grids (with marker diameters ranging from 1.5 to 4 mm), representing all allowed marker image sizes, and on a noise image. The results of comparing the optimised kernels and the current ELITE version showed a great improvement in marker recognition accuracy, while noise rejection characteristics were preserved. An average increase in marker co-ordinate accuracy of +22% was achieved, corresponding to a mean accuracy of 0.11 pixel in comparison with 0.14 pixel, measured over all grids. An improvement of +37%, corresponding to an improvement from 0.22 pixel to 0.14 pixel, was observed over the grid with the biggest markers.

  17. Optimisation of solar synoptic observations

    NASA Astrophysics Data System (ADS)

    Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal

    2012-09-01

    The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.

  18. State-Of in Uav Remote Sensing Survey - First Insights Into Applications of Uav Sensing Systems

    NASA Astrophysics Data System (ADS)

    Aasen, H.

    2017-08-01

    UAVs are increasingly adapted as remote sensing platforms. Together with specialized sensors, they become powerful sensing systems for environmental monitoring and surveying. Spectral data has great capabilities to the gather information about biophysical and biochemical properties. Still, capturing meaningful spectral data in a reproducible way is not trivial. Since a couple of years small and lightweight spectral sensors, which can be carried on small flexible platforms, have become available. With their adaption in the community, the responsibility to ensure the quality of the data is increasingly shifted from specialized companies and agencies to individual researchers or research teams. Due to the complexity of the data acquisition of spectral data, this poses a challenge for the community and standardized protocols, metadata and best practice procedures are needed to make data intercomparable. In November 2016, the ESSEM COST action Innovative optical Tools for proximal sensing of ecophysiological processes (OPTIMISE; http://optimise.dcs.aber.ac.uk/) held a workshop on best practices for UAV spectral sampling. The objective of this meeting was to trace the way from particle to pixel and identify influences on the data quality / reliability, to figure out how well we are currently doing with spectral sampling from UAVs and how we can improve. Additionally, a survey was designed to be distributed within the community to get an overview over the current practices and raise awareness for the topic. This talk will introduce the approach of the OPTIMISE community towards best practises in UAV spectral sampling and present first results of the survey (http://optimise.dcs.aber.ac.uk/uav-survey/). This contribution briefly introduces the survey and gives some insights into the first results given by the interviewees.

  19. Conception et optimisation d'une peau en composite pour une aile adaptative =

    NASA Astrophysics Data System (ADS)

    Michaud, Francois

    Les preoccupations economiques et environnementales constituent des enjeux majeurs pour le developpement de nouvelles technologies en aeronautique. C'est dans cette optique qu'est ne le projet MDO-505 intitule Morphing Architectures and Related Technologies for Wing Efficiency Improvement. L'objectif de ce projet vise a concevoir une aile adaptative active servant a ameliorer sa laminarite et ainsi reduire la consommation de carburant et les emissions de l'avion. Les travaux de recherche realises ont permis de concevoir et optimiser une peau en composite adaptative permettant d'assurer l'amelioration de la laminarite tout en conservant son integrite structurale. D'abord, une methode d'optimisation en trois etapes fut developpee avec pour objectif de minimiser la masse de la peau en composite en assurant qu'elle s'adapte par un controle actif de la surface deformable aux profils aerodynamiques desires. Le processus d'optimisation incluait egalement des contraintes de resistance, de stabilite et de rigidite de la peau en composite. Suite a l'optimisation, la peau optimisee fut simplifiee afin de faciliter la fabrication et de respecter les regles de conception de Bombardier Aeronautique. Ce processus d'optimisation a permis de concevoir une peau en composite dont les deviations ou erreurs des formes obtenues etaient grandement reduites afin de repondre au mieux aux profils aerodynamiques optimises. Les analyses aerodynamiques realisees a partir de ces formes ont predit de bonnes ameliorations de la laminarite. Par la suite, une serie de validations analytiques fut realisee afin de valider l'integrite structurale de la peau en composite suivant les methodes generalement utilisees par Bombardier Aeronautique. D'abord, une analyse comparative par elements finis a permis de valider une rigidite equivalente de l'aile adaptative a la section d'aile d'origine. Le modele par elements finis fut par la suite mis en boucle avec des feuilles de calcul afin de valider la stabilite et la resistance de la peau en composite pour les cas de chargement aerodynamique reels. En dernier lieu, une analyse de joints boulonnes fut realisee en utilisant un outil interne nomme LJ 85 BJSFM GO.v9 developpe par Bombardier Aeronautique. Ces analyses ont permis de valider numeriquement l'integrite structurale de la peau de composite pour des chargements et des admissibles de materiaux aeronautiques typiques.

  20. Terminal spacecraft rendezvous and capture with LASSO model predictive control

    NASA Astrophysics Data System (ADS)

    Hartley, Edward N.; Gallieri, Marco; Maciejowski, Jan M.

    2013-11-01

    The recently investigated ℓasso model predictive control (MPC) is applied to the terminal phase of a spacecraft rendezvous and capture mission. The interaction between the cost function and the treatment of minimum impulse bit is also investigated. The propellant consumption with ℓasso MPC for the considered scenario is noticeably less than with a conventional quadratic cost and control actions are sparser in time. Propellant consumption and sparsity are competitive with those achieved using a zone-based ℓ1 cost function, whilst requiring fewer decision variables in the optimisation problem than the latter. The ℓasso MPC is demonstrated to meet tighter specifications on control precision and also avoids the risk of undesirable behaviours often associated with pure ℓ1 stage costs.

  1. H∞ control problem of linear periodic piecewise time-delay systems

    NASA Astrophysics Data System (ADS)

    Xie, Xiaochen; Lam, James; Li, Panshuo

    2018-04-01

    This paper investigates the H∞ control problem based on exponential stability and weighted L2-gain analyses for a class of continuous-time linear periodic piecewise systems with time delay. A periodic piecewise Lyapunov-Krasovskii functional is developed by integrating a discontinuous time-varying matrix function with two global terms. By applying the improved constraints to the stability and L2-gain analyses, sufficient delay-dependent exponential stability and weighted L2-gain criteria are proposed for the periodic piecewise time-delay system. Based on these analyses, an H∞ control scheme is designed under the considerations of periodic state feedback control input and iterative optimisation. Finally, numerical examples are presented to illustrate the effectiveness of our proposed conditions.

  2. DMS Advanced Applications for Accommodating High Penetrations of DERs and Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratt, Annabelle; Veda, Santosh; Maitra, Arindam

    Efficient and effective management of the electric distribution system requires an integrated approach to allow various systems to work in harmony, including distribution management systems (DMS), distributed energy resources (DERs), distributed energy resources management systems, and microgrids. This study highlights some outcomes from a recent project sponsored by the US Department of Energy, Office of Electricity Delivery and Energy Reliability, including information about (i) the architecture of these integrated systems and (ii) expanded functions of two example DMS applications to accommodate DERs: volt-var optimisation and fault location, isolation, and service restoration. In addition, the relevant DER group functions necessary tomore » support communications between the DMS and a microgrid controller in grid-tied mode are identified.« less

  3. Neutron scattering, solid state NMR and quantum chemistry studies of 11-keto-progesterone

    NASA Astrophysics Data System (ADS)

    Szyczewski, A.; Hołderna-Natkaniec, K.; Natkaniec, I.

    2004-07-01

    The molecule geometry, frequency and intensity of the IINS and IR vibrational bands of 11-ketoprogesterone have been obtained by the HF, PM3 and density functional theory (DFT) with the B3LYP functionals and 6-31G(d,p) basis set. The optimised bond lengths and bond angles of the steroid skeleton are in good agreement with the X-ray data. The IR and IINS spectra of ketoprogesterone, computed at the DFT level, well reproduce the vibrational wavenumbers and intensities to an accuracy allowing reliable vibrational assignments. The molecular dynamic study by 1H NMR has confirmed the sequence of onset of reorientations of subsequent methyl groups indicated by the results of quantum chemistry calculations and INS spectra.

  4. Design and testing of a rotational brake with shear thickening fluids

    NASA Astrophysics Data System (ADS)

    Tian, Tongfei; Nakano, Masami

    2017-03-01

    A rotational brake working with shear thickening fluid (STF) was designed and tested in this study. With the optimisation in design, most of the STF in the brake can receive the same shear rate when the brake rotates. The parts of this brake were fabricated with a 3D printer and then assembled manually. Three types of STFs with various carrier fluids and different particles were fabricated and tested with a rheometer. Then the brake with each STF was separately tested with the rheometer. The estimated and measured torques as a function of the angular velocity fit each other well. The stability of the rotational STF brake was investigated in repeated tests, which proved the function of the brake for a long time.

  5. Refractive index dependence of L3 photonic crystal nano-cavities.

    PubMed

    Adawi, A M; Chalcraft, A R; Whittaker, D M; Lidzey, D G

    2007-10-29

    We model the optical properties of L3 photonic crystal nano-cavities as a function of the photonic crystal membrane refractive index n using a guided mode expansion method. Band structure calculations revealed that a TE-like full band-gap exists for materials of refractive index as low as 1.6. The Q-factor of such cavities showed a super-linear increase with refractive index. By adjusting the relative position of the cavity side holes, the Q-factor was optimised as a function of the photonic crystal membrane refractive index n over the range 1.6 to 3.4. Q-factors in the range 3000-8000 were predicted from absorption free materials in the visible range with refractive index between 2.45 and 2.8.

  6. A density difference based analysis of orbital-dependent exchange-correlation functionals

    NASA Astrophysics Data System (ADS)

    Grabowski, Ireneusz; Teale, Andrew M.; Fabiano, Eduardo; Śmiga, Szymon; Buksztel, Adam; Della Sala, Fabio

    2014-03-01

    We present a density difference based analysis for a range of orbital-dependent Kohn-Sham functionals. Results for atoms, some members of the neon isoelectronic series and small molecules are reported and compared with ab initio wave function calculations. Particular attention is paid to the quality of approximations to the exchange-only optimised effective potential (OEP) approach: we consider both the localised Hartree-Fock as well as the Krieger-Li-Iafrate methods. Analysis of density differences at the exchange-only level reveals the impact of the approximations on the resulting electronic densities. These differences are further quantified in terms of the ground state energies, frontier orbital energy differences and highest occupied orbital energies obtained. At the correlated level, an OEP approach based on a perturbative second-order correlation energy expression is shown to deliver results comparable with those from traditional wave function approaches, making it suitable for use as a benchmark against which to compare standard density functional approximations.

  7. Optimising Service Delivery of AAC AT Devices and Compensating AT for Dyslexia.

    PubMed

    Roentgen, Uta R; Hagedoren, Edith A V; Horions, Katrien D L; Dalemans, Ruth J P

    2017-01-01

    To promote successful use of Assistive Technology (AT) supporting Augmentative and Alternative Communication (AAC) and compensating for dyslexia, the last steps of their provision, delivery and instruction, use, maintenance and evaluation, were optimised. In co-creation with all stakeholders based on a list of requirements an integral method and tools were developed.

  8. 76 FR 21087 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    ... developed an enhanced technology trading platform (the ``Optimise platform''). To assure a smooth transition... Optimise trading platform and will continue to do so up to the launch of the new technology and during the... tested and is available for the launch. The Exchange believes that it will be less disruptive to members...

  9. High School Learners' Mental Construction during Solving Optimisation Problems in Calculus: A South African Case Study

    ERIC Educational Resources Information Center

    Brijlall, Deonarain; Ndlovu, Zanele

    2013-01-01

    This qualitative case study in a rural school in Umgungundlovu District in KwaZulu-Natal, South Africa, explored Grade 12 learners' mental constructions of mathematical knowledge during engagement with optimisation problems. Ten Grade 12 learners who do pure Mathemat-ics participated, and data were collected through structured activity sheets and…

  10. Optimising the Blended Learning Environment: The Arab Open University Experience

    ERIC Educational Resources Information Center

    Hamdi, Tahrir; Abu Qudais, Mohammed

    2018-01-01

    This paper will offer some insights into possible ways to optimise the blended learning environment based on experience with this modality of teaching at Arab Open University/Jordan branch and also by reflecting upon the results of several meta-analytical studies, which have shown blended learning environments to be more effective than their face…

  11. Critical review of membrane bioreactor models--part 2: hydrodynamic and integrated models.

    PubMed

    Naessens, W; Maere, T; Ratkovich, N; Vedantam, S; Nopens, I

    2012-10-01

    Membrane bioreactor technology exists for a couple of decades, but has not yet overwhelmed the market due to some serious drawbacks of which operational cost due to fouling is the major contributor. Knowledge buildup and optimisation for such complex systems can heavily benefit from mathematical modelling. In this paper, the vast literature on hydrodynamic and integrated MBR modelling is critically reviewed. Hydrodynamic models are used at different scales and focus mainly on fouling and only little on system design/optimisation. Integrated models also focus on fouling although the ones including costs are leaning towards optimisation. Trends are discussed, knowledge gaps identified and interesting routes for further research suggested. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Dopamine reward prediction error responses reflect marginal utility.

    PubMed

    Stauffer, William R; Lak, Armin; Schultz, Wolfram

    2014-11-03

    Optimal choices require an accurate neuronal representation of economic value. In economics, utility functions are mathematical representations of subjective value that can be constructed from choices under risk. Utility usually exhibits a nonlinear relationship to physical reward value that corresponds to risk attitudes and reflects the increasing or decreasing marginal utility obtained with each additional unit of reward. Accordingly, neuronal reward responses coding utility should robustly reflect this nonlinearity. In two monkeys, we measured utility as a function of physical reward value from meaningful choices under risk (that adhered to first- and second-order stochastic dominance). The resulting nonlinear utility functions predicted the certainty equivalents for new gambles, indicating that the functions' shapes were meaningful. The monkeys were risk seeking (convex utility function) for low reward and risk avoiding (concave utility function) with higher amounts. Critically, the dopamine prediction error responses at the time of reward itself reflected the nonlinear utility functions measured at the time of choices. In particular, the reward response magnitude depended on the first derivative of the utility function and thus reflected the marginal utility. Furthermore, dopamine responses recorded outside of the task reflected the marginal utility of unpredicted reward. Accordingly, these responses were sufficient to train reinforcement learning models to predict the behaviorally defined expected utility of gambles. These data suggest a neuronal manifestation of marginal utility in dopamine neurons and indicate a common neuronal basis for fundamental explanatory constructs in animal learning theory (prediction error) and economic decision theory (marginal utility). Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Sampling with poling-based flux balance analysis: optimal versus sub-optimal flux space analysis of Actinobacillus succinogenes.

    PubMed

    Binns, Michael; de Atauri, Pedro; Vlysidis, Anestis; Cascante, Marta; Theodoropoulos, Constantinos

    2015-02-18

    Flux balance analysis is traditionally implemented to identify the maximum theoretical flux for some specified reaction and a single distribution of flux values for all the reactions present which achieve this maximum value. However it is well known that the uncertainty in reaction networks due to branches, cycles and experimental errors results in a large number of combinations of internal reaction fluxes which can achieve the same optimal flux value. In this work, we have modified the applied linear objective of flux balance analysis to include a poling penalty function, which pushes each new set of reaction fluxes away from previous solutions generated. Repeated poling-based flux balance analysis generates a sample of different solutions (a characteristic set), which represents all the possible functionality of the reaction network. Compared to existing sampling methods, for the purpose of generating a relatively "small" characteristic set, our new method is shown to obtain a higher coverage than competing methods under most conditions. The influence of the linear objective function on the sampling (the linear bias) constrains optimisation results to a subspace of optimal solutions all producing the same maximal fluxes. Visualisation of reaction fluxes plotted against each other in 2 dimensions with and without the linear bias indicates the existence of correlations between fluxes. This method of sampling is applied to the organism Actinobacillus succinogenes for the production of succinic acid from glycerol. A new method of sampling for the generation of different flux distributions (sets of individual fluxes satisfying constraints on the steady-state mass balances of intermediates) has been developed using a relatively simple modification of flux balance analysis to include a poling penalty function inside the resulting optimisation objective function. This new methodology can achieve a high coverage of the possible flux space and can be used with and without linear bias to show optimal versus sub-optimal solution spaces. Basic analysis of the Actinobacillus succinogenes system using sampling shows that in order to achieve the maximal succinic acid production CO₂ must be taken into the system. Solutions involving release of CO₂ all give sub-optimal succinic acid production.

  14. Simultaneous determination of selected biogenic amines in alcoholic beverage samples by isotachophoretic and chromatographic methods.

    PubMed

    Jastrzębska, Aneta; Piasta, Anna; Szłyk, Edward

    2014-01-01

    A simple and useful method for the determination of biogenic amines in beverage samples based on isotachophoretic separation is described. The proposed procedure permitted simultaneous analysis of histamine, tyramine, cadaverine, putrescine, tryptamine, 2-phenylethylamine, spermine and spermidine. The data presented demonstrate the utility, simplicity, flexibility, sensitivity and environmentally friendly character of the proposed method. The precision of the method expressed as coefficient of variations varied from 0.1% to 5.9% for beverage samples, whereas recoveries varied from 91% to 101%. The results for the determination of biogenic amines were compared with an HPLC procedure based on a pre-column derivatisation reaction of biogenic amines with dansyl chloride. Furthermore, the derivatisation procedure was optimised by verification of concentration and pH of the buffer, the addition of organic solvents, reaction time and temperature.

  15. Dopamine Reward Prediction Error Responses Reflect Marginal Utility

    PubMed Central

    Stauffer, William R.; Lak, Armin; Schultz, Wolfram

    2014-01-01

    Summary Background Optimal choices require an accurate neuronal representation of economic value. In economics, utility functions are mathematical representations of subjective value that can be constructed from choices under risk. Utility usually exhibits a nonlinear relationship to physical reward value that corresponds to risk attitudes and reflects the increasing or decreasing marginal utility obtained with each additional unit of reward. Accordingly, neuronal reward responses coding utility should robustly reflect this nonlinearity. Results In two monkeys, we measured utility as a function of physical reward value from meaningful choices under risk (that adhered to first- and second-order stochastic dominance). The resulting nonlinear utility functions predicted the certainty equivalents for new gambles, indicating that the functions’ shapes were meaningful. The monkeys were risk seeking (convex utility function) for low reward and risk avoiding (concave utility function) with higher amounts. Critically, the dopamine prediction error responses at the time of reward itself reflected the nonlinear utility functions measured at the time of choices. In particular, the reward response magnitude depended on the first derivative of the utility function and thus reflected the marginal utility. Furthermore, dopamine responses recorded outside of the task reflected the marginal utility of unpredicted reward. Accordingly, these responses were sufficient to train reinforcement learning models to predict the behaviorally defined expected utility of gambles. Conclusions These data suggest a neuronal manifestation of marginal utility in dopamine neurons and indicate a common neuronal basis for fundamental explanatory constructs in animal learning theory (prediction error) and economic decision theory (marginal utility). PMID:25283778

  16. Efficiency of "Prescribe Vida Saludable", a health promotion innovation. Pilot phase.

    PubMed

    Sanz-Guinea, Aitor; Espinosa, Maite; Grandes, Gonzalo; Sáncheza, Álvaro; Martínez, Catalina; Pombo, Haizea; Bully, Paola; Cortada, Josep

    "Prescribe Vida Saludable" (PVS) is an organisational innovation designed to optimise the promotion of multiple healthy habits in primary healthcare. It aims to estimate the cost effectiveness and cost-utility of prescribing physical activity in the pilot phase of the PVS programme, compared to the routine clinical practice of promoting physical activity in primary healthcare. An economic evaluation of the quasi-experimental pilot phase of PVS was carried out. In the four control centres, a systematic sample was selected of 194 patients who visited the centre in a single year and who did not comply with physical activity recommendations. In the four intervention centres, 122 patients who received their first physical activity prescription were consecutively enrolled. The costs were evaluated from the perspective of the PVS programme using bottom-up methodology. The effectiveness (proportion of patients who changed their physical activity) as well as the utility were evaluated at baseline and after 3 months. The incremental cost-utility ratio (ICUR) and the incremental cost-effectiveness ratio (ICER) were calculated and a sensitivity analysis was performed with bootstrapping and 1,000 replications. Information was obtained from 35% of control cases and 62% of intervention cases. The ICUR was €1,234.66/Quality Adjusted Life Years (QALY) and the ICER was €4.12. In 98.3% of the simulations, the ICUR was below the €30,000/QALY threshold. The prescription of physical activity was demonstrably within acceptable cost-utility limits in the pilot PVS phase, even from a conservative perspective. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. A Mariner Transposon-Based Signature-Tagged Mutagenesis System for the Analysis of Oral Infection by Listeria monocytogenes

    PubMed Central

    Cummins, Joanne; Casey, Pat G.; Joyce, Susan A.; Gahan, Cormac G. M.

    2013-01-01

    Listeria monocytogenes is a Gram-positive foodborne pathogen and the causative agent of listerosis a disease that manifests predominately as meningitis in the non-pregnant individual or infection of the fetus and spontaneous abortion in pregnant women. Common-source outbreaks of foodborne listeriosis are associated with significant morbidity and mortality. However, relatively little is known concerning the mechanisms that govern infection via the oral route. In order to aid functional genetic analysis of the gastrointestinal phase of infection we designed a novel signature-tagged mutagenesis (STM) system based upon the invasive L. monocytogenes 4b serotype H7858 strain. To overcome the limitations of gastrointestinal infection by L. monocytogenes in the mouse model we created a H7858 strain that is genetically optimised for oral infection in mice. Furthermore our STM system was based upon a mariner transposon to favour numerous and random transposition events throughout the L. monocytogenes genome. Use of the STM bank to investigate oral infection by L. monocytogenes identified 21 insertion mutants that demonstrated significantly reduced potential for infection in our model. The sites of transposon insertion included lmOh7858_0671 (encoding an internalin homologous to Lmo0610), lmOh7858_0898 (encoding a putative surface-expressed LPXTG protein homologous to Lmo0842), lmOh7858_2579 (encoding the HupDGC hemin transport system) and lmOh7858_0399 (encoding a putative fructose specific phosphotransferase system). We propose that this represents an optimised STM system for functional genetic analysis of foodborne/oral infection by L. monocytogenes. PMID:24069416

  18. Factor VIII organisation on nanodiscs with different lipid composition.

    PubMed

    Grushin, Kirill; Miller, Jaimy; Dalm, Daniela; Stoilova-McPhie, Svetla

    2015-04-01

    Nanodiscs (ND) are lipid bilayer membrane patches held by amphiphilic scaffolding proteins (MSP) of ~10 nm in diameter. Nanodiscs have been developed as lipid nanoplatforms for structural and functional studies of membrane and membrane associated proteins. Their size and monodispersity have rendered them unique for electron microscopy (EM) and single particle analysis studies of proteins and complexes either spanning or associated to the ND membrane. Binding of blood coagulation factors and complexes, such as the Factor VIII (FVIII) and the Factor VIIIa - Factor IXa (intrinsic tenase) complex to the negatively charged activated platelet membrane is required for normal haemostasis. In this study we present our work on optimising ND, specifically designed to bind FVIII at close to physiological conditions. The binding of FVIII to the negatively charged ND rich in phosphatidylserine (PS) was followed by electron microscopy at three different PS compositions and two different membrane scaffolding protein (MSP1D1) to lipid ratios. Our results show that the ND with highest PS content (80 %) and lowest MSP1D1 to lipid ratio (1:47) are the most suitable for structure determination of the membrane-bound FVIII by single particle EM. Our preliminary FVIII 3D reconstruction as bound to PS containing ND demonstrates the suitability of the optimised ND for structural studies by EM. Further assembly of the activated FVIII form (FVIIIa) and the whole FVIIIa-FIXa complex on ND, followed by EM and single particle reconstruction will help to identify the protein-protein and protein-membrane interfaces critical for the intrinsic tenase complex assembly and function.

  19. [Cross-sectoral quality assurance in ambulatory care].

    PubMed

    Albrecht, Martin; Loos, Stefan; Otten, Marcus

    2013-01-01

    Overcoming rigid sectoral segmentation in healthcare has also become a health policy target in quality assurance. With the Act to Enhance Competition in Statutory Health Insurance (GKV-WSG) coming into effect, quality assurance measures are to be designed in a cross-sectoral fashion for in- and outpatient sectors equally. An independent institution is currently mandated to develop specific quality indicators for eleven indications. For three of these operating tests have already been commissioned by the Federal Joint Committee. This article depicts the major results of a feasibility study, including a compliance cost estimate, for the aforementioned indications of cross-sectoral quality assurance (cQA). In conclusion, a number of both practical and conceptual basic challenges are still to be resolved prior to the full implementation of cQA, such as a sufficient specification to activate documentation requirements and an inspection system capable of separating actual quality problems from documentary deficits. So far, a comprehensive cost-utility analysis of cQA has not been provided, in particular with comparison to existing QA systems. In order to optimise cost and utility of cQA an evidence-based approach is required for both the extension of cQA areas and for QA provisions. Copyright © 2013. Published by Elsevier GmbH.

  20. The optimisation of low-acceleration interstellar relativistic rocket trajectories using genetic algorithms

    NASA Astrophysics Data System (ADS)

    Fung, Kenneth K. H.; Lewis, Geraint F.; Wu, Xiaofeng

    2017-04-01

    A vast wealth of literature exists on the topic of rocket trajectory optimisation, particularly in the area of interplanetary trajectories due to its relevance today. Studies on optimising interstellar and intergalactic trajectories are usually performed in flat spacetime using an analytical approach, with very little focus on optimising interstellar trajectories in a general relativistic framework. This paper examines the use of low-acceleration rockets to reach galactic destinations in the least possible time, with a genetic algorithm being employed for the optimisation process. The fuel required for each journey was calculated for various types of propulsion systems to determine the viability of low-acceleration rockets to colonise the Milky Way. The results showed that to limit the amount of fuel carried on board, an antimatter propulsion system would likely be the minimum technological requirement to reach star systems tens of thousands of light years away. However, using a low-acceleration rocket would require several hundreds of thousands of years to reach these star systems, with minimal time dilation effects since maximum velocities only reached about 0.2 c . Such transit times are clearly impractical, and thus, any kind of colonisation using low acceleration rockets would be difficult. High accelerations, on the order of 1 g, are likely required to complete interstellar journeys within a reasonable time frame, though they may require prohibitively large amounts of fuel. So for now, it appears that humanity's ultimate goal of a galactic empire may only be possible at significantly higher accelerations, though the propulsion technology requirement for a journey that uses realistic amounts of fuel remains to be determined.

  1. AlphaMate: a program for optimising selection, maintenance of diversity, and mate allocation in breeding programs.

    PubMed

    Gorjanc, Gregor; Hickey, John M

    2018-05-02

    AlphaMate is a flexible program that optimises selection, maintenance of genetic diversity, and mate allocation in breeding programs. It can be used in animal and cross- and self-pollinating plant populations. These populations can be subject to selective breeding or conservation management. The problem is formulated as a multi-objective optimisation of a valid mating plan that is solved with an evolutionary algorithm. A valid mating plan is defined by a combination of mating constraints (the number of matings, the maximal number of parents, the minimal/equal/maximal number of contributions per parent, or allowance for selfing) that are gender specific or generic. The optimisation can maximize genetic gain, minimize group coancestry, minimize inbreeding of individual matings, or maximize genetic gain for a given increase in group coancestry or inbreeding. Users provide a list of candidate individuals with associated gender and selection criteria information (if applicable) and coancestry matrix. Selection criteria and coancestry matrix can be based on pedigree or genome-wide markers. Additional individual or mating specific information can be included to enrich optimisation objectives. An example of rapid recurrent genomic selection in wheat demonstrates how AlphaMate can double the efficiency of converting genetic diversity into genetic gain compared to truncation selection. Another example demonstrates the use of genome editing to expand the gain-diversity frontier. Executable versions of AlphaMate for Windows, Mac, and Linux platforms are available at http://www.AlphaGenes.roslin.ed.ac.uk/AlphaMate. gregor.gorjanc@roslin.ed.ack.uk.

  2. Analysis of the car body stability performance after coupler jack-knifing during braking

    NASA Astrophysics Data System (ADS)

    Guo, Lirong; Wang, Kaiyun; Chen, Zaigang; Shi, Zhiyong; Lv, Kaikai; Ji, Tiancheng

    2018-06-01

    This paper aims to improve car body stability performance by optimising locomotive parameters when coupler jack-knifing occurs during braking. In order to prevent car body instability behaviour caused by coupler jack-knifing, a multi-locomotive simulation model and a series of field braking tests are developed to analyse the influence of the secondary suspension and the secondary lateral stopper on the car body stability performance during braking. According to simulation and test results, increasing secondary lateral stiffness contributes to limit car body yaw angle during braking. However, it seriously affects the dynamic performance of the locomotive. For the secondary lateral stopper, its lateral stiffness and free clearance have a significant influence on improving the car body stability capacity, and have less effect on the dynamic performance of the locomotive. An optimised measure was proposed and adopted on the test locomotive. For the optimised locomotive, the lateral stiffness of secondary lateral stopper is increased to 7875 kN/m, while its free clearance is decreased to 10 mm. The optimised locomotive has excellent dynamic and safety performance. Comparing with the original locomotive, the maximum car body yaw angle and coupler rotation angle of the optimised locomotive were reduced by 59.25% and 53.19%, respectively, according to the practical application. The maximum derailment coefficient was 0.32, and the maximum wheelset lateral force was 39.5 kN. Hence, reasonable parameters of secondary lateral stopper can improve the car body stability capacity and the running safety of the heavy haul locomotive.

  3. Dietary optimisation with omega-3 and omega-6 fatty acids for 12-23-month-old overweight and obese children in urban Jakarta.

    PubMed

    Cahyaningrum, Fitrianna; Permadhi, Inge; Ansari, Muhammad Ridwan; Prafiantini, Erfi; Rachman, Purnawati Hustina; Agustina, Rina

    2016-12-01

    Diets with a specific omega-6/omega-3 fatty acid ratio have been reported to have favourable effects in controlling obesity in adults. However, development a local-based diet by considering the ratio of these fatty acids for improving the nutritional status of overweight and obese children is lacking. Therefore, using linear programming, we developed an affordable optimised diet focusing on the ratio of omega- 6/omega-3 fatty acid intake for obese children aged 12-23 months. A crosssectional study was conducted in two subdistricts of East Jakarta involving 42 normal-weight and 29 overweight and obese children, grouped on the basis of their body mass index for-age Z scores and selected through multistage random sampling. A 24-h recall was performed for 3-nonconsecutive days to assess the children's dietary intake levels and food patterns. We conducted group and structured interviews as well as market surveys to identify food availability, accessibility and affordability. Three types of affordable optimised 7-day diet meal plans were developed on the basis of breastfeeding status. The optimised diet plan fulfilled energy and macronutrient intake requirements within the acceptable macronutrient distribution range. The omega-6/omega-3 fatty acid ratio in the children was between 4 and 10. Moreover, the micronutrient intake level was within the range of the recommended daily allowance or estimated average recommendation and tolerable upper intake level. The optimisation model used in this study provides a mathematical solution for economical diet meal plans that approximate the nutrient requirements for overweight and obese children.

  4. The Optimisation of the Expression of Recombinant Surface Immunogenic Protein of Group B Streptococcus in Escherichia coli by Response Surface Methodology Improves Humoral Immunity.

    PubMed

    Díaz-Dinamarca, Diego A; Jerias, José I; Soto, Daniel A; Soto, Jorge A; Díaz, Natalia V; Leyton, Yessica Y; Villegas, Rodrigo A; Kalergis, Alexis M; Vásquez, Abel E

    2018-03-01

    Group B Streptococcus (GBS) is the leading cause of neonatal meningitis and a common pathogen in livestock and aquaculture industries around the world. Conjugate polysaccharide and protein-based vaccines are under development. The surface immunogenic protein (SIP) is a conserved protein in all GBS serotypes and has been shown to be a good target for vaccine development. The expression of recombinant proteins in Escherichia coli cells has been shown to be useful in the development of vaccines, and the protein purification is a factor affecting their immunogenicity. The response surface methodology (RSM) and Box-Behnken design can optimise the performance in the expression of recombinant proteins. However, the biological effect in mice immunised with an immunogenic protein that is optimised by RSM and purified by low-affinity chromatography is unknown. In this study, we used RSM for the optimisation of the expression of the rSIP, and we evaluated the SIP-specific humoral response and the property to decrease the GBS colonisation in the vaginal tract in female mice. It was observed by NI-NTA chromatography that the RSM increases the yield in the expression of rSIP, generating a better purification process. This improvement in rSIP purification suggests a better induction of IgG anti-SIP immune response and a positive effect in the decreased GBS intravaginal colonisation. The RSM applied to optimise the expression of recombinant proteins with immunogenic capacity is an interesting alternative in the evaluation of vaccines in preclinical phase, which could improve their immune response.

  5. Optimising the Collaborative Practice of Nurses in Primary Care Settings Using a Knowledge Translation Approach

    ERIC Educational Resources Information Center

    Oelke, Nelly; Wilhelm, Amanda; Jackson, Karen

    2016-01-01

    The role of nurses in primary care is poorly understood and many are not working to their full scope of practice. Building on previous research, this knowledge translation (KT) project's aim was to facilitate nurses' capacity to optimise their practice in these settings. A Summit engaging Alberta stakeholders in a deliberative discussion was the…

  6. Optimising fuel treatments over time and space

    Treesearch

    Woodam Chung; Greg Jones; Kurt Krueger; Jody Bramel; Marco Contreras

    2013-01-01

    Fuel treatments have been widely used as a tool to reduce catastrophic wildland fire risks in many forests around the world. However, it is a challenging task for forest managers to prioritise where, when and how to implement fuel treatments across a large forest landscape. In this study, an optimisation model was developed for long-term fuel management decisions at a...

  7. Huffman coding in advanced audio coding standard

    NASA Astrophysics Data System (ADS)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  8. Fox-7 for Insensitive Boosters

    DTIC Science & Technology

    2010-08-01

    cavitation , and therefore nucleation, to occur at each frequency. As well as producing ultrasound at different frequencies, the method of delivery of...processing techniques using ultrasound , designed to optimise FOX-7 crystal size and morphology to improve booster formulations, and results from these...7 booster formulations. Also included are particle processing techniques using ultrasound , designed to optimise FOX-7 crystal size and morphology

  9. Estimation of Power Consumption in the Circular Sawing of Stone Based on Tangential Force Distribution

    NASA Astrophysics Data System (ADS)

    Huang, Guoqin; Zhang, Meiqin; Huang, Hui; Guo, Hua; Xu, Xipeng

    2018-04-01

    Circular sawing is an important method for the processing of natural stone. The ability to predict sawing power is important in the optimisation, monitoring and control of the sawing process. In this paper, a predictive model (PFD) of sawing power, which is based on the tangential force distribution at the sawing contact zone, was proposed, experimentally validated and modified. With regard to the influence of sawing speed on tangential force distribution, the modified PFD (MPFD) performed with high predictive accuracy across a wide range of sawing parameters, including sawing speed. The mean maximum absolute error rate was within 6.78%, and the maximum absolute error rate was within 11.7%. The practicability of predicting sawing power by the MPFD with few initial experimental samples was proved in case studies. On the premise of high sample measurement accuracy, only two samples are required for a fixed sawing speed. The feasibility of applying the MPFD to optimise sawing parameters while lowering the energy consumption of the sawing system was validated. The case study shows that energy use was reduced 28% by optimising the sawing parameters. The MPFD model can be used to predict sawing power, optimise sawing parameters and control energy.

  10. A New Computational Technique for the Generation of Optimised Aircraft Trajectories

    NASA Astrophysics Data System (ADS)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto

    2017-12-01

    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  11. Optimisation of the formulation of a bubble bath by a chemometric approach market segmentation and optimisation.

    PubMed

    Marengo, Emilio; Robotti, Elisa; Gennaro, Maria Carla; Bertetto, Mariella

    2003-03-01

    The optimisation of the formulation of a commercial bubble bath was performed by chemometric analysis of Panel Tests results. A first Panel Test was performed to choose the best essence, among four proposed to the consumers; the best essence chosen was used in the revised commercial bubble bath. Afterwards, the effect of changing the amount of four components (the amount of primary surfactant, the essence, the hydratant and the colouring agent) of the bubble bath was studied by a fractional factorial design. The segmentation of the bubble bath market was performed by a second Panel Test, in which the consumers were requested to evaluate the samples coming from the experimental design. The results were then treated by Principal Component Analysis. The market had two segments: people preferring a product with a rich formulation and people preferring a poor product. The final target, i.e. the optimisation of the formulation for each segment, was obtained by the calculation of regression models relating the subjective evaluations given by the Panel and the compositions of the samples. The regression models allowed to identify the best formulations for the two segments ofthe market.

  12. Subject Specific Optimisation of the Stiffness of Footwear Material for Maximum Plantar Pressure Reduction.

    PubMed

    Chatzistergos, Panagiotis E; Naemi, Roozbeh; Healy, Aoife; Gerth, Peter; Chockalingam, Nachiappan

    2017-08-01

    Current selection of cushioning materials for therapeutic footwear and orthoses is based on empirical and anecdotal evidence. The aim of this investigation is to assess the biomechanical properties of carefully selected cushioning materials and to establish the basis for patient-specific material optimisation. For this purpose, bespoke cushioning materials with qualitatively similar mechanical behaviour but different stiffness were produced. Healthy volunteers were asked to stand and walk on materials with varying stiffness and their capacity for pressure reduction was assessed. Mechanical testing using a surrogate heel model was employed to investigate the effect of loading on optimum stiffness. Results indicated that optimising the stiffness of cushioning materials improved pressure reduction during standing and walking by at least 16 and 19% respectively. Moreover, the optimum stiffness was strongly correlated to body mass (BM) and body mass index (BMI), with stiffer materials needed in the case of people with higher BM or BMI. Mechanical testing confirmed that optimum stiffness increases with the magnitude of compressive loading. For the first time, this study provides quantitative data to support the importance of stiffness optimisation in cushioning materials and sets the basis for methods to inform optimum material selection in the clinic.

  13. Design of distributed PID-type dynamic matrix controller for fractional-order systems

    NASA Astrophysics Data System (ADS)

    Wang, Dawei; Zhang, Ridong

    2018-01-01

    With the continuous requirements for product quality and safety operation in industrial production, it is difficult to describe the complex large-scale processes with integer-order differential equations. However, the fractional differential equations may precisely represent the intrinsic characteristics of such systems. In this paper, a distributed PID-type dynamic matrix control method based on fractional-order systems is proposed. First, the high-order approximate model of integer order is obtained by utilising the Oustaloup method. Then, the step response model vectors of the plant is obtained on the basis of the high-order model, and the online optimisation for multivariable processes is transformed into the optimisation of each small-scale subsystem that is regarded as a sub-plant controlled in the distributed framework. Furthermore, the PID operator is introduced into the performance index of each subsystem and the fractional-order PID-type dynamic matrix controller is designed based on Nash optimisation strategy. The information exchange among the subsystems is realised through the distributed control structure so as to complete the optimisation task of the whole large-scale system. Finally, the control performance of the designed controller in this paper is verified by an example.

  14. Model fitting for small skin permeability data sets: hyperparameter optimisation in Gaussian Process Regression.

    PubMed

    Ashrafi, Parivash; Sun, Yi; Davey, Neil; Adams, Roderick G; Wilkinson, Simon C; Moss, Gary Patrick

    2018-03-01

    The aim of this study was to investigate how to improve predictions from Gaussian Process models by optimising the model hyperparameters. Optimisation methods, including Grid Search, Conjugate Gradient, Random Search, Evolutionary Algorithm and Hyper-prior, were evaluated and applied to previously published data. Data sets were also altered in a structured manner to reduce their size, which retained the range, or 'chemical space' of the key descriptors to assess the effect of the data range on model quality. The Hyper-prior Smoothbox kernel results in the best models for the majority of data sets, and they exhibited significantly better performance than benchmark quantitative structure-permeability relationship (QSPR) models. When the data sets were systematically reduced in size, the different optimisation methods generally retained their statistical quality, whereas benchmark QSPR models performed poorly. The design of the data set, and possibly also the approach to validation of the model, is critical in the development of improved models. The size of the data set, if carefully controlled, was not generally a significant factor for these models and that models of excellent statistical quality could be produced from substantially smaller data sets. © 2018 Royal Pharmaceutical Society.

  15. Implementation of the multi-channel monolith reactor in an optimisation procedure for heterogeneous oxidation catalysts based on genetic algorithms.

    PubMed

    Breuer, Christian; Lucas, Martin; Schütze, Frank-Walter; Claus, Peter

    2007-01-01

    A multi-criteria optimisation procedure based on genetic algorithms is carried out in search of advanced heterogeneous catalysts for total oxidation. Simple but flexible software routines have been created to be applied within a search space of more then 150,000 individuals. The general catalyst design includes mono-, bi- and trimetallic compositions assembled out of 49 different metals and depleted on an Al2O3 support in up to nine amount levels. As an efficient tool for high-throughput screening and perfectly matched to the requirements of heterogeneous gas phase catalysis - especially for applications technically run in honeycomb structures - the multi-channel monolith reactor is implemented to evaluate the catalyst performances. Out of a multi-component feed-gas, the conversion rates of carbon monoxide (CO) and a model hydrocarbon (HC) are monitored in parallel. In combination with further restrictions to preparation and pre-treatment a primary screening can be conducted, promising to provide results close to technically applied catalysts. Presented are the resulting performances of the optimisation process for the first catalyst generations and the prospect of its auto-adaptation to specified optimisation goals.

  16. Orbital optimisation in the perfect pairing hierarchy: applications to full-valence calculations on linear polyacenes

    NASA Astrophysics Data System (ADS)

    Lehtola, Susi; Parkhill, John; Head-Gordon, Martin

    2018-03-01

    We describe the implementation of orbital optimisation for the models in the perfect pairing hierarchy. Orbital optimisation, which is generally necessary to obtain reliable results, is pursued at perfect pairing (PP) and perfect quadruples (PQ) levels of theory for applications on linear polyacenes, which are believed to exhibit strong correlation in the π space. While local minima and σ-π symmetry breaking solutions were found for PP orbitals, no such problems were encountered for PQ orbitals. The PQ orbitals are used for single-point calculations at PP, PQ and perfect hextuples (PH) levels of theory, both only in the π subspace, as well as in the full σπ valence space. It is numerically demonstrated that the inclusion of single excitations is necessary also when optimised orbitals are used. PH is found to yield good agreement with previously published density matrix renormalisation group data in the π space, capturing over 95% of the correlation energy. Full-valence calculations made possible by our novel, efficient code reveal that strong correlations are weaker when larger basis sets or active spaces are employed than in previous calculations. The largest full-valence PH calculations presented correspond to a (192e,192o) problem.

  17. Modulation aware cluster size optimisation in wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Sriram Naik, M.; Kumar, Vinay

    2017-07-01

    Wireless sensor networks (WSNs) play a great role because of their numerous advantages to the mankind. The main challenge with WSNs is the energy efficiency. In this paper, we have focused on the energy minimisation with the help of cluster size optimisation along with consideration of modulation effect when the nodes are not able to communicate using baseband communication technique. Cluster size optimisations is important technique to improve the performance of WSNs. It provides improvement in energy efficiency, network scalability, network lifetime and latency. We have proposed analytical expression for cluster size optimisation using traditional sensing model of nodes for square sensing field with consideration of modulation effects. Energy minimisation can be achieved by changing the modulation schemes such as BPSK, 16-QAM, QPSK, 64-QAM, etc., so we are considering the effect of different modulation techniques in the cluster formation. The nodes in the sensing fields are random and uniformly deployed. It is also observed that placement of base station at centre of scenario enables very less number of modulation schemes to work in energy efficient manner but when base station placed at the corner of the sensing field, it enable large number of modulation schemes to work in energy efficient manner.

  18. Optimisation of SIW bandpass filter with wide and sharp stopband using space mapping

    NASA Astrophysics Data System (ADS)

    Xu, Juan; Bi, Jun Jian; Li, Zhao Long; Chen, Ru shan

    2016-12-01

    This work presents a substrate integrated waveguide (SIW) bandpass filter with wide and precipitous stopband, which is different from filters with a direct input/output coupling structure. Higher modes in the SIW cavities are used to generate the finite transmission zeros for improved stopband performance. The design of SIW filters requires full wave electromagnetic simulation and extensive optimisation. If a full wave solver is used for optimisation, the design process is very time consuming. The space mapping (SM) approach has been called upon to alleviate this problem. In this case, the coarse model is optimised using an equivalent circuit model-based representation of the structure for fast computations. On the other hand, the verification of the design is completed with an accurate fine model full wave simulation. A fourth-order filter with a passband of 12.0-12.5 GHz is fabricated on a single layer Rogers RT/Duroid 5880 substrate. The return loss is better than 17.4 dB in the passband and the rejection is more than 40 dB in the stopband. The stopband is from 2 to 11 GHz and 13.5 to 17.3 GHz, demonstrating a wide bandwidth performance.

  19. A reliability-based maintenance technicians' workloads optimisation model with stochastic consideration

    NASA Astrophysics Data System (ADS)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2016-06-01

    The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.

  20. Honeybee economics: optimisation of foraging in a variable world.

    PubMed

    Stabentheiner, Anton; Kovac, Helmut

    2016-06-20

    In honeybees fast and efficient exploitation of nectar and pollen sources is achieved by persistent endothermy throughout the foraging cycle, which means extremely high energy costs. The need for food promotes maximisation of the intake rate, and the high costs call for energetic optimisation. Experiments on how honeybees resolve this conflict have to consider that foraging takes place in a variable environment concerning microclimate and food quality and availability. Here we report, in simultaneous measurements of energy costs, gains, and intake rate and efficiency, how honeybee foragers manage this challenge in their highly variable environment. If possible, during unlimited sucrose flow, they follow an 'investment-guided' ('time is honey') economic strategy promising increased returns. They maximise net intake rate by investing both own heat production and solar heat to increase body temperature to a level which guarantees a high suction velocity. They switch to an 'economizing' ('save the honey') optimisation of energetic efficiency if the intake rate is restricted by the food source when an increased body temperature would not guarantee a high intake rate. With this flexible and graded change between economic strategies honeybees can do both maximise colony intake rate and optimise foraging efficiency in reaction to environmental variation.

  1. Off-Axis Nulling Transfer Function Measurement: A First Assessment

    NASA Technical Reports Server (NTRS)

    Vedova, G. Dalla; Menut, J.-L.; Millour, F.; Petrov, R.; Cassaing, F.; Danchi, W. C.; Jacquinod, S.; Lhome, E.; Lopez, B.; Lozi, J.; hide

    2013-01-01

    We want to study a polychromatic inverse problem method with nulling interferometers to obtain information on the structures of the exozodiacal light. For this reason, during the first semester of 2013, thanks to the support of the consortium PERSEE, we launched a campaign of laboratory measurements with the nulling interferometric test bench PERSEE, operating with 9 spectral channels between J and K bands. Our objective is to characterise the transfer function, i.e. the map of the null as a function of wavelength for an off-axis source, the null being optimised on the central source or on the source photocenter. We were able to reach on-axis null depths better than 10(exp -4). This work is part of a broader project aiming at creating a simulator of a nulling interferometer in which typical noises of a real instrument are introduced. We present here our first results.

  2. Acoustic Resonator Optimisation for Airborne Particle Manipulation

    NASA Astrophysics Data System (ADS)

    Devendran, Citsabehsan; Billson, Duncan R.; Hutchins, David A.; Alan, Tuncay; Neild, Adrian

    Advances in micro-electromechanical systems (MEMS) technology and biomedical research necessitate micro-machined manipulators to capture, handle and position delicate micron-sized particles. To this end, a parallel plate acoustic resonator system has been investigated for the purposes of manipulation and entrapment of micron sized particles in air. Numerical and finite element modelling was performed to optimise the design of the layered acoustic resonator. To obtain an optimised resonator design, careful considerations of the effect of thickness and material properties are required. Furthermore, the effect of acoustic attenuation which is dependent on frequency is also considered within this study, leading to an optimum operational frequency range. Finally, experimental results demonstrated good particle levitation and capture of various particle properties and sizes ranging to as small as 14.8 μm.

  3. An improved design method based on polyphase components for digital FIR filters

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Kuldeep, B.; Singh, G. K.; Lee, Heung No

    2017-11-01

    This paper presents an efficient design of digital finite impulse response (FIR) filter, based on polyphase components and swarm optimisation techniques (SOTs). For this purpose, the design problem is formulated as mean square error between the actual response and ideal response in frequency domain using polyphase components of a prototype filter. To achieve more precise frequency response at some specified frequency, fractional derivative constraints (FDCs) have been applied, and optimal FDCs are computed using SOTs such as cuckoo search and modified cuckoo search algorithms. A comparative study of well-proved swarm optimisation, called particle swarm optimisation and artificial bee colony algorithm is made. The excellence of proposed method is evaluated using several important attributes of a filter. Comparative study evidences the excellence of proposed method for effective design of FIR filter.

  4. Optimisation of process parameters on thin shell part using response surface methodology (RSM) and genetic algorithm (GA)

    NASA Astrophysics Data System (ADS)

    Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study conducts the simulation on optimisation of injection moulding process parameters using Autodesk Moldflow Insight (AMI) software. This study has applied some process parameters which are melt temperature, mould temperature, packing pressure, and cooling time in order to analyse the warpage value of the part. Besides, a part has been selected to be studied which made of Polypropylene (PP). The combination of the process parameters is analysed using Analysis of Variance (ANOVA) and the optimised value is obtained using Response Surface Methodology (RSM). The RSM as well as Genetic Algorithm are applied in Design Expert software in order to minimise the warpage value. The outcome of this study shows that the warpage value improved by using RSM and GA.

  5. A joint swarm intelligence algorithm for multi-user detection in MIMO-OFDM system

    NASA Astrophysics Data System (ADS)

    Hu, Fengye; Du, Dakun; Zhang, Peng; Wang, Zhijun

    2014-11-01

    In the multi-input multi-output orthogonal frequency division multiplexing (MIMO-OFDM) system, traditional multi-user detection (MUD) algorithms that usually used to suppress multiple access interference are difficult to balance system detection performance and the complexity of the algorithm. To solve this problem, this paper proposes a joint swarm intelligence algorithm called Ant Colony and Particle Swarm Optimisation (AC-PSO) by integrating particle swarm optimisation (PSO) and ant colony optimisation (ACO) algorithms. According to simulation results, it has been shown that, with low computational complexity, the MUD for the MIMO-OFDM system based on AC-PSO algorithm gains comparable MUD performance with maximum likelihood algorithm. Thus, the proposed AC-PSO algorithm provides a satisfactory trade-off between computational complexity and detection performance.

  6. Model-based Utility Functions

    NASA Astrophysics Data System (ADS)

    Hibbard, Bill

    2012-05-01

    Orseau and Ring, as well as Dewey, have recently described problems, including self-delusion, with the behavior of agents using various definitions of utility functions. An agent's utility function is defined in terms of the agent's history of interactions with its environment. This paper argues, via two examples, that the behavior problems can be avoided by formulating the utility function in two steps: 1) inferring a model of the environment from interactions, and 2) computing utility as a function of the environment model. Basing a utility function on a model that the agent must learn implies that the utility function must initially be expressed in terms of specifications to be matched to structures in the learned model. These specifications constitute prior assumptions about the environment so this approach will not work with arbitrary environments. But the approach should work for agents designed by humans to act in the physical world. The paper also addresses the issue of self-modifying agents and shows that if provided with the possibility to modify their utility functions agents will not choose to do so, under some usual assumptions.

  7. A novel, modernized Golgi-Cox stain optimized for CLARITY cleared tissue.

    PubMed

    Kassem, Mustafa S; Fok, Sandra Y Y; Smith, Kristie L; Kuligowski, Michael; Balleine, Bernard W

    2018-01-15

    High resolution neuronal information is extraordinarily useful in understanding the brain's functionality. The development of the Golgi-Cox stain allowed observation of the neuron in its entirety with unrivalled detail. Tissue clearing techniques, e.g., CLARITY and CUBIC, provide the potential to observe entire neuronal circuits intact within tissue and without previous restrictions with regard to section thickness. Here we describe an improved Golgi-Cox stain method, optimised for use with CLARITY and CUBIC that can be used in both fresh and fixed tissue. Using this method, we were able to observe neurons in their entirety within a fraction of the time traditionally taken to clear tissue (48h). We were also able to show for the first-time that Golgi stained tissue is fluorescent when visualized using a multi-photon microscope, allowing us to image synaptic spines with a detail previously unachievable. These novel methods provide cheap and easy to use techniques to investigate the morphology of cellular processes in the brain at a new-found depth, speed, utility and detail, without previous restrictions of time, tissue type and section thickness. This is the first application of a Golgi-Cox stain to cleared brain tissue, it is investigated and discussed in detail, describing different methodologies that may be used, a comparison between the different clearing techniques and lastly the novel interaction of these techniques with this ultra-rapid stain. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Improving the pharmacokinetic properties of biologics by fusion to an anti-HSA shark VNAR domain

    PubMed Central

    Müller, Mischa R.; Saunders, Kenneth; Grace, Christopher; Jin, Macy; Piche-Nicholas, Nicole; Steven, John; O’Dwyer, Ronan; Wu, Leeying; Khetemenee, Lam; Vugmeyster, Yulia; Hickling, Timothy P.; Tchistiakova, Lioudmila; Olland, Stephane; Gill, Davinder; Jensen, Allan; Barelle, Caroline J.

    2012-01-01

    Advances in recombinant antibody technology and protein engineering have provided the opportunity to reduce antibodies to their smallest binding domain components and have concomitantly driven the requirement for devising strategies to increase serum half-life to optimise drug exposure, thereby increasing therapeutic efficacy. In this study, we adopted an immunization route to raise picomolar affinity shark immunoglobulin new antigen receptors (IgNARs) to target human serum albumin (HSA). From our model shark species, Squalus acanthias, a phage display library encompassing the variable binding domain of IgNAR (VNAR) was constructed, screened against target, and positive clones were characterized for affinity and specificity. N-terminal and C-terminal molecular fusions of our lead hit in complex with a naïve VNAR domain were expressed, purified and exhibited the retention of high affinity binding to HSA, but also cross-selectivity to mouse, rat and monkey serum albumin both in vitro and in vivo. Furthermore, the naïve VNAR had enhanced pharmacokinetic (PK) characteristics in both N- and C-terminal orientations and when tested as a three domain construct with naïve VNAR flanking the HSA binding domain at both the N and C termini. Molecules derived from this platform technology also demonstrated the potential for clinical utility by being available via the subcutaneous route of delivery. This study thus demonstrates the first in vivo functional efficacy of a VNAR binding domain with the ability to enhance PK properties and support delivery of multifunctional therapies. PMID:23676205

  9. Improving the pharmacokinetic properties of biologics by fusion to an anti-HSA shark VNAR domain.

    PubMed

    Müller, Mischa R; Saunders, Kenneth; Grace, Christopher; Jin, Macy; Piche-Nicholas, Nicole; Steven, John; O'Dwyer, Ronan; Wu, Leeying; Khetemenee, Lam; Vugmeyster, Yulia; Hickling, Timothy P; Tchistiakova, Lioudmila; Olland, Stephane; Gill, Davinder; Jensen, Allan; Barelle, Caroline J

    2012-01-01

    Advances in recombinant antibody technology and protein engineering have provided the opportunity to reduce antibodies to their smallest binding domain components and have concomitantly driven the requirement for devising strategies to increase serum half-life to optimise drug exposure, thereby increasing therapeutic efficacy. In this study, we adopted an immunization route to raise picomolar affinity shark immunoglobulin new antigen receptors (IgNARs) to target human serum albumin (HSA). From our model shark species, Squalus acanthias, a phage display library encompassing the variable binding domain of IgNAR (VNAR) was constructed, screened against target, and positive clones were characterized for affinity and specificity. N-terminal and C-terminal molecular fusions of our lead hit in complex with a naïve VNAR domain were expressed, purified and exhibited the retention of high affinity binding to HSA, but also cross-selectivity to mouse, rat and monkey serum albumin both in vitro and in vivo. Furthermore, the naïve VNAR had enhanced pharmacokinetic (PK) characteristics in both N- and C-terminal orientations and when tested as a three domain construct with naïve VNAR flanking the HSA binding domain at both the N and C termini. Molecules derived from this platform technology also demonstrated the potential for clinical utility by being available via the subcutaneous route of delivery. This study thus demonstrates the first in vivo functional efficacy of a VNAR binding domain with the ability to enhance PK properties and support delivery of multifunctional therapies.

  10. Global Topology Optimisation

    DTIC Science & Technology

    2016-10-31

    statistical physics. Sec. IV includes several examples of the application of the stochastic method, including matching of a shape to a fixed design, and...an important part of any future application of this method. Second, re-initialization of the level set can lead to small but significant movements of...of engineering design problems [6, 17]. However, many of the relevant applications involve non-convex optimisation problems with multiple locally

  11. Sentient Structures: Optimising Sensor Layouts for Direct Measurement of Discrete Variables

    DTIC Science & Technology

    2008-11-01

    1 Sentient Structures Optimising Sensor Layouts for Direct Measurement of Discrete Variables Report to US Air Force...TITLE AND SUBTITLE Sentient Structures 5a. CONTRACT NUMBER FA48690714045 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Donald Price...optimal sensor placements is an important requirement for the development of sentient structures. An optimal sensor layout is attained when a limited

  12. An Optimisation Procedure for the Conceptual Analysis of Different Aerodynamic Configurations

    DTIC Science & Technology

    2000-06-01

    G. Lombardi, G. Mengali Department of Aerospace Engineering , University of Pisa Via Diotisalvi 2, 56126 PISA, Italy F. Beux Scuola Normale Superiore...obtain engines , gears and various systems; their weights and centre configurations with improved performances with respect to a of gravity positions...design parameters have been arranged for The optimisation process includes the following steps: cruise: payload, velocity, range, cruise height, engine

  13. A simulation and optimisation procedure to model daily suppression resource transfers during a fire season in Colorado

    Treesearch

    Yu Wei; Erin J. Belval; Matthew P. Thompson; Dave E. Calkin; Crystal S. Stonesifer

    2016-01-01

    Sharing fire engines and crews between fire suppression dispatch zones may help improve the utilisation of fire suppression resources. Using the Resource Ordering and Status System, the Predictive Services’ Fire Potential Outlooks and the Rocky Mountain Region Preparedness Levels from 2010 to 2013, we tested a simulation and optimisation procedure to transfer crews and...

  14. Evaluation and optimisation of phenomenological multi-step soot model for spray combustion under diesel engine-like operating conditions

    NASA Astrophysics Data System (ADS)

    Pang, Kar Mun; Jangi, Mehdi; Bai, Xue-Song; Schramm, Jesper

    2015-05-01

    In this work, a two-dimensional computational fluid dynamics study is reported of an n-heptane combustion event and the associated soot formation process in a constant volume combustion chamber. The key interest here is to evaluate the sensitivity of the chemical kinetics and submodels of a semi-empirical soot model in predicting the associated events. Numerical computation is performed using an open-source code and a chemistry coordinate mapping approach is used to expedite the calculation. A library consisting of various phenomenological multi-step soot models is constructed and integrated with the spray combustion solver. Prior to the soot modelling, combustion simulations are carried out. Numerical results show that the ignition delay times and lift-off lengths exhibit good agreement with the experimental measurements across a wide range of operating conditions, apart from those in the cases with ambient temperature lower than 850 K. The variation of the soot precursor production with respect to the change of ambient oxygen levels qualitatively agrees with that of the conceptual models when the skeletal n-heptane mechanism is integrated with a reduced pyrene chemistry. Subsequently, a comprehensive sensitivity analysis is carried out to appraise the existing soot formation and oxidation submodels. It is revealed that the soot formation is captured when the surface growth rate is calculated using a square root function of the soot specific surface area and when a pressure-dependent model constant is considered. An optimised soot model is then proposed based on the knowledge gained through this exercise. With the implementation of optimised model, the simulated soot onset and transport phenomena before reaching quasi-steady state agree reasonably well with the experimental observation. Also, variation of spatial soot distribution and soot mass produced at oxygen molar fractions ranging from 10.0 to 21.0% for both low and high density conditions are reproduced.

  15. Epigenetically regulated imprinted genes and foetal programming.

    PubMed

    Keverne, Eric B

    2010-11-01

    Genomic imprinting is a widespread epigenetic phenomenon in mammals and many imprinted genes are expressed in the developing hypothalamus and placenta. The placenta and brain are very different structures with very different roles, but in the pregnant mother they functionally interact coordinating and ensuring the provision of nutrients, timing of parturition and priming of hypothalamus for maternal care and nurturing. This interaction has been evolutionarily fine-tuned to optimise infant survival such that when resources are poor, the mother 'informs' this condition to the foetus producing a thrifty phenotype that is adapted to survive scarce resources after birth.

  16. How can psychiatrists offer psychotherapeutic leadership in the public sector?

    PubMed

    Cammell, Paul; Amos, Jackie; Baigent, Michael

    2016-06-01

    This article reviews the forms that psychotherapeutic leadership can take for psychiatrists attempting to optimise outcomes for individuals receiving treatment in the public mental health sector. It explores a range of roles and functions that psychiatrists can take on as psychotherapy leaders, and how these can be applied in clinical, administrative and research contexts. Psychiatrists need to play an increasing role in clinical, administrative and academic settings to advance service provision, resource allocation, training and research directed at psychotherapies in the public health sector. © The Royal Australian and New Zealand College of Psychiatrists 2016.

  17. Modelling the endothelial blood-CNS barriers: a method for the production of robust in vitro models of the rat blood-brain barrier and blood-spinal cord barrier

    PubMed Central

    2013-01-01

    Background Modelling the blood-CNS barriers of the brain and spinal cord in vitro continues to provide a considerable challenge for research studying the passage of large and small molecules in and out of the central nervous system, both within the context of basic biology and for pharmaceutical drug discovery. Although there has been considerable success over the previous two decades in establishing useful in vitro primary endothelial cell cultures from the blood-CNS barriers, no model fully mimics the high electrical resistance, low paracellular permeability and selective influx/efflux characteristics of the in vivo situation. Furthermore, such primary-derived cultures are typically labour-intensive and generate low yields of cells, limiting scope for experimental work. We thus aimed to establish protocols for the high yield isolation and culture of endothelial cells from both rat brain and spinal cord. Our aim was to optimise in vitro conditions for inducing phenotypic characteristics in these cells that were reminiscent of the in vivo situation, such that they developed into tight endothelial barriers suitable for performing investigative biology and permeability studies. Methods Brain and spinal cord tissue was taken from the same rats and used to specifically isolate endothelial cells to reconstitute as in vitro blood-CNS barrier models. Isolated endothelial cells were cultured to expand the cellular yield and then passaged onto cell culture inserts for further investigation. Cell culture conditions were optimised using commercially available reagents and the resulting barrier-forming endothelial monolayers were characterised by functional permeability experiments and in vitro phenotyping by immunocytochemistry and western blotting. Results Using a combination of modified handling techniques and cell culture conditions, we have established and optimised a protocol for the in vitro culture of brain and, for the first time in rat, spinal cord endothelial cells. High yields of both CNS endothelial cell types can be obtained, and these can be passaged onto large numbers of cell culture inserts for in vitro permeability studies. The passaged brain and spinal cord endothelial cells are pure and express endothelial markers, tight junction proteins and intracellular transport machinery. Further, both models exhibit tight, functional barrier characteristics that are discriminating against large and small molecules in permeability assays and show functional expression of the pharmaceutically important P-gp efflux transporter. Conclusions Our techniques allow the provision of high yields of robust sister cultures of endothelial cells that accurately model the blood-CNS barriers in vitro. These models are ideally suited for use in studying the biology of the blood-brain barrier and blood-spinal cord barrier in vitro and for pre-clinical drug discovery. PMID:23773766

  18. Modelling the endothelial blood-CNS barriers: a method for the production of robust in vitro models of the rat blood-brain barrier and blood-spinal cord barrier.

    PubMed

    Watson, P Marc D; Paterson, Judy C; Thom, George; Ginman, Ulrika; Lundquist, Stefan; Webster, Carl I

    2013-06-18

    Modelling the blood-CNS barriers of the brain and spinal cord in vitro continues to provide a considerable challenge for research studying the passage of large and small molecules in and out of the central nervous system, both within the context of basic biology and for pharmaceutical drug discovery. Although there has been considerable success over the previous two decades in establishing useful in vitro primary endothelial cell cultures from the blood-CNS barriers, no model fully mimics the high electrical resistance, low paracellular permeability and selective influx/efflux characteristics of the in vivo situation. Furthermore, such primary-derived cultures are typically labour-intensive and generate low yields of cells, limiting scope for experimental work. We thus aimed to establish protocols for the high yield isolation and culture of endothelial cells from both rat brain and spinal cord. Our aim was to optimise in vitro conditions for inducing phenotypic characteristics in these cells that were reminiscent of the in vivo situation, such that they developed into tight endothelial barriers suitable for performing investigative biology and permeability studies. Brain and spinal cord tissue was taken from the same rats and used to specifically isolate endothelial cells to reconstitute as in vitro blood-CNS barrier models. Isolated endothelial cells were cultured to expand the cellular yield and then passaged onto cell culture inserts for further investigation. Cell culture conditions were optimised using commercially available reagents and the resulting barrier-forming endothelial monolayers were characterised by functional permeability experiments and in vitro phenotyping by immunocytochemistry and western blotting. Using a combination of modified handling techniques and cell culture conditions, we have established and optimised a protocol for the in vitro culture of brain and, for the first time in rat, spinal cord endothelial cells. High yields of both CNS endothelial cell types can be obtained, and these can be passaged onto large numbers of cell culture inserts for in vitro permeability studies. The passaged brain and spinal cord endothelial cells are pure and express endothelial markers, tight junction proteins and intracellular transport machinery. Further, both models exhibit tight, functional barrier characteristics that are discriminating against large and small molecules in permeability assays and show functional expression of the pharmaceutically important P-gp efflux transporter. Our techniques allow the provision of high yields of robust sister cultures of endothelial cells that accurately model the blood-CNS barriers in vitro. These models are ideally suited for use in studying the biology of the blood-brain barrier and blood-spinal cord barrier in vitro and for pre-clinical drug discovery.

  19. The Integration of Environmental Constraints into Tidal Array Optimisation

    NASA Astrophysics Data System (ADS)

    du Feu, Roan; de Trafford, Sebastian; Culley, Dave; Hill, Jon; Funke, Simon W.; Kramer, Stephan C.; Piggott, Matthew D.

    2015-04-01

    It has been estimated by The Carbon Trust that the marine renewable energy sector, of which tidal stream turbines are projected to play a large part, could produce 20% of the UK's present electricity requirements. This has lead to the important question of how this technology can be deployed in an economically and environmentally friendly manner. Work is currently under way to understand how the tidal turbines that constitute an array can be arranged to maximise the total power generated by that array. The work presented here continues this through the inclusion of environmental constraints. The benefits of the renewable energy sector to our environment at large are not in question. However, the question remains as to the effects this burgeoning sector will have on local environments, and how to mitigate these effects if they are detrimental. For example, the presence of tidal arrays can, through altering current velocity, drastically change the sediment transport into and out of an area along with re-suspending existing sediment. This can have the effects of scouring or submerging habitat, mobilising contaminants within the existing sediment, reducing food supply and altering the turbidity of the water. All of which greatly impact upon any fauna in the affected region. This work pays particular attention to the destruction of habitat of benthic fauna, as this is quantifiable as a direct result of change in the current speed; a primary factor in determining sediment accumulation on the sea floor. OpenTidalFarm is an open source tool that maximises the power generated by an array through repositioning the turbines within it. It currently uses a 2D shallow water model with turbines represented as bump functions of increased friction. The functional of interest, power extracted by the array, is evaluated from the flow field which is calculated at each iteration using a finite element method. A gradient-based local optimisation is then used through solving the associated adjoint equations, and the turbines are repositioned accordingly. The use of local optimisation drastically reduces the number of iterations therefore allowing each iteration to be more expensive. This means that this technique can be readily applied to large arrays and also that there is enough leeway in computational cost that additional constraints or functionals can be introduced without the model becoming impractical to apply. The work presented here utilises OpenTidalFarm and incorporates into it ecological and sedimentological constraints that limit the extent to which the array can alter the current speed in specified locations. The addition of these constraints will likely affect the total power generated by the array, and this work details our first steps in investigating the trade off between the maximisation of power generation and the limitation of the array's impact upon its environment.

  20. A density functional global optimisation study of neutral 8-atom Cu-Ag and Cu-Au clusters

    NASA Astrophysics Data System (ADS)

    Heard, Christopher J.; Johnston, Roy L.

    2013-02-01

    The effect of doping on the energetics and dimensionality of eight atom coinage metal subnanometre particles is fully resolved using a genetic algorithm in tandem with on the fly density functional theory calculations to determine the global minima (GM) for Cu n Ag(8- n) and Cu n Au(8- n) clusters. Comparisons are made to previous ab initio work on mono- and bimetallic clusters, with excellent agreement found. Charge transfer and geometric arguments are considered to rationalise the stability of the particular permutational isomers found. An interesting transition between three dimensional and two dimensional GM structures is observed for copper-gold clusters, which is sharper and appears earlier in the doping series than is known for gold-silver particles.

  1. Scoring functions for protein-protein interactions.

    PubMed

    Moal, Iain H; Moretti, Rocco; Baker, David; Fernández-Recio, Juan

    2013-12-01

    The computational evaluation of protein-protein interactions will play an important role in organising the wealth of data being generated by high-throughput initiatives. Here we discuss future applications, report recent developments and identify areas requiring further investigation. Many functions have been developed to quantify the structural and energetic properties of interacting proteins, finding use in interrelated challenges revolving around the relationship between sequence, structure and binding free energy. These include loop modelling, side-chain refinement, docking, multimer assembly, affinity prediction, affinity change upon mutation, hotspots location and interface design. Information derived from models optimised for one of these challenges can be used to benefit the others, and can be unified within the theoretical frameworks of multi-task learning and Pareto-optimal multi-objective learning. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Bacterial Unculturability and the Formation of Intercellular Metabolic Networks.

    PubMed

    Pande, Samay; Kost, Christian

    2017-05-01

    The majority of known bacterial species cannot be cultivated under laboratory conditions. Here we argue that the adaptive emergence of obligate metabolic interactions in natural bacterial communities can explain this pattern. Bacteria commonly release metabolites into the external environment. Accumulating pools of extracellular metabolites create an ecological niche that benefits auxotrophic mutants, which have lost the ability to autonomously produce the corresponding metabolites. In addition to a diffusion-based metabolite transfer, auxotrophic cells can use contact-dependent means to obtain nutrients from other co-occurring cells. Spatial colocalisation and a continuous coevolution further increase the nutritional dependency and optimise fluxes through combined metabolic networks. Thus, bacteria likely function as networks of interacting cells that reciprocally exchange nutrients and biochemical functions rather than as physiologically autonomous units. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Optimal and robust control of a class of nonlinear systems using dynamically re-optimised single network adaptive critic design

    NASA Astrophysics Data System (ADS)

    Tiwari, Shivendra N.; Padhi, Radhakant

    2018-01-01

    Following the philosophy of adaptive optimal control, a neural network-based state feedback optimal control synthesis approach is presented in this paper. First, accounting for a nominal system model, a single network adaptive critic (SNAC) based multi-layered neural network (called as NN1) is synthesised offline. However, another linear-in-weight neural network (called as NN2) is trained online and augmented to NN1 in such a manner that their combined output represent the desired optimal costate for the actual plant. To do this, the nominal model needs to be updated online to adapt to the actual plant, which is done by synthesising yet another linear-in-weight neural network (called as NN3) online. Training of NN3 is done by utilising the error information between the nominal and actual states and carrying out the necessary Lyapunov stability analysis using a Sobolev norm based Lyapunov function. This helps in training NN2 successfully to capture the required optimal relationship. The overall architecture is named as 'Dynamically Re-optimised single network adaptive critic (DR-SNAC)'. Numerical results for two motivating illustrative problems are presented, including comparison studies with closed form solution for one problem, which clearly demonstrate the effectiveness and benefit of the proposed approach.

  4. To what extent can ecosystem services motivate protecting biodiversity?

    PubMed

    Dee, Laura E; De Lara, Michel; Costello, Christopher; Gaines, Steven D

    2017-08-01

    Society increasingly focuses on managing nature for the services it provides people rather than for the existence of particular species. How much biodiversity protection would result from this modified focus? Although biodiversity contributes to ecosystem services, the details of which species are critical, and whether they will go functionally extinct in the future, are fraught with uncertainty. Explicitly considering this uncertainty, we develop an analytical framework to determine how much biodiversity protection would arise solely from optimising net value from an ecosystem service. Using stochastic dynamic programming, we find that protecting a threshold number of species is optimal, and uncertainty surrounding how biodiversity produces services makes it optimal to protect more species than are presumed critical. We define conditions under which the economically optimal protection strategy is to protect all species, no species, and cases in between. We show how the optimal number of species to protect depends upon different relationships between species and services, including considering multiple services. Our analysis provides simple criteria to evaluate when managing for particular ecosystem services could warrant protecting all species, given uncertainty. Evaluating this criterion with empirical estimates from different ecosystems suggests that optimising some services will be more likely to protect most species than others. © 2017 John Wiley & Sons Ltd/CNRS.

  5. Alkalizing Reactions Streamline Cellular Metabolism in Acidogenic Microorganisms

    PubMed Central

    Arioli, Stefania; Ragg, Enzio; Scaglioni, Leonardo; Fessas, Dimitrios; Signorelli, Marco; Karp, Matti; Daffonchio, Daniele; De Noni, Ivano; Mulas, Laura; Oggioni, Marco; Guglielmetti, Simone; Mora, Diego

    2010-01-01

    An understanding of the integrated relationships among the principal cellular functions that govern the bioenergetic reactions of an organism is necessary to determine how cells remain viable and optimise their fitness in the environment. Urease is a complex enzyme that catalyzes the hydrolysis of urea to ammonia and carbonic acid. While the induction of urease activity by several microorganisms has been predominantly considered a stress-response that is initiated to generate a nitrogen source in response to a low environmental pH, here we demonstrate a new role of urease in the optimisation of cellular bioenergetics. We show that urea hydrolysis increases the catabolic efficiency of Streptococcus thermophilus, a lactic acid bacterium that is widely used in the industrial manufacture of dairy products. By modulating the intracellular pH and thereby increasing the activity of β-galactosidase, glycolytic enzymes and lactate dehydrogenase, urease increases the overall change in enthalpy generated by the bioenergetic reactions. A cooperative altruistic behaviour of urease-positive microorganisms on the urease-negative microorganisms within the same environment was also observed. The physiological role of a single enzymatic activity demonstrates a novel and unexpected view of the non-transcriptional regulatory mechanisms that govern the bioenergetics of a bacterial cell, highlighting a new role for cytosol-alkalizing biochemical pathways in acidogenic microorganisms. PMID:21152088

  6. Synthesis and characterisation of PEG modified chitosan nanocapsules loaded with thymoquinone.

    PubMed

    Vignesh Kumar, Suresh Kumar; Renuka Devi, Ponnuswamy; Harish, Saru; Hemananthan, Eswaran

    2017-02-01

    Thymoquinone (TQ), a major bioactive compound of Nigella sativa seeds has several therapeutic properties. The main drawback in bringing TQ to therapeutic application is that it has poor stability and bioavailability. Hence a suitable carrier is essential for TQ delivery. Recent studies indicate biodegradable polymers are potentially good carriers of bioactive compounds. In this study, polyethylene glycol (PEG) modified chitosan (Cs) nanocapsules were developed as a carrier for TQ. Aqueous soluble low molecular weight Cs and PEG was selected among different biodegradable polymers based on their biocompatibility and efficacy as a carrier. Optimisation of synthesis of nanocapsules was done based on particle size, PDI, encapsulation efficiency and process yield. A positive zeta potential value of +48 mV, indicating good stability was observed. Scanning electron microscope and atomic-force microscopy analysis revealed spherical shaped and smooth surfaced nanocapsules with size between 100 to 300 nm. The molecular dispersion of the TQ in Cs PEG nanocapsules was studied using X-ray powder diffraction. The Fourier transform infrared spectrum of optimised nanocapsule exhibited functional groups of both polymer and drug, confirming the presence of Cs, PEG and TQ. In vitro drug release studies showed that PEG modified Cs nanocapsules loaded with TQ had a slow and sustained release.

  7. Silicone rod extraction followed by liquid desorption-large volume injection-programmable temperature vaporiser-gas chromatography-mass spectrometry for trace analysis of priority organic pollutants in environmental water samples.

    PubMed

    Delgado, Alejandra; Posada-Ureta, Oscar; Olivares, Maitane; Vallejo, Asier; Etxebarria, Nestor

    2013-12-15

    In this study a priority organic pollutants usually found in environmental water samples were considered to accomplish two extraction and analysis approaches. Among those compounds organochlorine compounds, pesticides, phthalates, phenols and residues of pharmaceutical and personal care products were included. The extraction and analysis steps were based on silicone rod extraction (SR) followed by liquid desorption in combination with large volume injection-programmable temperature vaporiser (LVI-PTV) and gas chromatography-mass spectrometry (GC-MS). Variables affecting the analytical response as a function of the programmable temperature vaporiser (PTV) parameters were firstly optimised following an experimental design approach. The SR extraction and desorption conditions were assessed afterwards, including matrix modification, time extraction, and stripping solvent composition. Subsequently, the possibility of performing membrane enclosed sorptive coating extraction (MESCO) as a modified extraction approach was also evaluated. The optimised method showed low method detection limits (3-35 ng L(-1)), acceptable accuracy (78-114%) and precision values (<13%) for most of the studied analytes regardless of the aqueous matrix. Finally, the developed approach was successfully applied to the determination of target analytes in aqueous environmental matrices including estuarine and wastewater samples. © 2013 Elsevier B.V. All rights reserved.

  8. Development of Porous Piezoceramics for Medical and Sensor Applications.

    PubMed

    Ringgaard, Erling; Lautzenhiser, Frans; Bierregaard, Louise M; Zawada, Tomasz; Molz, Eric

    2015-12-21

    The use of porosity to modify the functional properties of piezoelectric ceramics is well known in the scientific literature as well as by the industry, and porous ceramic can be seen as a 2-phase composite. In the present work, examples are given of applications where controlled porosity is exploited in order to optimise the dielectric, piezoelectric and acoustic properties of the piezoceramics. For the optimisation efforts it is important to note that the thickness coupling coefficient k t will be maximised for some non-zero value of the porosity that could be above 20%. On the other hand, with a good approximation, the acoustic velocity decreases linearly with increasing porosity, which is obviously also the case for the density. Consequently, the acoustic impedance shows a rather strong decrease with porosity, and in practice a reduction of more than 50% may be obtained for an engineered porous ceramic. The significance of the acoustic impedance is associated with the transmission of acoustic signals through the interface between the piezoceramic and some medium of propagation, but when the porous ceramic is used as a substrate for a piezoceramic thick film, the attenuation may be equally important. In the case of open porosity it is possible to introduce a liquid into the pores, and examples of modifying the properties in this way are given.

  9. Kinematic models of the upper limb joints for multibody kinematics optimisation: An overview.

    PubMed

    Duprey, Sonia; Naaim, Alexandre; Moissenet, Florent; Begon, Mickaël; Chèze, Laurence

    2017-09-06

    Soft tissue artefact (STA), i.e. the motion of the skin, fat and muscles gliding on the underlying bone, may lead to a marker position error reaching up to 8.7cm for the particular case of the scapula. Multibody kinematics optimisation (MKO) is one of the most efficient approaches used to reduce STA. It consists in minimising the distance between the positions of experimental markers on a subject skin and the simulated positions of the same markers embedded on a kinematic model. However, the efficiency of MKO directly relies on the chosen kinematic model. This paper proposes an overview of the different upper limb models available in the literature and a discussion about their applicability to MKO. The advantages of each joint model with respect to its biofidelity to functional anatomy are detailed both for the shoulder and the forearm areas. Models capabilities of personalisation and of adaptation to pathological cases are also discussed. Concerning model efficiency in terms of STA reduction in MKO algorithms, a lack of quantitative assessment in the literature is noted. In priority, future studies should concern the evaluation and quantification of STA reduction depending on upper limb joint constraints. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Multidetector CT radiation dose optimisation in adults: short- and long-term effects of a clinical audit.

    PubMed

    Tack, Denis; Jahnen, Andreas; Kohler, Sarah; Harpes, Nico; De Maertelaer, Viviane; Back, Carlo; Gevenois, Pierre Alain

    2014-01-01

    To report short- and long-term effects of an audit process intended to optimise the radiation dose from multidetector row computed tomography (MDCT). A survey of radiation dose from all eight MDCT departments in the state of Luxembourg performed in 2007 served as baseline, and involved the most frequently imaged regions (head, sinus, cervical spine, thorax, abdomen, and lumbar spine). CT dose index volume (CTDIvol), dose-length product per acquisition (DLP/acq), and DLP per examination (DLP/exa) were recorded, and their mean, median, 25th and 75th percentiles compared. In 2008, an audit conducted in each department helped to optimise doses. In 2009 and 2010, two further surveys evaluated the audit's impact on the dose delivered. Between 2007 and 2009, DLP/exa significantly decreased by 32-69 % for all regions (P < 0.001) except the lumbar spine (5 %, P = 0.455). Between 2009 and 2010, DLP/exa significantly decreased by 13-18 % for sinus, cervical and lumbar spine (P ranging from 0.016 to less than 0.001). Between 2007 and 2010, DLP/exa significantly decreased for all regions (18-75 %, P < 0.001). Collective dose decreased by 30 % and the 75th percentile (diagnostic reference level, DRL) by 20-78 %. The audit process resulted in long-lasting dose reduction, with DRLs reduced by 20-78 %, mean DLP/examination by 18-75 %, and collective dose by 30 %. • External support through clinical audit may optimise default parameters of routine CT. • Reduction of 75th percentiles used as reference diagnostic levels is 18-75 %. • The effect of this audit is sustainable over time. • Dose savings through optimisation can be added to those achievable through CT.

  11. Medical imaging dose optimisation from ground up: expert opinion of an international summit.

    PubMed

    Samei, Ehsan; Järvinen, Hannu; Kortesniemi, Mika; Simantirakis, George; Goh, Charles; Wallace, Anthony; Vano, Eliseo; Bejan, Adrian; Rehani, Madan; Vassileva, Jenia

    2018-05-17

    As in any medical intervention, there is either a known or an anticipated benefit to the patient from undergoing a medical imaging procedure. This benefit is generally significant, as demonstrated by the manner in which medical imaging has transformed clinical medicine. At the same time, when it comes to imaging that deploys ionising radiation, there is a potential associated risk from radiation. Radiation risk has been recognised as a key liability in the practice of medical imaging, creating a motivation for radiation dose optimisation. The level of radiation dose and risk in imaging varies but is generally low. Thus, from the epidemiological perspective, this makes the estimation of the precise level of associated risk highly uncertain. However, in spite of the low magnitude and high uncertainty of this risk, its possibility cannot easily be refuted. Therefore, given the moral obligation of healthcare providers, 'first, do no harm,' there is an ethical obligation to mitigate this risk. Precisely how to achieve this goal scientifically and practically within a coherent system has been an open question. To address this need, in 2016, the International Atomic Energy Agency (IAEA) organised a summit to clarify the role of Diagnostic Reference Levels to optimise imaging dose, summarised into an initial report (Järvinen et al 2017 Journal of Medical Imaging 4 031214). Through a consensus building exercise, the summit further concluded that the imaging optimisation goal goes beyond dose alone, and should include image quality as a means to include both the benefit and the safety of the exam. The present, second report details the deliberation of the summit on imaging optimisation.

  12. Application of snakes and dynamic programming optimisation technique in modeling of buildings in informal settlement areas

    NASA Astrophysics Data System (ADS)

    Rüther, Heinz; Martine, Hagai M.; Mtalo, E. G.

    This paper presents a novel approach to semiautomatic building extraction in informal settlement areas from aerial photographs. The proposed approach uses a strategy of delineating buildings by optimising their approximate building contour position. Approximate building contours are derived automatically by locating elevation blobs in digital surface models. Building extraction is then effected by means of the snakes algorithm and the dynamic programming optimisation technique. With dynamic programming, the building contour optimisation problem is realized through a discrete multistage process and solved by the "time-delayed" algorithm, as developed in this work. The proposed building extraction approach is a semiautomatic process, with user-controlled operations linking fully automated subprocesses. Inputs into the proposed building extraction system are ortho-images and digital surface models, the latter being generated through image matching techniques. Buildings are modeled as "lumps" or elevation blobs in digital surface models, which are derived by altimetric thresholding of digital surface models. Initial windows for building extraction are provided by projecting the elevation blobs centre points onto an ortho-image. In the next step, approximate building contours are extracted from the ortho-image by region growing constrained by edges. Approximate building contours thus derived are inputs into the dynamic programming optimisation process in which final building contours are established. The proposed system is tested on two study areas: Marconi Beam in Cape Town, South Africa, and Manzese in Dar es Salaam, Tanzania. Sixty percent of buildings in the study areas have been extracted and verified and it is concluded that the proposed approach contributes meaningfully to the extraction of buildings in moderately complex and crowded informal settlement areas.

  13. Predicting emergency coronary artery bypass graft following PCI: application of a computational model to refer patients to hospitals with and without onsite surgical backup

    PubMed Central

    Syed, Zeeshan; Moscucci, Mauro; Share, David; Gurm, Hitinder S

    2015-01-01

    Background Clinical tools to stratify patients for emergency coronary artery bypass graft (ECABG) after percutaneous coronary intervention (PCI) create the opportunity to selectively assign patients undergoing procedures to hospitals with and without onsite surgical facilities for dealing with potential complications while balancing load across providers. The goal of our study was to investigate the feasibility of a computational model directly optimised for cohort-level performance to predict ECABG in PCI patients for this application. Methods Blue Cross Blue Shield of Michigan Cardiovascular Consortium registry data with 69 pre-procedural and angiographic risk variables from 68 022 PCI procedures in 2004–2007 were used to develop a support vector machine (SVM) model for ECABG. The SVM model was optimised for the area under the receiver operating characteristic curve (AUROC) at the level of the training cohort and validated on 42 310 PCI procedures performed in 2008–2009. Results There were 87 cases of ECABG (0.21%) in the validation cohort. The SVM model achieved an AUROC of 0.81 (95% CI 0.76 to 0.86). Patients in the predicted top decile were at a significantly increased risk relative to the remaining patients (OR 9.74, 95% CI 6.39 to 14.85, p<0.001) for ECABG. The SVM model optimised for the AUROC on the training cohort significantly improved discrimination, net reclassification and calibration over logistic regression and traditional SVM classification optimised for univariate performance. Conclusions Computational risk stratification directly optimising cohort-level performance holds the potential of high levels of discrimination for ECABG following PCI. This approach has value in selectively referring PCI patients to hospitals with and without onsite surgery. PMID:26688738

  14. Optimisation of asymmetric flow field-flow fractionation for the characterisation of nanoparticles in coated polydisperse TiO2 with applications in food and feed.

    PubMed

    Omar, J; Boix, A; Kerckhove, G; von Holst, C

    2016-12-01

    Titanium dioxide (TiO 2 ) has various applications in consumer products and is also used as an additive in food and feeding stuffs. For the characterisation of this product, including the determination of nanoparticles, there is a strong need for the availability of corresponding methods of analysis. This paper presents an optimisation process for the characterisation of polydisperse-coated TiO 2 nanoparticles. As a first step, probe ultrasonication was optimised using a central composite design in which the amplitude and time were the selected variables to disperse, i.e., to break up agglomerates and/or aggregates of the material. The results showed that high amplitudes (60%) favoured a better dispersion and time was fixed in mid-values (5 min). In a next step, key factors of asymmetric flow field-flow fraction (AF4), namely cross-flow (CF), detector flow (DF), exponential decay of the cross-flow (CF exp ) and focus time (Ft), were studied through experimental design. Firstly, a full-factorial design was employed to establish the statistically significant factors (p < 0.05). Then, the information obtained from the full-factorial design was utilised by applying a central composite design to obtain the following optimum conditions of the system: CF, 1.6 ml min -1 ; DF, 0.4 ml min -1 ; Ft, 5 min; and CF exp , 0.6. Once the optimum conditions were obtained, the stability of the dispersed sample was measured for 24 h by analysing 10 replicates with AF4 in order to assess the performance of the optimised dispersion protocol. Finally, the recovery of the optimised method, particle shape and particle size distribution were estimated.

  15. Optimisation of asymmetric flow field-flow fractionation for the characterisation of nanoparticles in coated polydisperse TiO2 with applications in food and feed

    PubMed Central

    Omar, J.; Boix, A.; Kerckhove, G.; von Holst, C.

    2016-01-01

    ABSTRACT Titanium dioxide (TiO2) has various applications in consumer products and is also used as an additive in food and feeding stuffs. For the characterisation of this product, including the determination of nanoparticles, there is a strong need for the availability of corresponding methods of analysis. This paper presents an optimisation process for the characterisation of polydisperse-coated TiO2 nanoparticles. As a first step, probe ultrasonication was optimised using a central composite design in which the amplitude and time were the selected variables to disperse, i.e., to break up agglomerates and/or aggregates of the material. The results showed that high amplitudes (60%) favoured a better dispersion and time was fixed in mid-values (5 min). In a next step, key factors of asymmetric flow field-flow fraction (AF4), namely cross-flow (CF), detector flow (DF), exponential decay of the cross-flow (CFexp) and focus time (Ft), were studied through experimental design. Firstly, a full-factorial design was employed to establish the statistically significant factors (p < 0.05). Then, the information obtained from the full-factorial design was utilised by applying a central composite design to obtain the following optimum conditions of the system: CF, 1.6 ml min–1; DF, 0.4 ml min–1; Ft, 5 min; and CFexp, 0.6. Once the optimum conditions were obtained, the stability of the dispersed sample was measured for 24 h by analysing 10 replicates with AF4 in order to assess the performance of the optimised dispersion protocol. Finally, the recovery of the optimised method, particle shape and particle size distribution were estimated. PMID:27650879

  16. Biodegradation of free cyanide and subsequent utilisation of biodegradation by-products by Bacillus consortia: optimisation using response surface methodology.

    PubMed

    Mekuto, Lukhanyo; Ntwampe, Seteno Karabo Obed; Jackson, Vanessa Angela

    2015-07-01

    A mesophilic alkali-tolerant bacterial consortium belonging to the Bacillus genus was evaluated for its ability to biodegrade high free cyanide (CN(-)) concentration (up to 500 mg CN(-)/L), subsequent to the oxidation of the formed ammonium and nitrates in a continuous bioreactor system solely supplemented with whey waste. Furthermore, an optimisation study for successful cyanide biodegradation by this consortium was evaluated in batch bioreactors (BBs) using response surface methodology (RSM). The input variables, that is, pH, temperature and whey-waste concentration, were optimised using a numerical optimisation technique where the optimum conditions were found to be as follows: pH 9.88, temperature 33.60 °C and whey-waste concentration of 14.27 g/L, under which 206.53 mg CN(-)/L in 96 h can be biodegraded by the microbial species from an initial cyanide concentration of 500 mg CN(-)/L. Furthermore, using the optimised data, cyanide biodegradation in a continuous mode was evaluated in a dual-stage packed-bed bioreactor (PBB) connected in series to a pneumatic bioreactor system (PBS) used for simultaneous nitrification, including aerobic denitrification. The whey-supported Bacillus sp. culture was not inhibited by the free cyanide concentration of up to 500 mg CN(-)/L, with an overall degradation efficiency of ≥ 99 % with subsequent nitrification and aerobic denitrification of the formed ammonium and nitrates over a period of 80 days. This is the first study to report free cyanide biodegradation at concentrations of up to 500 mg CN(-)/L in a continuous system using whey waste as a microbial feedstock. The results showed that the process has the potential for the bioremediation of cyanide-containing wastewaters.

  17. The optimization of treatment and management of schizophrenia in Europe (OPTiMiSE) trial: rationale for its methodology and a review of the effectiveness of switching antipsychotics.

    PubMed

    Leucht, Stefan; Winter-van Rossum, Inge; Heres, Stephan; Arango, Celso; Fleischhacker, W Wolfgang; Glenthøj, Birte; Leboyer, Marion; Leweke, F Markus; Lewis, Shôn; McGuire, Phillip; Meyer-Lindenberg, Andreas; Rujescu, Dan; Kapur, Shitij; Kahn, René S; Sommer, Iris E

    2015-05-01

    Most of the 13 542 trials contained in the Cochrane Schizophrenia Group's register just tested the general efficacy of pharmacological or psychosocial interventions. Studies on the subsequent treatment steps, which are essential to guide clinicians, are largely missing. This knowledge gap leaves important questions unanswered. For example, when a first antipsychotic failed, is switching to another drug effective? And when should we use clozapine? The aim of this article is to review the efficacy of switching antipsychotics in case of nonresponse. We also present the European Commission sponsored "Optimization of Treatment and Management of Schizophrenia in Europe" (OPTiMiSE) trial which aims to provide a treatment algorithm for patients with a first episode of schizophrenia. We searched Pubmed (October 29, 2014) for randomized controlled trials (RCTs) that examined switching the drug in nonresponders to another antipsychotic. We described important methodological choices of the OPTiMiSE trial. We found 10 RCTs on switching antipsychotic drugs. No trial was conclusive and none was concerned with first-episode schizophrenia. In OPTiMiSE, 500 first episode patients are treated with amisulpride for 4 weeks, followed by a 6-week double-blind RCT comparing continuation of amisulpride with switching to olanzapine and ultimately a 12-week clozapine treatment in nonremitters. A subsequent 1-year RCT validates psychosocial interventions to enhance adherence. Current literature fails to provide basic guidance for the pharmacological treatment of schizophrenia. The OPTiMiSE trial is expected to provide a basis for clinical guidelines to treat patients with a first episode of schizophrenia. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. Determination of optimal ultrasound planes for the initialisation of image registration during endoscopic ultrasound-guided procedures.

    PubMed

    Bonmati, Ester; Hu, Yipeng; Gibson, Eli; Uribarri, Laura; Keane, Geri; Gurusami, Kurinchi; Davidson, Brian; Pereira, Stephen P; Clarkson, Matthew J; Barratt, Dean C

    2018-06-01

    Navigation of endoscopic ultrasound (EUS)-guided procedures of the upper gastrointestinal (GI) system can be technically challenging due to the small fields-of-view of ultrasound and optical devices, as well as the anatomical variability and limited number of orienting landmarks during navigation. Co-registration of an EUS device and a pre-procedure 3D image can enhance the ability to navigate. However, the fidelity of this contextual information depends on the accuracy of registration. The purpose of this study was to develop and test the feasibility of a simulation-based planning method for pre-selecting patient-specific EUS-visible anatomical landmark locations to maximise the accuracy and robustness of a feature-based multimodality registration method. A registration approach was adopted in which landmarks are registered to anatomical structures segmented from the pre-procedure volume. The predicted target registration errors (TREs) of EUS-CT registration were estimated using simulated visible anatomical landmarks and a Monte Carlo simulation of landmark localisation error. The optimal planes were selected based on the 90th percentile of TREs, which provide a robust and more accurate EUS-CT registration initialisation. The method was evaluated by comparing the accuracy and robustness of registrations initialised using optimised planes versus non-optimised planes using manually segmented CT images and simulated ([Formula: see text]) or retrospective clinical ([Formula: see text]) EUS landmarks. The results show a lower 90th percentile TRE when registration is initialised using the optimised planes compared with a non-optimised initialisation approach (p value [Formula: see text]). The proposed simulation-based method to find optimised EUS planes and landmarks for EUS-guided procedures may have the potential to improve registration accuracy. Further work will investigate applying the technique in a clinical setting.

  19. Providing Effective Access to Shared Resources: A COIN Approach

    NASA Technical Reports Server (NTRS)

    Airiau, Stephane; Wolpert, David H.

    2004-01-01

    Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user s satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and "aligned" with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the Collective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.

  20. Providing Effective Access to Shared Resources: A COIN Approach

    NASA Technical Reports Server (NTRS)

    Airiau, Stephane; Wolpert, David H.; Sen, Sandip; Tumer, Kagan

    2003-01-01

    Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user's satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and 'aligned' with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the COllective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.

  1. Linear-scaling time-dependent density-functional theory beyond the Tamm-Dancoff approximation: Obtaining efficiency and accuracy with in situ optimised local orbitals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuehlsdorff, T. J., E-mail: tjz21@cam.ac.uk; Payne, M. C.; Hine, N. D. M.

    2015-11-28

    We present a solution of the full time-dependent density-functional theory (TDDFT) eigenvalue equation in the linear response formalism exhibiting a linear-scaling computational complexity with system size, without relying on the simplifying Tamm-Dancoff approximation (TDA). The implementation relies on representing the occupied and unoccupied subspaces with two different sets of in situ optimised localised functions, yielding a very compact and efficient representation of the transition density matrix of the excitation with the accuracy associated with a systematic basis set. The TDDFT eigenvalue equation is solved using a preconditioned conjugate gradient algorithm that is very memory-efficient. The algorithm is validated on amore » small test molecule and a good agreement with results obtained from standard quantum chemistry packages is found, with the preconditioner yielding a significant improvement in convergence rates. The method developed in this work is then used to reproduce experimental results of the absorption spectrum of bacteriochlorophyll in an organic solvent, where it is demonstrated that the TDA fails to reproduce the main features of the low energy spectrum, while the full TDDFT equation yields results in good qualitative agreement with experimental data. Furthermore, the need for explicitly including parts of the solvent into the TDDFT calculations is highlighted, making the treatment of large system sizes necessary that are well within reach of the capabilities of the algorithm introduced here. Finally, the linear-scaling properties of the algorithm are demonstrated by computing the lowest excitation energy of bacteriochlorophyll in solution. The largest systems considered in this work are of the same order of magnitude as a variety of widely studied pigment-protein complexes, opening up the possibility of studying their properties without having to resort to any semiclassical approximations to parts of the protein environment.« less

  2. Modelling and strategy optimisation for a kind of networked evolutionary games with memories under the bankruptcy mechanism

    NASA Astrophysics Data System (ADS)

    Fu, Shihua; Li, Haitao; Zhao, Guodong

    2018-05-01

    This paper investigates the evolutionary dynamic and strategy optimisation for a kind of networked evolutionary games whose strategy updating rules incorporate 'bankruptcy' mechanism, and the situation that each player's bankruptcy is due to the previous continuous low profits gaining from the game is considered. First, by using semi-tensor product of matrices method, the evolutionary dynamic of this kind of games is expressed as a higher order logical dynamic system and then converted into its algebraic form, based on which, the evolutionary dynamic of the given games can be discussed. Second, the strategy optimisation problem is investigated, and some free-type control sequences are designed to maximise the total payoff of the whole game. Finally, an illustrative example is given to show that our new results are very effective.

  3. Easy-Going On-Spectrometer Optimisation of Phase Modulated Homonuclear Decoupling Sequences in Solid-State NMR

    NASA Astrophysics Data System (ADS)

    Grimminck, Dennis L. A. G.; Vasa, Suresh K.; Meerts, W. Leo; Kentgens, P. M.

    2011-06-01

    A global optimisation scheme for phase modulated proton homonuclear decoupling sequences in solid-state NMR is presented. Phase modulations, parameterised by DUMBO Fourier coefficients, were optimized using a Covariance Matrix Adaptation Evolution Strategies algorithm. Our method, denoted EASY-GOING homonuclear decoupling, starts with featureless spectra and optimises proton-proton decoupling, during either proton or carbon signal detection. On the one hand, our solutions closely resemble (e)DUMBO for moderate sample spinning frequencies and medium radio-frequency (rf) field strengths. On the other hand, the EASY-GOING approach resulted in a superior solution, achieving significantly better resolved proton spectra at very high 680 kHz rf field strength. N. Hansen, and A. Ostermeier. Evol. Comput. 9 (2001) 159-195 B. Elena, G. de Paepe, L. Emsley. Chem. Phys. Lett. 398 (2004) 532-538

  4. Bright-White Beetle Scales Optimise Multiple Scattering of Light

    NASA Astrophysics Data System (ADS)

    Burresi, Matteo; Cortese, Lorenzo; Pattelli, Lorenzo; Kolle, Mathias; Vukusic, Peter; Wiersma, Diederik S.; Steiner, Ullrich; Vignolini, Silvia

    2014-08-01

    Whiteness arises from diffuse and broadband reflection of light typically achieved through optical scattering in randomly structured media. In contrast to structural colour due to coherent scattering, white appearance generally requires a relatively thick system comprising randomly positioned high refractive-index scattering centres. Here, we show that the exceptionally bright white appearance of Cyphochilus and Lepidiota stigma beetles arises from a remarkably optimised anisotropy of intra-scale chitin networks, which act as a dense scattering media. Using time-resolved measurements, we show that light propagating in the scales of the beetles undergoes pronounced multiple scattering that is associated with the lowest transport mean free path reported to date for low-refractive-index systems. Our light transport investigation unveil high level of optimisation that achieves high-brightness white in a thin low-mass-per-unit-area anisotropic disordered nanostructure.

  5. Utilization of Historic Information in an Optimisation Task

    NASA Technical Reports Server (NTRS)

    Boesser, T.

    1984-01-01

    One of the basic components of a discrete model of motor behavior and decision making, which describes tracking and supervisory control in unitary terms, is assumed to be a filtering mechanism which is tied to the representational principles of human memory for time-series information. In a series of experiments subjects used the time-series information with certain significant limitations: there is a range-effect; asymmetric distributions seem to be recognized, but it does not seem to be possible to optimize performance based on skewed distributions. Thus there is a transformation of the displayed data between the perceptual system and representation in memory involving a loss of information. This rules out a number of representational principles for time-series information in memory and fits very well into the framework of a comprehensive discrete model for control of complex systems, modelling continuous control (tracking), discrete responses, supervisory behavior and learning.

  6. Multiplexed LC-MS/MS analysis of horse plasma proteins to study doping in sport.

    PubMed

    Barton, Chris; Beck, Paul; Kay, Richard; Teale, Phil; Roberts, Jane

    2009-06-01

    The development of protein biomarkers for the indirect detection of doping in horse is a potential solution to doping threats such as gene and protein doping. A method for biomarker candidate discovery in horse plasma is presented using targeted analysis of proteotypic peptides from horse proteins. These peptides were first identified in a novel list of the abundant proteins in horse plasma. To monitor these peptides, an LC-MS/MS method using multiple reaction monitoring was developed to study the quantity of 49 proteins in horse plasma in a single run. The method was optimised and validated, and then applied to a population of race-horses to study protein variance within a population. The method was finally applied to longitudinal time courses of horse plasma collected after administration of an anabolic steroid to demonstrate utility for hypothesis-driven discovery of doping biomarker candidates.

  7. Light-induced atomic desorption in a compact system for ultracold atoms

    PubMed Central

    Torralbo-Campo, Lara; Bruce, Graham D.; Smirne, Giuseppe; Cassettari, Donatella

    2015-01-01

    In recent years, light-induced atomic desorption (LIAD) of alkali atoms from the inner surface of a vacuum chamber has been employed in cold atom experiments for the purpose of modulating the alkali background vapour. This is beneficial because larger trapped atom samples can be loaded from vapour at higher pressure, after which the pressure is reduced to increase the lifetime of the sample. We present an analysis, based on the case of rubidium atoms adsorbed on pyrex, of various aspects of LIAD that are useful for this application. Firstly, we study the intensity dependence of LIAD by fitting the experimental data with a rate-equation model, from which we extract a correct prediction for the increase in trapped atom number. Following this, we quantify a figure of merit for the utility of LIAD in cold atom experiments and we show how it can be optimised for realistic experimental parameters. PMID:26458325

  8. Process optimisation of microwave-assisted extraction of peony ( Paeonia suffruticosa Andr .) seed oil using hexane-ethanol mixture and its characterisation

    Treesearch

    Xiaoli Sun; Wengang Li; Jian Li; Yuangang Zu; Chung-Yun Hse; Jiulong Xie; Xiuhua Zhao

    2016-01-01

    Ethanol and hexane mixture agent microwave-assisted extraction (MAE) method was conducted to extract peony (Paeonia suffruticosa Andr.) seed oil (PSO). The aim of the study was to optimise the extraction for both yield and energy consumption in mixture agent MAE. The highest oil yield (34.49%) and lowest unit energy consumption (14 125.4 J g -1)...

  9. Optimizing Operational Physical Fitness (Optimisation de L’Aptitude Physique Operationnelle)

    DTIC Science & Technology

    2009-01-01

    NORTH ATLANTIC TREATY ORGANISATION RESEARCH AND TECHNOLOGY ORGANISATION AC/323(HFM-080)TP/200 www.rto.nato.int RTO TECHNICAL REPORT TR... RESEARCH AND TECHNOLOGY ORGANISATION AC/323(HFM-080)TP/200 www.rto.nato.int RTO TECHNICAL REPORT TR-HFM-080 Optimizing Operational Physical...Fitness (Optimisation de l’aptitude physique opérationnelle) Final Report of Task Group 019. ii RTO-TR-HFM-080 The Research and

  10. Summation-by-Parts operators with minimal dispersion error for coarse grid flow calculations

    NASA Astrophysics Data System (ADS)

    Linders, Viktor; Kupiainen, Marco; Nordström, Jan

    2017-07-01

    We present a procedure for constructing Summation-by-Parts operators with minimal dispersion error both near and far from numerical interfaces. Examples of such operators are constructed and compared with a higher order non-optimised Summation-by-Parts operator. Experiments show that the optimised operators are superior for wave propagation and turbulent flows involving large wavenumbers, long solution times and large ranges of resolution scales.

  11. Cost optimisation and minimisation of the environmental impact through life cycle analysis of the waste water treatment plant of Bree (Belgium).

    PubMed

    De Gussem, K; Wambecq, T; Roels, J; Fenu, A; De Gueldre, G; Van De Steene, B

    2011-01-01

    An ASM2da model of the full-scale waste water plant of Bree (Belgium) has been made. It showed very good correlation with reference operational data. This basic model has been extended to include an accurate calculation of environmental footprint and operational costs (energy consumption, dosing of chemicals and sludge treatment). Two optimisation strategies were compared: lowest cost meeting the effluent consent versus lowest environmental footprint. Six optimisation scenarios have been studied, namely (i) implementation of an online control system based on ammonium and nitrate sensors, (ii) implementation of a control on MLSS concentration, (iii) evaluation of internal recirculation flow, (iv) oxygen set point, (v) installation of mixing in the aeration tank, and (vi) evaluation of nitrate setpoint for post denitrification. Both an environmental impact or Life Cycle Assessment (LCA) based approach for optimisation are able to significantly lower the cost and environmental footprint. However, the LCA approach has some advantages over cost minimisation of an existing full-scale plant. LCA tends to chose control settings that are more logic: it results in a safer operation of the plant with less risks regarding the consents. It results in a better effluent at a slightly increased cost.

  12. Optimising design, operation and energy consumption of biological aerated filters (BAF) for nitrogen removal of municipal wastewater.

    PubMed

    Rother, E; Cornel, P

    2004-01-01

    The Biofiltration process in wastewater treatment combines filtration and biological processes in one reactor. In Europe it is meanwhile an accepted technology in advanced wastewater treatment, whenever space is scarce and a virtually suspended solids-free effluent is demanded. Although more than 500 plants are in operation world-wide there is still a lack of published operational experiences to help planners and operators to identify potentials for optimisation, e.g. energy consumption or the vulnerability against peakloads. Examples from pilot trials are given how the nitrification and denitrification can be optimised. Nitrification can be quickly increased by adjusting DO content of the water. Furthermore carrier materials like zeolites can store surplus ammonia during peak loads and release afterwards. Pre-denitrification in biofilters is normally limited by the amount of easily degradable organic substrate, resulting in relatively high requirements for external carbon. The combination of pre-DN, N and post-DN filters is much more advisable for most municipal wastewaters, because the recycle rate can be reduced and external carbon can be saved. Exemplarily it is shown for a full scale preanoxic-DN/N/postanoxic-DN plant of 130,000 p.e. how 15% energy could be saved by optimising internal recycling and some control strategies.

  13. Optimised in vitro applicable loads for the simulation of lateral bending in the lumbar spine.

    PubMed

    Dreischarf, Marcel; Rohlmann, Antonius; Bergmann, Georg; Zander, Thomas

    2012-07-01

    In in vitro studies of the lumbar spine simplified loading modes (compressive follower force, pure moment) are usually employed to simulate the standard load cases flexion-extension, axial rotation and lateral bending of the upper body. However, the magnitudes of these loads vary widely in the literature. Thus the results of current studies may lead to unrealistic values and are hardly comparable. It is still unknown which load magnitudes lead to a realistic simulation of maximum lateral bending. A validated finite element model of the lumbar spine was used in an optimisation study to determine which magnitudes of the compressive follower force and bending moment deliver results that fit best with averaged in vivo data. The best agreement with averaged in vivo measured data was found for a compressive follower force of 700 N and a lateral bending moment of 7.8 Nm. These results show that loading modes that differ strongly from the optimised one may not realistically simulate maximum lateral bending. The simplified but in vitro applicable loading cannot perfectly mimic the in vivo situation. However, the optimised magnitudes are those which agree best with averaged in vivo measured data. Its consequent application would lead to a better comparability of different investigations. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  14. Electroconvulsive therapy stimulus titration: Not all it seems.

    PubMed

    Rosenman, Stephen J

    2018-05-01

    To examine the provenance and implications of seizure threshold titration in electroconvulsive therapy. Titration of seizure threshold has become a virtual standard for electroconvulsive therapy. It is justified as individualisation and optimisation of the balance between efficacy and unwanted effects. Present day threshold estimation is significantly different from the 1960 studies of Cronholm and Ottosson that are its usual justification. The present form of threshold estimation is unstable and too uncertain for valid optimisation or individualisation of dose. Threshold stimulation (lowest dose that produces a seizure) has proven therapeutically ineffective, and the multiples applied to threshold to attain efficacy have never been properly investigated or standardised. The therapeutic outcomes of threshold estimation (or its multiples) have not been separated from simple dose effects. Threshold estimation does not optimise dose due to its own uncertainties and the different short-term and long-term cognitive and memory effects. Potential harms of titration have not been examined. Seizure threshold titration in electroconvulsive therapy is not a proven technique of dose optimisation. It is widely held and practiced; its benefit and harmlessness assumed but unproven. It is a prematurely settled answer to an unsettled question that discourages further enquiry. It is an example of how practices, assumed scientific, enter medicine by obscure paths.

  15. Application of statistical experimental design for optimisation of bioinsecticides production by sporeless Bacillus thuringiensis strain on cheap medium.

    PubMed

    Ben Khedher, Saoussen; Jaoua, Samir; Zouari, Nabil

    2013-01-01

    In order to overproduce bioinsecticides production by a sporeless Bacillus thuringiensis strain, an optimal composition of a cheap medium was defined using a response surface methodology. In a first step, a Plackett-Burman design used to evaluate the effects of eight medium components on delta-endotoxin production showed that starch, soya bean and sodium chloride exhibited significant effects on bioinsecticides production. In a second step, these parameters were selected for further optimisation by central composite design. The obtained results revealed that the optimum culture medium for delta-endotoxin production consists of 30 g L(-1) starch, 30 g L(-1) soya bean and 9 g L(-1) sodium chloride. When compared to the basal production medium, an improvement in delta-endotoxin production up to 50% was noted. Moreover, relative toxin yield of sporeless Bacillus thuringiensis S22 was improved markedly by using optimised cheap medium (148.5 mg delta-endotoxins per g starch) when compared to the yield obtained in the basal medium (94.46 mg delta-endotoxins per g starch). Therefore, the use of optimised culture cheap medium appeared to be a good alternative for a low cost production of sporeless Bacillus thuringiensis bioinsecticides at industrial scale which is of great importance in practical point of view.

  16. Single tube genotyping of sickle cell anaemia using PCR-based SNP analysis

    PubMed Central

    Waterfall, Christy M.; Cobb, Benjamin D.

    2001-01-01

    Allele-specific amplification (ASA) is a generally applicable technique for the detection of known single nucleotide polymorphisms (SNPs), deletions, insertions and other sequence variations. Conventionally, two reactions are required to determine the zygosity of DNA in a two-allele system, along with significant upstream optimisation to define the specific test conditions. Here, we combine single tube bi-directional ASA with a ‘matrix-based’ optimisation strategy, speeding up the whole process in a reduced reaction set. We use sickle cell anaemia as our model SNP system, a genetic disease that is currently screened using ASA methods. Discriminatory conditions were rapidly optimised enabling the unambiguous identification of DNA from homozygous sickle cell patients (HbS/S), heterozygous carriers (HbA/S) or normal DNA in a single tube. Simple downstream mathematical analyses based on product yield across the optimisation set allow an insight into the important aspects of priming competition and component interactions in this competitive PCR. This strategy can be applied to any polymorphism, defining specific conditions using a multifactorial approach. The inherent simplicity and low cost of this PCR-based method validates bi-directional ASA as an effective tool in future clinical screening and pharmacogenomic research where more expensive fluorescence-based approaches may not be desirable. PMID:11726702

  17. Single tube genotyping of sickle cell anaemia using PCR-based SNP analysis.

    PubMed

    Waterfall, C M; Cobb, B D

    2001-12-01

    Allele-specific amplification (ASA) is a generally applicable technique for the detection of known single nucleotide polymorphisms (SNPs), deletions, insertions and other sequence variations. Conventionally, two reactions are required to determine the zygosity of DNA in a two-allele system, along with significant upstream optimisation to define the specific test conditions. Here, we combine single tube bi-directional ASA with a 'matrix-based' optimisation strategy, speeding up the whole process in a reduced reaction set. We use sickle cell anaemia as our model SNP system, a genetic disease that is currently screened using ASA methods. Discriminatory conditions were rapidly optimised enabling the unambiguous identification of DNA from homozygous sickle cell patients (HbS/S), heterozygous carriers (HbA/S) or normal DNA in a single tube. Simple downstream mathematical analyses based on product yield across the optimisation set allow an insight into the important aspects of priming competition and component interactions in this competitive PCR. This strategy can be applied to any polymorphism, defining specific conditions using a multifactorial approach. The inherent simplicity and low cost of this PCR-based method validates bi-directional ASA as an effective tool in future clinical screening and pharmacogenomic research where more expensive fluorescence-based approaches may not be desirable.

  18. Statistical optimisation of diclofenac sustained release pellets coated with polymethacrylic films.

    PubMed

    Kramar, A; Turk, S; Vrecer, F

    2003-04-30

    The objective of the present study was to evaluate three formulation parameters for the application of polymethacrylic films from aqueous dispersions in order to obtain multiparticulate sustained release of diclofenac sodium. Film coating of pellet cores was performed in a laboratory fluid bed apparatus. The chosen independent variables, i.e. the concentration of plasticizer (triethyl citrate), methacrylate polymers ratio (Eudragit RS:Eudragit RL) and the quantity of coating dispersion were optimised with a three-factor, three-level Box-Behnken design. The chosen dependent variables were cumulative percentage values of diclofenac dissolved in 3, 4 and 6 h. Based on the experimental design, different diclofenac release profiles were obtained. Response surface plots were used to relate the dependent and the independent variables. The optimisation procedure generated an optimum of 40% release in 3 h. The levels of plasticizer concentration, quantity of coating dispersion and polymer to polymer ratio (Eudragit RS:Eudragit RL) were 25% w/w, 400 g and 3/1, respectively. The optimised formulation prepared according to computer-determined levels provided a release profile, which was close to the predicted values. We also studied thermal and surface characteristics of the polymethacrylic films to understand the influence of plasticizer concentration on the drug release from the pellets.

  19. Application of statistical experimental design for optimisation of bioinsecticides production by sporeless Bacillus thuringiensis strain on cheap medium

    PubMed Central

    Ben Khedher, Saoussen; Jaoua, Samir; Zouari, Nabil

    2013-01-01

    In order to overproduce bioinsecticides production by a sporeless Bacillus thuringiensis strain, an optimal composition of a cheap medium was defined using a response surface methodology. In a first step, a Plackett-Burman design used to evaluate the effects of eight medium components on delta-endotoxin production showed that starch, soya bean and sodium chloride exhibited significant effects on bioinsecticides production. In a second step, these parameters were selected for further optimisation by central composite design. The obtained results revealed that the optimum culture medium for delta-endotoxin production consists of 30 g L−1 starch, 30 g L−1 soya bean and 9 g L−1 sodium chloride. When compared to the basal production medium, an improvement in delta-endotoxin production up to 50% was noted. Moreover, relative toxin yield of sporeless Bacillus thuringiensis S22 was improved markedly by using optimised cheap medium (148.5 mg delta-endotoxins per g starch) when compared to the yield obtained in the basal medium (94.46 mg delta-endotoxins per g starch). Therefore, the use of optimised culture cheap medium appeared to be a good alternative for a low cost production of sporeless Bacillus thuringiensis bioinsecticides at industrial scale which is of great importance in practical point of view. PMID:24516462

  20. Optimising resolution for a preparative separation of Chinese herbal medicine using a surrogate model sample system.

    PubMed

    Ye, Haoyu; Ignatova, Svetlana; Peng, Aihua; Chen, Lijuan; Sutherland, Ian

    2009-06-26

    This paper builds on previous modelling research with short single layer columns to develop rapid methods for optimising high-performance counter-current chromatography at constant stationary phase retention. Benzyl alcohol and p-cresol are used as model compounds to rapidly optimise first flow and then rotational speed operating conditions at a preparative scale with long columns for a given phase system using a Dynamic Extractions Midi-DE centrifuge. The transfer to a high value extract such as the crude ethanol extract of Chinese herbal medicine Millettia pachycarpa Benth. is then demonstrated and validated using the same phase system. The results show that constant stationary phase modelling of flow and speed with long multilayer columns works well as a cheap, quick and effective method of optimising operating conditions for the chosen phase system-hexane-ethyl acetate-methanol-water (1:0.8:1:0.6, v/v). Optimum conditions for resolution were a flow of 20 ml/min and speed of 1200 rpm, but for throughput were 80 ml/min at the same speed. The results show that 80 ml/min gave the best throughputs for tephrosin (518 mg/h), pyranoisoflavone (47.2 mg/h) and dehydrodeguelin (10.4 mg/h), whereas for deguelin (100.5 mg/h), the best flow rate was 40 ml/min.

Top