NASA Astrophysics Data System (ADS)
Luo, Bin; Lin, Lin; Zhong, ShiSheng
2018-02-01
In this research, we propose a preference-guided optimisation algorithm for multi-criteria decision-making (MCDM) problems with interval-valued fuzzy preferences. The interval-valued fuzzy preferences are decomposed into a series of precise and evenly distributed preference-vectors (reference directions) regarding the objectives to be optimised on the basis of uniform design strategy firstly. Then the preference information is further incorporated into the preference-vectors based on the boundary intersection approach, meanwhile, the MCDM problem with interval-valued fuzzy preferences is reformulated into a series of single-objective optimisation sub-problems (each sub-problem corresponds to a decomposed preference-vector). Finally, a preference-guided optimisation algorithm based on MOEA/D (multi-objective evolutionary algorithm based on decomposition) is proposed to solve the sub-problems in a single run. The proposed algorithm incorporates the preference-vectors within the optimisation process for guiding the search procedure towards a more promising subset of the efficient solutions matching the interval-valued fuzzy preferences. In particular, lots of test instances and an engineering application are employed to validate the performance of the proposed algorithm, and the results demonstrate the effectiveness and feasibility of the algorithm.
Bonmati, Ester; Hu, Yipeng; Gibson, Eli; Uribarri, Laura; Keane, Geri; Gurusami, Kurinchi; Davidson, Brian; Pereira, Stephen P; Clarkson, Matthew J; Barratt, Dean C
2018-06-01
Navigation of endoscopic ultrasound (EUS)-guided procedures of the upper gastrointestinal (GI) system can be technically challenging due to the small fields-of-view of ultrasound and optical devices, as well as the anatomical variability and limited number of orienting landmarks during navigation. Co-registration of an EUS device and a pre-procedure 3D image can enhance the ability to navigate. However, the fidelity of this contextual information depends on the accuracy of registration. The purpose of this study was to develop and test the feasibility of a simulation-based planning method for pre-selecting patient-specific EUS-visible anatomical landmark locations to maximise the accuracy and robustness of a feature-based multimodality registration method. A registration approach was adopted in which landmarks are registered to anatomical structures segmented from the pre-procedure volume. The predicted target registration errors (TREs) of EUS-CT registration were estimated using simulated visible anatomical landmarks and a Monte Carlo simulation of landmark localisation error. The optimal planes were selected based on the 90th percentile of TREs, which provide a robust and more accurate EUS-CT registration initialisation. The method was evaluated by comparing the accuracy and robustness of registrations initialised using optimised planes versus non-optimised planes using manually segmented CT images and simulated ([Formula: see text]) or retrospective clinical ([Formula: see text]) EUS landmarks. The results show a lower 90th percentile TRE when registration is initialised using the optimised planes compared with a non-optimised initialisation approach (p value [Formula: see text]). The proposed simulation-based method to find optimised EUS planes and landmarks for EUS-guided procedures may have the potential to improve registration accuracy. Further work will investigate applying the technique in a clinical setting.
Medicines optimisation: priorities and challenges.
Kaufman, Gerri
2016-03-23
Medicines optimisation is promoted in a guideline published in 2015 by the National Institute for Health and Care Excellence. Four guiding principles underpin medicines optimisation: aim to understand the patient's experience; ensure evidence-based choice of medicines; ensure medicines use is as safe as possible; and make medicines optimisation part of routine practice. Understanding the patient experience is important to improve adherence to medication regimens. This involves communication, shared decision making and respect for patient preferences. Evidence-based choice of medicines is important for clinical and cost effectiveness. Systems and processes for the reporting of medicines-related safety incidents have to be improved if medicines use is to be as safe as possible. Ensuring safe practice in medicines use when patients are transferred between organisations, and managing the complexities of polypharmacy are imperative. A medicines use review can help to ensure that medicines optimisation forms part of routine practice.
Kievit, Wietske; van Herwaarden, Noortje; van den Hoogen, Frank Hj; van Vollenhoven, Ronald F; Bijlsma, Johannes Wj; van den Bemt, Bart Jf; van der Maas, Aatke; den Broeder, Alfons A
2016-11-01
A disease activity-guided dose optimisation strategy of adalimumab or etanercept (TNFi (tumour necrosis factor inhibitors)) has shown to be non-inferior in maintaining disease control in patients with rheumatoid arthritis (RA) compared with usual care. However, the cost-effectiveness of this strategy is still unknown. This is a preplanned cost-effectiveness analysis of the Dose REduction Strategy of Subcutaneous TNF inhibitors (DRESS) study, a randomised controlled, open-label, non-inferiority trial performed in two Dutch rheumatology outpatient clinics. Patients with low disease activity using TNF inhibitors were included. Total healthcare costs were measured and quality adjusted life years (QALY) were based on EQ5D utility scores. Decremental cost-effectiveness analyses were performed using bootstrap analyses; incremental net monetary benefit (iNMB) was used to express cost-effectiveness. 180 patients were included, and 121 were allocated to the dose optimisation strategy and 59 to control. The dose optimisation strategy resulted in a mean cost saving of -€12 280 (95 percentile -€10 502; -€14 104) per patient per 18 months. There is an 84% chance that the dose optimisation strategy results in a QALY loss with a mean QALY loss of -0.02 (-0.07 to 0.02). The decremental cost-effectiveness ratio (DCER) was €390 493 (€5 085 184; dominant) of savings per QALY lost. The mean iNMB was €10 467 (€6553-€14 037). Sensitivity analyses using 30% and 50% lower prices for TNFi remained cost-effective. Disease activity-guided dose optimisation of TNFi results in considerable cost savings while no relevant loss of quality of life was observed. When the minimal QALY loss is compensated with the upper limit of what society is willing to pay or accept in the Netherlands, the net savings are still high. NTR3216; Post-results. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Improving Vector Evaluated Particle Swarm Optimisation by Incorporating Nondominated Solutions
Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima
2013-01-01
The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm. PMID:23737718
Improving Vector Evaluated Particle Swarm Optimisation by incorporating nondominated solutions.
Lim, Kian Sheng; Ibrahim, Zuwairie; Buyamin, Salinda; Ahmad, Anita; Naim, Faradila; Ghazali, Kamarul Hawari; Mokhtar, Norrima
2013-01-01
The Vector Evaluated Particle Swarm Optimisation algorithm is widely used to solve multiobjective optimisation problems. This algorithm optimises one objective using a swarm of particles where their movements are guided by the best solution found by another swarm. However, the best solution of a swarm is only updated when a newly generated solution has better fitness than the best solution at the objective function optimised by that swarm, yielding poor solutions for the multiobjective optimisation problems. Thus, an improved Vector Evaluated Particle Swarm Optimisation algorithm is introduced by incorporating the nondominated solutions as the guidance for a swarm rather than using the best solution from another swarm. In this paper, the performance of improved Vector Evaluated Particle Swarm Optimisation algorithm is investigated using performance measures such as the number of nondominated solutions found, the generational distance, the spread, and the hypervolume. The results suggest that the improved Vector Evaluated Particle Swarm Optimisation algorithm has impressive performance compared with the conventional Vector Evaluated Particle Swarm Optimisation algorithm.
NASA Astrophysics Data System (ADS)
Rebelo Kornmeier, Joana; Ostermann, Andreas; Hofmann, Michael; Gibmeier, Jens
2014-02-01
Neutron strain diffractometers usually use slits to define a gauge volume within engineering samples. In this study a multi-channel parabolic neutron guide was developed to be used instead of the primary slit to minimise the loss of intensity and vertical definition of the gauge volume when using slits placed far away from the measurement position in bulky components. The major advantage of a focusing guide is that the maximum flux is not at the exit of the guide as for a slit system but at the focal point relatively far away from the exit of the guide. Monte Carlo simulations were used to optimise the multi-channel parabolic guide with respect to the instrument characteristics of the diffractometer STRESS-SPEC at the FRM II neutron source. Also the simulations are in excellent agreement with experimental measurements using the optimised multi-channel parabolic guide at the neutron diffractometer. In addition the performance of the guide was compared to the standard slit setup at STRESS-SPEC using a single bead weld sample used in earlier round robin tests for residual strain measurements.
NASA Astrophysics Data System (ADS)
Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian
2017-08-01
With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.
Scarborough, Peter; Kaur, Asha; Cobiac, Linda; Owens, Paul; Parlesak, Alexandr; Sweeney, Kate; Rayner, Mike
2016-12-21
To model food group consumption and price of diet associated with achieving UK dietary recommendations while deviating as little as possible from the current UK diet, in order to support the redevelopment of the UK food-based dietary guidelines (now called the Eatwell Guide). Optimisation modelling, minimising an objective function of the difference between population mean modelled and current consumption of 125 food groups, and constraints of nutrient and food-based recommendations. The UK. Adults aged 19 years and above from the National Diet and Nutrition Survey 2008-2011. Proportion of diet consisting of major foods groups and price of the optimised diet. The optimised diet has an increase in consumption of 'potatoes, bread, rice, pasta and other starchy carbohydrates' (+69%) and 'fruit and vegetables' (+54%) and reductions in consumption of 'beans, pulses, fish, eggs, meat and other proteins' (-24%), 'dairy and alternatives' (-21%) and 'foods high in fat and sugar' (-53%). Results within food groups show considerable variety (eg, +90% for beans and pulses, -78% for red meat). The modelled diet would cost £5.99 (£5.93 to £6.05) per adult per day, very similar to the cost of the current diet: £6.02 (£5.96 to £6.08). The optimised diet would result in increased consumption of n-3 fatty acids and most micronutrients (including iron and folate), but decreased consumption of zinc and small decreases in consumption of calcium and riboflavin. To achieve the UK dietary recommendations would require large changes in the average diet of UK adults, including in food groups where current average consumption is well within the recommended range (eg, processed meat) or where there are no current recommendations (eg, dairy). These large changes in the diet will not lead to significant changes in the price of the diet. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Improving Vector Evaluated Particle Swarm Optimisation Using Multiple Nondominated Leaders
Lim, Kian Sheng; Buyamin, Salinda; Ahmad, Anita; Shapiai, Mohd Ibrahim; Naim, Faradila; Mubin, Marizan; Kim, Dong Hwa
2014-01-01
The vector evaluated particle swarm optimisation (VEPSO) algorithm was previously improved by incorporating nondominated solutions for solving multiobjective optimisation problems. However, the obtained solutions did not converge close to the Pareto front and also did not distribute evenly over the Pareto front. Therefore, in this study, the concept of multiple nondominated leaders is incorporated to further improve the VEPSO algorithm. Hence, multiple nondominated solutions that are best at a respective objective function are used to guide particles in finding optimal solutions. The improved VEPSO is measured by the number of nondominated solutions found, generational distance, spread, and hypervolume. The results from the conducted experiments show that the proposed VEPSO significantly improved the existing VEPSO algorithms. PMID:24883386
Lévy flight artificial bee colony algorithm
NASA Astrophysics Data System (ADS)
Sharma, Harish; Bansal, Jagdish Chand; Arya, K. V.; Yang, Xin-She
2016-08-01
Artificial bee colony (ABC) optimisation algorithm is a relatively simple and recent population-based probabilistic approach for global optimisation. The solution search equation of ABC is significantly influenced by a random quantity which helps in exploration at the cost of exploitation of the search space. In the ABC, there is a high chance to skip the true solution due to its large step sizes. In order to balance between diversity and convergence in the ABC, a Lévy flight inspired search strategy is proposed and integrated with ABC. The proposed strategy is named as Lévy Flight ABC (LFABC) has both the local and global search capability simultaneously and can be achieved by tuning the Lévy flight parameters and thus automatically tuning the step sizes. In the LFABC, new solutions are generated around the best solution and it helps to enhance the exploitation capability of ABC. Furthermore, to improve the exploration capability, the numbers of scout bees are increased. The experiments on 20 test problems of different complexities and five real-world engineering optimisation problems show that the proposed strategy outperforms the basic ABC and recent variants of ABC, namely, Gbest-guided ABC, best-so-far ABC and modified ABC in most of the experiments.
McStas event logger: Definition and applications
NASA Astrophysics Data System (ADS)
Bergbäck Knudsen, Erik; Bryndt Klinkby, Esben; Kjær Willendrup, Peter
2014-02-01
Functionality is added to the McStas neutron ray-tracing code, which allows individual neutron states before and after a scattering to be temporarily stored, and analysed. This logging mechanism has multiple uses, including studies of longitudinal intensity loss in neutron guides and guide coating design optimisations. Furthermore, the logging method enables the cold/thermal neutron induced gamma background along the guide to be calculated from the un-reflected neutron, using a recently developed MCNPX-McStas interface.
Optimisation of GaN LEDs and the reduction of efficiency droop using active machine learning
Rouet-Leduc, Bertrand; Barros, Kipton Marcos; Lookman, Turab; ...
2016-04-26
A fundamental challenge in the design of LEDs is to maximise electro-luminescence efficiency at high current densities. We simulate GaN-based LED structures that delay the onset of efficiency droop by spreading carrier concentrations evenly across the active region. Statistical analysis and machine learning effectively guide the selection of the next LED structure to be examined based upon its expected efficiency as well as model uncertainty. This active learning strategy rapidly constructs a model that predicts Poisson-Schrödinger simulations of devices, and that simultaneously produces structures with higher simulated efficiencies.
Deed, Gary; Barlow, John; Kawol, Dev; Kilov, Gary; Sharma, Anita; Hwa, Liew Yu
2015-05-01
Guidelines for the prevention and management of type 2 diabetes mellitus (T2DM) reinforce lifestyle management, yet advice to guide general practitioners on principles around dietary choices is needed. This article provides current evidence regarding the differing diets in diabetes prevention and management once T2DM arises, including the role in management of complications such as hypoglycaemia. Diets should incorporate weight maintenance or loss, while complementing changes in physical activity to optimise the metabolic effects of dietary advice. Using a structured, team-care approach supports pragmatic and sustainable individualised plans, while incorporating current evidence-based dietary approaches.
O'Neill, Taryn E; Li, Haoxin; Colquhoun, Caitlyn D; Johnson, John A; Webster, Duncan; Gray, Christopher A
2014-01-01
Because of increased resistance to current drugs, there is an urgent need to discover new anti-mycobacterial compounds for the development of novel anti-tuberculosis drugs. The microplate resazurin assay (MRA) is commonly used to evaluate natural products and synthetic compounds for anti-mycobacterial activity. However, the assay can be problematic and unreliable when screening methanolic phytochemical extracts. To optimise the MRA for the screening and bioassay-guided fractionation of phytochemical extracts using Mycobacterium tuberculosis H37Ra. The effects of varying assay duration, resazurin solution composition, solvent (dimethyl sulphoxide - DMSO) concentration and type of microtitre plate used on the results and reliability of the MRA were investigated. The optimal bioassay protocol was applied to methanolic extracts of medicinal plants that have been reported to possess anti-mycobacterial activity. The variables investigated were found to have significant effects on the results obtained with the MRA. A standardised procedure that can reliably quantify anti-mycobacterial activity of phytochemical extracts in as little as 48 h was identified. The optimised MRA uses 2% aqueous DMSO, with an indicator solution of 62.5 µg/mL resazurin in 5% aqueous Tween 80 over 96 h incubation. The study has identified an optimal procedure for the MRA when used with M. tuberculosis H37Ra that gives rapid, reliable and consistent results. The assay procedure has been used successfully for the screening and bioassay-guided fractionation of anti-mycobacterial compounds from methanol extracts of Canadian medicinal plants. Copyright © 2014 John Wiley & Sons, Ltd.
Pichardo, Samuel; Köhler, Max; Lee, Justin; Hynnyen, Kullervo
2014-12-01
In this in vivo study, the feasibility to perform hyperthermia treatments in the head and neck using magnetic resonance image-guided high intensity focused ultrasound (MRgHIFU) was established using a porcine acute model. Porcine specimens with a weight between 17 and 18 kg were treated in the omohyoid muscle in the neck. Hyperthermia was applied with a target temperature of 41 °C for 30 min using a Sonalleve MRgHIFU system. MR-based thermometry was calculated using water-proton resonance frequency shift and multi-baseline look-up tables indexed by peak-to-peak displacement (Dpp) measurements using a pencil-beam navigator. Three hyperthermia experiments were conducted at different Dpp values of 0.2, 1.0 and 3.0 mm. An optimisation study was carried out to establish the optimal parameters controlling the multi-baseline method that ensured a minimisation of spatial-average peak-to-peak temperature (TSA-pp) and temperature direct current bias (TSA-DC). The multi-baseline technique reduced considerably the noise on both TSA-pp and TSA-DC. The reduction of noise was more important when Dpp was higher. For Dpp = 3 mm the average (±standard deviation (SD)) of TSA-pp and TSA-DC was reduced from 4.5 (± 2.5) and 2.5 (±0.6) °C, respectively, to 0.8 (± 0.7) and 0.09 (± 0.2) °C. This in vivo study showed the level of noise in PRFS-based thermometry introduced by respiratory motion in the context of MRgHIFU hyperthermia treatment for head and neck and the feasibility of reducing this noise using a multi-baseline technique.
Davies, Huw Ob; Popplewell, Matthew; Darvall, Katy; Bate, Gareth; Bradbury, Andrew W
2016-05-01
The last 10 years have seen the introduction into everyday clinical practice of a wide range of novel non-surgical treatments for varicose veins. In July 2013, the UK National Institute for Health and Care Excellence recommended the following treatment hierarchy for varicose veins: endothermal ablation, ultrasound-guided foam sclerotherapy, surgery and compression hosiery. The aim of this paper is to review the randomised controlled trials that have compared endothermal ablation and ultrasound-guided foam sclerotherapy to determine if the level 1 evidence base still supports an "endothermal ablation first" strategy for the treatment of varicose veins. A PubMed and OVID literature search (until 31 January 2015) was performed and randomised controlled trials comparing endothermal ablation and ultrasound-guided foam sclerotherapy were obtained. Although anatomical success appeared higher with endothermal ablation than ultrasound-guided foam sclerotherapy, clinical success and patient-reported outcomes measures were similar. Morbidity and complication rates were very low and not significantly different between endothermal ablation and ultrasound-guided foam sclerotherapy. Ultrasound-guided foam sclerotherapy was consistently less expensive that endothermal ablation. All endovenous modalities appear to be successful and have a role in modern day practice. Although further work is required to optimise ultrasound-guided foam sclerotherapy technique to maximise anatomical success and minimise retreatment, the present level 1 evidence base shows there is no significant difference in clinical important outcomes between ultrasound-guided foam sclerotherapy and endothermal ablation. As ultrasound-guided foam sclerotherapy is less expensive, it is likely to be a more cost-effective option in most patients in most healthcare settings. Strict adherence to the treatment hierarchy recommended by National Institute for Health and Care Excellence seems unjustified. © The Author(s) 2015.
Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics.
Trianni, Vito; López-Ibáñez, Manuel
2015-01-01
The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics.
Prosperi, Mattia C. F.; Rosen-Zvi, Michal; Altmann, André; Zazzi, Maurizio; Di Giambenedetto, Simona; Kaiser, Rolf; Schülter, Eugen; Struck, Daniel; Sloot, Peter; van de Vijver, David A.; Vandamme, Anne-Mieke; Sönnerborg, Anders
2010-01-01
Background Although genotypic resistance testing (GRT) is recommended to guide combination antiretroviral therapy (cART), funding and/or facilities to perform GRT may not be available in low to middle income countries. Since treatment history (TH) impacts response to subsequent therapy, we investigated a set of statistical learning models to optimise cART in the absence of GRT information. Methods and Findings The EuResist database was used to extract 8-week and 24-week treatment change episodes (TCE) with GRT and additional clinical, demographic and TH information. Random Forest (RF) classification was used to predict 8- and 24-week success, defined as undetectable HIV-1 RNA, comparing nested models including (i) GRT+TH and (ii) TH without GRT, using multiple cross-validation and area under the receiver operating characteristic curve (AUC). Virological success was achieved in 68.2% and 68.0% of TCE at 8- and 24-weeks (n = 2,831 and 2,579), respectively. RF (i) and (ii) showed comparable performances, with an average (st.dev.) AUC 0.77 (0.031) vs. 0.757 (0.035) at 8-weeks, 0.834 (0.027) vs. 0.821 (0.025) at 24-weeks. Sensitivity analyses, carried out on a data subset that included antiretroviral regimens commonly used in low to middle income countries, confirmed our findings. Training on subtype B and validation on non-B isolates resulted in a decline of performance for models (i) and (ii). Conclusions Treatment history-based RF prediction models are comparable to GRT-based for classification of virological outcome. These results may be relevant for therapy optimisation in areas where availability of GRT is limited. Further investigations are required in order to account for different demographics, subtypes and different therapy switching strategies. PMID:21060792
Computer-based teaching module design: principles derived from learning theories.
Lau, K H Vincent
2014-03-01
The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to take advantage of this unique teaching format as it gains increasing importance in medical education. © 2014 John Wiley & Sons Ltd.
KleinJan, Gijs H; van den Berg, Nynke S; Brouwer, Oscar R; de Jong, Jeroen; Acar, Cenk; Wit, Esther M; Vegt, Erik; van der Noort, Vincent; Valdés Olmos, Renato A; van Leeuwen, Fijs W B; van der Poel, Henk G
2014-12-01
The hybrid tracer was introduced to complement intraoperative radiotracing towards the sentinel nodes (SNs) with fluorescence guidance. Improve in vivo fluorescence-based SN identification for prostate cancer by optimising hybrid tracer preparation, injection technique, and fluorescence imaging hardware. Forty patients with a Briganti nomogram-based risk >10% of lymph node (LN) metastases were included. After intraprostatic tracer injection, SN mapping was performed (lymphoscintigraphy and single-photon emission computed tomography with computed tomography (SPECT-CT)). In groups 1 and 2, SNs were pursued intraoperatively using a laparoscopic gamma probe followed by fluorescence imaging (FI). In group 3, SNs were initially located via FI. Compared with group 1, in groups 2 and 3, a new tracer formulation was introduced that had a reduced total injected volume (2.0 ml vs. 3.2 ml) but increased particle concentration. For groups 1 and 2, the Tricam SLII with D-Light C laparoscopic FI (LFI) system was used. In group 3, the LFI system was upgraded to an Image 1 HUB HD with D-Light P system. Hybrid tracer-based SN biopsy, extended pelvic lymph node dissection, and robot-assisted radical prostatectomy. Number and location of the preoperatively identified SNs, in vivo fluorescence-based SN identification rate, tumour status of SNs and LNs, postoperative complications, and biochemical recurrence (BCR). Mean fluorescence-based SN identification improved from 63.7% (group 1) to 85.2% and 93.5% for groups 2 and 3, respectively (p=0.012). No differences in postoperative complications were found. BCR occurred in three pN0 patients. Stepwise optimisation of the hybrid tracer formulation and the LFI system led to a significant improvement in fluorescence-assisted SN identification. Preoperative SPECT-CT remained essential for guiding intraoperative SN localisation. Intraoperative fluorescence-based SN visualisation can be improved by enhancing the hybrid tracer formulation and laparoscopic fluorescence imaging system. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Lu, Jia-Yang; Cheung, Michael Lok-Man; Huang, Bao-Tian; Wu, Li-Li; Xie, Wen-Jia; Chen, Zhi-Jian; Li, De-Rui; Xie, Liang-Xi
2015-01-01
To assess the performance of a simple optimisation method for improving target coverage and organ-at-risk (OAR) sparing in intensity-modulated radiotherapy (IMRT) for cervical oesophageal cancer. For 20 selected patients, clinically acceptable original IMRT plans (Original plans) were created, and two optimisation methods were adopted to improve the plans: 1) a base dose function (BDF)-based method, in which the treatment plans were re-optimised based on the original plans, and 2) a dose-controlling structure (DCS)-based method, in which the original plans were re-optimised by assigning additional constraints for hot and cold spots. The Original, BDF-based and DCS-based plans were compared with regard to target dose homogeneity, conformity, OAR sparing, planning time and monitor units (MUs). Dosimetric verifications were performed and delivery times were recorded for the BDF-based and DCS-based plans. The BDF-based plans provided significantly superior dose homogeneity and conformity compared with both the DCS-based and Original plans. The BDF-based method further reduced the doses delivered to the OARs by approximately 1-3%. The re-optimisation time was reduced by approximately 28%, but the MUs and delivery time were slightly increased. All verification tests were passed and no significant differences were found. The BDF-based method for the optimisation of IMRT for cervical oesophageal cancer can achieve significantly better dose distributions with better planning efficiency at the expense of slightly more MUs.
Advantages of Task-Specific Multi-Objective Optimisation in Evolutionary Robotics
Trianni, Vito; López-Ibáñez, Manuel
2015-01-01
The application of multi-objective optimisation to evolutionary robotics is receiving increasing attention. A survey of the literature reveals the different possibilities it offers to improve the automatic design of efficient and adaptive robotic systems, and points to the successful demonstrations available for both task-specific and task-agnostic approaches (i.e., with or without reference to the specific design problem to be tackled). However, the advantages of multi-objective approaches over single-objective ones have not been clearly spelled out and experimentally demonstrated. This paper fills this gap for task-specific approaches: starting from well-known results in multi-objective optimisation, we discuss how to tackle commonly recognised problems in evolutionary robotics. In particular, we show that multi-objective optimisation (i) allows evolving a more varied set of behaviours by exploring multiple trade-offs of the objectives to optimise, (ii) supports the evolution of the desired behaviour through the introduction of objectives as proxies, (iii) avoids the premature convergence to local optima possibly introduced by multi-component fitness functions, and (iv) solves the bootstrap problem exploiting ancillary objectives to guide evolution in the early phases. We present an experimental demonstration of these benefits in three different case studies: maze navigation in a single robot domain, flocking in a swarm robotics context, and a strictly collaborative task in collective robotics. PMID:26295151
Honeybee economics: optimisation of foraging in a variable world.
Stabentheiner, Anton; Kovac, Helmut
2016-06-20
In honeybees fast and efficient exploitation of nectar and pollen sources is achieved by persistent endothermy throughout the foraging cycle, which means extremely high energy costs. The need for food promotes maximisation of the intake rate, and the high costs call for energetic optimisation. Experiments on how honeybees resolve this conflict have to consider that foraging takes place in a variable environment concerning microclimate and food quality and availability. Here we report, in simultaneous measurements of energy costs, gains, and intake rate and efficiency, how honeybee foragers manage this challenge in their highly variable environment. If possible, during unlimited sucrose flow, they follow an 'investment-guided' ('time is honey') economic strategy promising increased returns. They maximise net intake rate by investing both own heat production and solar heat to increase body temperature to a level which guarantees a high suction velocity. They switch to an 'economizing' ('save the honey') optimisation of energetic efficiency if the intake rate is restricted by the food source when an increased body temperature would not guarantee a high intake rate. With this flexible and graded change between economic strategies honeybees can do both maximise colony intake rate and optimise foraging efficiency in reaction to environmental variation.
Gordon, G T; McCann, B P
2015-01-01
This paper describes the basis of a stakeholder-based sustainable optimisation indicator (SOI) system to be developed for small-to-medium sized activated sludge (AS) wastewater treatment plants (WwTPs) in the Republic of Ireland (ROI). Key technical publications relating to best practice plant operation, performance audits and optimisation, and indicator and benchmarking systems for wastewater services are identified. Optimisation studies were developed at a number of Irish AS WwTPs and key findings are presented. A national AS WwTP manager/operator survey was carried out to verify the applied operational findings and identify the key operator stakeholder requirements for this proposed SOI system. It was found that most plants require more consistent operational data-based decision-making, monitoring and communication structures to facilitate optimised, sustainable and continuous performance improvement. The applied optimisation and stakeholder consultation phases form the basis of the proposed stakeholder-based SOI system. This system will allow for continuous monitoring and rating of plant performance, facilitate optimised operation and encourage the prioritisation of performance improvement through tracking key operational metrics. Plant optimisation has become a major focus due to the transfer of all ROI water services to a national water utility from individual local authorities and the implementation of the EU Water Framework Directive.
NASA Astrophysics Data System (ADS)
Fritzsche, Matthias; Kittel, Konstantin; Blankenburg, Alexander; Vajna, Sándor
2012-08-01
The focus of this paper is to present a method of multidisciplinary design optimisation based on the autogenetic design theory (ADT) that provides methods, which are partially implemented in the optimisation software described here. The main thesis of the ADT is that biological evolution and the process of developing products are mainly similar, i.e. procedures from biological evolution can be transferred into product development. In order to fulfil requirements and boundary conditions of any kind (that may change at any time), both biological evolution and product development look for appropriate solution possibilities in a certain area, and try to optimise those that are actually promising by varying parameters and combinations of these solutions. As the time necessary for multidisciplinary design optimisations is a critical aspect in product development, ways to distribute the optimisation process with the effective use of unused calculating capacity, can reduce the optimisation time drastically. Finally, a practical example shows how ADT methods and distributed optimising are applied to improve a product.
Mutual information-based LPI optimisation for radar network
NASA Astrophysics Data System (ADS)
Shi, Chenguang; Zhou, Jianjiang; Wang, Fei; Chen, Jun
2015-07-01
Radar network can offer significant performance improvement for target detection and information extraction employing spatial diversity. For a fixed number of radars, the achievable mutual information (MI) for estimating the target parameters may extend beyond a predefined threshold with full power transmission. In this paper, an effective low probability of intercept (LPI) optimisation algorithm is presented to improve LPI performance for radar network. Based on radar network system model, we first provide Schleher intercept factor for radar network as an optimisation metric for LPI performance. Then, a novel LPI optimisation algorithm is presented, where for a predefined MI threshold, Schleher intercept factor for radar network is minimised by optimising the transmission power allocation among radars in the network such that the enhanced LPI performance for radar network can be achieved. The genetic algorithm based on nonlinear programming (GA-NP) is employed to solve the resulting nonconvex and nonlinear optimisation problem. Some simulations demonstrate that the proposed algorithm is valuable and effective to improve the LPI performance for radar network.
A supportive architecture for CFD-based design optimisation
NASA Astrophysics Data System (ADS)
Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong
2014-03-01
Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.
Optimisation study of a vehicle bumper subsystem with fuzzy parameters
NASA Astrophysics Data System (ADS)
Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.
2012-10-01
This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).
Design Optimisation of a Magnetic Field Based Soft Tactile Sensor
Raske, Nicholas; Kow, Junwai; Alazmani, Ali; Ghajari, Mazdak; Culmer, Peter; Hewson, Robert
2017-01-01
This paper investigates the design optimisation of a magnetic field based soft tactile sensor, comprised of a magnet and Hall effect module separated by an elastomer. The aim was to minimise sensitivity of the output force with respect to the input magnetic field; this was achieved by varying the geometry and material properties. Finite element simulations determined the magnetic field and structural behaviour under load. Genetic programming produced phenomenological expressions describing these responses. Optimisation studies constrained by a measurable force and stable loading conditions were conducted; these produced Pareto sets of designs from which the optimal sensor characteristics were selected. The optimisation demonstrated a compromise between sensitivity and the measurable force, a fabricated version of the optimised sensor validated the improvements made using this methodology. The approach presented can be applied in general for optimising soft tactile sensor designs over a range of applications and sensing modes. PMID:29099787
Optimisation of lateral car dynamics taking into account parameter uncertainties
NASA Astrophysics Data System (ADS)
Busch, Jochen; Bestle, Dieter
2014-02-01
Simulation studies on an active all-wheel-steering car show that disturbance of vehicle parameters have high influence on lateral car dynamics. This motivates the need of robust design against such parameter uncertainties. A specific parametrisation is established combining deterministic, velocity-dependent steering control parameters with partly uncertain, velocity-independent vehicle parameters for simultaneous use in a numerical optimisation process. Model-based objectives are formulated and summarised in a multi-objective optimisation problem where especially the lateral steady-state behaviour is improved by an adaption strategy based on measurable uncertainties. The normally distributed uncertainties are generated by optimal Latin hypercube sampling and a response surface based strategy helps to cut down time consuming model evaluations which offers the possibility to use a genetic optimisation algorithm. Optimisation results are discussed in different criterion spaces and the achieved improvements confirm the validity of the proposed procedure.
Breaking free from chemical spreadsheets.
Segall, Matthew; Champness, Ed; Leeding, Chris; Chisholm, James; Hunt, Peter; Elliott, Alex; Garcia-Martinez, Hector; Foster, Nick; Dowling, Samuel
2015-09-01
Drug discovery scientists often consider compounds and data in terms of groups, such as chemical series, and relationships, representing similarity or structural transformations, to aid compound optimisation. This is often supported by chemoinformatics algorithms, for example clustering and matched molecular pair analysis. However, chemistry software packages commonly present these data as spreadsheets or form views that make it hard to find relevant patterns or compare related compounds conveniently. Here, we review common data visualisation and analysis methods used to extract information from chemistry data. We introduce a new framework that enables scientists to work flexibly with drug discovery data to reflect their thought processes and interact with the output of algorithms to identify key structure-activity relationships and guide further optimisation intuitively. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yadav, Naresh Kumar; Kumar, Mukesh; Gupta, S. K.
2017-03-01
General strategic bidding procedure has been formulated in the literature as a bi-level searching problem, in which the offer curve tends to minimise the market clearing function and to maximise the profit. Computationally, this is complex and hence, the researchers have adopted Karush-Kuhn-Tucker (KKT) optimality conditions to transform the model into a single-level maximisation problem. However, the profit maximisation problem with KKT optimality conditions poses great challenge to the classical optimisation algorithms. The problem has become more complex after the inclusion of transmission constraints. This paper simplifies the profit maximisation problem as a minimisation function, in which the transmission constraints, the operating limits and the ISO market clearing functions are considered with no KKT optimality conditions. The derived function is solved using group search optimiser (GSO), a robust population-based optimisation algorithm. Experimental investigation is carried out on IEEE 14 as well as IEEE 30 bus systems and the performance is compared against differential evolution-based strategic bidding, genetic algorithm-based strategic bidding and particle swarm optimisation-based strategic bidding methods. The simulation results demonstrate that the obtained profit maximisation through GSO-based bidding strategies is higher than the other three methods.
Sheridan, Juliette; Coe, Carol Ann; Doran, Peter; Egan, Laurence; Cullen, Garret; Kevans, David; Leyden, Jan; Galligan, Marie; O’Toole, Aoibhlinn; McCarthy, Jane; Doherty, Glen
2018-01-01
Introduction Ulcerative colitis (UC) is a chronic inflammatory bowel disease (IBD), often leading to an impaired quality of life in affected patients. Current treatment modalities include antitumour necrosis factor (anti-TNF) monoclonal antibodies (mABs) including infliximab, adalimumab and golimumab (GLM). Several recent retrospective and prospective studies have demonstrated that fixed dosing schedules of anti-TNF agents often fails to consistently achieve adequate circulating therapeutic drug levels (DL) with consequent risk of immunogenicity treatment failure and potential risk of hospitalisation and colectomy in patients with UC. The design of GLM dose Optimisation to Adequate Levels to Achieve Response in Colitis aims to address the impact of dose escalation of GLM immediately following induction and during the subsequent maintenance phase in response to suboptimal DL or persisting inflammatory burden as represented by raised faecal calprotectin (FCP). Aim The primary aim of the study is to ascertain if monitoring of FCP and DL of GLM to guide dose optimisation (during maintenance) improves rates of patient continuous clinical response and reduces disease activity in UC. Methods and analysis A randomised, multicentred two-arm trial studying the effect of dose optimisation of GLM based on FCP and DL versus treatment as per SMPC. Eligible patients will be randomised in a 1:1 ratio to 1 of 2 treatment groups and shall be treated over a period of 46 weeks. Ethics and dissemination The study protocol was approved by the Research Ethics committee of St. Vincent’s University Hospital. The results will be published in a peer-reviewed journal and shared with the worldwide medical community. Trial registration numbers EudraCT number: 2015-004724-62; Clinicaltrials.gov Identifier: NCT0268772; Pre-results. PMID:29379609
NASA Astrophysics Data System (ADS)
Chu, Xiaoyu; Zhang, Jingrui; Lu, Shan; Zhang, Yao; Sun, Yue
2016-11-01
This paper presents a trajectory planning algorithm to optimise the collision avoidance of a chasing spacecraft operating in an ultra-close proximity to a failed satellite. The complex configuration and the tumbling motion of the failed satellite are considered. The two-spacecraft rendezvous dynamics are formulated based on the target body frame, and the collision avoidance constraints are detailed, particularly concerning the uncertainties. An optimisation solution of the approaching problem is generated using the Gauss pseudospectral method. A closed-loop control is used to track the optimised trajectory. Numerical results are provided to demonstrate the effectiveness of the proposed algorithms.
Topology optimisation for natural convection problems
NASA Astrophysics Data System (ADS)
Alexandersen, Joe; Aage, Niels; Andreasen, Casper Schousboe; Sigmund, Ole
2014-12-01
This paper demonstrates the application of the density-based topology optimisation approach for the design of heat sinks and micropumps based on natural convection effects. The problems are modelled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations coupled to the convection-diffusion equation through the Boussinesq approximation. In order to facilitate topology optimisation, the Brinkman approach is taken to penalise velocities inside the solid domain and the effective thermal conductivity is interpolated in order to accommodate differences in thermal conductivity of the solid and fluid phases. The governing equations are discretised using stabilised finite elements and topology optimisation is performed for two different problems using discrete adjoint sensitivity analysis. The study shows that topology optimisation is a viable approach for designing heat sink geometries cooled by natural convection and micropumps powered by natural convection.
Optimisation of the hybrid renewable energy system by HOMER, PSO and CPSO for the study area
NASA Astrophysics Data System (ADS)
Khare, Vikas; Nema, Savita; Baredar, Prashant
2017-04-01
This study is based on simulation and optimisation of the renewable energy system of the police control room at Sagar in central India. To analyse this hybrid system, the meteorological data of solar insolation and hourly wind speeds of Sagar in central India (longitude 78°45‧ and latitude 23°50‧) have been considered. The pattern of load consumption is studied and suitably modelled for optimisation of the hybrid energy system using HOMER software. The results are compared with those of the particle swarm optimisation and the chaotic particle swarm optimisation algorithms. The use of these two algorithms to optimise the hybrid system leads to a higher quality result with faster convergence. Based on the optimisation result, it has been found that replacing conventional energy sources by the solar-wind hybrid renewable energy system will be a feasible solution for the distribution of electric power as a stand-alone application at the police control room. This system is more environmentally friendly than the conventional diesel generator. The fuel cost reduction is approximately 70-80% more than that of the conventional diesel generator.
NASA Astrophysics Data System (ADS)
Kaliszewski, M.; Mazuro, P.
2016-09-01
Simulated Annealing Method of optimisation for the sealing piston ring geometry is tested. The aim of optimisation is to develop ring geometry which would exert demanded pressure on a cylinder just while being bended to fit the cylinder. Method of FEM analysis of an arbitrary piston ring geometry is applied in an ANSYS software. The demanded pressure function (basing on formulae presented by A. Iskra) as well as objective function are introduced. Geometry definition constructed by polynomials in radial coordinate system is delivered and discussed. Possible application of Simulated Annealing Method in a piston ring optimisation task is proposed and visualised. Difficulties leading to possible lack of convergence of optimisation are presented. An example of an unsuccessful optimisation performed in APDL is discussed. Possible line of further optimisation improvement is proposed.
Optimisation in radiotherapy. III: Stochastic optimisation algorithms and conclusions.
Ebert, M
1997-12-01
This is the final article in a three part examination of optimisation in radiotherapy. Previous articles have established the bases and form of the radiotherapy optimisation problem, and examined certain types of optimisation algorithm, namely, those which perform some form of ordered search of the solution space (mathematical programming), and those which attempt to find the closest feasible solution to the inverse planning problem (deterministic inversion). The current paper examines algorithms which search the space of possible irradiation strategies by stochastic methods. The resulting iterative search methods move about the solution space by sampling random variates, which gradually become more constricted as the algorithm converges upon the optimal solution. This paper also discusses the implementation of optimisation in radiotherapy practice.
Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P
2008-11-30
In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.
Greco, Francesco; Cadeddu, Jeffrey A; Gill, Inderbir S; Kaouk, Jihad H; Remzi, Mesut; Thompson, R Houston; van Leeuwen, Fijs W B; van der Poel, Henk G; Fornara, Paolo; Rassweiler, Jens
2014-05-01
Molecular imaging (MI) entails the visualisation, characterisation, and measurement of biologic processes at the molecular and cellular levels in humans and other living systems. Translating this technology to interventions in real-time enables interventional MI/image-guided surgery, for example, by providing better detection of tumours and their dimensions. To summarise and critically analyse the available evidence on image-guided surgery for genitourinary (GU) oncologic diseases. A comprehensive literature review was performed using PubMed and the Thomson Reuters Web of Science. In the free-text protocol, the following terms were applied: molecular imaging, genitourinary oncologic surgery, surgical navigation, image-guided surgery, and augmented reality. Review articles, editorials, commentaries, and letters to the editor were included if deemed to contain relevant information. We selected 79 articles according to the search strategy based on the Preferred Reporting Items for Systematic Reviews and Meta-analysis criteria and the IDEAL method. MI techniques included optical imaging and fluorescent techniques, the augmented reality (AR) navigation system, magnetic resonance imaging spectroscopy, positron emission tomography, and single-photon emission computed tomography. Experimental studies on the AR navigation system were restricted to the detection and therapy of adrenal and renal malignancies and in the relatively infrequent cases of prostate cancer, whereas fluorescence techniques and optical imaging presented a wide application of intraoperative GU oncologic surgery. In most cases, image-guided surgery was shown to improve the surgical resectability of tumours. Based on the evidence to date, image-guided surgery has promise in the near future for multiple GU malignancies. Further optimisation of targeted imaging agents, along with the integration of imaging modalities, is necessary to further enhance intraoperative GU oncologic surgery. Copyright © 2013 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Distributed optimisation problem with communication delay and external disturbance
NASA Astrophysics Data System (ADS)
Tran, Ngoc-Tu; Xiao, Jiang-Wen; Wang, Yan-Wu; Yang, Wu
2017-12-01
This paper investigates the distributed optimisation problem for the multi-agent systems (MASs) with the simultaneous presence of external disturbance and the communication delay. To solve this problem, a two-step design scheme is introduced. In the first step, based on the internal model principle, the internal model term is constructed to compensate the disturbance asymptotically. In the second step, a distributed optimisation algorithm is designed to solve the distributed optimisation problem based on the MASs with the simultaneous presence of disturbance and communication delay. Moreover, in the proposed algorithm, each agent interacts with its neighbours through the connected topology and the delay occurs during the information exchange. By utilising Lyapunov-Krasovskii functional, the delay-dependent conditions are derived for both slowly and fast time-varying delay, respectively, to ensure the convergence of the algorithm to the optimal solution of the optimisation problem. Several numerical simulation examples are provided to illustrate the effectiveness of the theoretical results.
Impact of field number and beam angle on functional image-guided lung cancer radiotherapy planning
NASA Astrophysics Data System (ADS)
Tahir, Bilal A.; Bragg, Chris M.; Wild, Jim M.; Swinscoe, James A.; Lawless, Sarah E.; Hart, Kerry A.; Hatton, Matthew Q.; Ireland, Rob H.
2017-09-01
To investigate the effect of beam angles and field number on functionally-guided intensity modulated radiotherapy (IMRT) normal lung avoidance treatment plans that incorporate hyperpolarised helium-3 magnetic resonance imaging (3He MRI) ventilation data. Eight non-small cell lung cancer patients had pre-treatment 3He MRI that was registered to inspiration breath-hold radiotherapy planning computed tomography. IMRT plans that minimised the volume of total lung receiving ⩾20 Gy (V20) were compared with plans that minimised 3He MRI defined functional lung receiving ⩾20 Gy (fV20). Coplanar IMRT plans using 5-field manually optimised beam angles and 9-field equidistant plans were also evaluated. For each pair of plans, the Wilcoxon signed ranks test was used to compare fV20 and the percentage of planning target volume (PTV) receiving 90% of the prescription dose (PTV90). Incorporation of 3He MRI led to median reductions in fV20 of 1.3% (range: 0.2-9.3% p = 0.04) and 0.2% (range: 0 to 4.1%; p = 0.012) for 5- and 9-field arrangements, respectively. There was no clinically significant difference in target coverage. Functionally-guided IMRT plans incorporating hyperpolarised 3He MRI information can reduce the dose received by ventilated lung without comprising PTV coverage. The effect was greater for optimised beam angles rather than uniformly spaced fields.
Ding, N S; Hart, A; De Cruz, P
2016-01-01
Nonresponse and loss of response to anti-TNF therapies in Crohn's disease represent significant clinical problems for which clear management guidelines are lacking. To review the incidence, mechanisms and predictors of primary nonresponse and secondary loss of response to formulate practical clinical algorithms to guide management. Through a systematic literature review, 503 articles were identified which fit the inclusion criteria. Primary nonresponse to anti-TNF treatment affects 13-40% of patients. Secondary loss of response to anti-TNF occurs in 23-46% of patients when determined according to dose intensification, and 5-13% of patients when gauged by drug discontinuation rates. Recent evidence suggests that the mechanisms underlying primary nonresponse and secondary loss of response are multifactorial and include disease characteristics (phenotype, location, severity); drug (pharmacokinetic, pharmacodynamic or immunogenicity) and treatment strategy (dosing regimen) related factors. Clinical algorithms that employ therapeutic drug monitoring (using anti-TNF tough levels and anti-drug antibody levels) may be used to determine the underlying cause of primary nonresponse and secondary loss of response respectively and guide clinicians as to which patients are most likely to respond to anti-TNF therapy and help optimise drug therapy for those who are losing response to anti-TNF therapy. Nonresponse or loss of response to anti-TNF occurs commonly in Crohn's disease. Clinical algorithms utilising therapeutic drug monitoring may establish the mechanisms for treatment failure and help guide the subsequent therapeutic approach. © 2015 John Wiley & Sons Ltd.
Optimisation of nano-silica modified self-compacting high-Volume fly ash mortar
NASA Astrophysics Data System (ADS)
Achara, Bitrus Emmanuel; Mohammed, Bashar S.; Fadhil Nuruddin, Muhd
2017-05-01
Evaluation of the effects of nano-silica amount and superplasticizer (SP) dosage on the compressive strength, porosity and slump flow on high-volume fly ash self-consolidating mortar was investigated. Multiobjective optimisation technique using Design-Expert software was applied to obtain solution based on desirability function that simultaneously optimises the variables and the responses. A desirability function of 0.811 gives the optimised solution. The experimental and predicted results showed minimal errors in all the measured responses.
NASA Astrophysics Data System (ADS)
Sundaramoorthy, Kumaravel
2017-02-01
The hybrid energy systems (HESs) based electricity generation system has become a more attractive solution for rural electrification nowadays. Economically feasible and technically reliable HESs are solidly based on an optimisation stage. This article discusses about the optimal unit sizing model with the objective function to minimise the total cost of the HES. Three typical rural sites from southern part of India have been selected for the application of the developed optimisation methodology. Feasibility studies and sensitivity analysis on the optimal HES are discussed elaborately in this article. A comparison has been carried out with the Hybrid Optimization Model for Electric Renewable optimisation model for three sites. The optimal HES is found with less total net present rate and rate of energy compared with the existing method
Stenholm, A; Göransson, U; Bohlin, L
2013-02-01
Selective extraction of plant materials is advantageous for obtaining extracts enriched with desired constituents, thereby reducing the need for subsequent chromatography purification. Such compounds include three cyclooxygenase-2 (COX-2) inhibitory substances in Plantago major L. targeted in this investigation: α-linolenic acid (α-LNA) (18:3 ω-3) and the triterpenic acids ursolic acid and oleanolic acid. To investigate the scope for tuning the selectivity of supercritical fluid extraction (SFE) using bioassay guidance, and Soxhlet extraction with dichloromethane as solvent as a reference technique, to optimise yields of these substances. Extraction parameters were varied to optimise extracts' COX-2/COX-1 inhibitory effect ratios. The crude extracts were purified initially using a solid phase extraction (SPE) clean-up procedure and the target compounds were identified with GC-MS, LC-ESI-MS and LC-ESI-MS² using GC-FID for quantification. α-LNA was preferentially extracted in dynamic mode using unmodified carbon dioxide at 40°C and 172 bar, at a 0.04% (w/w) yield with a COX-2/COX-1 inhibitory effect ratio of 1.5. Ursolic and oleanolic acids were dynamically extracted at 0.25% and 0.06% yields, respectively, with no traces of (α-LNA) and a COX-2/COX-1-inhibitory effect ratio of 1.1 using 10% (v/v) ethanol as polar modifier at 75°C and 483 bar. The Soxhlet extracts had ursolic acid, oleanolic acid and αLNA yields up to 1.36%, 0.34% and 0.15%, respectively, with a COX-2/COX-1 inhibitory effect ratio of 1.2. The target substances can be extracted selectively by bioassay guided optimisation of SFE conditions. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Hadia, Sarman K.; Thakker, R. A.; Bhatt, Kirit R.
2016-05-01
The study proposes an application of evolutionary algorithms, specifically an artificial bee colony (ABC), variant ABC and particle swarm optimisation (PSO), to extract the parameters of metal oxide semiconductor field effect transistor (MOSFET) model. These algorithms are applied for the MOSFET parameter extraction problem using a Pennsylvania surface potential model. MOSFET parameter extraction procedures involve reducing the error between measured and modelled data. This study shows that ABC algorithm optimises the parameter values based on intelligent activities of honey bee swarms. Some modifications have also been applied to the basic ABC algorithm. Particle swarm optimisation is a population-based stochastic optimisation method that is based on bird flocking activities. The performances of these algorithms are compared with respect to the quality of the solutions. The simulation results of this study show that the PSO algorithm performs better than the variant ABC and basic ABC algorithm for the parameter extraction of the MOSFET model; also the implementation of the ABC algorithm is shown to be simpler than that of the PSO algorithm.
NASA Astrophysics Data System (ADS)
Jian, Le; Cao, Wang; Jintao, Yang; Yinge, Wang
2018-04-01
This paper describes the design of a dynamic voltage restorer (DVR) that can simultaneously protect several sensitive loads from voltage sags in a region of an MV distribution network. A novel reference voltage calculation method based on zero-sequence voltage optimisation is proposed for this DVR to optimise cost-effectiveness in compensation of voltage sags with different characteristics in an ungrounded neutral system. Based on a detailed analysis of the characteristics of voltage sags caused by different types of faults and the effect of the wiring mode of the transformer on these characteristics, the optimisation target of the reference voltage calculation is presented with several constraints. The reference voltages under all types of voltage sags are calculated by optimising the zero-sequence component, which can reduce the degree of swell in the phase-to-ground voltage after compensation to the maximum extent and can improve the symmetry degree of the output voltages of the DVR, thereby effectively increasing the compensation ability. The validity and effectiveness of the proposed method are verified by simulation and experimental results.
Bryant, Maria; Burton, Wendy; Cundill, Bonnie; Farrin, Amanda J; Nixon, Jane; Stevens, June; Roberts, Kim; Foy, Robbie; Rutter, Harry; Hartley, Suzanne; Tubeuf, Sandy; Collinson, Michelle; Brown, Julia
2017-01-24
Family-based interventions to prevent childhood obesity depend upon parents' taking action to improve diet and other lifestyle behaviours in their families. Programmes that attract and retain high numbers of parents provide an enhanced opportunity to improve public health and are also likely to be more cost-effective than those that do not. We have developed a theory-informed optimisation intervention to promote parent engagement within an existing childhood obesity prevention group programme, HENRY (Health Exercise Nutrition for the Really Young). Here, we describe a proposal to evaluate the effectiveness of this optimisation intervention in regard to the engagement of parents and cost-effectiveness. The Optimising Family Engagement in HENRY (OFTEN) trial is a cluster randomised controlled trial being conducted across 24 local authorities (approximately 144 children's centres) which currently deliver HENRY programmes. The primary outcome will be parental enrolment and attendance at the HENRY programme, assessed using routinely collected process data. Cost-effectiveness will be presented in terms of primary outcomes using acceptability curves and through eliciting the willingness to pay for the optimisation from HENRY commissioners. Secondary outcomes include the longitudinal impact of the optimisation, parent-reported infant intake of fruits and vegetables (as a proxy to compliance) and other parent-reported family habits and lifestyle. This innovative trial will provide evidence on the implementation of a theory-informed optimisation intervention to promote parent engagement in HENRY, a community-based childhood obesity prevention programme. The findings will be generalisable to other interventions delivered to parents in other community-based environments. This research meets the expressed needs of commissioners, children's centres and parents to optimise the potential impact that HENRY has on obesity prevention. A subsequent cluster randomised controlled pilot trial is planned to determine the practicality of undertaking a definitive trial to robustly evaluate the effectiveness and cost-effectiveness of the optimised intervention on childhood obesity prevention. ClinicalTrials.gov identifier: NCT02675699 . Registered on 4 February 2016.
Natural guide-star processing for wide-field laser-assisted AO systems
NASA Astrophysics Data System (ADS)
Correia, Carlos M.; Neichel, Benoit; Conan, Jean-Marc; Petit, Cyril; Sauvage, Jean-Francois; Fusco, Thierry; Vernet, Joel D. R.; Thatte, Niranjan
2016-07-01
Sky-coverage in laser-assisted AO observations largely depends on the system's capability to guide on the faintest natural guide-stars possible. Here we give an up-to-date status of our natural guide-star processing tailored to the European-ELT's visible and near-infrared (0.47 to 2.45 μm) integral field spectrograph - Harmoni. We tour the processing of both the isoplanatic and anisoplanatic tilt modes using the spatio-angular approach whereby the wavefront is estimated directly in the pupil plane avoiding a cumbersome explicit layered estimation on the 35-layer profiles we're currently using. Taking the case of Harmoni, we cover the choice of wave-front sensors, the number and field location of guide-stars, the optimised algorithms to beat down angular anisoplanatism and the performance obtained with different temporal controllers under split high-order/low-order tomography or joint tomography. We consider both atmospheric and far greater telescope wind buffeting disturbances. In addition we provide the sky-coverage estimates thus obtained.
Leucht, Stefan; Winter-van Rossum, Inge; Heres, Stephan; Arango, Celso; Fleischhacker, W Wolfgang; Glenthøj, Birte; Leboyer, Marion; Leweke, F Markus; Lewis, Shôn; McGuire, Phillip; Meyer-Lindenberg, Andreas; Rujescu, Dan; Kapur, Shitij; Kahn, René S; Sommer, Iris E
2015-05-01
Most of the 13 542 trials contained in the Cochrane Schizophrenia Group's register just tested the general efficacy of pharmacological or psychosocial interventions. Studies on the subsequent treatment steps, which are essential to guide clinicians, are largely missing. This knowledge gap leaves important questions unanswered. For example, when a first antipsychotic failed, is switching to another drug effective? And when should we use clozapine? The aim of this article is to review the efficacy of switching antipsychotics in case of nonresponse. We also present the European Commission sponsored "Optimization of Treatment and Management of Schizophrenia in Europe" (OPTiMiSE) trial which aims to provide a treatment algorithm for patients with a first episode of schizophrenia. We searched Pubmed (October 29, 2014) for randomized controlled trials (RCTs) that examined switching the drug in nonresponders to another antipsychotic. We described important methodological choices of the OPTiMiSE trial. We found 10 RCTs on switching antipsychotic drugs. No trial was conclusive and none was concerned with first-episode schizophrenia. In OPTiMiSE, 500 first episode patients are treated with amisulpride for 4 weeks, followed by a 6-week double-blind RCT comparing continuation of amisulpride with switching to olanzapine and ultimately a 12-week clozapine treatment in nonremitters. A subsequent 1-year RCT validates psychosocial interventions to enhance adherence. Current literature fails to provide basic guidance for the pharmacological treatment of schizophrenia. The OPTiMiSE trial is expected to provide a basis for clinical guidelines to treat patients with a first episode of schizophrenia. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Capoccia, Massimo; Marconi, Silvia; Singh, Sanjeet Avtaar; Pisanelli, Domenico M; De Lazzari, Claudio
2018-05-02
Modelling and simulation may become clinically applicable tools for detailed evaluation of the cardiovascular system and clinical decision-making to guide therapeutic intervention. Models based on pressure-volume relationship and zero-dimensional representation of the cardiovascular system may be a suitable choice given their simplicity and versatility. This approach has great potential for application in heart failure where the impact of left ventricular assist devices has played a significant role as a bridge to transplant and more recently as a long-term solution for non eligible candidates. We sought to investigate the value of simulation in the context of three heart failure patients with a view to predict or guide further management. CARDIOSIM © was the software used for this purpose. The study was based on retrospective analysis of haemodynamic data previously discussed at a multidisciplinary meeting. The outcome of the simulations addressed the value of a more quantitative approach in the clinical decision process. Although previous experience, co-morbidities and the risk of potentially fatal complications play a role in clinical decision-making, patient-specific modelling may become a daily approach for selection and optimisation of device-based treatment for heart failure patients. Willingness to adopt this integrated approach may be the key to further progress.
Cultural-based particle swarm for dynamic optimisation problems
NASA Astrophysics Data System (ADS)
Daneshyari, Moayed; Yen, Gary G.
2012-07-01
Many practical optimisation problems are with the existence of uncertainties, among which a significant number belong to the dynamic optimisation problem (DOP) category in which the fitness function changes through time. In this study, we propose the cultural-based particle swarm optimisation (PSO) to solve DOP problems. A cultural framework is adopted incorporating the required information from the PSO into five sections of the belief space, namely situational, temporal, domain, normative and spatial knowledge. The stored information will be adopted to detect the changes in the environment and assists response to the change through a diversity-based repulsion among particles and migration among swarms in the population space, and also helps in selecting the leading particles in three different levels, personal, swarm and global levels. Comparison of the proposed heuristics over several difficult dynamic benchmark problems demonstrates the better or equal performance with respect to most of other selected state-of-the-art dynamic PSO heuristics.
Multiobjective optimisation of bogie suspension to boost speed on curves
NASA Astrophysics Data System (ADS)
Milad Mousavi-Bideleh, Seyed; Berbyuk, Viktor
2016-01-01
To improve safety and maximum admissible speed on different operational scenarios, multiobjective optimisation of bogie suspension components of a one-car railway vehicle model is considered. The vehicle model has 50 degrees of freedom and is developed in multibody dynamics software SIMPACK. Track shift force, running stability, and risk of derailment are selected as safety objective functions. The improved maximum admissible speeds of the vehicle on curves are determined based on the track plane accelerations up to 1.5 m/s2. To attenuate the number of design parameters for optimisation and improve the computational efficiency, a global sensitivity analysis is accomplished using the multiplicative dimensional reduction method (M-DRM). A multistep optimisation routine based on genetic algorithm (GA) and MATLAB/SIMPACK co-simulation is executed at three levels. The bogie conventional secondary and primary suspension components are chosen as the design parameters in the first two steps, respectively. In the last step semi-active suspension is in focus. The input electrical current to magnetorheological yaw dampers is optimised to guarantee an appropriate safety level. Semi-active controllers are also applied and the respective effects on bogie dynamics are explored. The safety Pareto optimised results are compared with those associated with in-service values. The global sensitivity analysis and multistep approach significantly reduced the number of design parameters and improved the computational efficiency of the optimisation. Furthermore, using the optimised values of design parameters give the possibility to run the vehicle up to 13% faster on curves while a satisfactory safety level is guaranteed. The results obtained can be used in Pareto optimisation and active bogie suspension design problems.
Boundary element based multiresolution shape optimisation in electrostatics
NASA Astrophysics Data System (ADS)
Bandara, Kosala; Cirak, Fehmi; Of, Günther; Steinbach, Olaf; Zapletal, Jan
2015-09-01
We consider the shape optimisation of high-voltage devices subject to electrostatic field equations by combining fast boundary elements with multiresolution subdivision surfaces. The geometry of the domain is described with subdivision surfaces and different resolutions of the same geometry are used for optimisation and analysis. The primal and adjoint problems are discretised with the boundary element method using a sufficiently fine control mesh. For shape optimisation the geometry is updated starting from the coarsest control mesh with increasingly finer control meshes. The multiresolution approach effectively prevents the appearance of non-physical geometry oscillations in the optimised shapes. Moreover, there is no need for mesh regeneration or smoothing during the optimisation due to the absence of a volume mesh. We present several numerical experiments and one industrial application to demonstrate the robustness and versatility of the developed approach.
Optimisation of sensing time and transmission time in cognitive radio-based smart grid networks
NASA Astrophysics Data System (ADS)
Yang, Chao; Fu, Yuli; Yang, Junjie
2016-07-01
Cognitive radio (CR)-based smart grid (SG) networks have been widely recognised as emerging communication paradigms in power grids. However, a sufficient spectrum resource and reliability are two major challenges for real-time applications in CR-based SG networks. In this article, we study the traffic data collection problem. Based on the two-stage power pricing model, the power price is associated with the efficient received traffic data in a metre data management system (MDMS). In order to minimise the system power price, a wideband hybrid access strategy is proposed and analysed, to share the spectrum between the SG nodes and CR networks. The sensing time and transmission time are jointly optimised, while both the interference to primary users and the spectrum opportunity loss of secondary users are considered. Two algorithms are proposed to solve the joint optimisation problem. Simulation results show that the proposed joint optimisation algorithms outperform the fixed parameters (sensing time and transmission time) algorithms, and the power cost is reduced efficiently.
NASA Astrophysics Data System (ADS)
Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.
2017-09-01
This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.
Metaheuristic optimisation methods for approximate solving of singular boundary value problems
NASA Astrophysics Data System (ADS)
Sadollah, Ali; Yadav, Neha; Gao, Kaizhou; Su, Rong
2017-07-01
This paper presents a novel approximation technique based on metaheuristics and weighted residual function (WRF) for tackling singular boundary value problems (BVPs) arising in engineering and science. With the aid of certain fundamental concepts of mathematics, Fourier series expansion, and metaheuristic optimisation algorithms, singular BVPs can be approximated as an optimisation problem with boundary conditions as constraints. The target is to minimise the WRF (i.e. error function) constructed in approximation of BVPs. The scheme involves generational distance metric for quality evaluation of the approximate solutions against exact solutions (i.e. error evaluator metric). Four test problems including two linear and two non-linear singular BVPs are considered in this paper to check the efficiency and accuracy of the proposed algorithm. The optimisation task is performed using three different optimisers including the particle swarm optimisation, the water cycle algorithm, and the harmony search algorithm. Optimisation results obtained show that the suggested technique can be successfully applied for approximate solving of singular BVPs.
Infrastructure optimisation via MBR retrofit: a design guide.
Bagg, W K
2009-01-01
Wastewater management is continually evolving with the development and implementation of new, more efficient technologies. One of these is the Membrane Bioreactor (MBR). Although a relatively new technology in Australia, MBR wastewater treatment has been widely used elsewhere for over 20 years, with thousands of MBRs now in operation worldwide. Over the past 5 years, MBR technology has been enthusiastically embraced in Australia as a potential treatment upgrade option, and via retrofit typically offers two major benefits: (1) more capacity using mostly existing facilities, and (2) very high quality treated effluent. However, infrastructure optimisation via MBR retrofit is not a simple or low-cost solution and there are many factors which should be carefully evaluated before deciding on this method of plant upgrade. The paper reviews a range of design parameters which should be carefully evaluated when considering an MBR retrofit solution. Several actual and conceptual case studies are considered to demonstrate both advantages and disadvantages. Whilst optimising existing facilities and production of high quality water for reuse are powerful drivers, it is suggested that MBRs are perhaps not always the most sustainable Whole-of-Life solution for a wastewater treatment plant upgrade, especially by way of a retrofit.
A Bayesian Approach for Sensor Optimisation in Impact Identification
Mallardo, Vincenzo; Sharif Khodaei, Zahra; Aliabadi, Ferri M. H.
2016-01-01
This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM) system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence. PMID:28774064
Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei
2017-01-01
A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.
Ribera, Esteban; Martínez-Sesmero, José Manuel; Sánchez-Rubio, Javier; Rubio, Rafael; Pasquau, Juan; Poveda, José Luis; Pérez-Mitru, Alejandro; Roldán, Celia; Hernández-Novoa, Beatriz
2018-03-01
The objective of this study is to estimate the economic impact associated with the optimisation of triple antiretroviral treatment (ART) in patients with undetectable viral load according to the recommendations from the GeSIDA/PNS (2015) Consensus and their applicability in the Spanish clinical practice. A pharmacoeconomic model was developed based on data from a National Hospital Prescription Survey on ART (2014) and the A-I evidence recommendations for the optimisation of ART from the GeSIDA/PNS (2015) consensus. The optimisation model took into account the willingness to optimise a particular regimen and other assumptions, and the results were validated by an expert panel in HIV infection (Infectious Disease Specialists and Hospital Pharmacists). The analysis was conducted from the NHS perspective, considering the annual wholesale price and accounting for deductions stated in the RD-Law 8/2010 and the VAT. The expert panel selected six optimisation strategies, and estimated that 10,863 (13.4%) of the 80,859 patients in Spain currently on triple ART, would be candidates to optimise their ART, leading to savings of €15.9M/year (2.4% of total triple ART drug cost). The most feasible strategies (>40% of patients candidates for optimisation, n=4,556) would be optimisations to ATV/r+3TC therapy. These would produce savings between €653 and €4,797 per patient per year depending on baseline triple ART. Implementation of the main optimisation strategies recommended in the GeSIDA/PNS (2015) Consensus into Spanish clinical practice would lead to considerable savings, especially those based in dual therapy with ATV/r+3TC, thus contributing to the control of pharmaceutical expenditure and NHS sustainability. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
NASA Astrophysics Data System (ADS)
Nickless, A.; Rayner, P. J.; Erni, B.; Scholes, R. J.
2018-05-01
The design of an optimal network of atmospheric monitoring stations for the observation of carbon dioxide (CO2) concentrations can be obtained by applying an optimisation algorithm to a cost function based on minimising posterior uncertainty in the CO2 fluxes obtained from a Bayesian inverse modelling solution. Two candidate optimisation methods assessed were the evolutionary algorithm: the genetic algorithm (GA), and the deterministic algorithm: the incremental optimisation (IO) routine. This paper assessed the ability of the IO routine in comparison to the more computationally demanding GA routine to optimise the placement of a five-member network of CO2 monitoring sites located in South Africa. The comparison considered the reduction in uncertainty of the overall flux estimate, the spatial similarity of solutions, and computational requirements. Although the IO routine failed to find the solution with the global maximum uncertainty reduction, the resulting solution had only fractionally lower uncertainty reduction compared with the GA, and at only a quarter of the computational resources used by the lowest specified GA algorithm. The GA solution set showed more inconsistency if the number of iterations or population size was small, and more so for a complex prior flux covariance matrix. If the GA completed with a sub-optimal solution, these solutions were similar in fitness to the best available solution. Two additional scenarios were considered, with the objective of creating circumstances where the GA may outperform the IO. The first scenario considered an established network, where the optimisation was required to add an additional five stations to an existing five-member network. In the second scenario the optimisation was based only on the uncertainty reduction within a subregion of the domain. The GA was able to find a better solution than the IO under both scenarios, but with only a marginal improvement in the uncertainty reduction. These results suggest that the best use of resources for the network design problem would be spent in improvement of the prior estimates of the flux uncertainties rather than investing these resources in running a complex evolutionary optimisation algorithm. The authors recommend that, if time and computational resources allow, that multiple optimisation techniques should be used as a part of a comprehensive suite of sensitivity tests when performing such an optimisation exercise. This will provide a selection of best solutions which could be ranked based on their utility and practicality.
A practical guide for nurses in diluent selection for subcutaneous infusion using a syringe driver.
McLeod, Fiona; Flowers, Charne
2006-12-01
Appropriate diluent selection in continuous subcutaneous infusion optimises symptom management and client well-being. The responsibility of diluent selection is commonly one of the attending nurse. This paper was developed with the intention of providing nurses with practical instruction for diluent selection when preparing medications for administration subcutaneously using a syringe driver. A literature review was undertaken of published journal databases and published guidelines sites. Recommendations regarding diluent choice were reviewed in two iterations by an expert panel of palliative care nurse clinicians. The principles for diluent selection are presented. They are based primarily on expert opinion level of evidence given a lack of primary research evidence in the area of diluent selection. There is a pressing need for manufacturers' guidance on diluent selection and independent research to establish the impact of diluents on drug and drug combinations when using syringe drivers. Until such time that this evidence is available to guide practice, clinicians need to be trained to inspect solutions and assess the effectiveness of the medication in controlling symptoms. The capacity of this paper to provide practical instruction has been limited by the lack of rigorous evidence available, and indeed, the process of developing this guide identified perhaps more questions than answers available at the present time.
NASA Astrophysics Data System (ADS)
Sur, Chiranjib; Shukla, Anupam
2018-03-01
Bacteria Foraging Optimisation Algorithm is a collective behaviour-based meta-heuristics searching depending on the social influence of the bacteria co-agents in the search space of the problem. The algorithm faces tremendous hindrance in terms of its application for discrete problems and graph-based problems due to biased mathematical modelling and dynamic structure of the algorithm. This had been the key factor to revive and introduce the discrete form called Discrete Bacteria Foraging Optimisation (DBFO) Algorithm for discrete problems which exceeds the number of continuous domain problems represented by mathematical and numerical equations in real life. In this work, we have mainly simulated a graph-based road multi-objective optimisation problem and have discussed the prospect of its utilisation in other similar optimisation problems and graph-based problems. The various solution representations that can be handled by this DBFO has also been discussed. The implications and dynamics of the various parameters used in the DBFO are illustrated from the point view of the problems and has been a combination of both exploration and exploitation. The result of DBFO has been compared with Ant Colony Optimisation and Intelligent Water Drops Algorithms. Important features of DBFO are that the bacteria agents do not depend on the local heuristic information but estimates new exploration schemes depending upon the previous experience and covered path analysis. This makes the algorithm better in combination generation for graph-based problems and combination generation for NP hard problems.
Van de Velde, Stijn; Roshanov, Pavel; Kortteisto, Tiina; Kunnamo, Ilkka; Aertgeerts, Bert; Vandvik, Per Olav; Flottorp, Signe
2016-03-05
A computerised clinical decision support system (CCDSS) is a technology that uses patient-specific data to provide relevant medical knowledge at the point of care. It is considered to be an important quality improvement intervention, and the implementation of CCDSS is growing substantially. However, the significant investments do not consistently result in value for money due to content, context, system and implementation issues. The Guideline Implementation with Decision Support (GUIDES) project aims to improve the impact of CCDSS through optimised implementation based on high-quality evidence-based recommendations. To achieve this, we will develop tools that address the factors that determine successful CCDSS implementation. We will develop the GUIDES tools in four steps, using the methods and results of the Tailored Implementation for Chronic Diseases (TICD) project as a starting point: (1) a review of research evidence and frameworks on the determinants of implementing recommendations using CCDSS; (2) a synthesis of a comprehensive framework for the identified determinants; (3) the development of tools for use of the framework and (4) pilot testing the utility of the tools through the development of a tailored CCDSS intervention in Norway, Belgium and Finland. We selected the conservative management of knee osteoarthritis as a prototype condition for the pilot. During the process, the authors will collaborate with an international expert group to provide input and feedback on the tools. This project will provide guidance and tools on methods of identifying implementation determinants and selecting strategies to implement evidence-based recommendations through CCDSS. We will make the GUIDES tools available to CCDSS developers, implementers, researchers, funders, clinicians, managers, educators, and policymakers internationally. The tools and recommendations will be generic, which makes them scalable to a large spectrum of conditions. Ultimately, the better implementation of CCDSS may lead to better-informed decisions and improved care and patient outcomes for a wide range of conditions. PROSPERO, CRD42016033738.
Optimisation of SOA-REAMs for hybrid DWDM-TDMA PON applications.
Naughton, Alan; Antony, Cleitus; Ossieur, Peter; Porto, Stefano; Talli, Giuseppe; Townsend, Paul D
2011-12-12
We demonstrate how loss-optimised, gain-saturated SOA-REAM based reflective modulators can reduce the burst to burst power variations due to differential access loss in the upstream path in carrier distributed passive optical networks by 18 dB compared to fixed linear gain modulators. We also show that the loss optimised device has a high tolerance to input power variations and can operate in deep saturation with minimal patterning penalties. Finally, we demonstrate that an optimised device can operate across the C-Band and also over a transmission distance of 80 km. © 2011 Optical Society of America
NASA Astrophysics Data System (ADS)
Hazwan, M. H. M.; Shayfull, Z.; Sharif, S.; Nasir, S. M.; Zainal, N.
2017-09-01
In injection moulding process, quality and productivity are notably important and must be controlled for each product type produced. Quality is measured as the extent of warpage of moulded parts while productivity is measured as a duration of moulding cycle time. To control the quality, many researchers have introduced various of optimisation approaches which have been proven enhanced the quality of the moulded part produced. In order to improve the productivity of injection moulding process, some of researches have proposed the application of conformal cooling channels which have been proven reduced the duration of moulding cycle time. Therefore, this paper presents an application of alternative optimisation approach which is Response Surface Methodology (RSM) with Glowworm Swarm Optimisation (GSO) on the moulded part with straight-drilled and conformal cooling channels mould. This study examined the warpage condition of the moulded parts before and after optimisation work applied for both cooling channels. A front panel housing have been selected as a specimen and the performance of proposed optimisation approach have been analysed on the conventional straight-drilled cooling channels compared to the Milled Groove Square Shape (MGSS) conformal cooling channels by simulation analysis using Autodesk Moldflow Insight (AMI) 2013. Based on the results, melt temperature is the most significant factor contribute to the warpage condition and warpage have optimised by 39.1% after optimisation for straight-drilled cooling channels and cooling time is the most significant factor contribute to the warpage condition and warpage have optimised by 38.7% after optimisation for MGSS conformal cooling channels. In addition, the finding shows that the application of optimisation work on the conformal cooling channels offers the better quality and productivity of the moulded part produced.
Shape Optimisation of Holes in Loaded Plates by Minimisation of Multiple Stress Peaks
2015-04-01
UNCLASSIFIED UNCLASSIFIED Shape Optimisation of Holes in Loaded Plates by Minimisation of Multiple Stress Peaks Witold Waldman and Manfred...minimising the peak tangential stresses on multiple segments around the boundary of a hole in a uniaxially-loaded or biaxially-loaded plate . It is based...RELEASE UNCLASSIFIED UNCLASSIFIED Shape Optimisation of Holes in Loaded Plates by Minimisation of Multiple Stress Peaks Executive Summary Aerospace
Hasler, B; Delabouglise, A; Babo Martins, S
2017-04-01
The primary role of animal health economics is to inform decision-making by determining optimal investments for animal health. Animal health surveillance produces information to guide interventions. Consequently, investments in surveillance and intervention must be evaluated together. This article explores the different theoretical frameworks and methods developed to assess and optimise the spending of resources in surveillance and intervention and their technical interdependence. The authors present frameworks that define the relationship between health investment and losses due to disease, and the relationship between surveillance and intervention resources. Surveillance and intervention are usually considered as technical substitutes, since increased investments in surveillance reduce the level of intervention resources required to reach the same benefit. The authors also discuss approaches used to quantify externalities and non-monetary impacts. Finally, they describe common economic evaluation types, including optimisation, acceptability and least-cost studies.
Magnetic resonance imaging-guided surgical design: can we optimise the Fontan operation?
Haggerty, Christopher M; Yoganathan, Ajit P; Fogel, Mark A
2013-12-01
The Fontan procedure, although an imperfect solution for children born with a single functional ventricle, is the only reconstruction at present short of transplantation. The haemodynamics associated with the total cavopulmonary connection, the modern approach to Fontan, are severely altered from the normal biventricular circulation and may contribute to the long-term complications that are frequently noted. Through recent technological advances, spear-headed by advances in medical imaging, it is now possible to virtually model these surgical procedures and evaluate the patient-specific haemodynamics as part of the pre-operative planning process. This is a novel paradigm with the potential to revolutionise the approach to Fontan surgery, help to optimise the haemodynamic results, and improve patient outcomes. This review provides a brief overview of these methods, presents preliminary results of their clinical usage, and offers insights into its potential future directions.
Suwannarangsee, Surisa; Bunterngsook, Benjarat; Arnthong, Jantima; Paemanee, Atchara; Thamchaipenet, Arinthip; Eurwilaichitr, Lily; Laosiripojana, Navadol; Champreda, Verawat
2012-09-01
Synergistic enzyme system for the hydrolysis of alkali-pretreated rice straw was optimised based on the synergy of crude fungal enzyme extracts with a commercial cellulase (Celluclast™). Among 13 enzyme extracts, the enzyme preparation from Aspergillus aculeatus BCC 199 exhibited the highest level of synergy with Celluclast™. This synergy was based on the complementary cellulolytic and hemicellulolytic activities of the BCC 199 enzyme extract. A mixture design was used to optimise the ternary enzyme complex based on the synergistic enzyme mixture with Bacillus subtilis expansin. Using the full cubic model, the optimal formulation of the enzyme mixture was predicted to the percentage of Celluclast™: BCC 199: expansin=41.4:37.0:21.6, which produced 769 mg reducing sugar/g biomass using 2.82 FPU/g enzymes. This work demonstrated the use of a systematic approach for the design and optimisation of a synergistic enzyme mixture of fungal enzymes and expansin for lignocellulosic degradation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Escalated convergent artificial bee colony
NASA Astrophysics Data System (ADS)
Jadon, Shimpi Singh; Bansal, Jagdish Chand; Tiwari, Ritu
2016-03-01
Artificial bee colony (ABC) optimisation algorithm is a recent, fast and easy-to-implement population-based meta heuristic for optimisation. ABC has been proved a rival algorithm with some popular swarm intelligence-based algorithms such as particle swarm optimisation, firefly algorithm and ant colony optimisation. The solution search equation of ABC is influenced by a random quantity which helps its search process in exploration at the cost of exploitation. In order to find a fast convergent behaviour of ABC while exploitation capability is maintained, in this paper basic ABC is modified in two ways. First, to improve exploitation capability, two local search strategies, namely classical unidimensional local search and levy flight random walk-based local search are incorporated with ABC. Furthermore, a new solution search strategy, namely stochastic diffusion scout search is proposed and incorporated into the scout bee phase to provide more chance to abandon solution to improve itself. Efficiency of the proposed algorithm is tested on 20 benchmark test functions of different complexities and characteristics. Results are very promising and they prove it to be a competitive algorithm in the field of swarm intelligence-based algorithms.
Genetic algorithm-based improved DOA estimation using fourth-order cumulants
NASA Astrophysics Data System (ADS)
Ahmed, Ammar; Tufail, Muhammad
2017-05-01
Genetic algorithm (GA)-based direction of arrival (DOA) estimation is proposed using fourth-order cumulants (FOC) and ESPRIT principle which results in Multiple Invariance Cumulant ESPRIT algorithm. In the existing FOC ESPRIT formulations, only one invariance is utilised to estimate DOAs. The unused multiple invariances (MIs) must be exploited simultaneously in order to improve the estimation accuracy. In this paper, a fitness function based on a carefully designed cumulant matrix is developed which incorporates MIs present in the sensor array. Better DOA estimation can be achieved by minimising this fitness function. Moreover, the effectiveness of Newton's method as well as GA for this optimisation problem has been illustrated. Simulation results show that the proposed algorithm provides improved estimation accuracy compared to existing algorithms, especially in the case of low SNR, less number of snapshots, closely spaced sources and high signal and noise correlation. Moreover, it is observed that the optimisation using Newton's method is more likely to converge to false local optima resulting in erroneous results. However, GA-based optimisation has been found attractive due to its global optimisation capability.
Algorithme intelligent d'optimisation d'un design structurel de grande envergure
NASA Astrophysics Data System (ADS)
Dominique, Stephane
The implementation of an automated decision support system in the field of design and structural optimisation can give a significant advantage to any industry working on mechanical designs. Indeed, by providing solution ideas to a designer or by upgrading existing design solutions while the designer is not at work, the system may reduce the project cycle time, or allow more time to produce a better design. This thesis presents a new approach to automate a design process based on Case-Based Reasoning (CBR), in combination with a new genetic algorithm named Genetic Algorithm with Territorial core Evolution (GATE). This approach was developed in order to reduce the operating cost of the process. However, as the system implementation cost is quite expensive, the approach is better suited for large scale design problem, and particularly for design problems that the designer plans to solve for many different specification sets. First, the CBR process uses a databank filled with every known solution to similar design problems. Then, the closest solutions to the current problem in term of specifications are selected. After this, during the adaptation phase, an artificial neural network (ANN) interpolates amongst known solutions to produce an additional solution to the current problem using the current specifications as inputs. Each solution produced and selected by the CBR is then used to initialize the population of an island of the genetic algorithm. The algorithm will optimise the solution further during the refinement phase. Using progressive refinement, the algorithm starts using only the most important variables for the problem. Then, as the optimisation progress, the remaining variables are gradually introduced, layer by layer. The genetic algorithm that is used is a new algorithm specifically created during this thesis to solve optimisation problems from the field of mechanical device structural design. The algorithm is named GATE, and is essentially a real number genetic algorithm that prevents new individuals to be born too close to previously evaluated solutions. The restricted area becomes smaller or larger during the optimisation to allow global or local search when necessary. Also, a new search operator named Substitution Operator is incorporated in GATE. This operator allows an ANN surrogate model to guide the algorithm toward the most promising areas of the design space. The suggested CBR approach and GATE were tested on several simple test problems, as well as on the industrial problem of designing a gas turbine engine rotor's disc. These results are compared to other results obtained for the same problems by many other popular optimisation algorithms, such as (depending of the problem) gradient algorithms, binary genetic algorithm, real number genetic algorithm, genetic algorithm using multiple parents crossovers, differential evolution genetic algorithm, Hookes & Jeeves generalized pattern search method and POINTER from the software I-SIGHT 3.5. Results show that GATE is quite competitive, giving the best results for 5 of the 6 constrained optimisation problem. GATE also provided the best results of all on problem produced by a Maximum Set Gaussian landscape generator. Finally, GATE provided a disc 4.3% lighter than the best other tested algorithm (POINTER) for the gas turbine engine rotor's disc problem. One drawback of GATE is a lesser efficiency for highly multimodal unconstrained problems, for which he gave quite poor results with respect to its implementation cost. To conclude, according to the preliminary results obtained during this thesis, the suggested CBR process, combined with GATE, seems to be a very good candidate to automate and accelerate the structural design of mechanical devices, potentially reducing significantly the cost of industrial preliminary design processes.
Hearns, S; Shirley, P J
2006-01-01
Retrieval and transfer of critically ill and injured patients is a high risk activity. Risk can be minimised with robust safety and clinical governance systems in place. This article describes the various governance systems that can be employed to optimise safety and efficiency in retrieval services. These include operating procedure development, equipment management, communications procedures, crew resource management, significant event analysis, audit and training. PMID:17130608
Multi-Optimisation Consensus Clustering
NASA Astrophysics Data System (ADS)
Li, Jian; Swift, Stephen; Liu, Xiaohui
Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.
Frameworks for change in healthcare organisations: a formative evaluation of the NHS Change Model.
Martin, Graham P; Sutton, Elizabeth; Willars, Janet; Dixon-Woods, Mary
2013-08-01
Organisational change in complex healthcare systems is a multifaceted process. The English National Health Service recently introduced a 'Change Model' that seeks to offer an evidence-based framework for guiding change. We report findings from a formative evaluation of the NHS Change Model and make recommendations for those developing the Model and its users. The evaluation involved 28 interviews with managers and clinicians making use of the Change Model in relation to a variety of projects. Interviews were fully transcribed and were analysed using an approach based on the Framework method. Participants saw the Change Model as valuable and practically useful. Fidelity to core principles of the Model was variable: participants often altered the Model, especially when using it to orchestrate the work of others. In challenging organisational contexts, the Change Model was sometimes used to delegitimise opposition rather than identify shared purpose among different interest groups. Those guiding change may benefit from frameworks, guidance and toolkits to structure and inform their planning and activities. Participants' experiences suggested the Change Model has much potential. Further work on its design and on supporting materials may optimise the approach, but its utility rests in particular on organisational cultures that support faithful application. © The Author(s) 2013 Reprints and permissions:]br]sagepub.co.uk/journalsPermissions.nav.
Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry
2016-01-01
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases.
A novel global Harmony Search method based on Ant Colony Optimisation algorithm
NASA Astrophysics Data System (ADS)
Fouad, Allouani; Boukhetala, Djamel; Boudjema, Fares; Zenger, Kai; Gao, Xiao-Zhi
2016-03-01
The Global-best Harmony Search (GHS) is a stochastic optimisation algorithm recently developed, which hybridises the Harmony Search (HS) method with the concept of swarm intelligence in the particle swarm optimisation (PSO) to enhance its performance. In this article, a new optimisation algorithm called GHSACO is developed by incorporating the GHS with the Ant Colony Optimisation algorithm (ACO). Our method introduces a novel improvisation process, which is different from that of the GHS in the following aspects. (i) A modified harmony memory (HM) representation and conception. (ii) The use of a global random switching mechanism to monitor the choice between the ACO and GHS. (iii) An additional memory consideration selection rule using the ACO random proportional transition rule with a pheromone trail update mechanism. The proposed GHSACO algorithm has been applied to various benchmark functions and constrained optimisation problems. Simulation results demonstrate that it can find significantly better solutions when compared with the original HS and some of its variants.
Achieving optimal SERS through enhanced experimental design
Fisk, Heidi; Westley, Chloe; Turner, Nicholas J.
2016-01-01
One of the current limitations surrounding surface‐enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal‐based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd. PMID:27587905
Achieving optimal SERS through enhanced experimental design.
Fisk, Heidi; Westley, Chloe; Turner, Nicholas J; Goodacre, Royston
2016-01-01
One of the current limitations surrounding surface-enhanced Raman scattering (SERS) is the perceived lack of reproducibility. SERS is indeed challenging, and for analyte detection, it is vital that the analyte interacts with the metal surface. However, as this is analyte dependent, there is not a single set of SERS conditions that are universal. This means that experimental optimisation for optimum SERS response is vital. Most researchers optimise one factor at a time, where a single parameter is altered first before going onto optimise the next. This is a very inefficient way of searching the experimental landscape. In this review, we explore the use of more powerful multivariate approaches to SERS experimental optimisation based on design of experiments and evolutionary computational methods. We particularly focus on colloidal-based SERS rather than thin film preparations as a result of their popularity. © 2015 The Authors. Journal of Raman Spectroscopy published by John Wiley & Sons, Ltd.
Optimisation of cavity parameters for lasers based on AlGaInAsP/InP solid solutions (λ = 1470 nm)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veselov, D A; Ayusheva, K R; Shashkin, I S
2015-10-31
We have studied the effect of laser cavity parameters on the light–current characteristics of lasers based on the AlGaInAs/GaInAsP/InP solid solution system that emit in the spectral range 1400 – 1600 nm. It has been shown that optimisation of cavity parameters (chip length and front facet reflectivity) allows one to improve heat removal from the laser, without changing other laser characteristics. An increase in the maximum output optical power of the laser by 0.5 W has been demonstrated due to cavity design optimisation. (lasers)
Fractures in sport: Optimising their management and outcome
Robertson, Greg AJ; Wood, Alexander M
2015-01-01
Fractures in sport are a specialised cohort of fracture injuries, occurring in a high functioning population, in which the goals are rapid restoration of function and return to play with the minimal symptom profile possible. While the general principles of fracture management, namely accurate fracture reduction, appropriate immobilisation and timely rehabilitation, guide the treatment of these injuries, management of fractures in athletic populations can differ significantly from those in the general population, due to the need to facilitate a rapid return to high demand activities. However, despite fractures comprising up to 10% of all of sporting injuries, dedicated research into the management and outcome of sport-related fractures is limited. In order to assess the optimal methods of treating such injuries, and so allow optimisation of their outcome, the evidence for the management of each specific sport-related fracture type requires assessment and analysis. We present and review the current evidence directing management of fractures in athletes with an aim to promote valid innovative methods and optimise the outcome of such injuries. From this, key recommendations are provided for the management of the common fracture types seen in the athlete. Six case reports are also presented to illustrate the management planning and application of sport-focussed fracture management in the clinical setting. PMID:26716081
Almén, Anja; Båth, Magnus
2016-06-01
The overall aim of the present work was to develop a conceptual framework for managing radiation dose in diagnostic radiology with the intention to support optimisation. An optimisation process was first derived. The framework for managing radiation dose, based on the derived optimisation process, was then outlined. The outset of the optimisation process is four stages: providing equipment, establishing methodology, performing examinations and ensuring quality. The optimisation process comprises a series of activities and actions at these stages. The current system of diagnostic reference levels is an activity in the last stage, ensuring quality. The system becomes a reactive activity only to a certain extent engaging the core activity in the radiology department, performing examinations. Three reference dose levels-possible, expected and established-were assigned to the three stages in the optimisation process, excluding ensuring quality. A reasonably achievable dose range is also derived, indicating an acceptable deviation from the established dose level. A reasonable radiation dose for a single patient is within this range. The suggested framework for managing radiation dose should be regarded as one part of the optimisation process. The optimisation process constitutes a variety of complementary activities, where managing radiation dose is only one part. This emphasises the need to take a holistic approach integrating the optimisation process in different clinical activities. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Synthesis of concentric circular antenna arrays using dragonfly algorithm
NASA Astrophysics Data System (ADS)
Babayigit, B.
2018-05-01
Due to the strong non-linear relationship between the array factor and the array elements, concentric circular antenna array (CCAA) synthesis problem is challenging. Nature-inspired optimisation techniques have been playing an important role in solving array synthesis problems. Dragonfly algorithm (DA) is a novel nature-inspired optimisation technique which is based on the static and dynamic swarming behaviours of dragonflies in nature. This paper presents the design of CCAAs to get low sidelobes using DA. The effectiveness of the proposed DA is investigated in two different (with and without centre element) cases of two three-ring (having 4-, 6-, 8-element or 8-, 10-, 12-element) CCAA design. The radiation pattern of each design cases is obtained by finding optimal excitation weights of the array elements using DA. Simulation results show that the proposed algorithm outperforms the other state-of-the-art techniques (symbiotic organisms search, biogeography-based optimisation, sequential quadratic programming, opposition-based gravitational search algorithm, cat swarm optimisation, firefly algorithm, evolutionary programming) for all design cases. DA can be a promising technique for electromagnetic problems.
UAV path planning using artificial potential field method updated by optimal control theory
NASA Astrophysics Data System (ADS)
Chen, Yong-bo; Luo, Guan-chen; Mei, Yue-song; Yu, Jian-qiao; Su, Xiao-long
2016-04-01
The unmanned aerial vehicle (UAV) path planning problem is an important assignment in the UAV mission planning. Based on the artificial potential field (APF) UAV path planning method, it is reconstructed into the constrained optimisation problem by introducing an additional control force. The constrained optimisation problem is translated into the unconstrained optimisation problem with the help of slack variables in this paper. The functional optimisation method is applied to reform this problem into an optimal control problem. The whole transformation process is deduced in detail, based on a discrete UAV dynamic model. Then, the path planning problem is solved with the help of the optimal control method. The path following process based on the six degrees of freedom simulation model of the quadrotor helicopters is introduced to verify the practicability of this method. Finally, the simulation results show that the improved method is more effective in planning path. In the planning space, the length of the calculated path is shorter and smoother than that using traditional APF method. In addition, the improved method can solve the dead point problem effectively.
Thermal buckling optimisation of composite plates using firefly algorithm
NASA Astrophysics Data System (ADS)
Kamarian, S.; Shakeri, M.; Yas, M. H.
2017-07-01
Composite plates play a very important role in engineering applications, especially in aerospace industry. Thermal buckling of such components is of great importance and must be known to achieve an appropriate design. This paper deals with stacking sequence optimisation of laminated composite plates for maximising the critical buckling temperature using a powerful meta-heuristic algorithm called firefly algorithm (FA) which is based on the flashing behaviour of fireflies. The main objective of present work was to show the ability of FA in optimisation of composite structures. The performance of FA is compared with the results reported in the previous published works using other algorithms which shows the efficiency of FA in stacking sequence optimisation of laminated composite structures.
Distributed convex optimisation with event-triggered communication in networked systems
NASA Astrophysics Data System (ADS)
Liu, Jiayun; Chen, Weisheng
2016-12-01
This paper studies the distributed convex optimisation problem over directed networks. Motivated by practical considerations, we propose a novel distributed zero-gradient-sum optimisation algorithm with event-triggered communication. Therefore, communication and control updates just occur at discrete instants when some predefined condition satisfies. Thus, compared with the time-driven distributed optimisation algorithms, the proposed algorithm has the advantages of less energy consumption and less communication cost. Based on Lyapunov approaches, we show that the proposed algorithm makes the system states asymptotically converge to the solution of the problem exponentially fast and the Zeno behaviour is excluded. Finally, simulation example is given to illustrate the effectiveness of the proposed algorithm.
Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry
2016-01-01
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471
Single tube genotyping of sickle cell anaemia using PCR-based SNP analysis.
Waterfall, C M; Cobb, B D
2001-12-01
Allele-specific amplification (ASA) is a generally applicable technique for the detection of known single nucleotide polymorphisms (SNPs), deletions, insertions and other sequence variations. Conventionally, two reactions are required to determine the zygosity of DNA in a two-allele system, along with significant upstream optimisation to define the specific test conditions. Here, we combine single tube bi-directional ASA with a 'matrix-based' optimisation strategy, speeding up the whole process in a reduced reaction set. We use sickle cell anaemia as our model SNP system, a genetic disease that is currently screened using ASA methods. Discriminatory conditions were rapidly optimised enabling the unambiguous identification of DNA from homozygous sickle cell patients (HbS/S), heterozygous carriers (HbA/S) or normal DNA in a single tube. Simple downstream mathematical analyses based on product yield across the optimisation set allow an insight into the important aspects of priming competition and component interactions in this competitive PCR. This strategy can be applied to any polymorphism, defining specific conditions using a multifactorial approach. The inherent simplicity and low cost of this PCR-based method validates bi-directional ASA as an effective tool in future clinical screening and pharmacogenomic research where more expensive fluorescence-based approaches may not be desirable.
Person-centred medicines optimisation policy in England: an agenda for research on polypharmacy.
Heaton, Janet; Britten, Nicky; Krska, Janet; Reeve, Joanne
2017-01-01
Aim To examine how patient perspectives and person-centred care values have been represented in documents on medicines optimisation policy in England. There has been growing support in England for a policy of medicines optimisation as a response to the rise of problematic polypharmacy. Conceptually, medicines optimisation differs from the medicines management model of prescribing in being based around the patient rather than processes and systems. This critical examination of current official and independent policy documents questions how central the patient is in them and whether relevant evidence has been utilised in their development. A documentary analysis of reports on medicines optimisation published by the Royal Pharmaceutical Society (RPS), The King's Fund and National Institute for Health and Social Care Excellence since 2013. The analysis draws on a non-systematic review of research on patient experiences of using medicines. Findings The reports varied in their inclusion of patient perspectives and person-centred care values, and in the extent to which they drew on evidence from research on patients' experiences of polypharmacy and medicines use. In the RPS report, medicines optimisation is represented as being a 'step change' from medicines management, in contrast to the other documents which suggest that it is facilitated by the systems and processes that comprise the latter model. Only The King's Fund report considered evidence from qualitative studies of people's use of medicines. However, these studies are not without their limitations. We suggest five ways in which researchers could improve this evidence base and so inform the development of future policy: by facilitating reviews of existing research; conducting studies of patient experiences of polypharmacy and multimorbidity; evaluating medicines optimisation interventions; making better use of relevant theories, concepts and tools; and improving patient and public involvement in research and in guideline development.
NASA Astrophysics Data System (ADS)
Kharbouch, Yassine; Mimet, Abdelaziz; El Ganaoui, Mohammed; Ouhsaine, Lahoucine
2018-07-01
This study investigates the thermal energy potentials and economic feasibility of an air-conditioned family household-integrated phase change material (PCM) considering different climate zones in Morocco. A simulation-based optimisation was carried out in order to define the optimal design of a PCM-enhanced household envelope for thermal energy effectiveness and cost-effectiveness of predefined candidate solutions. The optimisation methodology is based on coupling Energyplus® as a dynamic simulation tool and GenOpt® as an optimisation tool. Considering the obtained optimum design strategies, a thermal energy and economic analysis are carried out to investigate PCMs' integration feasibility in the Moroccan constructions. The results show that the PCM-integrated household envelope allows minimising the cooling/heating thermal energy demand vs. a reference household without PCM. While for the cost-effectiveness optimisation, it has been deduced that the economic feasibility is stilling insufficient under the actual PCM market conditions. The optimal design parameters results are also analysed.
Evolving optimised decision rules for intrusion detection using particle swarm paradigm
NASA Astrophysics Data System (ADS)
Sivatha Sindhu, Siva S.; Geetha, S.; Kannan, A.
2012-12-01
The aim of this article is to construct a practical intrusion detection system (IDS) that properly analyses the statistics of network traffic pattern and classify them as normal or anomalous class. The objective of this article is to prove that the choice of effective network traffic features and a proficient machine-learning paradigm enhances the detection accuracy of IDS. In this article, a rule-based approach with a family of six decision tree classifiers, namely Decision Stump, C4.5, Naive Baye's Tree, Random Forest, Random Tree and Representative Tree model to perform the detection of anomalous network pattern is introduced. In particular, the proposed swarm optimisation-based approach selects instances that compose training set and optimised decision tree operate over this trained set producing classification rules with improved coverage, classification capability and generalisation ability. Experiment with the Knowledge Discovery and Data mining (KDD) data set which have information on traffic pattern, during normal and intrusive behaviour shows that the proposed algorithm produces optimised decision rules and outperforms other machine-learning algorithm.
NASA Astrophysics Data System (ADS)
Grundmann, J.; Schütze, N.; Heck, V.
2014-09-01
Groundwater systems in arid coastal regions are particularly at risk due to limited potential for groundwater replenishment and increasing water demand, caused by a continuously growing population. For ensuring a sustainable management of those regions, we developed a new simulation-based integrated water management system. The management system unites process modelling with artificial intelligence tools and evolutionary optimisation techniques for managing both water quality and water quantity of a strongly coupled groundwater-agriculture system. Due to the large number of decision variables, a decomposition approach is applied to separate the original large optimisation problem into smaller, independent optimisation problems which finally allow for faster and more reliable solutions. It consists of an analytical inner optimisation loop to achieve a most profitable agricultural production for a given amount of water and an outer simulation-based optimisation loop to find the optimal groundwater abstraction pattern. Thereby, the behaviour of farms is described by crop-water-production functions and the aquifer response, including the seawater interface, is simulated by an artificial neural network. The methodology is applied exemplarily for the south Batinah re-gion/Oman, which is affected by saltwater intrusion into a coastal aquifer system due to excessive groundwater withdrawal for irrigated agriculture. Due to contradicting objectives like profit-oriented agriculture vs aquifer sustainability, a multi-objective optimisation is performed which can provide sustainable solutions for water and agricultural management over long-term periods at farm and regional scales in respect of water resources, environment, and socio-economic development.
Nurse strategies for optimising patient participation in nursing care.
Sahlsten, Monika J M; Larsson, Inga E; Sjöström, Björn; Plos, Kaety A E
2009-09-01
THE STUDY'S RATIONALE: Patient participation is an essential factor in nursing care and medical treatment and a legal right in many countries. Despite this, patients have experienced insufficient participation, inattention and neglect regarding their problems and may respond with dependence, passivity or taciturnity. Accordingly, nurses strategies for optimising patient participation in nursing care is an important question for the nursing profession. The aim was to explore Registered Nurses' strategies to stimulate and optimise patient participation in nursing care. The objective was to identify ward nurses' supporting practices. A qualitative research approach was applied. Three focus groups with experienced Registered Nurses providing inpatient somatic care (n = 16) were carried out. These nurses were recruited from three hospitals in West Sweden. The data were analysed using content analysis technique. The ethics of scientific work was adhered to. According to national Swedish legislation, no formal permit from an ethics committee was required. The participants gave informed consent after verbal and written information. Nurse strategies for optimising patient participation in nursing care were identified as three categories: 'Building close co-operation', 'Getting to know the person' and 'Reinforcing self-care capacity' and their 10 subcategories. The strategies point to a process of emancipation of the patient's potential by finding his/her own inherent knowledge, values, motivation and goals and linking these to actions. Nurses need to strive for guiding the patient towards attaining meaningful experiences, discoveries, learning and development. The strategies are important and useful to balance the asymmetry in the nurse-patient relationship in daily nursing practice and also in quality assurance to evaluate and improve patient participation and in education. However, further verification of the findings is recommended by means of replication or other studies in different clinical settings. © 2009 The Authors. Journal compilation © 2009 Nordic College of Caring Science.
Echocardiography and cardiac resynchronisation therapy, friends or foes?
van Everdingen, W M; Schipper, J C; van 't Sant, J; Ramdat Misier, K; Meine, M; Cramer, M J
2016-01-01
Echocardiography is used in cardiac resynchronisation therapy (CRT) to assess cardiac function, and in particular left ventricular (LV) volumetric status, and prediction of response. Despite its widespread applicability, LV volumes determined by echocardiography have inherent measurement errors, interobserver and intraobserver variability, and discrepancies with the gold standard magnetic resonance imaging. Echocardiographic predictors of CRT response are based on mechanical dyssynchrony. However, parameters are mainly tested in single-centre studies or lack feasibility. Speckle tracking echocardiography can guide LV lead placement, improving volumetric response and clinical outcome by guiding lead positioning towards the latest contracting segment. Results on optimisation of CRT device settings using echocardiographic indices have so far been rather disappointing, as results suffer from noise. Defining response by echocardiography seems valid, although re-assessment after 6 months is advisable, as patients can show both continuous improvement as well as deterioration after the initial response. Three-dimensional echocardiography is interesting for future implications, as it can determine volume, dyssynchrony and viability in a single recording, although image quality needs to be adequate. Deformation patterns from the septum and the derived parameters are promising, although validation in a multicentre trial is required. We conclude that echocardiography has a pivotal role in CRT, although clinicians should know its shortcomings.
Optimisation of wire-cut EDM process parameter by Grey-based response surface methodology
NASA Astrophysics Data System (ADS)
Kumar, Amit; Soota, Tarun; Kumar, Jitendra
2018-03-01
Wire electric discharge machining (WEDM) is one of the advanced machining processes. Response surface methodology coupled with Grey relation analysis method has been proposed and used to optimise the machining parameters of WEDM. A face centred cubic design is used for conducting experiments on high speed steel (HSS) M2 grade workpiece material. The regression model of significant factors such as pulse-on time, pulse-off time, peak current, and wire feed is considered for optimising the responses variables material removal rate (MRR), surface roughness and Kerf width. The optimal condition of the machining parameter was obtained using the Grey relation grade. ANOVA is applied to determine significance of the input parameters for optimising the Grey relation grade.
Buwembo, William; Munabi, Ian G; Galukande, Moses; Kituuka, Olivia; Luboga, Samuel A
2014-01-01
The ever increasing demand for surgical services in sub-Saharan Africa is creating a need to increase the number of health workers able to provide surgical care. This calls for the optimisation of all available human resources to provide universal access to essential and emergency surgical services. One way of optimising already scarce human resources for health is by clarifying job descriptions to guide the scope of practice, measuring rewards/benefits for the health workers providing surgical care, and informing education and training for health professionals. This study set out to determine the scope of the mandate to perform surgical procedures in current job descriptions of surgical care health professionals in Uganda. A document review was conducted of job descriptions for the health professionals responsible for surgical service delivery in the Ugandan Health care system. The job descriptions were extracted and subjected to a qualitative content data analysis approach using a text based RQDA package of the open source R statistical computing software. It was observed that there was no explicit mention of assignment of delivery of surgical services to a particular cadre. Instead the bulk of direct patient related care, including surgical attention, was assigned to the lower cadres, in particular the medical officer. Senior cadres were assigned to perform predominantly advisory and managerial roles in the health care system. In addition, a no cost opportunity to task shift surgical service delivery to the senior clinical officers was identified. There is a need to specifically assign the mandate to provide surgical care tasks, according to degree of complexity, to adequately trained cadres of health workers. Health professionals' current job descriptions are not explicit, and therefore do not adequately support proper training, deployment, defined scope of practice, and remuneration for equitable surgical service delivery in Uganda. Such deliberate assignment of mandates will provide a means of increasing surgical service delivery through further optimisation of the available human resources for health.
Garner, Alan A; van den Berg, Pieter L
2017-10-16
New South Wales (NSW), Australia has a network of multirole retrieval physician staffed helicopter emergency medical services (HEMS) with seven bases servicing a jurisdiction with population concentrated along the eastern seaboard. The aim of this study was to estimate optimal HEMS base locations within NSW using advanced mathematical modelling techniques. We used high resolution census population data for NSW from 2011 which divides the state into areas containing 200-800 people. Optimal HEMS base locations were estimated using the maximal covering location problem facility location optimization model and the average response time model, exploring the number of bases needed to cover various fractions of the population for a 45 min response time threshold or minimizing the overall average response time to all persons, both in green field scenarios and conditioning on the current base structure. We also developed a hybrid mathematical model where average response time was optimised based on minimum population coverage thresholds. Seven bases could cover 98% of the population within 45mins when optimised for coverage or reach the entire population of the state within an average of 21mins if optimised for response time. Given the existing bases, adding two bases could either increase the 45 min coverage from 91% to 97% or decrease the average response time from 21mins to 19mins. Adding a single specialist prehospital rapid response HEMS to the area of greatest population concentration decreased the average state wide response time by 4mins. The optimum seven base hybrid model that was able to cover 97.75% of the population within 45mins, and all of the population in an average response time of 18 mins included the rapid response HEMS model. HEMS base locations can be optimised based on either percentage of the population covered, or average response time to the entire population. We have also demonstrated a hybrid technique that optimizes response time for a given number of bases and minimum defined threshold of population coverage. Addition of specialized rapid response HEMS services to a system of multirole retrieval HEMS may reduce overall average response times by improving access in large urban areas.
Power law-based local search in spider monkey optimisation for lower order system modelling
NASA Astrophysics Data System (ADS)
Sharma, Ajay; Sharma, Harish; Bhargava, Annapurna; Sharma, Nirmala
2017-01-01
The nature-inspired algorithms (NIAs) have shown efficiency to solve many complex real-world optimisation problems. The efficiency of NIAs is measured by their ability to find adequate results within a reasonable amount of time, rather than an ability to guarantee the optimal solution. This paper presents a solution for lower order system modelling using spider monkey optimisation (SMO) algorithm to obtain a better approximation for lower order systems and reflects almost original higher order system's characteristics. Further, a local search strategy, namely, power law-based local search is incorporated with SMO. The proposed strategy is named as power law-based local search in SMO (PLSMO). The efficiency, accuracy and reliability of the proposed algorithm is tested over 20 well-known benchmark functions. Then, the PLSMO algorithm is applied to solve the lower order system modelling problem.
Zarb, Francis; McEntee, Mark F; Rainford, Louise
2015-06-01
To evaluate visual grading characteristics (VGC) and ordinal regression analysis during head CT optimisation as a potential alternative to visual grading assessment (VGA), traditionally employed to score anatomical visualisation. Patient images (n = 66) were obtained using current and optimised imaging protocols from two CT suites: a 16-slice scanner at the national Maltese centre for trauma and a 64-slice scanner in a private centre. Local resident radiologists (n = 6) performed VGA followed by VGC and ordinal regression analysis. VGC alone indicated that optimised protocols had similar image quality as current protocols. Ordinal logistic regression analysis provided an in-depth evaluation, criterion by criterion allowing the selective implementation of the protocols. The local radiology review panel supported the implementation of optimised protocols for brain CT examinations (including trauma) in one centre, achieving radiation dose reductions ranging from 24 % to 36 %. In the second centre a 29 % reduction in radiation dose was achieved for follow-up cases. The combined use of VGC and ordinal logistic regression analysis led to clinical decisions being taken on the implementation of the optimised protocols. This improved method of image quality analysis provided the evidence to support imaging protocol optimisation, resulting in significant radiation dose savings. • There is need for scientifically based image quality evaluation during CT optimisation. • VGC and ordinal regression analysis in combination led to better informed clinical decisions. • VGC and ordinal regression analysis led to dose reductions without compromising diagnostic efficacy.
Melesina, Jelena; Robaa, Dina; Pierce, Raymond J; Romier, Christophe; Sippl, Wolfgang
2015-11-01
Histone deacetylases (HDACs) are promising epigenetic targets for the treatment of various diseases, including cancer and neurodegenerative disorders. There is evidence that they can also be addressed to treat parasitic infections. Recently, the first X-ray structure of a parasite HDAC was published, Schistosoma mansoni HDAC8, giving structural insights into its inhibition. However, most of the targets from parasites of interest still lack this structural information. Therefore, we prepared homology models of relevant parasitic HDACs and compared them to human and S. mansoni HDACs. The information about known S. mansoni HDAC8 inhibitors and compounds that affect the growth of Trypanosoma, Leishmania and Plasmodium species was used to validate the models by docking and molecular dynamics studies. Our results provide analysis of structural features of parasitic HDACs and should be helpful for selecting promising candidates for biological testing and for structure-based optimisation of parasite-specific inhibitors. Copyright © 2015 Elsevier Inc. All rights reserved.
Monte-Carlo-based uncertainty propagation with hierarchical models—a case study in dynamic torque
NASA Astrophysics Data System (ADS)
Klaus, Leonard; Eichstädt, Sascha
2018-04-01
For a dynamic calibration, a torque transducer is described by a mechanical model, and the corresponding model parameters are to be identified from measurement data. A measuring device for the primary calibration of dynamic torque, and a corresponding model-based calibration approach, have recently been developed at PTB. The complete mechanical model of the calibration set-up is very complex, and involves several calibration steps—making a straightforward implementation of a Monte Carlo uncertainty evaluation tedious. With this in mind, we here propose to separate the complete model into sub-models, with each sub-model being treated with individual experiments and analysis. The uncertainty evaluation for the overall model then has to combine the information from the sub-models in line with Supplement 2 of the Guide to the Expression of Uncertainty in Measurement. In this contribution, we demonstrate how to carry this out using the Monte Carlo method. The uncertainty evaluation involves various input quantities of different origin and the solution of a numerical optimisation problem.
Single tube genotyping of sickle cell anaemia using PCR-based SNP analysis
Waterfall, Christy M.; Cobb, Benjamin D.
2001-01-01
Allele-specific amplification (ASA) is a generally applicable technique for the detection of known single nucleotide polymorphisms (SNPs), deletions, insertions and other sequence variations. Conventionally, two reactions are required to determine the zygosity of DNA in a two-allele system, along with significant upstream optimisation to define the specific test conditions. Here, we combine single tube bi-directional ASA with a ‘matrix-based’ optimisation strategy, speeding up the whole process in a reduced reaction set. We use sickle cell anaemia as our model SNP system, a genetic disease that is currently screened using ASA methods. Discriminatory conditions were rapidly optimised enabling the unambiguous identification of DNA from homozygous sickle cell patients (HbS/S), heterozygous carriers (HbA/S) or normal DNA in a single tube. Simple downstream mathematical analyses based on product yield across the optimisation set allow an insight into the important aspects of priming competition and component interactions in this competitive PCR. This strategy can be applied to any polymorphism, defining specific conditions using a multifactorial approach. The inherent simplicity and low cost of this PCR-based method validates bi-directional ASA as an effective tool in future clinical screening and pharmacogenomic research where more expensive fluorescence-based approaches may not be desirable. PMID:11726702
Advanced treatment planning using direct 4D optimisation for pencil-beam scanned particle therapy
NASA Astrophysics Data System (ADS)
Bernatowicz, Kinga; Zhang, Ye; Perrin, Rosalind; Weber, Damien C.; Lomax, Antony J.
2017-08-01
We report on development of a new four-dimensional (4D) optimisation approach for scanned proton beams, which incorporates both irregular motion patterns and the delivery dynamics of the treatment machine into the plan optimiser. Furthermore, we assess the effectiveness of this technique to reduce dose to critical structures in proximity to moving targets, while maintaining effective target dose homogeneity and coverage. The proposed approach has been tested using both a simulated phantom and a clinical liver cancer case, and allows for realistic 4D calculations and optimisation using irregular breathing patterns extracted from e.g. 4DCT-MRI (4D computed tomography-magnetic resonance imaging). 4D dose distributions resulting from our 4D optimisation can achieve almost the same quality as static plans, independent of the studied geometry/anatomy or selected motion (regular and irregular). Additionally, current implementation of the 4D optimisation approach requires less than 3 min to find the solution for a single field planned on 4DCT of a liver cancer patient. Although 4D optimisation allows for realistic calculations using irregular breathing patterns, it is very sensitive to variations from the planned motion. Based on a sensitivity analysis, target dose homogeneity comparable to static plans (D5-D95 <5%) has been found only for differences in amplitude of up to 1 mm, for changes in respiratory phase <200 ms and for changes in the breathing period of <20 ms in comparison to the motions used during optimisation. As such, methods to robustly deliver 4D optimised plans employing 4D intensity-modulated delivery are discussed.
Basu, Anindya; Chen, Wei Ning; Leong, Susanna Su Jan
2011-04-01
The hepatitis B virus X (HBx) protein is well known for its role in hepatitis B virus infection that often leads to hepatocellular carcinoma. Despite the clinical importance of HBx, there is little progress in anti-HBx drug development strategies due to shortage of HBx from native sources. Consistent expression of HBx as insoluble inclusion bodies within various expression systems has largely hindered HBx manufacturing via economical biosynthesis routes. Confronted by this roadblock, this study aims to quantitatively understand HBx protein behaviour in solution that will guide the rational development of a refolding-based bioprocess for HBx production. Second virial coefficient (SVC) measurements were employed to study the effects of varying physicochemical parameters on HBx intermolecular protein interaction. The SVC results suggest that covalent HBx aggregates play a key role in protein destabilisation during refolding. The use of an SVC-optimised refolding environment yielded bioactive and soluble HBx proteins from the denatured-reduced inclusion body state. This study provides new knowledge on HBx solubility behaviour in vitro, which is important in structure-function elucidation behaviour of this hydrophobic protein. Importantly, a rational refolding-based Escherichia coli bioprocess that can deliver purified and soluble HBx at large scale is successfully developed, which opens the way for rapid preparation of soluble HBx for further clinical and characterisation studies.
Non-contact Pressure-based Sleep/Wake Discrimination
Walsh, Lorcan; McLoone, Seán; Ronda, Joseph; Duffy, Jeanne F.; Czeisler, Charles A.
2016-01-01
Poor sleep is increasingly being recognised as an important prognostic parameter of health. For those with suspected sleep disorders, patients are referred to sleep clinics which guide treatment. However, sleep clinics are not always a viable option due to their high cost, a lack of experienced practitioners, lengthy waiting lists and an unrepresentative sleeping environment. A home-based non-contact sleep/wake monitoring system may be used as a guide for treatment potentially stratifying patients by clinical need or highlighting longitudinal changes in sleep and nocturnal patterns. This paper presents the evaluation of an under-mattress sleep monitoring system for non-contact sleep/wake discrimination. A large dataset of sensor data with concomitant sleep/wake state was collected from both younger and older adults participating in a circadian sleep study. A thorough training/testing/validation procedure was configured and optimised feature extraction and sleep/wake discrimination algorithms evaluated both within and across the two cohorts. An accuracy, sensitivity and specificity of 74.3%, 95.5%, and 53.2% is reported over all subjects using an external validation dataset (71.9%, 87.9% and 56%, and 77.5%, 98% and 57% is reported for younger and older subjects respectively). These results compare favourably with similar research, however this system provides an ambient alternative suitable for long term continuous sleep monitoring, particularly amongst vulnerable populations. PMID:27845651
NASA Astrophysics Data System (ADS)
Wang, Congsi; Wang, Yan; Wang, Zhihai; Wang, Meng; Yuan, Shuai; Wang, Weifeng
2018-04-01
It is well known that calculating and reducing of radar cross section (RCS) of the active phased array antenna (APAA) are both difficult and complicated. It remains unresolved to balance the performance of the radiating and scattering when the RCS is reduced. Therefore, this paper develops a structure and scattering array factor coupling model of APAA based on the phase errors of radiated elements generated by structural distortion and installation error of the array. To obtain the optimal radiating and scattering performance, an integrated optimisation model is built to optimise the installation height of all the radiated elements in normal direction of the array, in which the particle swarm optimisation method is adopted and the gain loss and scattering array factor are selected as the fitness function. The simulation indicates that the proposed coupling model and integrated optimisation method can effectively decrease the RCS and that the necessary radiating performance can be simultaneously guaranteed, which demonstrate an important application value in engineering design and structural evaluation of APAA.
Andrighetto, Luke M; Stevenson, Paul G; Pearson, James R; Henderson, Luke C; Conlan, Xavier A
2014-11-01
In-silico optimised two-dimensional high performance liquid chromatographic (2D-HPLC) separations of a model methamphetamine seizure sample are described, where an excellent match between simulated and real separations was observed. Targeted separation of model compounds was completed with significantly reduced method development time. This separation was completed in the heart-cutting mode of 2D-HPLC where C18 columns were used in both dimensions taking advantage of the selectivity difference of methanol and acetonitrile as the mobile phases. This method development protocol is most significant when optimising the separation of chemically similar chemical compounds as it eliminates potentially hours of trial and error injections to identify the optimised experimental conditions. After only four screening injections the gradient profile for both 2D-HPLC dimensions could be optimised via simulations, ensuring the baseline resolution of diastereomers (ephedrine and pseudoephedrine) in 9.7 min. Depending on which diastereomer is present the potential synthetic pathway can be categorized.
ICRP publication 121: radiological protection in paediatric diagnostic and interventional radiology.
Khong, P-L; Ringertz, H; Donoghue, V; Frush, D; Rehani, M; Appelgate, K; Sanchez, R
2013-04-01
Paediatric patients have a higher average risk of developing cancer compared with adults receiving the same dose. The longer life expectancy in children allows more time for any harmful effects of radiation to manifest, and developing organs and tissues are more sensitive to the effects of radiation. This publication aims to provide guiding principles of radiological protection for referring clinicians and clinical staff performing diagnostic imaging and interventional procedures for paediatric patients. It begins with a brief description of the basic concepts of radiological protection, followed by the general aspects of radiological protection, including principles of justification and optimisation. Guidelines and suggestions for radiological protection in specific modalities - radiography and fluoroscopy, interventional radiology, and computed tomography - are subsequently covered in depth. The report concludes with a summary and recommendations. The importance of rigorous justification of radiological procedures is emphasised for every procedure involving ionising radiation, and the use of imaging modalities that are non-ionising should always be considered. The basic aim of optimisation of radiological protection is to adjust imaging parameters and institute protective measures such that the required image is obtained with the lowest possible dose of radiation, and that net benefit is maximised to maintain sufficient quality for diagnostic interpretation. Special consideration should be given to the availability of dose reduction measures when purchasing new imaging equipment for paediatric use. One of the unique aspects of paediatric imaging is with regards to the wide range in patient size (and weight), therefore requiring special attention to optimisation and modification of equipment, technique, and imaging parameters. Examples of good radiographic and fluoroscopic technique include attention to patient positioning, field size and adequate collimation, use of protective shielding, optimisation of exposure factors, use of pulsed fluoroscopy, limiting fluoroscopy time, etc. Major paediatric interventional procedures should be performed by experienced paediatric interventional operators, and a second, specific level of training in radiological protection is desirable (in some countries, this is mandatory). For computed tomography, dose reduction should be optimised by the adjustment of scan parameters (such as mA, kVp, and pitch) according to patient weight or age, region scanned, and study indication (e.g. images with greater noise should be accepted if they are of sufficient diagnostic quality). Other strategies include restricting multiphase examination protocols, avoiding overlapping of scan regions, and only scanning the area in question. Up-to-date dose reduction technology such as tube current modulation, organ-based dose modulation, auto kV technology, and iterative reconstruction should be utilised when appropriate. It is anticipated that this publication will assist institutions in encouraging the standardisation of procedures, and that it may help increase awareness and ultimately improve practices for the benefit of patients. Copyright © 2012. Published by Elsevier Ltd.
Ceberio, Josu; Calvo, Borja; Mendiburu, Alexander; Lozano, Jose A
2018-02-15
In the last decade, many works in combinatorial optimisation have shown that, due to the advances in multi-objective optimisation, the algorithms from this field could be used for solving single-objective problems as well. In this sense, a number of papers have proposed multi-objectivising single-objective problems in order to use multi-objective algorithms in their optimisation. In this article, we follow up this idea by presenting a methodology for multi-objectivising combinatorial optimisation problems based on elementary landscape decompositions of their objective function. Under this framework, each of the elementary landscapes obtained from the decomposition is considered as an independent objective function to optimise. In order to illustrate this general methodology, we consider four problems from different domains: the quadratic assignment problem and the linear ordering problem (permutation domain), the 0-1 unconstrained quadratic optimisation problem (binary domain), and the frequency assignment problem (integer domain). We implemented two widely known multi-objective algorithms, NSGA-II and SPEA2, and compared their performance with that of a single-objective GA. The experiments conducted on a large benchmark of instances of the four problems show that the multi-objective algorithms clearly outperform the single-objective approaches. Furthermore, a discussion on the results suggests that the multi-objective space generated by this decomposition enhances the exploration ability, thus permitting NSGA-II and SPEA2 to obtain better results in the majority of the tested instances.
Automatic optimisation of gamma dose rate sensor networks: The DETECT Optimisation Tool
NASA Astrophysics Data System (ADS)
Helle, K. B.; Müller, T. O.; Astrup, P.; Dyve, J. E.
2014-05-01
Fast delivery of comprehensive information on the radiological situation is essential for decision-making in nuclear emergencies. Most national radiological agencies in Europe employ gamma dose rate sensor networks to monitor radioactive pollution of the atmosphere. Sensor locations were often chosen using regular grids or according to administrative constraints. Nowadays, however, the choice can be based on more realistic risk assessment, as it is possible to simulate potential radioactive plumes. To support sensor planning, we developed the DETECT Optimisation Tool (DOT) within the scope of the EU FP 7 project DETECT. It evaluates the gamma dose rates that a proposed set of sensors might measure in an emergency and uses this information to optimise the sensor locations. The gamma dose rates are taken from a comprehensive library of simulations of atmospheric radioactive plumes from 64 source locations. These simulations cover the whole European Union, so the DOT allows evaluation and optimisation of sensor networks for all EU countries, as well as evaluation of fencing sensors around possible sources. Users can choose from seven cost functions to evaluate the capability of a given monitoring network for early detection of radioactive plumes or for the creation of dose maps. The DOT is implemented as a stand-alone easy-to-use JAVA-based application with a graphical user interface and an R backend. Users can run evaluations and optimisations, and display, store and download the results. The DOT runs on a server and can be accessed via common web browsers; it can also be installed locally.
Alejo, L; Corredoira, E; Sánchez-Muñoz, F; Huerga, C; Aza, Z; Plaza-Núñez, R; Serrada, A; Bret-Zurita, M; Parrón, M; Prieto-Areyano, C; Garzón-Moll, G; Madero, R; Guibelalde, E
2018-04-09
Objective: The new 2013/59 EURATOM Directive (ED) demands dosimetric optimisation procedures without undue delay. The aim of this study was to optimise paediatric conventional radiology examinations applying the ED without compromising the clinical diagnosis. Automatic dose management software (ADMS) was used to analyse 2678 studies of children from birth to 5 years of age, obtaining local diagnostic reference levels (DRLs) in terms of entrance surface air kerma. Given local DRL for infants and chest examinations exceeded the European Commission (EC) DRL, an optimisation was performed decreasing the kVp and applying the automatic control exposure. To assess the image quality, an analysis of high-contrast resolution (HCSR), signal-to-noise ratio (SNR) and figure of merit (FOM) was performed, as well as a blind test based on the generalised estimating equations method. For newborns and chest examinations, the local DRL exceeded the EC DRL by 113%. After the optimisation, a reduction of 54% was obtained. No significant differences were found in the image quality blind test. A decrease in SNR (-37%) and HCSR (-68%), and an increase in FOM (42%), was observed. ADMS allows the fast calculation of local DRLs and the performance of optimisation procedures in babies without delay. However, physical and clinical analyses of image quality remain to be needed to ensure the diagnostic integrity after the optimisation process. Advances in knowledge: ADMS are useful to detect radiation protection problems and to perform optimisation procedures in paediatric conventional imaging without undue delay, as ED requires.
In situ click chemistry: a powerful means for lead discovery.
Sharpless, K Barry; Manetsch, Roman
2006-11-01
Combinatorial chemistry and parallel synthesis are important and regularly applied tools for lead identification and optimisation, although they are often accompanied by challenges related to the efficiency of library synthesis and the purity of the compound library. In the last decade, novel means of lead discovery approaches have been investigated where the biological target is actively involved in the synthesis of its own inhibitory compound. These fragment-based approaches, also termed target-guided synthesis (TGS), show great promise in lead discovery applications by combining the synthesis and screening of libraries of low molecular weight compounds in a single step. Of all the TGS methods, the kinetically controlled variant is the least well known, but it has the potential to emerge as a reliable lead discovery method. The kinetically controlled TGS approach, termed in situ click chemistry, is discussed in this article.
Posttraumatic growth in post-surgical coronary artery bypass graft patients
Waight, Catherine A; Sheridan, Judith; Tesar, Peter
2015-01-01
Recent research in posttraumatic growth has been applied to people with life-threatening illnesses to optimise recovery. There is a lack of research exploring posttraumatic growth in coronary artery bypass graft patients. This article describes the recovery experience of 14 coronary artery bypass graft patients (13 males and 1 female) at their first outpatient review post-surgery. Grounded theory analysis was used to develop a model of distinct and shared pathways to growth depending on whether patients were symptomatic or asymptomatic pre-coronary artery bypass graft. Outcomes of posttraumatic growth in this sample included action-based healthy lifestyle growth and two forms of cognitive growth: appreciation of life and new possibilities. The model of posttraumatic growth developed in this study may be helpful in guiding future research into promoting posttraumatic growth and behaviour change in coronary artery bypass graft patients. PMID:28070351
An improved design method based on polyphase components for digital FIR filters
NASA Astrophysics Data System (ADS)
Kumar, A.; Kuldeep, B.; Singh, G. K.; Lee, Heung No
2017-11-01
This paper presents an efficient design of digital finite impulse response (FIR) filter, based on polyphase components and swarm optimisation techniques (SOTs). For this purpose, the design problem is formulated as mean square error between the actual response and ideal response in frequency domain using polyphase components of a prototype filter. To achieve more precise frequency response at some specified frequency, fractional derivative constraints (FDCs) have been applied, and optimal FDCs are computed using SOTs such as cuckoo search and modified cuckoo search algorithms. A comparative study of well-proved swarm optimisation, called particle swarm optimisation and artificial bee colony algorithm is made. The excellence of proposed method is evaluated using several important attributes of a filter. Comparative study evidences the excellence of proposed method for effective design of FIR filter.
O'Brien, Rosaleen; Fitzpatrick, Bridie; Higgins, Maria; Guthrie, Bruce; Watt, Graham; Wyke, Sally
2016-01-01
Objectives To develop and optimise a primary care-based complex intervention (CARE Plus) to enhance the quality of life of patients with multimorbidity in the deprived areas. Methods Six co-design discussion groups involving 32 participants were held separately with multimorbid patients from the deprived areas, voluntary organisations, general practitioners and practice nurses working in the deprived areas. This was followed by piloting in two practices and further optimisation based on interviews with 11 general practitioners, 2 practice nurses and 6 participating multimorbid patients. Results Participants endorsed the need for longer consultations, relational continuity and a holistic approach. All felt that training and support of the health care staff was important. Most participants welcomed the idea of additional self-management support, though some practitioners were dubious about whether patients would use it. The pilot study led to changes including a revised care plan, the inclusion of mindfulness-based stress reduction techniques in the support of practitioners and patients, and the stream-lining of the written self-management support material for patients. Discussion We have co-designed and optimised an augmented primary care intervention involving a whole-system approach to enhance quality of life in multimorbid patients living in the deprived areas. CARE Plus will next be tested in a phase 2 cluster randomised controlled trial. PMID:27068113
Optimising Service Delivery of AAC AT Devices and Compensating AT for Dyslexia.
Roentgen, Uta R; Hagedoren, Edith A V; Horions, Katrien D L; Dalemans, Ruth J P
2017-01-01
To promote successful use of Assistive Technology (AT) supporting Augmentative and Alternative Communication (AAC) and compensating for dyslexia, the last steps of their provision, delivery and instruction, use, maintenance and evaluation, were optimised. In co-creation with all stakeholders based on a list of requirements an integral method and tools were developed.
Optimising the Blended Learning Environment: The Arab Open University Experience
ERIC Educational Resources Information Center
Hamdi, Tahrir; Abu Qudais, Mohammed
2018-01-01
This paper will offer some insights into possible ways to optimise the blended learning environment based on experience with this modality of teaching at Arab Open University/Jordan branch and also by reflecting upon the results of several meta-analytical studies, which have shown blended learning environments to be more effective than their face…
Automated model optimisation using the Cylc workflow engine (Cyclops v1.0)
NASA Astrophysics Data System (ADS)
Gorman, Richard M.; Oliver, Hilary J.
2018-06-01
Most geophysical models include many parameters that are not fully determined by theory, and can be tuned
to improve the model's agreement with available data. We might attempt to automate this tuning process in an objective way by employing an optimisation algorithm to find the set of parameters that minimises a cost function derived from comparing model outputs with measurements. A number of algorithms are available for solving optimisation problems, in various programming languages, but interfacing such software to a complex geophysical model simulation presents certain challenges. To tackle this problem, we have developed an optimisation suite (Cyclops
) based on the Cylc workflow engine that implements a wide selection of optimisation algorithms from the NLopt Python toolbox (Johnson, 2014). The Cyclops optimisation suite can be used to calibrate any modelling system that has itself been implemented as a (separate) Cylc model suite, provided it includes computation and output of the desired scalar cost function. A growing number of institutions are using Cylc to orchestrate complex distributed suites of interdependent cycling tasks within their operational forecast systems, and in such cases application of the optimisation suite is particularly straightforward. As a test case, we applied the Cyclops to calibrate a global implementation of the WAVEWATCH III (v4.18) third-generation spectral wave model, forced by ERA-Interim input fields. This was calibrated over a 1-year period (1997), before applying the calibrated model to a full (1979-2016) wave hindcast. The chosen error metric was the spatial average of the root mean square error of hindcast significant wave height compared with collocated altimeter records. We describe the results of a calibration in which up to 19 parameters were optimised.
Heavy liquid metals: Research programs at PSI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takeda, Y.
1996-06-01
The author describes work at PSI on thermohydraulics, thermal shock, and material tests for mechnical properties. In the presentation, the focus is on two main programs. (1) SINQ LBE target: The phase II study program for SINQ is planned. A new LBE loop is being constructed. The study has the following three objectives: (a) Pump study - design work on an electromagnetic pump to be integrated into the target. (b) Heat pipe performance test - the use of heat pipes as an additional component of the target cooling system is being considered, and it may be a way to futhermore » decouple the liquid metal and water coolant loops. (c) Mixed convection experiment - in order to find an optimal configuration of the additional flow guide for window cooling, mixed convection around the window is to be studied. The experiment will be started using water and then with LBE. (2) ESS Mercury target: For ESS target study, the following experimental studies are planned, some of which are exampled by trial experiments. (a) Flow around the window: Flow mapping around the hemi-cylindrical window will be made for optimising the flow channels and structures, (b) Geometry optimisation for minimizing a recirculation zone behind the edge of the flow separator, (c) Flow induced vibration and buckling problem for a optimised structure of the flow separator and (d) Gas-liquid two-phase flow will be studied by starting to establish the new experimental method of measuring various kinds of two-phase flow characteristics.« less
Multi-objective optimisation and decision-making of space station logistics strategies
NASA Astrophysics Data System (ADS)
Zhu, Yue-he; Luo, Ya-zhong
2016-10-01
Space station logistics strategy optimisation is a complex engineering problem with multiple objectives. Finding a decision-maker-preferred compromise solution becomes more significant when solving such a problem. However, the designer-preferred solution is not easy to determine using the traditional method. Thus, a hybrid approach that combines the multi-objective evolutionary algorithm, physical programming, and differential evolution (DE) algorithm is proposed to deal with the optimisation and decision-making of space station logistics strategies. A multi-objective evolutionary algorithm is used to acquire a Pareto frontier and help determine the range parameters of the physical programming. Physical programming is employed to convert the four-objective problem into a single-objective problem, and a DE algorithm is applied to solve the resulting physical programming-based optimisation problem. Five kinds of objective preference are simulated and compared. The simulation results indicate that the proposed approach can produce good compromise solutions corresponding to different decision-makers' preferences.
A shrinking hypersphere PSO for engineering optimisation problems
NASA Astrophysics Data System (ADS)
Yadav, Anupam; Deep, Kusum
2016-03-01
Many real-world and engineering design problems can be formulated as constrained optimisation problems (COPs). Swarm intelligence techniques are a good approach to solve COPs. In this paper an efficient shrinking hypersphere-based particle swarm optimisation (SHPSO) algorithm is proposed for constrained optimisation. The proposed SHPSO is designed in such a way that the movement of the particle is set to move under the influence of shrinking hyperspheres. A parameter-free approach is used to handle the constraints. The performance of the SHPSO is compared against the state-of-the-art algorithms for a set of 24 benchmark problems. An exhaustive comparison of the results is provided statistically as well as graphically. Moreover three engineering design problems namely welded beam design, compressed string design and pressure vessel design problems are solved using SHPSO and the results are compared with the state-of-the-art algorithms.
Topology Optimisation of Wideband Coaxial-to-Waveguide Transitions
NASA Astrophysics Data System (ADS)
Hassan, Emadeldeen; Noreland, Daniel; Wadbro, Eddie; Berggren, Martin
2017-03-01
To maximize the matching between a coaxial cable and rectangular waveguides, we present a computational topology optimisation approach that decides for each point in a given domain whether to hold a good conductor or a good dielectric. The conductivity is determined by a gradient-based optimisation method that relies on finite-difference time-domain solutions to the 3D Maxwell’s equations. Unlike previously reported results in the literature for this kind of problems, our design algorithm can efficiently handle tens of thousands of design variables that can allow novel conceptual waveguide designs. We demonstrate the effectiveness of the approach by presenting optimised transitions with reflection coefficients lower than -15 dB over more than a 60% bandwidth, both for right-angle and end-launcher configurations. The performance of the proposed transitions is cross-verified with a commercial software, and one design case is validated experimentally.
Topology Optimisation of Wideband Coaxial-to-Waveguide Transitions.
Hassan, Emadeldeen; Noreland, Daniel; Wadbro, Eddie; Berggren, Martin
2017-03-23
To maximize the matching between a coaxial cable and rectangular waveguides, we present a computational topology optimisation approach that decides for each point in a given domain whether to hold a good conductor or a good dielectric. The conductivity is determined by a gradient-based optimisation method that relies on finite-difference time-domain solutions to the 3D Maxwell's equations. Unlike previously reported results in the literature for this kind of problems, our design algorithm can efficiently handle tens of thousands of design variables that can allow novel conceptual waveguide designs. We demonstrate the effectiveness of the approach by presenting optimised transitions with reflection coefficients lower than -15 dB over more than a 60% bandwidth, both for right-angle and end-launcher configurations. The performance of the proposed transitions is cross-verified with a commercial software, and one design case is validated experimentally.
Optimal design and operation of a photovoltaic-electrolyser system using particle swarm optimisation
NASA Astrophysics Data System (ADS)
Sayedin, Farid; Maroufmashat, Azadeh; Roshandel, Ramin; Khavas, Sourena Sattari
2016-07-01
In this study, hydrogen generation is maximised by optimising the size and the operating conditions of an electrolyser (EL) directly connected to a photovoltaic (PV) module at different irradiance. Due to the variations of maximum power points of the PV module during a year and the complexity of the system, a nonlinear approach is considered. A mathematical model has been developed to determine the performance of the PV/EL system. The optimisation methodology presented here is based on the particle swarm optimisation algorithm. By this method, for the given number of PV modules, the optimal sizeand operating condition of a PV/EL system areachieved. The approach can be applied for different sizes of PV systems, various ambient temperatures and different locations with various climaticconditions. The results show that for the given location and the PV system, the energy transfer efficiency of PV/EL system can reach up to 97.83%.
Topology Optimisation of Wideband Coaxial-to-Waveguide Transitions
Hassan, Emadeldeen; Noreland, Daniel; Wadbro, Eddie; Berggren, Martin
2017-01-01
To maximize the matching between a coaxial cable and rectangular waveguides, we present a computational topology optimisation approach that decides for each point in a given domain whether to hold a good conductor or a good dielectric. The conductivity is determined by a gradient-based optimisation method that relies on finite-difference time-domain solutions to the 3D Maxwell’s equations. Unlike previously reported results in the literature for this kind of problems, our design algorithm can efficiently handle tens of thousands of design variables that can allow novel conceptual waveguide designs. We demonstrate the effectiveness of the approach by presenting optimised transitions with reflection coefficients lower than −15 dB over more than a 60% bandwidth, both for right-angle and end-launcher configurations. The performance of the proposed transitions is cross-verified with a commercial software, and one design case is validated experimentally. PMID:28332585
Crystal structure optimisation using an auxiliary equation of state
NASA Astrophysics Data System (ADS)
Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; Walsh, Aron
2015-11-01
Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.
Aungkulanon, Pasura; Luangpaiboon, Pongchanun
2016-01-01
Response surface methods via the first or second order models are important in manufacturing processes. This study, however, proposes different structured mechanisms of the vertical transportation systems or VTS embedded on a shuffled frog leaping-based approach. There are three VTS scenarios, a motion reaching a normal operating velocity, and both reaching and not reaching transitional motion. These variants were performed to simultaneously inspect multiple responses affected by machining parameters in multi-pass turning processes. The numerical results of two machining optimisation problems demonstrated the high performance measures of the proposed methods, when compared to other optimisation algorithms for an actual deep cut design.
Reservoir optimisation using El Niño information. Case study of Daule Peripa (Ecuador)
NASA Astrophysics Data System (ADS)
Gelati, Emiliano; Madsen, Henrik; Rosbjerg, Dan
2010-05-01
The optimisation of water resources systems requires the ability to produce runoff scenarios that are consistent with available climatic information. We approach stochastic runoff modelling with a Markov-modulated autoregressive model with exogenous input, which belongs to the class of Markov-switching models. The model assumes runoff parameterisation to be conditioned on a hidden climatic state following a Markov chain, whose state transition probabilities depend on climatic information. This approach allows stochastic modeling of non-stationary runoff, as runoff anomalies are described by a mixture of autoregressive models with exogenous input, each one corresponding to a climate state. We calibrate the model on the inflows of the Daule Peripa reservoir located in western Ecuador, where the occurrence of El Niño leads to anomalously heavy rainfall caused by positive sea surface temperature anomalies along the coast. El Niño - Southern Oscillation (ENSO) information is used to condition the runoff parameterisation. Inflow predictions are realistic, especially at the occurrence of El Niño events. The Daule Peripa reservoir serves a hydropower plant and a downstream water supply facility. Using historical ENSO records, synthetic monthly inflow scenarios are generated for the period 1950-2007. These scenarios are used as input to perform stochastic optimisation of the reservoir rule curves with a multi-objective Genetic Algorithm (MOGA). The optimised rule curves are assumed to be the reservoir base policy. ENSO standard indices are currently forecasted at monthly time scale with nine-month lead time. These forecasts are used to perform stochastic optimisation of reservoir releases at each monthly time step according to the following procedure: (i) nine-month inflow forecast scenarios are generated using ENSO forecasts; (ii) a MOGA is set up to optimise the upcoming nine monthly releases; (iii) the optimisation is carried out by simulating the releases on the inflow forecasts, and by applying the base policy on a subsequent synthetic inflow scenario in order to account for long-term costs; (iv) the optimised release for the first month is implemented; (v) the state of the system is updated and (i), (ii), (iii), and (iv) are iterated for the following time step. The results highlight the advantages of using a climate-driven stochastic model to produce inflow scenarios and forecasts for reservoir optimisation, showing potential improvements with respect to the current management. Dynamic programming was used to find the best possible release time series given the inflow observations, in order to benchmark any possible operational improvement.
NASA Astrophysics Data System (ADS)
Hoell, Simon; Omenzetter, Piotr
2018-02-01
To advance the concept of smart structures in large systems, such as wind turbines (WTs), it is desirable to be able to detect structural damage early while using minimal instrumentation. Data-driven vibration-based damage detection methods can be competitive in that respect because global vibrational responses encompass the entire structure. Multivariate damage sensitive features (DSFs) extracted from acceleration responses enable to detect changes in a structure via statistical methods. However, even though such DSFs contain information about the structural state, they may not be optimised for the damage detection task. This paper addresses the shortcoming by exploring a DSF projection technique specialised for statistical structural damage detection. High dimensional initial DSFs are projected onto a low-dimensional space for improved damage detection performance and simultaneous computational burden reduction. The technique is based on sequential projection pursuit where the projection vectors are optimised one by one using an advanced evolutionary strategy. The approach is applied to laboratory experiments with a small-scale WT blade under wind-like excitations. Autocorrelation function coefficients calculated from acceleration signals are employed as DSFs. The optimal numbers of projection vectors are identified with the help of a fast forward selection procedure. To benchmark the proposed method, selections of original DSFs as well as principal component analysis scores from these features are additionally investigated. The optimised DSFs are tested for damage detection on previously unseen data from the healthy state and a wide range of damage scenarios. It is demonstrated that using selected subsets of the initial and transformed DSFs improves damage detectability compared to the full set of features. Furthermore, superior results can be achieved by projecting autocorrelation coefficients onto just a single optimised projection vector.
Automation of route identification and optimisation based on data-mining and chemical intuition.
Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G
2017-09-21
Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.
NASA Astrophysics Data System (ADS)
Han, Ke-Zhen; Feng, Jian; Cui, Xiaohong
2017-10-01
This paper considers the fault-tolerant optimised tracking control (FTOTC) problem for unknown discrete-time linear system. A research scheme is proposed on the basis of data-based parity space identification, reinforcement learning and residual compensation techniques. The main characteristic of this research scheme lies in the parity-space-identification-based simultaneous tracking control and residual compensation. The specific technical line consists of four main contents: apply subspace aided method to design observer-based residual generator; use reinforcement Q-learning approach to solve optimised tracking control policy; rely on robust H∞ theory to achieve noise attenuation; adopt fault estimation triggered by residual generator to perform fault compensation. To clarify the design and implementation procedures, an integrated algorithm is further constructed to link up these four functional units. The detailed analysis and proof are subsequently given to explain the guaranteed FTOTC performance of the proposed conclusions. Finally, a case simulation is provided to verify its effectiveness.
NASA Astrophysics Data System (ADS)
Huang, Guoqin; Zhang, Meiqin; Huang, Hui; Guo, Hua; Xu, Xipeng
2018-04-01
Circular sawing is an important method for the processing of natural stone. The ability to predict sawing power is important in the optimisation, monitoring and control of the sawing process. In this paper, a predictive model (PFD) of sawing power, which is based on the tangential force distribution at the sawing contact zone, was proposed, experimentally validated and modified. With regard to the influence of sawing speed on tangential force distribution, the modified PFD (MPFD) performed with high predictive accuracy across a wide range of sawing parameters, including sawing speed. The mean maximum absolute error rate was within 6.78%, and the maximum absolute error rate was within 11.7%. The practicability of predicting sawing power by the MPFD with few initial experimental samples was proved in case studies. On the premise of high sample measurement accuracy, only two samples are required for a fixed sawing speed. The feasibility of applying the MPFD to optimise sawing parameters while lowering the energy consumption of the sawing system was validated. The case study shows that energy use was reduced 28% by optimising the sawing parameters. The MPFD model can be used to predict sawing power, optimise sawing parameters and control energy.
Design of distributed PID-type dynamic matrix controller for fractional-order systems
NASA Astrophysics Data System (ADS)
Wang, Dawei; Zhang, Ridong
2018-01-01
With the continuous requirements for product quality and safety operation in industrial production, it is difficult to describe the complex large-scale processes with integer-order differential equations. However, the fractional differential equations may precisely represent the intrinsic characteristics of such systems. In this paper, a distributed PID-type dynamic matrix control method based on fractional-order systems is proposed. First, the high-order approximate model of integer order is obtained by utilising the Oustaloup method. Then, the step response model vectors of the plant is obtained on the basis of the high-order model, and the online optimisation for multivariable processes is transformed into the optimisation of each small-scale subsystem that is regarded as a sub-plant controlled in the distributed framework. Furthermore, the PID operator is introduced into the performance index of each subsystem and the fractional-order PID-type dynamic matrix controller is designed based on Nash optimisation strategy. The information exchange among the subsystems is realised through the distributed control structure so as to complete the optimisation task of the whole large-scale system. Finally, the control performance of the designed controller in this paper is verified by an example.
Breuer, Christian; Lucas, Martin; Schütze, Frank-Walter; Claus, Peter
2007-01-01
A multi-criteria optimisation procedure based on genetic algorithms is carried out in search of advanced heterogeneous catalysts for total oxidation. Simple but flexible software routines have been created to be applied within a search space of more then 150,000 individuals. The general catalyst design includes mono-, bi- and trimetallic compositions assembled out of 49 different metals and depleted on an Al2O3 support in up to nine amount levels. As an efficient tool for high-throughput screening and perfectly matched to the requirements of heterogeneous gas phase catalysis - especially for applications technically run in honeycomb structures - the multi-channel monolith reactor is implemented to evaluate the catalyst performances. Out of a multi-component feed-gas, the conversion rates of carbon monoxide (CO) and a model hydrocarbon (HC) are monitored in parallel. In combination with further restrictions to preparation and pre-treatment a primary screening can be conducted, promising to provide results close to technically applied catalysts. Presented are the resulting performances of the optimisation process for the first catalyst generations and the prospect of its auto-adaptation to specified optimisation goals.
Modulation aware cluster size optimisation in wireless sensor networks
NASA Astrophysics Data System (ADS)
Sriram Naik, M.; Kumar, Vinay
2017-07-01
Wireless sensor networks (WSNs) play a great role because of their numerous advantages to the mankind. The main challenge with WSNs is the energy efficiency. In this paper, we have focused on the energy minimisation with the help of cluster size optimisation along with consideration of modulation effect when the nodes are not able to communicate using baseband communication technique. Cluster size optimisations is important technique to improve the performance of WSNs. It provides improvement in energy efficiency, network scalability, network lifetime and latency. We have proposed analytical expression for cluster size optimisation using traditional sensing model of nodes for square sensing field with consideration of modulation effects. Energy minimisation can be achieved by changing the modulation schemes such as BPSK, 16-QAM, QPSK, 64-QAM, etc., so we are considering the effect of different modulation techniques in the cluster formation. The nodes in the sensing fields are random and uniformly deployed. It is also observed that placement of base station at centre of scenario enables very less number of modulation schemes to work in energy efficient manner but when base station placed at the corner of the sensing field, it enable large number of modulation schemes to work in energy efficient manner.
Sluggett, Janet K; Ilomäki, Jenni; Seaman, Karla L; Corlis, Megan; Bell, J Simon
2017-02-01
Eight percent of Australians aged 65 years and over receive residential aged care each year. Residents are increasingly older, frailer and have complex care needs on entry to residential aged care. Up to 63% of Australian residents of aged care facilities take nine or more medications regularly. Together, these factors place residents at high risk of adverse drug events. This paper reviews medication-related policies, practices and research in Australian residential aged care. Complex processes underpin prescribing, supply and administration of medications in aged care facilities. A broad range of policies and resources are available to assist health professionals, aged care facilities and residents to optimise medication management. These include national guiding principles, a standardised national medication chart, clinical medication reviews and facility accreditation standards. Recent Australian interventions have improved medication use in residential aged care facilities. Generating evidence for prescribing and deprescribing that is specific to residential aged care, health workforce reform, medication-related quality indicators and inter-professional education in aged care are important steps toward optimising medication use in this setting. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimisation of shape kernel and threshold in image-processing motion analysers.
Pedrocchi, A; Baroni, G; Sada, S; Marcon, E; Pedotti, A; Ferrigno, G
2001-09-01
The aim of the work is to optimise the image processing of a motion analyser. This is to improve accuracy, which is crucial for neurophysiological and rehabilitation applications. A new motion analyser, ELITE-S2, for installation on the International Space Station is described, with the focus on image processing. Important improvements are expected in the hardware of ELITE-S2 compared with ELITE and previous versions (ELITE-S and Kinelite). The core algorithm for marker recognition was based on the current ELITE version, using the cross-correlation technique. This technique was based on the matching of the expected marker shape, the so-called kernel, with image features. Optimisation of the kernel parameters was achieved using a genetic algorithm, taking into account noise rejection and accuracy. Optimisation was achieved by performing tests on six highly precise grids (with marker diameters ranging from 1.5 to 4 mm), representing all allowed marker image sizes, and on a noise image. The results of comparing the optimised kernels and the current ELITE version showed a great improvement in marker recognition accuracy, while noise rejection characteristics were preserved. An average increase in marker co-ordinate accuracy of +22% was achieved, corresponding to a mean accuracy of 0.11 pixel in comparison with 0.14 pixel, measured over all grids. An improvement of +37%, corresponding to an improvement from 0.22 pixel to 0.14 pixel, was observed over the grid with the biggest markers.
NASA Astrophysics Data System (ADS)
Desnijder, Karel; Hanselaer, Peter; Meuret, Youri
2016-04-01
A key requirement to obtain a uniform luminance for a side-lit LED backlight is the optimised spatial pattern of structures on the light guide that extract the light. The generation of such a scatter pattern is usually performed by applying an iterative approach. In each iteration, the luminance distribution of the backlight with a particular scatter pattern is analysed. This is typically performed with a brute-force ray-tracing algorithm, although this approach results in a time-consuming optimisation process. In this study, the Adding-Doubling method is explored as an alternative way for evaluating the luminance of a backlight. Due to the similarities between light propagating in a backlight with extraction structures and light scattering in a cloud of light scatterers, the Adding-Doubling method which is used to model the latter could also be used to model the light distribution in a backlight. The backlight problem is translated to a form upon which the Adding-Doubling method is directly applicable. The calculated luminance for a simple uniform extraction pattern with the Adding-Doubling method matches the luminance generated by a commercial raytracer very well. Although successful, no clear computational advantage over ray tracers is realised. However, the dynamics of light propagation in a light guide as used the Adding-Doubling method, also allow to enhance the efficiency of brute-force ray-tracing algorithms. The performance of this enhanced ray-tracing approach for the simulation of backlights is also evaluated against a typical brute-force ray-tracing approach.
How can clinical ethics guide the management of comorbidities in the child with Rett syndrome?
Downs, Jenny; Forbes, David; Johnson, Michael; Leonard, Helen
2016-08-01
Rett syndrome is a rare disorder caused by a mutation in the MECP2 gene. Those affected generally have severe functional impairments, and medical comorbidities such as scoliosis and poor growth are common. There is a paucity of information on the natural history of many rare disorders and an even greater deficit of evidence to guide best practice. The population-based and longitudinal Australian Rett Syndrome Database established in 1993 has supported investigations of the natural history of Rett syndrome and effectiveness of treatments. This paper reviews the disorder Rett syndrome and evidence for the management of scoliosis and poor growth within a clinical ethics framework. Compared with conservative management, we have shown that spinal fusion is associated with reduced mortality and better respiratory health. We have also shown that gastrostomy insertion is associated with subsequent weight gain. Family counselling for both procedures necessarily must include family perspectives and careful clinical attention to their needs and wishes. Vignettes describing family decision-making and experiences are presented to illustrate the principals of beneficence and autonomy in determining the best interests of the child and family. A blend of evidence-based practice with a strong clinical ethics framework has capacity to build existing strengths in families and reduce the negative impacts of disability and in so doing, optimise the health and wellbeing of those with Rett syndrome. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).
SLA-based optimisation of virtualised resource for multi-tier web applications in cloud data centres
NASA Astrophysics Data System (ADS)
Bi, Jing; Yuan, Haitao; Tie, Ming; Tan, Wei
2015-10-01
Dynamic virtualised resource allocation is the key to quality of service assurance for multi-tier web application services in cloud data centre. In this paper, we develop a self-management architecture of cloud data centres with virtualisation mechanism for multi-tier web application services. Based on this architecture, we establish a flexible hybrid queueing model to determine the amount of virtual machines for each tier of virtualised application service environments. Besides, we propose a non-linear constrained optimisation problem with restrictions defined in service level agreement. Furthermore, we develop a heuristic mixed optimisation algorithm to maximise the profit of cloud infrastructure providers, and to meet performance requirements from different clients as well. Finally, we compare the effectiveness of our dynamic allocation strategy with two other allocation strategies. The simulation results show that the proposed resource allocation method is efficient in improving the overall performance and reducing the resource energy cost.
Biomass supply chain optimisation for Organosolv-based biorefineries.
Giarola, Sara; Patel, Mayank; Shah, Nilay
2014-05-01
This work aims at providing a Mixed Integer Linear Programming modelling framework to help define planning strategies for the development of sustainable biorefineries. The up-scaling of an Organosolv biorefinery was addressed via optimisation of the whole system economics. Three real world case studies were addressed to show the high-level flexibility and wide applicability of the tool to model different biomass typologies (i.e. forest fellings, cereal residues and energy crops) and supply strategies. Model outcomes have revealed how supply chain optimisation techniques could help shed light on the development of sustainable biorefineries. Feedstock quality, quantity, temporal and geographical availability are crucial to determine biorefinery location and the cost-efficient way to supply the feedstock to the plant. Storage costs are relevant for biorefineries based on cereal stubble, while wood supply chains present dominant pretreatment operations costs. Copyright © 2014 Elsevier Ltd. All rights reserved.
A target recognition method for maritime surveillance radars based on hybrid ensemble selection
NASA Astrophysics Data System (ADS)
Fan, Xueman; Hu, Shengliang; He, Jingbo
2017-11-01
In order to improve the generalisation ability of the maritime surveillance radar, a novel ensemble selection technique, termed Optimisation and Dynamic Selection (ODS), is proposed. During the optimisation phase, the non-dominated sorting genetic algorithm II for multi-objective optimisation is used to find the Pareto front, i.e. a set of ensembles of classifiers representing different tradeoffs between the classification error and diversity. During the dynamic selection phase, the meta-learning method is used to predict whether a candidate ensemble is competent enough to classify a query instance based on three different aspects, namely, feature space, decision space and the extent of consensus. The classification performance and time complexity of ODS are compared against nine other ensemble methods using a self-built full polarimetric high resolution range profile data-set. The experimental results clearly show the effectiveness of ODS. In addition, the influence of the selection of diversity measures is studied concurrently.
On the dynamic rounding-off in analogue and RF optimal circuit sizing
NASA Astrophysics Data System (ADS)
Kotti, Mouna; Fakhfakh, Mourad; Fino, Maria Helena
2014-04-01
Frequently used approaches to solve discrete multivariable optimisation problems consist of computing solutions using a continuous optimisation technique. Then, using heuristics, the variables are rounded-off to their nearest available discrete values to obtain a discrete solution. Indeed, in many engineering problems, and particularly in analogue circuit design, component values, such as the geometric dimensions of the transistors, the number of fingers in an integrated capacitor or the number of turns in an integrated inductor, cannot be chosen arbitrarily since they have to obey to some technology sizing constraints. However, rounding-off the variables values a posteriori and can lead to infeasible solutions (solutions that are located too close to the feasible solution frontier) or degradation of the obtained results (expulsion from the neighbourhood of a 'sharp' optimum) depending on how the added perturbation affects the solution. Discrete optimisation techniques, such as the dynamic rounding-off technique (DRO) are, therefore, needed to overcome the previously mentioned situation. In this paper, we deal with an improvement of the DRO technique. We propose a particle swarm optimisation (PSO)-based DRO technique, and we show, via some analog and RF-examples, the necessity to implement such a routine into continuous optimisation algorithms.
Optimisation of active suspension control inputs for improved performance of active safety systems
NASA Astrophysics Data System (ADS)
Čorić, Mirko; Deur, Joško; Xu, Li; Tseng, H. Eric; Hrovat, Davor
2018-01-01
A collocation-type control variable optimisation method is used to investigate the extent to which the fully active suspension (FAS) can be applied to improve the vehicle electronic stability control (ESC) performance and reduce the braking distance. First, the optimisation approach is applied to the scenario of vehicle stabilisation during the sine-with-dwell manoeuvre. The results are used to provide insights into different FAS control mechanisms for vehicle performance improvements related to responsiveness and yaw rate error reduction indices. The FAS control performance is compared to performances of the standard ESC system, optimal active brake system and combined FAS and ESC configuration. Second, the optimisation approach is employed to the task of FAS-based braking distance reduction for straight-line vehicle motion. Here, the scenarios of uniform and longitudinally or laterally non-uniform tyre-road friction coefficient are considered. The influences of limited anti-lock braking system (ABS) actuator bandwidth and limit-cycle ABS behaviour are also analysed. The optimisation results indicate that the FAS can provide competitive stabilisation performance and improved agility when compared to the ESC system, and that it can reduce the braking distance by up to 5% for distinctively non-uniform friction conditions.
Headlamps for light based driver assistance
NASA Astrophysics Data System (ADS)
Götz, M.; Kleinkes, M.
2008-04-01
Driving at night is dangerous. Although only 25% of all driving tasks are performed at night, nearly half of all fatal accidents happen in this time. In order to increase safety when driving under poor visibility conditions, automotive front lighting systems have undergone a strong development in the last fifteen years. One important milestone was the introduction of Xenon headlamps in 1992, which provide more and brighter light for road illumination than ever before. Since then the paradigm of simply providing more light has changed toward providing optimised light distributions, which support the driver's perception. A first step in this direction was the introduction of dynamic bend lighting and cornering light in 2003. In 2006 the first full AFS headlamp (Adaptive Front Lighting System) allowed an optimised adoption of the light distribution to the driving situation. These systems use information provided by vehicle sensors and an intelligent algorithm to guide light towards those areas where needed. Nowadays, even more information about the vehicle's environment is available. Image processing systems, for example, allow to detect other traffic participants, their speed and their driving directions. In future headlamp systems these data will be used to constantly regulate the reach of the light distribution thus allowing a maximal reach without providing glare. Moreover, technologies that allow to constantly use a high-beam light distribution are under development. These systems will illuminate the whole traffic area only excluding other traffic participants. LED light sources will play a significant role in these scenarios, since they allow to precisely illuminate certain areas of the road, while neighbouring parts will be left in dark.
PASTIS2 and CROCODILE: XYZ-wide angle polarisation analysis for thermal neutrons
NASA Astrophysics Data System (ADS)
Enderle, Mechthild; Jullien, David; Petoukhov, Alexander; Mouveau, Pascal; Andersen, Ken; Courtois, Pierre
2017-06-01
We present a wide-angle device for inelastic neutron scattering with XYZ-polarisation analysis (PASTIS2). PASTIS2 employs a banana-shaped Si-walled 3He-filter for the polarisation analysis and allows pillar-free neutron scattering for horizontal scattering angles 0-100◦. The guide field direction at the sample can be chosen vertical or with 45◦ incremental steps in the horizontal scattering plane. When PASTIS2 is implemented on a polarised neutron beam, the incident neutron spin can be flipped with an easy-to-optimise broad-band adiabatic resonant flipper (CROCODILE) independent of the guide field direction at the sample position. We have tested the performance of this new device on the polarised thermal triple-axis spectrometer IN20 at the Institut Laue-Langevin, equipped with Heusler monochromator and the FlatCone multi-analyser, and discuss its potential for future instruments.
Systemic solutions for multi-benefit water and environmental management.
Everard, Mark; McInnes, Robert
2013-09-01
The environmental and financial costs of inputs to, and unintended consequences arising from narrow consideration of outputs from, water and environmental management technologies highlight the need for low-input solutions that optimise outcomes across multiple ecosystem services. Case studies examining the inputs and outputs associated with several ecosystem-based water and environmental management technologies reveal a range from those that differ little from conventional electro-mechanical engineering techniques through methods, such as integrated constructed wetlands (ICWs), designed explicitly as low-input systems optimising ecosystem service outcomes. All techniques present opportunities for further optimisation of outputs, and hence for greater cumulative public value. We define 'systemic solutions' as "…low-input technologies using natural processes to optimise benefits across the spectrum of ecosystem services and their beneficiaries". They contribute to sustainable development by averting unintended negative impacts and optimising benefits to all ecosystem service beneficiaries, increasing net economic value. Legacy legislation addressing issues in a fragmented way, associated 'ring-fenced' budgets and established management assumptions represent obstacles to implementing 'systemic solutions'. However, flexible implementation of legacy regulations recognising their primary purpose, rather than slavish adherence to detailed sub-clauses, may achieve greater overall public benefit through optimisation of outcomes across ecosystem services. Systemic solutions are not a panacea if applied merely as 'downstream' fixes, but are part of, and a means to accelerate, broader culture change towards more sustainable practice. This necessarily entails connecting a wider network of interests in the formulation and design of mutually-beneficial systemic solutions, including for example spatial planners, engineers, regulators, managers, farming and other businesses, and researchers working on ways to quantify and optimise delivery of ecosystem services. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Ming; Zhao, Lindu
2012-08-01
Demand for emergency resources is usually uncertain and varies quickly in anti-bioterrorism system. Besides, emergency resources which had been allocated to the epidemic areas in the early rescue cycle will affect the demand later. In this article, an integrated and dynamic optimisation model with time-varying demand based on the epidemic diffusion rule is constructed. The heuristic algorithm coupled with the MATLAB mathematical programming solver is adopted to solve the optimisation model. In what follows, the application of the optimisation model as well as a short sensitivity analysis of the key parameters in the time-varying demand forecast model is presented. The results show that both the model and the solution algorithm are useful in practice, and both objectives of inventory level and emergency rescue cost can be controlled effectively. Thus, it can provide some guidelines for decision makers when coping with emergency rescue problem with uncertain demand, and offers an excellent reference when issues pertain to bioterrorism.
De Gussem, K; Wambecq, T; Roels, J; Fenu, A; De Gueldre, G; Van De Steene, B
2011-01-01
An ASM2da model of the full-scale waste water plant of Bree (Belgium) has been made. It showed very good correlation with reference operational data. This basic model has been extended to include an accurate calculation of environmental footprint and operational costs (energy consumption, dosing of chemicals and sludge treatment). Two optimisation strategies were compared: lowest cost meeting the effluent consent versus lowest environmental footprint. Six optimisation scenarios have been studied, namely (i) implementation of an online control system based on ammonium and nitrate sensors, (ii) implementation of a control on MLSS concentration, (iii) evaluation of internal recirculation flow, (iv) oxygen set point, (v) installation of mixing in the aeration tank, and (vi) evaluation of nitrate setpoint for post denitrification. Both an environmental impact or Life Cycle Assessment (LCA) based approach for optimisation are able to significantly lower the cost and environmental footprint. However, the LCA approach has some advantages over cost minimisation of an existing full-scale plant. LCA tends to chose control settings that are more logic: it results in a safer operation of the plant with less risks regarding the consents. It results in a better effluent at a slightly increased cost.
Robustness analysis of bogie suspension components Pareto optimised values
NASA Astrophysics Data System (ADS)
Mousavi Bideleh, Seyed Milad
2017-08-01
Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.
Optimisation of flight dynamic control based on many-objectives meta-heuristic: a comparative study
NASA Astrophysics Data System (ADS)
Bureerat, Sujin; Pholdee, Nantiwat; Radpukdee, Thana
2018-05-01
Development of many objective meta-heuristics (MnMHs) is a currently interesting topic as they are suitable to real applications of optimisation problems which usually require many ob-jectives. However, most of MnMHs have been mostly developed and tested based on stand-ard testing functions while the use of MnMHs to real applications is rare. Therefore, in this work, MnMHs are applied for optimisation design of flight dynamic control. The design prob-lem is posed to find control gains for minimising; the control effort, the spiral root, the damp-ing in roll root, sideslip angle deviation, and maximising; the damping ratio of the dutch-roll complex pair, the dutch-roll frequency, bank angle at pre-specified times 1 seconds and 2.8 second subjected to several constraints based on Military Specifications (1969) requirement. Several established many-objective meta-heuristics (MnMHs) are used to solve the problem while their performances are compared. With this research work, performance of several MnMHs for flight control is investigated. The results obtained will be the baseline for future development of flight dynamic and control.
Speckle-based at-wavelength metrology of X-ray mirrors with super accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kashyap, Yogesh; Wang, Hongchang; Sawhney, Kawal, E-mail: kawal.sawhney@diamond.ac.uk
2016-05-15
X-ray active mirrors, such as bimorph and mechanically bendable mirrors, are increasingly being used on beamlines at modern synchrotron source facilities to generate either focused or “tophat” beams. As well as optical tests in the metrology lab, it is becoming increasingly important to optimise and characterise active optics under actual beamline operating conditions. Recently developed X-ray speckle-based at-wavelength metrology technique has shown great potential. The technique has been established and further developed at the Diamond Light Source and is increasingly being used to optimise active mirrors. Details of the X-ray speckle-based at-wavelength metrology technique and an example of its applicabilitymore » in characterising and optimising a micro-focusing bimorph X-ray mirror are presented. Importantly, an unprecedented angular sensitivity in the range of two nanoradians for measuring the slope error of an optical surface has been demonstrated. Such a super precision metrology technique will be beneficial to the manufacturers of polished mirrors and also in optimization of beam shaping during experiments.« less
The use of surrogates for an optimal management of coupled groundwater-agriculture hydrosystems
NASA Astrophysics Data System (ADS)
Grundmann, J.; Schütze, N.; Brettschneider, M.; Schmitz, G. H.; Lennartz, F.
2012-04-01
For ensuring an optimal sustainable water resources management in arid coastal environments, we develop a new simulation based integrated water management system. It aims at achieving best possible solutions for groundwater withdrawals for agricultural and municipal water use including saline water management together with a substantial increase of the water use efficiency in irrigated agriculture. To achieve a robust and fast operation of the management system regarding water quality and water quantity we develop appropriate surrogate models by combining physically based process modelling with methods of artificial intelligence. Thereby we use an artificial neural network for modelling the aquifer response, inclusive the seawater interface, which was trained on a scenario database generated by a numerical density depended groundwater flow model. For simulating the behaviour of high productive agricultural farms crop water production functions are generated by means of soil-vegetation-atmosphere-transport (SVAT)-models, adapted to the regional climate conditions, and a novel evolutionary optimisation algorithm for optimal irrigation scheduling and control. We apply both surrogates exemplarily within a simulation based optimisation environment using the characteristics of the south Batinah region in the Sultanate of Oman which is affected by saltwater intrusion into the coastal aquifer due to excessive groundwater withdrawal for irrigated agriculture. We demonstrate the effectiveness of our methodology for the evaluation and optimisation of different irrigation practices, cropping pattern and resulting abstraction scenarios. Due to contradicting objectives like profit-oriented agriculture vs. aquifer sustainability a multi-criterial optimisation is performed.
NASA Astrophysics Data System (ADS)
Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.
2014-05-01
In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.
Haworth, Annette; Mears, Christopher; Betts, John M; Reynolds, Hayley M; Tack, Guido; Leo, Kevin; Williams, Scott; Ebert, Martin A
2016-01-07
Treatment plans for ten patients, initially treated with a conventional approach to low dose-rate brachytherapy (LDR, 145 Gy to entire prostate), were compared with plans for the same patients created with an inverse-optimisation planning process utilising a biologically-based objective. The 'biological optimisation' considered a non-uniform distribution of tumour cell density through the prostate based on known and expected locations of the tumour. Using dose planning-objectives derived from our previous biological-model validation study, the volume of the urethra receiving 125% of the conventional prescription (145 Gy) was reduced from a median value of 64% to less than 8% whilst maintaining high values of TCP. On average, the number of planned seeds was reduced from 85 to less than 75. The robustness of plans to random seed displacements needs to be carefully considered when using contemporary seed placement techniques. We conclude that an inverse planning approach to LDR treatments, based on a biological objective, has the potential to maintain high rates of tumour control whilst minimising dose to healthy tissue. In future, the radiobiological model will be informed using multi-parametric MRI to provide a personalised medicine approach.
NASA Astrophysics Data System (ADS)
Jin, Chenxia; Li, Fachao; Tsang, Eric C. C.; Bulysheva, Larissa; Kataev, Mikhail Yu
2017-01-01
In many real industrial applications, the integration of raw data with a methodology can support economically sound decision-making. Furthermore, most of these tasks involve complex optimisation problems. Seeking better solutions is critical. As an intelligent search optimisation algorithm, genetic algorithm (GA) is an important technique for complex system optimisation, but it has internal drawbacks such as low computation efficiency and prematurity. Improving the performance of GA is a vital topic in academic and applications research. In this paper, a new real-coded crossover operator, called compound arithmetic crossover operator (CAC), is proposed. CAC is used in conjunction with a uniform mutation operator to define a new genetic algorithm CAC10-GA. This GA is compared with an existing genetic algorithm (AC10-GA) that comprises an arithmetic crossover operator and a uniform mutation operator. To judge the performance of CAC10-GA, two kinds of analysis are performed. First the analysis of the convergence of CAC10-GA is performed by the Markov chain theory; second, a pair-wise comparison is carried out between CAC10-GA and AC10-GA through two test problems available in the global optimisation literature. The overall comparative study shows that the CAC performs quite well and the CAC10-GA defined outperforms the AC10-GA.
3D interlock design 100% PVDF piezoelectric to improve energy harvesting
NASA Astrophysics Data System (ADS)
Talbourdet, Anaëlle; Rault, François; Lemort, Guillaume; Cochrane, Cédric; Devaux, Eric; Campagne, Christine
2018-07-01
Piezoelectric textile structures based on 100% poly(vinylidene fluoride) (PVDF) were developed and characterised. Multifilaments of 246 tex were produced by melt spinning. The mechanical stretching during the process provides PVDF fibres with a piezoelectric β-phase of up to 97% has been measured by FTIR experiments. Several studies have been carried out on piezoelectric PVDF-based flexible structures (films or textiles), the aim of the study being the investigation of the differences between 2D and 3D woven fabrics from 100% optimised (by optimising piezoelectric crystalline phase) piezoelectric PVDF multifilament yarns. The textile structures were poled after the weaving process, and a maximum output voltage of 2.3 V was observed on 3D woven under compression by DMA tests. Energy harvesting is optimised in a 3D interlock thanks to the stresses of the multifilaments in the thickness. The addition of a resistor makes it possible to measure energy of 10.5 μJ.m‑2 during 10 cycles of stress in compression of 5 s each.
NASA Astrophysics Data System (ADS)
Ghasemy Yaghin, R.; Fatemi Ghomi, S. M. T.; Torabi, S. A.
2015-10-01
In most markets, price differentiation mechanisms enable manufacturers to offer different prices for their products or services in different customer segments; however, the perfect price discrimination is usually impossible for manufacturers. The importance of accounting for uncertainty in such environments spurs an interest to develop appropriate decision-making tools to deal with uncertain and ill-defined parameters in joint pricing and lot-sizing problems. This paper proposes a hybrid bi-objective credibility-based fuzzy optimisation model including both quantitative and qualitative objectives to cope with these issues. Taking marketing and lot-sizing decisions into account simultaneously, the model aims to maximise the total profit of manufacturer and to improve service aspects of retailing simultaneously to set different prices with arbitrage consideration. After applying appropriate strategies to defuzzify the original model, the resulting non-linear multi-objective crisp model is then solved by a fuzzy goal programming method. An efficient stochastic search procedure using particle swarm optimisation is also proposed to solve the non-linear crisp model.
NASA Astrophysics Data System (ADS)
Sheikhan, Mansour; Abbasnezhad Arabi, Mahdi; Gharavian, Davood
2015-10-01
Artificial neural networks are efficient models in pattern recognition applications, but their performance is dependent on employing suitable structure and connection weights. This study used a hybrid method for obtaining the optimal weight set and architecture of a recurrent neural emotion classifier based on gravitational search algorithm (GSA) and its binary version (BGSA), respectively. By considering the features of speech signal that were related to prosody, voice quality, and spectrum, a rich feature set was constructed. To select more efficient features, a fast feature selection method was employed. The performance of the proposed hybrid GSA-BGSA method was compared with similar hybrid methods based on particle swarm optimisation (PSO) algorithm and its binary version, PSO and discrete firefly algorithm, and hybrid of error back-propagation and genetic algorithm that were used for optimisation. Experimental tests on Berlin emotional database demonstrated the superior performance of the proposed method using a lighter network structure.
Optimisation of substrate blends in anaerobic co-digestion using adaptive linear programming.
García-Gen, Santiago; Rodríguez, Jorge; Lema, Juan M
2014-12-01
Anaerobic co-digestion of multiple substrates has the potential to enhance biogas productivity by making use of the complementary characteristics of different substrates. A blending strategy based on a linear programming optimisation method is proposed aiming at maximising COD conversion into methane, but simultaneously maintaining a digestate and biogas quality. The method incorporates experimental and heuristic information to define the objective function and the linear restrictions. The active constraints are continuously adapted (by relaxing the restriction boundaries) such that further optimisations in terms of methane productivity can be achieved. The feasibility of the blends calculated with this methodology was previously tested and accurately predicted with an ADM1-based co-digestion model. This was validated in a continuously operated pilot plant, treating for several months different mixtures of glycerine, gelatine and pig manure at organic loading rates from 1.50 to 4.93 gCOD/Ld and hydraulic retention times between 32 and 40 days at mesophilic conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.
INDIVIDUAL-BASED MODELS: POWERFUL OR POWER STRUGGLE?
Willem, L; Stijven, S; Hens, N; Vladislavleva, E; Broeckhove, J; Beutels, P
2015-01-01
Individual-based models (IBMs) offer endless possibilities to explore various research questions but come with high model complexity and computational burden. Large-scale IBMs have become feasible but the novel hardware architectures require adapted software. The increased model complexity also requires systematic exploration to gain thorough system understanding. We elaborate on the development of IBMs for vaccine-preventable infectious diseases and model exploration with active learning. Investment in IBM simulator code can lead to significant runtime reductions. We found large performance differences due to data locality. Sorting the population once, reduced simulation time by a factor two. Storing person attributes separately instead of using person objects also seemed more efficient. Next, we improved model performance up to 70% by structuring potential contacts based on health status before processing disease transmission. The active learning approach we present is based on iterative surrogate modelling and model-guided experimentation. Symbolic regression is used for nonlinear response surface modelling with automatic feature selection. We illustrate our approach using an IBM for influenza vaccination. After optimizing the parameter spade, we observed an inverse relationship between vaccination coverage and the clinical attack rate reinforced by herd immunity. These insights can be used to focus and optimise research activities, and to reduce both dimensionality and decision uncertainty.
NASA Astrophysics Data System (ADS)
Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto; Hansen, Olfred; Brink, Carsten
2016-08-01
A comprehensive artefact correction method for clinical cone beam CT (CBCT) images acquired for image guided radiation therapy (IGRT) on a commercial system is presented. The method is demonstrated to reduce artefacts and recover CT-like Hounsfield units (HU) in reconstructed CBCT images of five lung cancer patients. Projection image based artefact corrections of image lag, detector scatter, body scatter and beam hardening are described and applied to CBCT images of five lung cancer patients. Image quality is evaluated through visual appearance of the reconstructed images, HU-correspondence with the planning CT images, and total volume HU error. Artefacts are reduced and CT-like HUs are recovered in the artefact corrected CBCT images. Visual inspection confirms that artefacts are indeed suppressed by the proposed method, and the HU root mean square difference between reconstructed CBCTs and the reference CT images are reduced by 31% when using the artefact corrections compared to the standard clinical CBCT reconstruction. A versatile artefact correction method for clinical CBCT images acquired for IGRT has been developed. HU values are recovered in the corrected CBCT images. The proposed method relies on post processing of clinical projection images, and does not require patient specific optimisation. It is thus a powerful tool for image quality improvement of large numbers of CBCT images.
Franks, Paul W; Poveda, Alaitz
2017-05-01
Precision diabetes medicine, the optimisation of therapy using patient-level biomarker data, has stimulated enormous interest throughout society as it provides hope of more effective, less costly and safer ways of preventing, treating, and perhaps even curing the disease. While precision diabetes medicine is often framed in the context of pharmacotherapy, using biomarkers to personalise lifestyle recommendations, intended to lower type 2 diabetes risk or to slow progression, is also conceivable. There are at least four ways in which this might work: (1) by helping to predict a person's susceptibility to adverse lifestyle exposures; (2) by facilitating the stratification of type 2 diabetes into subclasses, some of which may be prevented or treated optimally with specific lifestyle interventions; (3) by aiding the discovery of prognostic biomarkers that help guide timing and intensity of lifestyle interventions; (4) by predicting treatment response. In this review we overview the rationale for precision diabetes medicine, specifically as it relates to lifestyle; we also scrutinise existing evidence, discuss the barriers germane to research in this field and consider how this work is likely to proceed.
Optimisation of reconstruction--reprojection-based motion correction for cardiac SPECT.
Kangasmaa, Tuija S; Sohlberg, Antti O
2014-07-01
Cardiac motion is a challenging cause of image artefacts in myocardial perfusion SPECT. A wide range of motion correction methods have been developed over the years, and so far automatic algorithms based on the reconstruction--reprojection principle have proved to be the most effective. However, these methods have not been fully optimised in terms of their free parameters and implementational details. Two slightly different implementations of reconstruction--reprojection-based motion correction techniques were optimised for effective, good-quality motion correction and then compared with each other. The first of these methods (Method 1) was the traditional reconstruction-reprojection motion correction algorithm, where the motion correction is done in projection space, whereas the second algorithm (Method 2) performed motion correction in reconstruction space. The parameters that were optimised include the type of cost function (squared difference, normalised cross-correlation and mutual information) that was used to compare measured and reprojected projections, and the number of iterations needed. The methods were tested with motion-corrupt projection datasets, which were generated by adding three different types of motion (lateral shift, vertical shift and vertical creep) to motion-free cardiac perfusion SPECT studies. Method 2 performed slightly better overall than Method 1, but the difference between the two implementations was small. The execution time for Method 2 was much longer than for Method 1, which limits its clinical usefulness. The mutual information cost function gave clearly the best results for all three motion sets for both correction methods. Three iterations were sufficient for a good quality correction using Method 1. The traditional reconstruction--reprojection-based method with three update iterations and mutual information cost function is a good option for motion correction in clinical myocardial perfusion SPECT.
Intelligent Internet-based information system optimises diabetes mellitus management in communities.
Wei, Xuejuan; Wu, Hao; Cui, Shuqi; Ge, Caiying; Wang, Li; Jia, Hongyan; Liang, Wannian
2018-05-01
To evaluate the effect of an intelligent Internet-based information system upon optimising the management of patients diagnosed with type 2 diabetes mellitus (T2DM). In 2015, a T2DM information system was introduced to optimise the management of T2DM patients for 1 year in Fangzhuang community of Beijing, China. A total of 602 T2DM patients who were registered in the health service centre of Fangzhuang community were enrolled based on an isometric sampling technique. The data from 587 patients were used in the final analysis. The intervention effect was subsequently assessed by statistically comparing multiple parameters, such as the prevalence of glycaemic control, standard health management and annual outpatient consultation visits per person, before and after the implementation of the T2DM information system. In 2015, a total of 1668 T2DM patients were newly registered in Fangzhuang community. The glycaemic control rate was calculated as 37.65% in 2014 and significantly elevated up to 62.35% in 2015 ( p < 0.001). After application of the Internet-based information system, the rate of standard health management was increased from 48.04% to 85.01% ( p < 0.001). Among all registered T2DM patients, the annual outpatient consultation visits per person in Fangzhuang community was 24.88% in 2014, considerably decreased to 22.84% in 2015 ( p < 0.001) and declined from 14.59% to 13.66% in general hospitals ( p < 0.05). Application of the T2DM information system optimised the management of T2DM patients in Fangzhuang community and decreased the outpatient numbers in both community and general hospitals, which played a positive role in assisting T2DM patients and their healthcare providers to better manage this chronic illness.
A joint swarm intelligence algorithm for multi-user detection in MIMO-OFDM system
NASA Astrophysics Data System (ADS)
Hu, Fengye; Du, Dakun; Zhang, Peng; Wang, Zhijun
2014-11-01
In the multi-input multi-output orthogonal frequency division multiplexing (MIMO-OFDM) system, traditional multi-user detection (MUD) algorithms that usually used to suppress multiple access interference are difficult to balance system detection performance and the complexity of the algorithm. To solve this problem, this paper proposes a joint swarm intelligence algorithm called Ant Colony and Particle Swarm Optimisation (AC-PSO) by integrating particle swarm optimisation (PSO) and ant colony optimisation (ACO) algorithms. According to simulation results, it has been shown that, with low computational complexity, the MUD for the MIMO-OFDM system based on AC-PSO algorithm gains comparable MUD performance with maximum likelihood algorithm. Thus, the proposed AC-PSO algorithm provides a satisfactory trade-off between computational complexity and detection performance.
Albadr, Musatafa Abbas Abbood; Tiun, Sabrina; Al-Dhief, Fahad Taha; Sammour, Mahmoud A M
2018-01-01
Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%.
Tiun, Sabrina; AL-Dhief, Fahad Taha; Sammour, Mahmoud A. M.
2018-01-01
Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%. PMID:29672546
Santonastaso, Giovanni Francesco; Bortone, Immacolata; Chianese, Simeone; Di Nardo, Armando; Di Natale, Michele; Erto, Alessandro; Karatza, Despina; Musmarra, Dino
2017-09-19
The following paper presents a method to optimise a discontinuous permeable adsorptive barrier (PAB-D). This method is based on the comparison of different PAB-D configurations obtained by changing some of the main PAB-D design parameters. In particular, the well diameters, the distance between two consecutive passive wells and the distance between two consecutive well lines were varied, and a cost analysis for each configuration was carried out in order to define the best performing and most cost-effective PAB-D configuration. As a case study, a benzene-contaminated aquifer located in an urban area in the north of Naples (Italy) was considered. The PAB-D configuration with a well diameter of 0.8 m resulted the best optimised layout in terms of performance and cost-effectiveness. Moreover, in order to identify the best configuration for the remediation of the aquifer studied, a comparison with a continuous permeable adsorptive barrier (PAB-C) was added. In particular, this showed a 40% reduction of the total remediation costs by using the optimised PAB-D.
NASA Astrophysics Data System (ADS)
Hsu, Chih-Ming
2014-12-01
Portfolio optimisation is an important issue in the field of investment/financial decision-making and has received considerable attention from both researchers and practitioners. However, besides portfolio optimisation, a complete investment procedure should also include the selection of profitable investment targets and determine the optimal timing for buying/selling the investment targets. In this study, an integrated procedure using data envelopment analysis (DEA), artificial bee colony (ABC) and genetic programming (GP) is proposed to resolve a portfolio optimisation problem. The proposed procedure is evaluated through a case study on investing in stocks in the semiconductor sub-section of the Taiwan stock market for 4 years. The potential average 6-month return on investment of 9.31% from 1 November 2007 to 31 October 2011 indicates that the proposed procedure can be considered a feasible and effective tool for making outstanding investment plans, and thus making profits in the Taiwan stock market. Moreover, it is a strategy that can help investors to make profits even when the overall stock market suffers a loss.
NASA Astrophysics Data System (ADS)
Dittmar, N.; Haberstroh, Ch.; Hesse, U.; Krzyzowski, M.
2016-04-01
The transfer of liquid helium (LHe) into mobile dewars or transport vessels is a common and unavoidable process at LHe decant stations. During this transfer reasonable amounts of LHe evaporate due to heat leak and pressure drop. Thus generated helium gas needs to be collected and reliquefied which requires a huge amount of electrical energy. Therefore, the design of transfer lines used at LHe decant stations has been optimised to establish a LHe transfer with minor evaporation losses which increases the overall efficiency and capacity of LHe decant stations. This paper presents the experimental results achieved during the thermohydraulic optimisation of a flexible LHe transfer line. An extensive measurement campaign with a set of dedicated transfer lines equipped with pressure and temperature sensors led to unique experimental data of this specific transfer process. The experimental results cover the heat leak, the pressure drop, the transfer rate, the outlet quality, and the cool-down and warm-up behaviour of the examined transfer lines. Based on the obtained results the design of the considered flexible transfer line has been optimised, featuring reduced heat leak and pressure drop.
NASA Astrophysics Data System (ADS)
Mallick, S.; Kar, R.; Mandal, D.; Ghoshal, S. P.
2016-07-01
This paper proposes a novel hybrid optimisation algorithm which combines the recently proposed evolutionary algorithm Backtracking Search Algorithm (BSA) with another widely accepted evolutionary algorithm, namely, Differential Evolution (DE). The proposed algorithm called BSA-DE is employed for the optimal designs of two commonly used analogue circuits, namely Complementary Metal Oxide Semiconductor (CMOS) differential amplifier circuit with current mirror load and CMOS two-stage operational amplifier (op-amp) circuit. BSA has a simple structure that is effective, fast and capable of solving multimodal problems. DE is a stochastic, population-based heuristic approach, having the capability to solve global optimisation problems. In this paper, the transistors' sizes are optimised using the proposed BSA-DE to minimise the areas occupied by the circuits and to improve the performances of the circuits. The simulation results justify the superiority of BSA-DE in global convergence properties and fine tuning ability, and prove it to be a promising candidate for the optimal design of the analogue CMOS amplifier circuits. The simulation results obtained for both the amplifier circuits prove the effectiveness of the proposed BSA-DE-based approach over DE, harmony search (HS), artificial bee colony (ABC) and PSO in terms of convergence speed, design specifications and design parameters of the optimal design of the analogue CMOS amplifier circuits. It is shown that BSA-DE-based design technique for each amplifier circuit yields the least MOS transistor area, and each designed circuit is shown to have the best performance parameters such as gain, power dissipation, etc., as compared with those of other recently reported literature.
Conjugate gradient minimisation approach to generating holographic traps for ultracold atoms.
Harte, Tiffany; Bruce, Graham D; Keeling, Jonathan; Cassettari, Donatella
2014-11-03
Direct minimisation of a cost function can in principle provide a versatile and highly controllable route to computational hologram generation. Here we show that the careful design of cost functions, combined with numerically efficient conjugate gradient minimisation, establishes a practical method for the generation of holograms for a wide range of target light distributions. This results in a guided optimisation process, with a crucial advantage illustrated by the ability to circumvent optical vortex formation during hologram calculation. We demonstrate the implementation of the conjugate gradient method for both discrete and continuous intensity distributions and discuss its applicability to optical trapping of ultracold atoms.
Design of an integrated team project as bachelor thesis in bioscience engineering
NASA Astrophysics Data System (ADS)
Peeters, Marie-Christine; Londers, Elsje; Van der Hoeven, Wouter
2014-11-01
Following the decision at the KU Leuven to implement the educational concept of guided independent learning and to encourage students to participate in scientific research, the Faculty of Bioscience Engineering decided to introduce a bachelor thesis. Competencies, such as communication, scientific research and teamwork, need to be present in the design of this thesis. Because of the high number of students and the multidisciplinary nature of the graduates, all research divisions of the faculty are asked to participate. The yearly surveys and hearings were used for further optimisation. The actual design of this bachelor thesis is presented and discussed in this paper.
NASA Astrophysics Data System (ADS)
Hill, Ian; White, Toby; Owen, Sarah
2014-05-01
Extraction and processing of rock materials to produce aggregates is carried out at some 20,000 quarries across the EU. All stages of the processing and transport of hard and dense materials inevitably consume high levels of energy and have consequent significant carbon footprints. The FP7 project "the Energy Efficient Quarry" (EE-Quarry) has been addressing this problem and has devised strategies, supported by modelling software, to assist the quarrying industry to assess and optimise its energy use, and to minimise its carbon footprint. Aggregate quarries across Europe vary enormously in the scale of the quarrying operations, the nature of the worked mineral, and the processing to produce a final market product. Nevertheless most quarries involve most or all of a series of essential stages; deposit assessment, drilling and blasting, loading and hauling, and crushing and screening. The process of determining the energy-efficiency of each stage is complex, but is broadly understood in principle and there are numerous sources of information and guidance available in the literature and on-line. More complex still is the interaction between each of these stages. For example, using a little more energy in blasting to increase fragmentation may save much greater energy in later crushing and screening, but also generate more fines material which is discarded as waste and the embedded energy in this material is lost. Thus the calculation of the embedded energy in the waste material becomes an input to the determination of the blasting strategy. Such feedback loops abound in the overall quarry optimisation. The project has involved research and demonstration operations at a number of quarries distributed across Europe carried out by all partners in the EE-Quarry project, working in collaboration with many of the major quarrying companies operating in the EU. The EE-Quarry project is developing a sophisticated modelling tool, the "EE-Quarry Model" available to the quarrying industry on a web-based platform. This tool guides quarry managers and operators through the complex, multi-layered, iterative, process of assessing the energy efficiency of their own quarry operation. They are able to evaluate the optimisation of the energy-efficiency of the overall quarry through examining both the individual stages of processing, and the interactions between them. The project is also developing on-line distance learning modules designed for Continuous Professional Development (CPD) activities for staff across the quarrying industry in the EU and beyond. The presentation will describe development of the model, and the format and scope of the resulting software tool and its user-support available to the quarrying industry.
NASA Astrophysics Data System (ADS)
Fouladi, Ehsan; Mojallali, Hamed
2018-01-01
In this paper, an adaptive backstepping controller has been tuned to synchronise two chaotic Colpitts oscillators in a master-slave configuration. The parameters of the controller are determined using shark smell optimisation (SSO) algorithm. Numerical results are presented and compared with those of particle swarm optimisation (PSO) algorithm. Simulation results show better performance in terms of accuracy and convergence for the proposed optimised method compared to PSO optimised controller or any non-optimised backstepping controller.
Haering, Diane; Huchez, Aurore; Barbier, Franck; Holvoët, Patrice; Begon, Mickaël
2017-01-01
Introduction Teaching acrobatic skills with a minimal amount of repetition is a major challenge for coaches. Biomechanical, statistical or computer simulation tools can help them identify the most determinant factors of performance. Release parameters, change in moment of inertia and segmental momentum transfers were identified in the prediction of acrobatics success. The purpose of the present study was to evaluate the relative contribution of these parameters in performance throughout expertise or optimisation based improvements. The counter movement forward in flight (CMFIF) was chosen for its intrinsic dichotomy between the accessibility of its attempt and complexity of its mastery. Methods Three repetitions of the CMFIF performed by eight novice and eight advanced female gymnasts were recorded using a motion capture system. Optimal aerial techniques that maximise rotation potential at regrasp were also computed. A 14-segment-multibody-model defined through the Rigid Body Dynamics Library was used to compute recorded and optimal kinematics, and biomechanical parameters. A stepwise multiple linear regression was used to determine the relative contribution of these parameters in novice recorded, novice optimised, advanced recorded and advanced optimised trials. Finally, fixed effects of expertise and optimisation were tested through a mixed-effects analysis. Results and discussion Variation in release state only contributed to performances in novice recorded trials. Moment of inertia contribution to performance increased from novice recorded, to novice optimised, advanced recorded, and advanced optimised trials. Contribution to performance of momentum transfer to the trunk during the flight prevailed in all recorded trials. Although optimisation decreased transfer contribution, momentum transfer to the arms appeared. Conclusion Findings suggest that novices should be coached on both contact and aerial technique. Inversely, mainly improved aerial technique helped advanced gymnasts increase their performance. For both, reduction of the moment of inertia should be focused on. The method proposed in this article could be generalized to any aerial skill learning investigation. PMID:28422954
NASA Astrophysics Data System (ADS)
Milic, Vladimir; Kasac, Josip; Novakovic, Branko
2015-10-01
This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.
Giri, Anupam; Zelinkova, Zuzana; Wenzl, Thomas
2017-12-01
For the implementation of Regulation (EC) No 2065/2003 related to smoke flavourings used or intended for use in or on foods a method based on solid-phase micro extraction (SPME) GC/MS was developed for the characterisation of liquid smoke products. A statistically based experimental design (DoE) was used for method optimisation. The best general conditions to quantitatively analyse the liquid smoke compounds were obtained with a polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre, 60°C extraction temperature, 30 min extraction time, 250°C desorption temperature, 180 s desorption time, 15 s agitation time, and 250 rpm agitation speed. Under the optimised conditions, 119 wood pyrolysis products including furan/pyran derivatives, phenols, guaiacol, syringol, benzenediol, and their derivatives, cyclic ketones, and several other heterocyclic compounds were identified. The proposed method was repeatable (RSD% <5) and the calibration functions were linear for all compounds under study. Nine isotopically labelled internal standards were used for improving quantification of analytes by compensating matrix effects that might affect headspace equilibrium and extractability of compounds. The optimised isotope dilution SPME-GC/MS based analytical method proved to be fit for purpose, allowing the rapid identification and quantification of volatile compounds in liquid smoke flavourings.
Koo, B K; O'Connell, P E
2006-04-01
The site-specific land use optimisation methodology, suggested by the authors in the first part of this two-part paper, has been applied to the River Kennet catchment at Marlborough, Wiltshire, UK, for a case study. The Marlborough catchment (143 km(2)) is an agriculture-dominated rural area over a deep chalk aquifer that is vulnerable to nitrate pollution from agricultural diffuse sources. For evaluation purposes, the catchment was discretised into a network of 1 kmx1 km grid cells. For each of the arable-land grid cells, seven land use alternatives (four arable-land alternatives and three grassland alternatives) were evaluated for their environmental and economic potential. For environmental evaluation, nitrate leaching rates of land use alternatives were estimated using SHETRAN simulations and groundwater pollution potential was evaluated using the DRASTIC index. For economic evaluation, economic gross margins were estimated using a simple agronomic model based on nitrogen response functions and agricultural land classification grades. In order to see whether the site-specific optimisation is efficient at the catchment scale, land use optimisation was carried out for four optimisation schemes (i.e. using four sets of criterion weights). Consequently, four land use scenarios were generated and the site-specifically optimised land use scenario was evaluated as the best compromise solution between long term nitrate pollution and agronomy at the catchment scale.
Optimisation of novel method for the extraction of steviosides from Stevia rebaudiana leaves.
Puri, Munish; Sharma, Deepika; Barrow, Colin J; Tiwary, A K
2012-06-01
Stevioside, a diterpene glycoside, is well known for its intense sweetness and is used as a non-caloric sweetener. Its potential widespread use requires an easy and effective extraction method. Enzymatic extraction of stevioside from Stevia rebaudiana leaves with cellulase, pectinase and hemicellulase, using various parameters, such as concentration of enzyme, incubation time and temperature, was optimised. Hemicellulase was observed to give the highest stevioside yield (369.23±0.11μg) in 1h in comparison to cellulase (359±0.30μg) and pectinases (333±0.55μg). Extraction from leaves under optimised conditions showed a remarkable increase in the yield (35 times) compared with a control experiment. The extraction conditions were further optimised using response surface methodology (RSM). A central composite design (CCD) was used for experimental design and analysis of the results to obtain optimal extraction conditions. Based on RSM analysis, temperature of 51-54°C, time of 36-45min and the cocktail of pectinase, cellulase and hemicellulase, set at 2% each, gave the best results. Under the optimised conditions, the experimental values were in close agreement with the prediction model and resulted in a three times yield enhancement of stevioside. The isolated stevioside was characterised through 1 H-NMR spectroscopy, by comparison with a stevioside standard. Copyright © 2011 Elsevier Ltd. All rights reserved.
Clayden, Jonathan D; Storkey, Amos J; Muñoz Maniega, Susana; Bastin, Mark E
2009-04-01
This work describes a reproducibility analysis of scalar water diffusion parameters, measured within white matter tracts segmented using a probabilistic shape modelling method. In common with previously reported neighbourhood tractography (NT) work, the technique optimises seed point placement for fibre tracking by matching the tracts generated using a number of candidate points against a reference tract, which is derived from a white matter atlas in the present study. No direct constraints are applied to the fibre tracking results. An Expectation-Maximisation algorithm is used to fully automate the procedure, and make dramatically more efficient use of data than earlier NT methods. Within-subject and between-subject variances for fractional anisotropy and mean diffusivity within the tracts are then separated using a random effects model. We find test-retest coefficients of variation (CVs) similar to those reported in another study using landmark-guided single seed points; and subject to subject CVs similar to a constraint-based multiple ROI method. We conclude that our approach is at least as effective as other methods for tract segmentation using tractography, whilst also having some additional benefits, such as its provision of a goodness-of-match measure for each segmentation.
Garnon, Julien; Koch, Guillaume; Ramamurthy, Nitin; Caudrelier, Jean; Rao, Pramod; Tsoumakidou, Georgia; Cazzato, Roberto Luigi; Gangi, Afshin
2016-09-01
To review our initial experience with percutaneous CT and fluoroscopy-guided screw fixation of pathological shoulder-girdle fractures. Between May 2014 and June 2015, three consecutive oncologic patients (mean age 65 years; range 57-75 years) with symptomatic pathological shoulder-girdle fractures unsuitable for surgery and radiotherapy underwent percutaneous image-guided screw fixation. Fractures occurred through metastases (n = 2) or a post-ablation cavity (n = 1). Mechanical properties of osteosynthesis were adjudged superior to stand-alone cementoplasty in each case. Cannulated screws were placed under combined CT and fluoroscopic guidance with complementary radiofrequency ablation or cementoplasty to optimise local palliation and secure screw fixation, respectively, in two cases. Follow-up was undertaken every few weeks until mortality or most recent appointment. Four pathological fractures were treated in three patients (2 acromion, 1 clavicular, 1 coracoid). Mean size of associated lesion was 2.6 cm (range 1-4.5 cm). Technical success was achieved in all cases (100 %), without complications. Good palliation and restoration of mobility were observed in two cases at 2-3 months; one case could not be followed due to early post-procedural oncologic mortality. Percutaneous image-guided shoulder-girdle osteosynthesis appears technically feasible with good short-term efficacy in this complex patient subset. Further studies are warranted to confirm these promising initial results.
NASA Astrophysics Data System (ADS)
Fu, Shihua; Li, Haitao; Zhao, Guodong
2018-05-01
This paper investigates the evolutionary dynamic and strategy optimisation for a kind of networked evolutionary games whose strategy updating rules incorporate 'bankruptcy' mechanism, and the situation that each player's bankruptcy is due to the previous continuous low profits gaining from the game is considered. First, by using semi-tensor product of matrices method, the evolutionary dynamic of this kind of games is expressed as a higher order logical dynamic system and then converted into its algebraic form, based on which, the evolutionary dynamic of the given games can be discussed. Second, the strategy optimisation problem is investigated, and some free-type control sequences are designed to maximise the total payoff of the whole game. Finally, an illustrative example is given to show that our new results are very effective.
NASA Astrophysics Data System (ADS)
Jia, Zhao-hong; Pei, Ming-li; Leung, Joseph Y.-T.
2017-12-01
In this paper, we investigate the batch-scheduling problem with rejection on parallel machines with non-identical job sizes and arbitrary job-rejected weights. If a job is rejected, the corresponding penalty has to be paid. Our objective is to minimise the makespan of the processed jobs and the total rejection cost of the rejected jobs. Based on the selected multi-objective optimisation approaches, two problems, P1 and P2, are considered. In P1, the two objectives are linearly combined into one single objective. In P2, the two objectives are simultaneously minimised and the Pareto non-dominated solution set is to be found. Based on the ant colony optimisation (ACO), two algorithms, called LACO and PACO, are proposed to address the two problems, respectively. Two different objective-oriented pheromone matrices and heuristic information are designed. Additionally, a local optimisation algorithm is adopted to improve the solution quality. Finally, simulated experiments are conducted, and the comparative results verify the effectiveness and efficiency of the proposed algorithms, especially on large-scale instances.
Optimisation of logistics processes of energy grass collection
NASA Astrophysics Data System (ADS)
Bányai, Tamás.
2010-05-01
The collection of energy grass is a logistics-intensive process [1]. The optimal design and control of transportation and collection subprocesses is a critical point of the supply chain. To avoid irresponsible decisions by right of experience and intuition, the optimisation and analysis of collection processes based on mathematical models and methods is the scientific suggestible way. Within the frame of this work, the author focuses on the optimisation possibilities of the collection processes, especially from the point of view transportation and related warehousing operations. However the developed optimisation methods in the literature [2] take into account the harvesting processes, county-specific yields, transportation distances, erosion constraints, machinery specifications, and other key variables, but the possibility of more collection points and the multi-level collection were not taken into consideration. The possible areas of using energy grass is very wide (energetically use, biogas and bio alcohol production, paper and textile industry, industrial fibre material, foddering purposes, biological soil protection [3], etc.), so not only a single level but also a multi-level collection system with more collection and production facilities has to be taken into consideration. The input parameters of the optimisation problem are the followings: total amount of energy grass to be harvested in each region; specific facility costs of collection, warehousing and production units; specific costs of transportation resources; pre-scheduling of harvesting process; specific transportation and warehousing costs; pre-scheduling of processing of energy grass at each facility (exclusive warehousing). The model take into consideration the following assumptions: (1) cooperative relation among processing and production facilties, (2) capacity constraints are not ignored, (3) the cost function of transportation is non-linear, (4) the drivers conditions are ignored. The objective function of the optimisation is the maximisation of the profit which means the maximization of the difference between revenue and cost. The objective function trades off the income of the assigned transportation demands against the logistic costs. The constraints are the followings: (1) the free capacity of the assigned transportation resource is more than the re-quested capacity of the transportation demand; the calculated arrival time of the transportation resource to the harvesting place is not later than the requested arrival time of them; (3) the calculated arrival time of the transportation demand to the processing and production facility is not later than the requested arrival time; (4) one transportation demand is assigned to one transportation resource and one resource is assigned to one transportation resource. The decision variable of the optimisation problem is the set of scheduling variables and the assignment of resources to transportation demands. The evaluation parameters of the optimised system are the followings: total costs of the collection process; utilisation of transportation resources and warehouses; efficiency of production and/or processing facilities. However the multidimensional heuristic optimisation method is based on genetic algorithm, but the routing sequence of the optimisation works on the base of an ant colony algorithm. The optimal routes are calculated by the aid of the ant colony algorithm as a subroutine of the global optimisation method and the optimal assignment is given by the genetic algorithm. One important part of the mathematical method is the sensibility analysis of the objective function, which shows the influence rate of the different input parameters. Acknowledgements This research was implemented within the frame of the project entitled "Development and operation of the Technology and Knowledge Transfer Centre of the University of Miskolc". with support by the European Union and co-funding of the European Social Fund. References [1] P. R. Daniel: The Economics of Harvesting and Transporting Corn Stover for Conversion to Fuel Ethanol: A Case Study for Minnesota. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/14213.html [2] T. G. Douglas, J. Brendan, D. Erin & V.-D. Becca: Energy and Chemicals from Native Grasses: Production, Transportation and Processing Technologies Considered in the Northern Great Plains. University of Minnesota, Department of Applied Economics. 2006. http://ideas.repec.org/p/ags/umaesp/13838.html [3] Homepage of energygrass. www.energiafu.hu
Pufulete, Maria; Maishman, Rachel; Dabner, Lucy; Mohiuddin, Syed; Hollingworth, William; Rogers, Chris A; Higgins, Julian; Dayer, Mark; Macleod, John; Purdy, Sarah; McDonagh, Theresa; Nightingale, Angus; Williams, Rachael; Reeves, Barnaby C
2017-01-01
BACKGROUND Heart failure (HF) affects around 500,000 people in the UK. HF medications are frequently underprescribed and B-type natriuretic peptide (BNP)-guided therapy may help to optimise treatment. OBJECTIVE To evaluate the clinical effectiveness and cost-effectiveness of BNP-guided therapy compared with symptom-guided therapy in HF patients. DESIGN Systematic review, cohort study and cost-effectiveness model. SETTING A literature review and usual care in the NHS. PARTICIPANTS (a) HF patients in randomised controlled trials (RCTs) of BNP-guided therapy; and (b) patients having usual care for HF in the NHS. INTERVENTIONS Systematic review: BNP-guided therapy or symptom-guided therapy in primary or secondary care. Cohort study: BNP monitored (≥ 6 months' follow-up and three or more BNP tests and two or more tests per year), BNP tested (≥ 1 tests but not BNP monitored) or never tested. Cost-effectiveness model: BNP-guided therapy in specialist clinics. MAIN OUTCOME MEASURES Mortality, hospital admission (all cause and HF related) and adverse events; and quality-adjusted life-years (QALYs) for the cost-effectiveness model. DATA SOURCES Systematic review: Individual participant or aggregate data from eligible RCTs. Cohort study: The Clinical Practice Research Datalink, Hospital Episode Statistics and National Heart Failure Audit (NHFA). REVIEW METHODS A systematic literature search (five databases, trial registries, grey literature and reference lists of publications) for published and unpublished RCTs. RESULTS Five RCTs contributed individual participant data (IPD) and eight RCTs contributed aggregate data (1536 participants were randomised to BNP-guided therapy and 1538 participants were randomised to symptom-guided therapy). For all-cause mortality, the hazard ratio (HR) for BNP-guided therapy was 0.87 [95% confidence interval (CI) 0.73 to 1.04]. Patients who were aged < 75 years or who had heart failure with a reduced ejection fraction (HFrEF) received the most benefit [interactions (p = 0.03): < 75 years vs. ≥ 75 years: HR 0.70 (95% CI 0.53 to 0.92) vs. 1.07 (95% CI 0.84 to 1.37); HFrEF vs. heart failure with a preserved ejection fraction (HFpEF): HR 0.83 (95% CI 0.68 to 1.01) vs. 1.33 (95% CI 0.83 to 2.11)]. In the cohort study, incident HF patients (1 April 2005-31 March 2013) were never tested (n = 13,632), BNP tested (n = 3392) or BNP monitored (n = 71). Median survival was 5 years; all-cause mortality was 141.5 out of 1000 person-years (95% CI 138.5 to 144.6 person-years). All-cause mortality and hospital admission rate were highest in the BNP-monitored group, and median survival among 130,433 NHFA patients (1 January 2007-1 March 2013) was 2.2 years. The admission rate was 1.1 patients per year (interquartile range 0.5-3.5 patients). In the cost-effectiveness model, in patients aged < 75 years with HFrEF or HFpEF, BNP-guided therapy improves median survival (7.98 vs. 6.46 years) with a small QALY gain (5.68 vs. 5.02) but higher lifetime costs (£64,777 vs. £58,139). BNP-guided therapy is cost-effective at a threshold of £20,000 per QALY. LIMITATIONS The limitations of the trial were a lack of IPD for most RCTs and heterogeneous interventions; the inability to identify BNP monitoring confidently, to determine medication doses or to distinguish between HFrEF and HFpEF; the use of a simplified two-state Markov model; a focus on health service costs and a paucity of data on HFpEF patients aged < 75 years and HFrEF patients aged ≥ 75 years. CONCLUSIONS The efficacy of BNP-guided therapy in specialist HF clinics is uncertain. If efficacious, it would be cost-effective for patients aged < 75 years with HFrEF. The evidence reviewed may not apply in the UK because care is delivered differently. FUTURE WORK Identify an optimal BNP-monitoring strategy and how to optimise HF management in accordance with guidelines; update the IPD meta-analysis to include the Guiding Evidence Based Therapy Using Biomarker Intensified Treatment (GUIDE-IT) RCT; collect routine long-term outcome data for completed and ongoing RCTs. TRIAL REGISTRATION Current Controlled Trials ISRCTN37248047 and PROSPERO CRD42013005335. FUNDING This project was funded by the NIHR Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 21, No. 40. See the NIHR Journals Library website for further project information. The British Heart Foundation paid for Chris A Rogers' and Maria Pufulete's time contributing to the study. Syed Mohiuddin's time is supported by the NIHR Collaboration for Leadership in Applied Health Research and Care West at University Hospitals Bristol NHS Foundation Trust. Rachel Maishman contributed to the study when she was in receipt of a NIHR Methodology Research Fellowship. PMID:28774374
Pufulete, Maria; Maishman, Rachel; Dabner, Lucy; Mohiuddin, Syed; Hollingworth, William; Rogers, Chris A; Higgins, Julian; Dayer, Mark; Macleod, John; Purdy, Sarah; McDonagh, Theresa; Nightingale, Angus; Williams, Rachael; Reeves, Barnaby C
2017-08-01
Heart failure (HF) affects around 500,000 people in the UK. HF medications are frequently underprescribed and B-type natriuretic peptide (BNP)-guided therapy may help to optimise treatment. To evaluate the clinical effectiveness and cost-effectiveness of BNP-guided therapy compared with symptom-guided therapy in HF patients. Systematic review, cohort study and cost-effectiveness model. A literature review and usual care in the NHS. (a) HF patients in randomised controlled trials (RCTs) of BNP-guided therapy; and (b) patients having usual care for HF in the NHS. Systematic review : BNP-guided therapy or symptom-guided therapy in primary or secondary care. Cohort study : BNP monitored (≥ 6 months' follow-up and three or more BNP tests and two or more tests per year), BNP tested (≥ 1 tests but not BNP monitored) or never tested. Cost-effectiveness model : BNP-guided therapy in specialist clinics. Mortality, hospital admission (all cause and HF related) and adverse events; and quality-adjusted life-years (QALYs) for the cost-effectiveness model. Systematic review : Individual participant or aggregate data from eligible RCTs. Cohort study : The Clinical Practice Research Datalink, Hospital Episode Statistics and National Heart Failure Audit (NHFA). A systematic literature search (five databases, trial registries, grey literature and reference lists of publications) for published and unpublished RCTs. Five RCTs contributed individual participant data (IPD) and eight RCTs contributed aggregate data (1536 participants were randomised to BNP-guided therapy and 1538 participants were randomised to symptom-guided therapy). For all-cause mortality, the hazard ratio (HR) for BNP-guided therapy was 0.87 [95% confidence interval (CI) 0.73 to 1.04]. Patients who were aged < 75 years or who had heart failure with a reduced ejection fraction (HFrEF) received the most benefit [interactions ( p = 0.03): < 75 years vs. ≥ 75 years: HR 0.70 (95% CI 0.53 to 0.92) vs. 1.07 (95% CI 0.84 to 1.37); HFrEF vs. heart failure with a preserved ejection fraction (HFpEF): HR 0.83 (95% CI 0.68 to 1.01) vs. 1.33 (95% CI 0.83 to 2.11)]. In the cohort study, incident HF patients (1 April 2005-31 March 2013) were never tested ( n = 13,632), BNP tested ( n = 3392) or BNP monitored ( n = 71). Median survival was 5 years; all-cause mortality was 141.5 out of 1000 person-years (95% CI 138.5 to 144.6 person-years). All-cause mortality and hospital admission rate were highest in the BNP-monitored group, and median survival among 130,433 NHFA patients (1 January 2007-1 March 2013) was 2.2 years. The admission rate was 1.1 patients per year (interquartile range 0.5-3.5 patients). In the cost-effectiveness model, in patients aged < 75 years with HFrEF or HFpEF, BNP-guided therapy improves median survival (7.98 vs. 6.46 years) with a small QALY gain (5.68 vs. 5.02) but higher lifetime costs (£64,777 vs. £58,139). BNP-guided therapy is cost-effective at a threshold of £20,000 per QALY. The limitations of the trial were a lack of IPD for most RCTs and heterogeneous interventions; the inability to identify BNP monitoring confidently, to determine medication doses or to distinguish between HFrEF and HFpEF; the use of a simplified two-state Markov model; a focus on health service costs and a paucity of data on HFpEF patients aged < 75 years and HFrEF patients aged ≥ 75 years. The efficacy of BNP-guided therapy in specialist HF clinics is uncertain. If efficacious, it would be cost-effective for patients aged < 75 years with HFrEF. The evidence reviewed may not apply in the UK because care is delivered differently. Identify an optimal BNP-monitoring strategy and how to optimise HF management in accordance with guidelines; update the IPD meta-analysis to include the Guiding Evidence Based Therapy Using Biomarker Intensified Treatment (GUIDE-IT) RCT; collect routine long-term outcome data for completed and ongoing RCTs. Current Controlled Trials ISRCTN37248047 and PROSPERO CRD42013005335. This project was funded by the NIHR Health Technology Assessment programme and will be published in full in Health Technology Assessment ; Vol. 21, No. 40. See the NIHR Journals Library website for further project information. The British Heart Foundation paid for Chris A Rogers' and Maria Pufulete's time contributing to the study. Syed Mohiuddin's time is supported by the NIHR Collaboration for Leadership in Applied Health Research and Care West at University Hospitals Bristol NHS Foundation Trust. Rachel Maishman contributed to the study when she was in receipt of a NIHR Methodology Research Fellowship.
The 5C Concept and 5S Principles in Inflammatory Bowel Disease Management
Hibi, Toshifumi; Panaccione, Remo; Katafuchi, Miiko; Yokoyama, Kaoru; Watanabe, Kenji; Matsui, Toshiyuki; Matsumoto, Takayuki; Travis, Simon; Suzuki, Yasuo
2017-01-01
Abstract Background and Aims The international Inflammatory Bowel Disease [IBD] Expert Alliance initiative [2012–2015] served as a platform to define and support areas of best practice in IBD management to help improve outcomes for all patients with IBD. Methods During the programme, IBD specialists from around the world established by consensus two best practice charters: the 5S Principles and the 5C Concept. Results The 5S Principles were conceived to provide health care providers with key guidance for improving clinical practice based on best management approaches. They comprise the following categories: Stage the disease; Stratify patients; Set treatment goals; Select appropriate treatment; and Supervise therapy. Optimised management of patients with IBD based on the 5S Principles can be achieved most effectively within an optimised clinical care environment. Guidance on optimising the clinical care setting in IBD management is provided through the 5C Concept, which encompasses: Comprehensive IBD care; Collaboration; Communication; Clinical nurse specialists; and Care pathways. Together, the 5C Concept and 5S Principles provide structured recommendations on organising the clinical care setting and developing best-practice approaches in IBD management. Conclusions Consideration and application of these two dimensions could help health care providers optimise their IBD centres and collaborate more effectively with their multidisciplinary team colleagues and patients, to provide improved IBD care in daily clinical practice. Ultimately, this could lead to improved outcomes for patients with IBD. PMID:28981622
Collaborative development for setup, execution, sharing and analytics of complex NMR experiments.
Irvine, Alistair G; Slynko, Vadim; Nikolaev, Yaroslav; Senthamarai, Russell R P; Pervushin, Konstantin
2014-02-01
Factory settings of NMR pulse sequences are rarely ideal for every scenario in which they are utilised. The optimisation of NMR experiments has for many years been performed locally, with implementations often specific to an individual spectrometer. Furthermore, these optimised experiments are normally retained solely for the use of an individual laboratory, spectrometer or even single user. Here we introduce a web-based service that provides a database for the deposition, annotation and optimisation of NMR experiments. The application uses a Wiki environment to enable the collaborative development of pulse sequences. It also provides a flexible mechanism to automatically generate NMR experiments from deposited sequences. Multidimensional NMR experiments of proteins and other macromolecules consume significant resources, in terms of both spectrometer time and effort required to analyse the results. Systematic analysis of simulated experiments can enable optimal allocation of NMR resources for structural analysis of proteins. Our web-based application (http://nmrplus.org) provides all the necessary information, includes the auxiliaries (waveforms, decoupling sequences etc.), for analysis of experiments by accurate numerical simulation of multidimensional NMR experiments. The online database of the NMR experiments, together with a systematic evaluation of their sensitivity, provides a framework for selection of the most efficient pulse sequences. The development of such a framework provides a basis for the collaborative optimisation of pulse sequences by the NMR community, with the benefits of this collective effort being available to the whole community. Copyright © 2013 Elsevier Inc. All rights reserved.
Satellite Vibration Testing: Angle optimisation method to Reduce Overtesting
NASA Astrophysics Data System (ADS)
Knight, Charly; Remedia, Marcello; Aglietti, Guglielmo S.; Richardson, Guy
2018-06-01
Spacecraft overtesting is a long running problem, and the main focus of most attempts to reduce it has been to adjust the base vibration input (i.e. notching). Instead this paper examines testing alternatives for secondary structures (equipment) coupled to the main structure (satellite) when they are tested separately. Even if the vibration source is applied along one of the orthogonal axes at the base of the coupled system (satellite plus equipment), the dynamics of the system and potentially the interface configuration mean the vibration at the interface may not occur all along one axis much less the corresponding orthogonal axis of the base excitation. This paper proposes an alternative testing methodology in which the testing of a piece of equipment occurs at an offset angle. This Angle Optimisation method may have multiple tests but each with an altered input direction allowing for the best match between all specified equipment system responses with coupled system tests. An optimisation process that compares the calculated equipment RMS values for a range of inputs with the maximum coupled system RMS values, and is used to find the optimal testing configuration for the given parameters. A case study was performed to find the best testing angles to match the acceleration responses of the centre of mass and sum of interface forces for all three axes, as well as the von Mises stress for an element by a fastening point. The angle optimisation method resulted in RMS values and PSD responses that were much closer to the coupled system when compared with traditional testing. The optimum testing configuration resulted in an overall average error significantly smaller than the traditional method. Crucially, this case study shows that the optimum test campaign could be a single equipment level test opposed to the traditional three orthogonal direction tests.
Using Optimisation Techniques to Granulise Rough Set Partitions
NASA Astrophysics Data System (ADS)
Crossingham, Bodie; Marwala, Tshilidzi
2007-11-01
This paper presents an approach to optimise rough set partition sizes using various optimisation techniques. Three optimisation techniques are implemented to perform the granularisation process, namely, genetic algorithm (GA), hill climbing (HC) and simulated annealing (SA). These optimisation methods maximise the classification accuracy of the rough sets. The proposed rough set partition method is tested on a set of demographic properties of individuals obtained from the South African antenatal survey. The three techniques are compared in terms of their computational time, accuracy and number of rules produced when applied to the Human Immunodeficiency Virus (HIV) data set. The optimised methods results are compared to a well known non-optimised discretisation method, equal-width-bin partitioning (EWB). The accuracies achieved after optimising the partitions using GA, HC and SA are 66.89%, 65.84% and 65.48% respectively, compared to the accuracy of EWB of 59.86%. In addition to rough sets providing the plausabilities of the estimated HIV status, they also provide the linguistic rules describing how the demographic parameters drive the risk of HIV.
A New Computational Technique for the Generation of Optimised Aircraft Trajectories
NASA Astrophysics Data System (ADS)
Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto
2017-12-01
A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.
Chatzistergos, Panagiotis E; Naemi, Roozbeh; Healy, Aoife; Gerth, Peter; Chockalingam, Nachiappan
2017-08-01
Current selection of cushioning materials for therapeutic footwear and orthoses is based on empirical and anecdotal evidence. The aim of this investigation is to assess the biomechanical properties of carefully selected cushioning materials and to establish the basis for patient-specific material optimisation. For this purpose, bespoke cushioning materials with qualitatively similar mechanical behaviour but different stiffness were produced. Healthy volunteers were asked to stand and walk on materials with varying stiffness and their capacity for pressure reduction was assessed. Mechanical testing using a surrogate heel model was employed to investigate the effect of loading on optimum stiffness. Results indicated that optimising the stiffness of cushioning materials improved pressure reduction during standing and walking by at least 16 and 19% respectively. Moreover, the optimum stiffness was strongly correlated to body mass (BM) and body mass index (BMI), with stiffer materials needed in the case of people with higher BM or BMI. Mechanical testing confirmed that optimum stiffness increases with the magnitude of compressive loading. For the first time, this study provides quantitative data to support the importance of stiffness optimisation in cushioning materials and sets the basis for methods to inform optimum material selection in the clinic.
Fuss, Franz Konstantin
2013-01-01
Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals.
2013-01-01
Standard methods for computing the fractal dimensions of time series are usually tested with continuous nowhere differentiable functions, but not benchmarked with actual signals. Therefore they can produce opposite results in extreme signals. These methods also use different scaling methods, that is, different amplitude multipliers, which makes it difficult to compare fractal dimensions obtained from different methods. The purpose of this research was to develop an optimisation method that computes the fractal dimension of a normalised (dimensionless) and modified time series signal with a robust algorithm and a running average method, and that maximises the difference between two fractal dimensions, for example, a minimum and a maximum one. The signal is modified by transforming its amplitude by a multiplier, which has a non-linear effect on the signal's time derivative. The optimisation method identifies the optimal multiplier of the normalised amplitude for targeted decision making based on fractal dimensions. The optimisation method provides an additional filter effect and makes the fractal dimensions less noisy. The method is exemplified by, and explained with, different signals, such as human movement, EEG, and acoustic signals. PMID:24151522
A new effective operator for the hybrid algorithm for solving global optimisation problems
NASA Astrophysics Data System (ADS)
Duc, Le Anh; Li, Kenli; Nguyen, Tien Trong; Yen, Vu Minh; Truong, Tung Khac
2018-04-01
Hybrid algorithms have been recently used to solve complex single-objective optimisation problems. The ultimate goal is to find an optimised global solution by using these algorithms. Based on the existing algorithms (HP_CRO, PSO, RCCRO), this study proposes a new hybrid algorithm called MPC (Mean-PSO-CRO), which utilises a new Mean-Search Operator. By employing this new operator, the proposed algorithm improves the search ability on areas of the solution space that the other operators of previous algorithms do not explore. Specifically, the Mean-Search Operator helps find the better solutions in comparison with other algorithms. Moreover, the authors have proposed two parameters for balancing local and global search and between various types of local search, as well. In addition, three versions of this operator, which use different constraints, are introduced. The experimental results on 23 benchmark functions, which are used in previous works, show that our framework can find better optimal or close-to-optimal solutions with faster convergence speed for most of the benchmark functions, especially the high-dimensional functions. Thus, the proposed algorithm is more effective in solving single-objective optimisation problems than the other existing algorithms.
Optimisation of SIW bandpass filter with wide and sharp stopband using space mapping
NASA Astrophysics Data System (ADS)
Xu, Juan; Bi, Jun Jian; Li, Zhao Long; Chen, Ru shan
2016-12-01
This work presents a substrate integrated waveguide (SIW) bandpass filter with wide and precipitous stopband, which is different from filters with a direct input/output coupling structure. Higher modes in the SIW cavities are used to generate the finite transmission zeros for improved stopband performance. The design of SIW filters requires full wave electromagnetic simulation and extensive optimisation. If a full wave solver is used for optimisation, the design process is very time consuming. The space mapping (SM) approach has been called upon to alleviate this problem. In this case, the coarse model is optimised using an equivalent circuit model-based representation of the structure for fast computations. On the other hand, the verification of the design is completed with an accurate fine model full wave simulation. A fourth-order filter with a passband of 12.0-12.5 GHz is fabricated on a single layer Rogers RT/Duroid 5880 substrate. The return loss is better than 17.4 dB in the passband and the rejection is more than 40 dB in the stopband. The stopband is from 2 to 11 GHz and 13.5 to 17.3 GHz, demonstrating a wide bandwidth performance.
NASA Astrophysics Data System (ADS)
Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.
2016-06-01
The growing interest in technicians' workloads research is probably associated with the recent surge in competition. This was prompted by unprecedented technological development that triggers changes in customer tastes and preferences for industrial goods. In a quest for business improvement, this worldwide intense competition in industries has stimulated theories and practical frameworks that seek to optimise performance in workplaces. In line with this drive, the present paper proposes an optimisation model which considers technicians' reliability that complements factory information obtained. The information used emerged from technicians' productivity and earned-values using the concept of multi-objective modelling approach. Since technicians are expected to carry out routine and stochastic maintenance work, we consider these workloads as constraints. The influence of training, fatigue and experiential knowledge of technicians on workload management was considered. These workloads were combined with maintenance policy in optimising reliability, productivity and earned-values using the goal programming approach. Practical datasets were utilised in studying the applicability of the proposed model in practice. It was observed that our model was able to generate information that practicing maintenance engineers can apply in making more informed decisions on technicians' management.
NASA Astrophysics Data System (ADS)
van Haveren, Rens; Ogryczak, Włodzimierz; Verduijn, Gerda M.; Keijzer, Marleen; Heijmen, Ben J. M.; Breedveld, Sebastiaan
2017-06-01
Previously, we have proposed Erasmus-iCycle, an algorithm for fully automated IMRT plan generation based on prioritised (lexicographic) multi-objective optimisation with the 2-phase ɛ-constraint (2pɛc) method. For each patient, the output of Erasmus-iCycle is a clinically favourable, Pareto optimal plan. The 2pɛc method uses a list of objective functions that are consecutively optimised, following a strict, user-defined prioritisation. The novel lexicographic reference point method (LRPM) is capable of solving multi-objective problems in a single optimisation, using a fuzzy prioritisation of the objectives. Trade-offs are made globally, aiming for large favourable gains for lower prioritised objectives at the cost of only slight degradations for higher prioritised objectives, or vice versa. In this study, the LRPM is validated for 15 head and neck cancer patients receiving bilateral neck irradiation. The generated plans using the LRPM are compared with the plans resulting from the 2pɛc method. Both methods were capable of automatically generating clinically relevant treatment plans for all patients. For some patients, the LRPM allowed large favourable gains in some treatment plan objectives at the cost of only small degradations for the others. Moreover, because of the applied single optimisation instead of multiple optimisations, the LRPM reduced the average computation time from 209.2 to 9.5 min, a speed-up factor of 22 relative to the 2pɛc method.
Optimisation of solar synoptic observations
NASA Astrophysics Data System (ADS)
Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal
2012-09-01
The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.
ATLAS software configuration and build tool optimisation
NASA Astrophysics Data System (ADS)
Rybkin, Grigory; Atlas Collaboration
2014-06-01
ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.
Cejudo-Bastante, María Jesús; Rodríguez Dodero, M Carmen; Durán Guerrero, Enrique; Castro Mejías, Remedios; Natera Marín, Ramón; García Barroso, Carmelo
2013-03-15
Despite the long history of sherry wine vinegar, new alternatives of consumption are being developed, with the aim of diversifying its market. Several new acetic-based fruit juices have been developed by optimising the amount of sherry wine vinegar added to different fruit juices: apple, peach, orange and pineapple. Once the concentrations of wine vinegar were optimised by an expert panel, the aforementioned new acetic fruit juices were tasted by 86 consumers. Three different aspects were taken into account: habits of consumption of vinegar and fruit juices, gender and age. Based on the sensory analysis, 50 g kg(-1) of wine vinegar was the optimal and preferred amount of wine vinegar added to the apple, orange and peach juices, whereas 10 g kg(-1) was the favourite for the pineapple fruit. Based on the olfactory and gustatory impression, and 'purchase intent', the acetic beverages made from peach and pineapple juices were the most appreciated, followed by apple juice, while those obtained from orange juice were the least preferred by consumers. New opportunities for diversification of the oenological market could be possible as a result of the development of this type of new product which can be easily developed by any vinegar or fruit juice maker company. © 2012 Society of Chemical Industry.
Tantalum pentoxide waveguides and microresonators for VECSEL based frequency combs
NASA Astrophysics Data System (ADS)
Chen Sverre, T.; Woods, J. R. C.; Shaw, E. A.; Hua, Ping; Apostolopoulos, V.; Wilkinson, J. S.; Tropper, A. C.
2018-02-01
Tantalum pentoxide (Ta2O5) is a promising material for mass-producible, multi-functional, integrated photonics circuits on silicon, exhibiting robust electrical, mechanical and thermal properties, as well as good CMOS compatibility. In addition, Ta2O5 has been reported to demonstrate a non-linear response comparable to that of chalcogenide glass, in the region of 3-6 times larger than that of materials such as silica (SiO2) or silicon nitride (Si3N4). In contrast to Si-based dielectrics, it will accept trivalent ytterbium and erbium dopant ions, opening the possibility of on-chip amplification. The high refractive index of Ta2O5 is consistent with small guided mode cross-section area, and allows the construction of micro-ring resonators. Propagation losses as low as 0.2 dB=cm have been reported. In this paper we describe the design of a planar Ta2O5 waveguides optimised for the generation of coherent continuum with near infrared pulse trains at kW peak powers. The Pulse Repetition Frequency (PRF) of the VECSEL can be tuned to a sub-harmonic of the planar micro-ring and the optical pump power applied to the VECSEL can be adjusted so that mode-matching of the VECSEL pulse train with the micro-ring resonator can be achieved. We shall describe the fabrication of Ta2O5 guiding structures, and the characterisation of their nonlinear and other optical properties. Characterisation with conventional lasers will be used to assess the degree of coherent spectral broadening likely to be achievable using these devices when driven by mode-locked VECSELs operating near the current state-of- art for pulse energy and duration.
The AOLI Non-Linear Curvature Wavefront Sensor: High sensitivity reconstruction for low-order AO
NASA Astrophysics Data System (ADS)
Crass, Jonathan; King, David; Mackay, Craig
2013-12-01
Many adaptive optics (AO) systems in use today require bright reference objects to determine the effects of atmospheric distortions on incoming wavefronts. This requirement is because Shack Hartmann wavefront sensors (SHWFS) distribute incoming light from reference objects into a large number of sub-apertures. Bright natural reference objects occur infrequently across the sky leading to the use of laser guide stars which add complexity to wavefront measurement systems. The non-linear curvature wavefront sensor as described by Guyon et al. has been shown to offer a significant increase in sensitivity when compared to a SHWFS. This facilitates much greater sky coverage using natural guide stars alone. This paper describes the current status of the non-linear curvature wavefront sensor being developed as part of an adaptive optics system for the Adaptive Optics Lucky Imager (AOLI) project. The sensor comprises two photon-counting EMCCD detectors from E2V Technologies, recording intensity at four near-pupil planes. These images are used with a reconstruction algorithm to determine the phase correction to be applied by an ALPAO 241-element deformable mirror. The overall system is intended to provide low-order correction for a Lucky Imaging based multi CCD imaging camera. We present the current optical design of the instrument including methods to minimise inherent optical effects, principally chromaticity. Wavefront reconstruction methods are discussed and strategies for their optimisation to run at the required real-time speeds are introduced. Finally, we discuss laboratory work with a demonstrator setup of the system.
[Antidotes: use guidelines and minimun stock in an emergency department].
García-Martín, A; Torres Santos-Olmos, R
2012-01-01
To develop a guide for antidotes and other medications used to counteract poisoning, and define the stock in an emergency department, as a safety priority for the part-time pharmacist assigned to the unit. A search of specialist databases and web portals of the Spanish Society of Toxicology and the British National Poisons Information Service, as well as toxicology databases, TOXICONET, information from other hospitals, tertiary sources, Micromedex and Medline. The Guide contains 42 active ingredients and is accessible to the Pharmacy and Emergency departments in electronic format. A minimum emergency stock was agreed based on the daily treatment of a 100 kg patient. This information, including updated expiry dates, is available at the emergency department antidote stock facilities and in electronic format. On a monthly basis, the pharmacist reviews the need to replace any drugs, due to their expiry date or lack of use. The lack of evidence from high quality antidote studies, the variability due to the difficulties of updating sources and some geographical differences in their use means that decision-making can be difficult. It would be useful to have minimum quantity recommendations from societies of toxicology, regulatory agencies and organisations such as the Joint Commission on the Accreditation of Healthcare Organisations. It would also be useful to have a suprahospital risk assessment to optimise management and ensure the availability of antidotes which are expensive, have a limited shelf life, or of which demand is difficult to forecast. Copyright © 2011 SEFH. Published by Elsevier Espana. All rights reserved.
Photonic simulation of entanglement growth and engineering after a spin chain quench.
Pitsios, Ioannis; Banchi, Leonardo; Rab, Adil S; Bentivegna, Marco; Caprara, Debora; Crespi, Andrea; Spagnolo, Nicolò; Bose, Sougato; Mataloni, Paolo; Osellame, Roberto; Sciarrino, Fabio
2017-11-17
The time evolution of quantum many-body systems is one of the most important processes for benchmarking quantum simulators. The most curious feature of such dynamics is the growth of quantum entanglement to an amount proportional to the system size (volume law) even when interactions are local. This phenomenon has great ramifications for fundamental aspects, while its optimisation clearly has an impact on technology (e.g., for on-chip quantum networking). Here we use an integrated photonic chip with a circuit-based approach to simulate the dynamics of a spin chain and maximise the entanglement generation. The resulting entanglement is certified by constructing a second chip, which measures the entanglement between multiple distant pairs of simulated spins, as well as the block entanglement entropy. This is the first photonic simulation and optimisation of the extensive growth of entanglement in a spin chain, and opens up the use of photonic circuits for optimising quantum devices.
A support vector machine approach for classification of welding defects from ultrasonic signals
NASA Astrophysics Data System (ADS)
Chen, Yuan; Ma, Hong-Wei; Zhang, Guang-Ming
2014-07-01
Defect classification is an important issue in ultrasonic non-destructive evaluation. A layered multi-class support vector machine (LMSVM) classification system, which combines multiple SVM classifiers through a layered architecture, is proposed in this paper. The proposed LMSVM classification system is applied to the classification of welding defects from ultrasonic test signals. The measured ultrasonic defect echo signals are first decomposed into wavelet coefficients by the wavelet packet transform. The energy of the wavelet coefficients at different frequency channels are used to construct the feature vectors. The bees algorithm (BA) is then used for feature selection and SVM parameter optimisation for the LMSVM classification system. The BA-based feature selection optimises the energy feature vectors. The optimised feature vectors are input to the LMSVM classification system for training and testing. Experimental results of classifying welding defects demonstrate that the proposed technique is highly robust, precise and reliable for ultrasonic defect classification.
Use of a genetic algorithm to improve the rail profile on Stockholm underground
NASA Astrophysics Data System (ADS)
Persson, Ingemar; Nilsson, Rickard; Bik, Ulf; Lundgren, Magnus; Iwnicki, Simon
2010-12-01
In this paper, a genetic algorithm optimisation method has been used to develop an improved rail profile for Stockholm underground. An inverted penalty index based on a number of key performance parameters was generated as a fitness function and vehicle dynamics simulations were carried out with the multibody simulation package Gensys. The effectiveness of each profile produced by the genetic algorithm was assessed using the roulette wheel method. The method has been applied to the rail profile on the Stockholm underground, where problems with rolling contact fatigue on wheels and rails are currently managed by grinding. From a starting point of the original BV50 and the UIC60 rail profiles, an optimised rail profile with some shoulder relief has been produced. The optimised profile seems similar to measured rail profiles on the Stockholm underground network and although initial grinding is required, maintenance of the profile will probably not require further grinding.
Pre-operative optimisation of lung function
Azhar, Naheed
2015-01-01
The anaesthetic management of patients with pre-existing pulmonary disease is a challenging task. It is associated with increased morbidity in the form of post-operative pulmonary complications. Pre-operative optimisation of lung function helps in reducing these complications. Patients are advised to stop smoking for a period of 4–6 weeks. This reduces airway reactivity, improves mucociliary function and decreases carboxy-haemoglobin. The widely used incentive spirometry may be useful only when combined with other respiratory muscle exercises. Volume-based inspiratory devices have the best results. Pharmacotherapy of asthma and chronic obstructive pulmonary disease must be optimised before considering the patient for elective surgery. Beta 2 agonists, inhaled corticosteroids and systemic corticosteroids, are the main drugs used for this and several drugs play an adjunctive role in medical therapy. A graded approach has been suggested to manage these patients for elective surgery with an aim to achieve optimal pulmonary function. PMID:26556913
On the optimisation of the use of 3He in radiation portal monitors
NASA Astrophysics Data System (ADS)
Tomanin, Alice; Peerani, Paolo; Janssens-Maenhout, Greet
2013-02-01
Radiation Portal Monitors (RPMs) are used to detect illicit trafficking of nuclear or other radioactive material concealed in vehicles, cargo containers or people at strategic check points, such as borders, seaports and airports. Most of them include neutron detectors for the interception of potential plutonium smuggling. The most common technology used for neutron detection in RPMs is based on 3He proportional counters. The recent severe shortage of this rare and expensive gas has created a problem of capacity for manufacturers to provide enough detectors to satisfy the market demand. In this paper we analyse the design of typical commercial RPMs and try to optimise the detector parameters in order either to maximise the efficiency using the same amount of 3He or minimise the amount of gas needed to reach the same detection performance: by reducing the volume or gas pressure in an optimised design.
Hybrid real-code ant colony optimisation for constrained mechanical design
NASA Astrophysics Data System (ADS)
Pholdee, Nantiwat; Bureerat, Sujin
2016-01-01
This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.
Sterckx, Femke L; Saison, Daan; Delvaux, Freddy R
2010-08-31
Monophenols are widely spread compounds contributing to the flavour of many foods and beverages. They are most likely present in beer, but so far, little is known about their influence on beer flavour. To quantify these monophenols in beer, we optimised a headspace solid-phase microextraction method coupled to gas chromatography-mass spectrometry. To improve their isolation from the beer matrix and their chromatographic properties, the monophenols were acetylated using acetic anhydride and KHCO(3) as derivatising agent and base catalyst, respectively. Derivatisation conditions were optimised with attention for the pH of the reaction medium. Additionally, different parameters affecting extraction efficiency were optimised, including fibre coating, extraction time and temperature and salt addition. Afterwards, we calibrated and validated the method successfully and applied it for the analysis of monophenols in beer samples. 2010 Elsevier B.V. All rights reserved.
Dwell time-based stabilisation of switched delay systems using free-weighting matrices
NASA Astrophysics Data System (ADS)
Koru, Ahmet Taha; Delibaşı, Akın; Özbay, Hitay
2018-01-01
In this paper, we present a quasi-convex optimisation method to minimise an upper bound of the dwell time for stability of switched delay systems. Piecewise Lyapunov-Krasovskii functionals are introduced and the upper bound for the derivative of Lyapunov functionals is estimated by free-weighting matrices method to investigate non-switching stability of each candidate subsystems. Then, a sufficient condition for the dwell time is derived to guarantee the asymptotic stability of the switched delay system. Once these conditions are represented by a set of linear matrix inequalities , dwell time optimisation problem can be formulated as a standard quasi-convex optimisation problem. Numerical examples are given to illustrate the improvements over previously obtained dwell time bounds. Using the results obtained in the stability case, we present a nonlinear minimisation algorithm to synthesise the dwell time minimiser controllers. The algorithm solves the problem with successive linearisation of nonlinear conditions.
Evolution of Force Sensing Technologies.
Shah, Dipen
2017-06-01
In order to Improve the procedural success and long-term outcomes of catheter ablation techniques for atrial fibrillation (AF), an Important unfulfilled requirement is to create durable electrophysiologically complete lesions. Measurement of contact force (CF) between the catheter tip and the target tissue can guide physicians to optimise both mapping and ablation procedures. Contact force can affect lesion size and clinical outcomes following catheter ablation of AF. Force sensing technologies have matured since their advent several years ago, and now allow the direct measurement of CF between the catheter tip and the target myocardium in real time. In order to obtain complete durable lesions, catheter tip spatial stability and stable contact force are important. Suboptimal energy delivery, lesion density/contiguity and/or excessive wall thickness of the pulmonary vein-left atrial (PV-LA) junction may result in conduction recovery at these sites. Lesion assessment tools may help predict and localise electrical weak points resulting in conduction recovery during and after ablation. There is increasing clinical evidence to show that optimal use of CF sensing during ablation can reduce acute PV re-conduction, although prospective randomised studies are desirable to confirm long-term favourable clinical outcomes. In combination with optimised lesion assessment tools, contact force sensing technology has the potential to become the standard of care for all patients undergoing AF catheter ablation.
Williams, Ruth M; Senanayake, Upeka; Artibani, Mara; Taylor, Gunes; Wells, Daniel; Ahmed, Ahmed Ashour; Sauka-Spengler, Tatjana
2018-02-23
CRISPR/Cas9 genome engineering has revolutionised all aspects of biological research, with epigenome engineering transforming gene regulation studies. Here, we present an optimised, adaptable toolkit enabling genome and epigenome engineering in the chicken embryo, and demonstrate its utility by probing gene regulatory interactions mediated by neural crest enhancers. First, we optimise novel efficient guide-RNA mini expression vectors utilising chick U6 promoters, provide a strategy for rapid somatic gene knockout and establish a protocol for evaluation of mutational penetrance by targeted next-generation sequencing. We show that CRISPR/Cas9-mediated disruption of transcription factors causes a reduction in their cognate enhancer-driven reporter activity. Next, we assess endogenous enhancer function using both enhancer deletion and nuclease-deficient Cas9 (dCas9) effector fusions to modulate enhancer chromatin landscape, thus providing the first report of epigenome engineering in a developing embryo. Finally, we use the synergistic activation mediator (SAM) system to activate an endogenous target promoter. The novel genome and epigenome engineering toolkit developed here enables manipulation of endogenous gene expression and enhancer activity in chicken embryos, facilitating high-resolution analysis of gene regulatory interactions in vivo . © 2018. Published by The Company of Biologists Ltd.
Roberts, Jason A; Joynt, Gavin M; Choi, Gordon Y S; Gomersall, Charles D; Lipman, Jeffrey
2012-03-01
Optimising antimicrobial dosing for critically ill patients is highly challenging and when it is not achieved can lead to worse patient outcomes. To this end, use of dosing regimens recommended in package inserts from drug manufacturers is frequently insufficient to guide dosing in these patients appropriately. Whilst the effect of critical illness pathophysiology on the pharmacokinetic (PK) behaviour of antimicrobials can be profound, the variability of these changes between patients is still being quantified. The PK effects of hypoproteinaemia, organ dysfunction and the presence of augmented renal clearance may lead to plasma antimicrobial concentrations that are difficult to predict at the bedside, which may result in excess toxicity or suboptimal bacterial killing. This paper outlines the factors that affect pharmacokinetics in critically ill patients and how knowledge of these factors can increase the likelihood of achieving optimal antimicrobial plasma concentrations. In selected settings, we advocate individualised dosing of renally cleared antimicrobials using physiological data such as measured creatinine clearance and published non-renal clearance data. Where such data do not exist, therapeutic drug monitoring may be a useful alternative and has been associated with significant clinical benefits, although it is not currently widely available. Copyright © 2011 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.
Mixing formula for tissue-mimicking silicone phantoms in the near infrared
NASA Astrophysics Data System (ADS)
Böcklin, C.; Baumann, D.; Stuker, F.; Fröhlich, Jürg
2015-03-01
The knowledge of accurate optical parameters of materials is paramount in biomedical optics applications and numerical simulations of such systems. Phantom materials with variable but predefined parameters are needed to optimise these systems. An optimised integrating sphere measurement setup and reconstruction algorithm are presented in this work to determine the optical properties of silicone rubber based phantoms whose absorption and scattering properties are altered with TiO2 and carbon black particles. A mixing formula for all constituents is derived and allows to create phantoms with predefined optical properties.
NASA Astrophysics Data System (ADS)
Andrade, P.; Fiorini, B.; Murphy, S.; Pigueiras, L.; Santos, M.
2015-12-01
Over the past two years, the operation of the CERN Data Centres went through significant changes with the introduction of new mechanisms for hardware procurement, new services for cloud provisioning and configuration management, among other improvements. These changes resulted in an increase of resources being operated in a more dynamic environment. Today, the CERN Data Centres provide over 11000 multi-core processor servers, 130 PB disk servers, 100 PB tape robots, and 150 high performance tape drives. To cope with these developments, an evolution of the data centre monitoring tools was also required. This modernisation was based on a number of guiding rules: sustain the increase of resources, adapt to the new dynamic nature of the data centres, make monitoring data easier to share, give more flexibility to Service Managers on how they publish and consume monitoring metrics and logs, establish a common repository of monitoring data, optimise the handling of monitoring notifications, and replace the previous toolset by new open source technologies with large adoption and community support. This contribution describes how these improvements were delivered, present the architecture and technologies of the new monitoring tools, and review the experience of its production deployment.
Path integration mediated systematic search: a Bayesian model.
Vickerstaff, Robert J; Merkle, Tobias
2012-08-21
The systematic search behaviour is a backup system that increases the chances of desert ants finding their nest entrance after foraging when the path integrator has failed to guide them home accurately enough. Here we present a mathematical model of the systematic search that is based on extensive behavioural studies in North African desert ants Cataglyphis fortis. First, a simple search heuristic utilising Bayesian inference and a probability density function is developed. This model, which optimises the short-term nest detection probability, is then compared to three simpler search heuristics and to recorded search patterns of Cataglyphis ants. To compare the different searches a method to quantify search efficiency is established as well as an estimate of the error rate in the ants' path integrator. We demonstrate that the Bayesian search heuristic is able to automatically adapt to increasing levels of positional uncertainty to produce broader search patterns, just as desert ants do, and that it outperforms the three other search heuristics tested. The searches produced by it are also arguably the most similar in appearance to the ant's searches. Copyright © 2012 Elsevier Ltd. All rights reserved.
Human placental vasculature imaging using an LED-based photoacoustic/ultrasound imaging system
NASA Astrophysics Data System (ADS)
Maneas, Efthymios; Xia, Wenfeng; Kuniyil Ajith Singh, Mithun; Sato, Naoto; Agano, Toshitaka; Ourselin, Sebastien; West, Simeon J.; David, Anna L.; Vercauteren, Tom; Desjardins, Adrien E.
2018-02-01
Minimally invasive fetal interventions, such as those used for therapy of twin-to-twin transfusion syndrome (TTTS), require accurate image guidance to optimise patient outcomes. Currently, TTTS can be treated fetoscopically by identifying anastomosing vessels on the chorionic (fetal) placental surface, and then performing photocoagulation. Incomplete photocoagulation increases the risk of procedure failure. Photoacoustic imaging can provide contrast for both haemoglobin concentration and oxygenation, and in this study, it was hypothesised that it can resolve chorionic placental vessels. We imaged a term human placenta that was collected after caesarean section delivery using a photoacoustic/ultrasound system (AcousticX) that included light emitting diode (LED) arrays for excitation light and a linear-array ultrasound imaging probe. Two-dimensional (2D) co-registered photoacoustic and B-mode pulse-echo ultrasound images were acquired and displayed in real-time. Translation of the imaging probe enabled 3D imaging. This feasibility study demonstrated that photoacoustic imaging can be used to visualise chorionic placental vasculature, and that it has strong potential to guide minimally invasive fetal interventions.
Morton, Katherine; Band, Rebecca; van Woezik, Anne; Grist, Rebecca; McManus, Richard J.; Little, Paul; Yardley, Lucy
2018-01-01
Background For behaviour-change interventions to be successful they must be acceptable to users and overcome barriers to behaviour change. The Person-Based Approach can help to optimise interventions to maximise acceptability and engagement. This article presents a novel, efficient and systematic method that can be used as part of the Person-Based Approach to rapidly analyse data from development studies to inform intervention modifications. We describe how we used this approach to optimise a digital intervention for patients with hypertension (HOME BP), which aims to implement medication and lifestyle changes to optimise blood pressure control. Methods In study 1, hypertensive patients (N = 12) each participated in three think-aloud interviews, providing feedback on a prototype of HOME BP. In study 2 patients (N = 11) used HOME BP for three weeks and were then interviewed about their experiences. Studies 1 and 2 were used to identify detailed changes to the intervention content and potential barriers to engagement with HOME BP. In study 3 (N = 7) we interviewed hypertensive patients who were not interested in using an intervention like HOME BP to identify potential barriers to uptake, which informed modifications to our recruitment materials. Analysis in all three studies involved detailed tabulation of patient data and comparison to our modification criteria. Results Studies 1 and 2 indicated that the HOME BP procedures were generally viewed as acceptable and feasible, but also highlighted concerns about monitoring blood pressure correctly at home and making medication changes remotely. Patients in study 3 had additional concerns about the safety and security of the intervention. Modifications improved the acceptability of the intervention and recruitment materials. Conclusions This paper provides a detailed illustration of how to use the Person-Based Approach to refine a digital intervention for hypertension. The novel, efficient approach to analysis and criteria for deciding when to implement intervention modifications described here may be useful to others developing interventions. PMID:29723262
Zipfel, Stephan; Wild, Beate; Groß, Gaby; Friederich, Hans-Christoph; Teufel, Martin; Schellberg, Dieter; Giel, Katrin E; de Zwaan, Martina; Dinkel, Andreas; Herpertz, Stephan; Burgmer, Markus; Löwe, Bernd; Tagay, Sefik; von Wietersheim, Jörn; Zeeck, Almut; Schade-Brittinger, Carmen; Schauenburg, Henning; Herzog, Wolfgang
2014-01-11
Psychotherapy is the treatment of choice for patients with anorexia nervosa, although evidence of efficacy is weak. The Anorexia Nervosa Treatment of OutPatients (ANTOP) study aimed to assess the efficacy and safety of two manual-based outpatient treatments for anorexia nervosa--focal psychodynamic therapy and enhanced cognitive behaviour therapy--versus optimised treatment as usual. The ANTOP study is a multicentre, randomised controlled efficacy trial in adults with anorexia nervosa. We recruited patients from ten university hospitals in Germany. Participants were randomly allocated to 10 months of treatment with either focal psychodynamic therapy, enhanced cognitive behaviour therapy, or optimised treatment as usual (including outpatient psychotherapy and structured care from a family doctor). The primary outcome was weight gain, measured as increased body-mass index (BMI) at the end of treatment. A key secondary outcome was rate of recovery (based on a combination of weight gain and eating disorder-specific psychopathology). Analysis was by intention to treat. This trial is registered at http://isrctn.org, number ISRCTN72809357. Of 727 adults screened for inclusion, 242 underwent randomisation: 80 to focal psychodynamic therapy, 80 to enhanced cognitive behaviour therapy, and 82 to optimised treatment as usual. At the end of treatment, 54 patients (22%) were lost to follow-up, and at 12-month follow-up a total of 73 (30%) had dropped out. At the end of treatment, BMI had increased in all study groups (focal psychodynamic therapy 0·73 kg/m(2), enhanced cognitive behaviour therapy 0·93 kg/m(2), optimised treatment as usual 0·69 kg/m(2)); no differences were noted between groups (mean difference between focal psychodynamic therapy and enhanced cognitive behaviour therapy -0·45, 95% CI -0·96 to 0·07; focal psychodynamic therapy vs optimised treatment as usual -0·14, -0·68 to 0·39; enhanced cognitive behaviour therapy vs optimised treatment as usual -0·30, -0·22 to 0·83). At 12-month follow-up, the mean gain in BMI had risen further (1·64 kg/m(2), 1·30 kg/m(2), and 1·22 kg/m(2), respectively), but no differences between groups were recorded (0·10, -0·56 to 0·76; 0·25, -0·45 to 0·95; 0·15, -0·54 to 0·83, respectively). No serious adverse events attributable to weight loss or trial participation were recorded. Optimised treatment as usual, combining psychotherapy and structured care from a family doctor, should be regarded as solid baseline treatment for adult outpatients with anorexia nervosa. Focal psychodynamic therapy proved advantageous in terms of recovery at 12-month follow-up, and enhanced cognitive behaviour therapy was more effective with respect to speed of weight gain and improvements in eating disorder psychopathology. Long-term outcome data will be helpful to further adapt and improve these novel manual-based treatment approaches. German Federal Ministry of Education and Research (Bundesministerium für Bildung und Forschung, BMBF), German Eating Disorders Diagnostic and Treatment Network (EDNET). Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
du Feu, R. J.; Funke, S. W.; Kramer, S. C.; Hill, J.; Piggott, M. D.
2016-12-01
The installation of tidal turbines into the ocean will inevitably affect the environment around them. However, due to the relative infancy of this sector the extent and severity of such effects is unknown. The layout of an array of turbines is an important factor in determining not only the array's final yield but also how it will influence regional hydrodynamics. This in turn could affect, for example, sediment transportation or habitat suitability. The two potentially competing objectives of extracting energy from the tidal current, and of limiting any environmental impact consequent to influencing that current, are investigated here. This relationship is posed as a multi-objective optimisation problem. OpenTidalFarm, an array layout optimisation tool, and MaxEnt, habitat sustainability modelling software, are used to evaluate scenarios off the coast of the UK. MaxEnt is used to estimate the likelihood of finding a species in a given location based upon environmental input data and presence data of the species. Environmental features which are known to impact habitat, specifically those affected by the presence of an array, such as bed shear stress, are chosen as inputs. MaxEnt then uses a maximum-entropy modelling approach to estimate population distribution across the modelled area. OpenTidalFarm is used to maximise the power generated by an array, or multiple arrays, through adjusting the position and number of turbines within them. It uses a 2D shallow water model with turbine arrays represented as adjustable friction fields. It has the capability to also optimise for user created functionals that can be expressed mathematically. This work uses two functionals; power extracted by the array, and the suitability of habitat as predicted by MaxEnt. A gradient-based local optimisation is used to adjust the array layout at each iteration. This work presents arrays that are optimised for both yield and the viability of habitat for chosen species. In each scenario studied, a range of array formations is found expressing varying preferences for either functional. Further analyses then allow for the identification of trade-offs between the two key societal objectives of energy production and conservation. This in turn produces information valuable to stakeholders and policymakers when making decisions on array design.
An Optimised System for Generating Multi-Resolution Dtms Using NASA Mro Datasets
NASA Astrophysics Data System (ADS)
Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Veitch-Michaelis, J.; Yershov, V.
2016-06-01
Within the EU FP-7 iMars project, a fully automated multi-resolution DTM processing chain, called Co-registration ASP-Gotcha Optimised (CASP-GO) has been developed, based on the open source NASA Ames Stereo Pipeline (ASP). CASP-GO includes tiepoint based multi-resolution image co-registration and an adaptive least squares correlation-based sub-pixel refinement method called Gotcha. The implemented system guarantees global geo-referencing compliance with respect to HRSC (and thence to MOLA), provides refined stereo matching completeness and accuracy based on the ASP normalised cross-correlation. We summarise issues discovered from experimenting with the use of the open-source ASP DTM processing chain and introduce our new working solutions. These issues include global co-registration accuracy, de-noising, dealing with failure in matching, matching confidence estimation, outlier definition and rejection scheme, various DTM artefacts, uncertainty estimation, and quality-efficiency trade-offs.
The 5C Concept and 5S Principles in Inflammatory Bowel Disease Management.
Hibi, Toshifumi; Panaccione, Remo; Katafuchi, Miiko; Yokoyama, Kaoru; Watanabe, Kenji; Matsui, Toshiyuki; Matsumoto, Takayuki; Travis, Simon; Suzuki, Yasuo
2017-10-27
The international Inflammatory Bowel Disease [IBD] Expert Alliance initiative [2012-2015] served as a platform to define and support areas of best practice in IBD management to help improve outcomes for all patients with IBD. During the programme, IBD specialists from around the world established by consensus two best practice charters: the 5S Principles and the 5C Concept. The 5S Principles were conceived to provide health care providers with key guidance for improving clinical practice based on best management approaches. They comprise the following categories: Stage the disease; Stratify patients; Set treatment goals; Select appropriate treatment; and Supervise therapy. Optimised management of patients with IBD based on the 5S Principles can be achieved most effectively within an optimised clinical care environment. Guidance on optimising the clinical care setting in IBD management is provided through the 5C Concept, which encompasses: Comprehensive IBD care; Collaboration; Communication; Clinical nurse specialists; and Care pathways. Together, the 5C Concept and 5S Principles provide structured recommendations on organising the clinical care setting and developing best-practice approaches in IBD management. Consideration and application of these two dimensions could help health care providers optimise their IBD centres and collaborate more effectively with their multidisciplinary team colleagues and patients, to provide improved IBD care in daily clinical practice. Ultimately, this could lead to improved outcomes for patients with IBD. Copyright © 2017 European Crohn’s and Colitis Organisation (ECCO). Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com
A new bio-inspired optimisation algorithm: Bird Swarm Algorithm
NASA Astrophysics Data System (ADS)
Meng, Xian-Bing; Gao, X. Z.; Lu, Lihua; Liu, Yu; Zhang, Hengzhen
2016-07-01
A new bio-inspired algorithm, namely Bird Swarm Algorithm (BSA), is proposed for solving optimisation applications. BSA is based on the swarm intelligence extracted from the social behaviours and social interactions in bird swarms. Birds mainly have three kinds of behaviours: foraging behaviour, vigilance behaviour and flight behaviour. Birds may forage for food and escape from the predators by the social interactions to obtain a high chance of survival. By modelling these social behaviours, social interactions and the related swarm intelligence, four search strategies associated with five simplified rules are formulated in BSA. Simulations and comparisons based on eighteen benchmark problems demonstrate the effectiveness, superiority and stability of BSA. Some proposals for future research about BSA are also discussed.
Miniature high-resolution guided-wave spectrometer for atmospheric remote sensing
NASA Astrophysics Data System (ADS)
Sloan, James; Kruzelecky, Roman; Wong, Brian; Zou, Jing; Jamroz, Wes; Haddad, Emile; Poirier, Michel
This paper describes the design and application of an innovative spectrometer in which a guided-wave integrated optical spectrometer (IOSPEC) has been coupled with a Fabry-Perot (FP) interferometer. This miniature spectrometer has a net mass under 3 kg, but is capable of broadband operation at spectral resolutions below 0.03 nm full width half maximum (FWHM). The tuneable FP filter provides very high spectral resolution combined with a large input aper-ture. The solid state guided-wave spectrometer is currently configured for a 512-channel array detector, which provides sub-nm coarse resolution. The ultimate resolution is determined by the FP filter, which is tuned across the desired spectral bands, thereby providing a signal-to-noise ratio (SNR) advantage over scanned spectrometer systems of the square root of the number of detector channels. The guided-wave optics provides robust, long-term optical alignment, while minimising the mechanical complexity. The miniaturisation of the FP-IOSPEC spectrometer allows multiple spectrometers to be accommodated on a single MicroSat. Each of these can be optimised for selected measurement tasks and views, thereby enabling more flexible data acquisition strategies with enhanced information content, while minimizing the mission cost. The application of this innovative technology in the proposed Miniature Earth Observation Satellite (MEOS) mission will also be discussed. The MEOS mission, which is designed for the investigation of the carbon and water cycles, relies on multiple IO-SPEC instruments for the simultaneous measurement of a range of atmospheric and surface properties important to climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garnon, Julien, E-mail: juliengarnon@gmail.com; Koch, Guillaume, E-mail: Guillaume.koch@gmail.com; Ramamurthy, Nitin, E-mail: Nitin-ramamurthy@hotmail.com
ObjectiveTo review our initial experience with percutaneous CT and fluoroscopy-guided screw fixation of pathological shoulder-girdle fractures.Materials and MethodsBetween May 2014 and June 2015, three consecutive oncologic patients (mean age 65 years; range 57–75 years) with symptomatic pathological shoulder-girdle fractures unsuitable for surgery and radiotherapy underwent percutaneous image-guided screw fixation. Fractures occurred through metastases (n = 2) or a post-ablation cavity (n = 1). Mechanical properties of osteosynthesis were adjudged superior to stand-alone cementoplasty in each case. Cannulated screws were placed under combined CT and fluoroscopic guidance with complementary radiofrequency ablation or cementoplasty to optimise local palliation and secure screw fixation, respectively, in two cases. Follow-up wasmore » undertaken every few weeks until mortality or most recent appointment.ResultsFour pathological fractures were treated in three patients (2 acromion, 1 clavicular, 1 coracoid). Mean size of associated lesion was 2.6 cm (range 1–4.5 cm). Technical success was achieved in all cases (100 %), without complications. Good palliation and restoration of mobility were observed in two cases at 2–3 months; one case could not be followed due to early post-procedural oncologic mortality.ConclusionPercutaneous image-guided shoulder-girdle osteosynthesis appears technically feasible with good short-term efficacy in this complex patient subset. Further studies are warranted to confirm these promising initial results.« less
2018-01-01
Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site. PMID:29370230
Illias, Hazlee Azil; Zhao Liang, Wee
2018-01-01
Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site.
Modelling the protocol stack in NCS with deterministic and stochastic petri net
NASA Astrophysics Data System (ADS)
Hui, Chen; Chunjie, Zhou; Weifeng, Zhu
2011-06-01
Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.
Acquisition of business intelligence from human experience in route planning
NASA Astrophysics Data System (ADS)
Bello Orgaz, Gema; Barrero, David F.; R-Moreno, María D.; Camacho, David
2015-04-01
The logistic sector raises a number of highly challenging problems. Probably one of the most important ones is the shipping planning, i.e. plan the routes that the shippers have to follow to deliver the goods. In this article, we present an artificial intelligence-based solution that has been designed to help a logistic company to improve its routes planning process. In order to achieve this goal, the solution uses the knowledge acquired by the company drivers to propose optimised routes. Hence, the proposed solution gathers the experience of the drivers, processes it and optimises the delivery process. The solution uses data mining to extract knowledge from the company information systems and prepares it for analysis with a case-based reasoning (CBR) algorithm. The CBR obtains critical business intelligence knowledge from the drivers experience that is needed by the planner. The design of the routes is done by a genetic algorithm that, given the processed information, optimises the routes following several objectives, such as minimise the distance or time. Experimentation shows that the proposed approach is able to find routes that improve, on average, the routes made by the human experts.
Computer-aided diagnosis of melanoma using border and wavelet-based texture analysis.
Garnavi, Rahil; Aldeen, Mohammad; Bailey, James
2012-11-01
This paper presents a novel computer-aided diagnosis system for melanoma. The novelty lies in the optimised selection and integration of features derived from textural, borderbased and geometrical properties of the melanoma lesion. The texture features are derived from using wavelet-decomposition, the border features are derived from constructing a boundaryseries model of the lesion border and analysing it in spatial and frequency domains, and the geometry features are derived from shape indexes. The optimised selection of features is achieved by using the Gain-Ratio method, which is shown to be computationally efficient for melanoma diagnosis application. Classification is done through the use of four classifiers; namely, Support Vector Machine, Random Forest, Logistic Model Tree and Hidden Naive Bayes. The proposed diagnostic system is applied on a set of 289 dermoscopy images (114 malignant, 175 benign) partitioned into train, validation and test image sets. The system achieves and accuracy of 91.26% and AUC value of 0.937, when 23 features are used. Other important findings include (i) the clear advantage gained in complementing texture with border and geometry features, compared to using texture information only, and (ii) higher contribution of texture features than border-based features in the optimised feature set.
NASA Astrophysics Data System (ADS)
Shaw-Stewart, James; Mattle, Thomas; Lippert, Thomas; Nagel, Matthias; Nüesch, Frank; Wokaun, Alexander
2013-08-01
Laser-induced forward transfer (LIFT) has already been used to fabricate various types of organic light-emitting diodes (OLEDs), and the process itself has been optimised and refined considerably since OLED pixels were first demonstrated. In particular, a dynamic release layer (DRL) of triazene polymer has been used, the environmental pressure has been reduced down to a medium vacuum, and the donor receiver gap has been controlled with the use of spacers. Insight into the LIFT process's effect upon OLED pixel performance is presented here, obtained through optimisation of three-colour polyfluorene-based OLEDs. A marked dependence of the pixel morphology quality on the cathode metal is observed, and the laser transfer fluence dependence is also analysed. The pixel device performances are compared to conventionally fabricated devices, and cathode effects have been looked at in detail. The silver cathode pixels show more heterogeneous pixel morphologies, and a correspondingly poorer efficiency characteristics. The aluminium cathode pixels have greater green electroluminescent emission than both the silver cathode pixels and the conventionally fabricated aluminium devices, and the green emission has a fluence dependence for silver cathode pixels.
Optimisation of a propagation-based x-ray phase-contrast micro-CT system
NASA Astrophysics Data System (ADS)
Nesterets, Yakov I.; Gureyev, Timur E.; Dimmock, Matthew R.
2018-03-01
Micro-CT scanners find applications in many areas ranging from biomedical research to material sciences. In order to provide spatial resolution on a micron scale, these scanners are usually equipped with micro-focus, low-power x-ray sources and hence require long scanning times to produce high resolution 3D images of the object with acceptable contrast-to-noise. Propagation-based phase-contrast tomography (PB-PCT) has the potential to significantly improve the contrast-to-noise ratio (CNR) or, alternatively, reduce the image acquisition time while preserving the CNR and the spatial resolution. We propose a general approach for the optimisation of the PB-PCT imaging system. When applied to an imaging system with fixed parameters of the source and detector this approach requires optimisation of only two independent geometrical parameters of the imaging system, i.e. the source-to-object distance R 1 and geometrical magnification M, in order to produce the best spatial resolution and CNR. If, in addition to R 1 and M, the system parameter space also includes the source size and the anode potential this approach allows one to find a unique configuration of the imaging system that produces the required spatial resolution and the best CNR.
Gorjanc, Gregor; Hickey, John M
2018-05-02
AlphaMate is a flexible program that optimises selection, maintenance of genetic diversity, and mate allocation in breeding programs. It can be used in animal and cross- and self-pollinating plant populations. These populations can be subject to selective breeding or conservation management. The problem is formulated as a multi-objective optimisation of a valid mating plan that is solved with an evolutionary algorithm. A valid mating plan is defined by a combination of mating constraints (the number of matings, the maximal number of parents, the minimal/equal/maximal number of contributions per parent, or allowance for selfing) that are gender specific or generic. The optimisation can maximize genetic gain, minimize group coancestry, minimize inbreeding of individual matings, or maximize genetic gain for a given increase in group coancestry or inbreeding. Users provide a list of candidate individuals with associated gender and selection criteria information (if applicable) and coancestry matrix. Selection criteria and coancestry matrix can be based on pedigree or genome-wide markers. Additional individual or mating specific information can be included to enrich optimisation objectives. An example of rapid recurrent genomic selection in wheat demonstrates how AlphaMate can double the efficiency of converting genetic diversity into genetic gain compared to truncation selection. Another example demonstrates the use of genome editing to expand the gain-diversity frontier. Executable versions of AlphaMate for Windows, Mac, and Linux platforms are available at http://www.AlphaGenes.roslin.ed.ac.uk/AlphaMate. gregor.gorjanc@roslin.ed.ack.uk.
Cahyaningrum, Fitrianna; Permadhi, Inge; Ansari, Muhammad Ridwan; Prafiantini, Erfi; Rachman, Purnawati Hustina; Agustina, Rina
2016-12-01
Diets with a specific omega-6/omega-3 fatty acid ratio have been reported to have favourable effects in controlling obesity in adults. However, development a local-based diet by considering the ratio of these fatty acids for improving the nutritional status of overweight and obese children is lacking. Therefore, using linear programming, we developed an affordable optimised diet focusing on the ratio of omega- 6/omega-3 fatty acid intake for obese children aged 12-23 months. A crosssectional study was conducted in two subdistricts of East Jakarta involving 42 normal-weight and 29 overweight and obese children, grouped on the basis of their body mass index for-age Z scores and selected through multistage random sampling. A 24-h recall was performed for 3-nonconsecutive days to assess the children's dietary intake levels and food patterns. We conducted group and structured interviews as well as market surveys to identify food availability, accessibility and affordability. Three types of affordable optimised 7-day diet meal plans were developed on the basis of breastfeeding status. The optimised diet plan fulfilled energy and macronutrient intake requirements within the acceptable macronutrient distribution range. The omega-6/omega-3 fatty acid ratio in the children was between 4 and 10. Moreover, the micronutrient intake level was within the range of the recommended daily allowance or estimated average recommendation and tolerable upper intake level. The optimisation model used in this study provides a mathematical solution for economical diet meal plans that approximate the nutrient requirements for overweight and obese children.
Díaz-Dinamarca, Diego A; Jerias, José I; Soto, Daniel A; Soto, Jorge A; Díaz, Natalia V; Leyton, Yessica Y; Villegas, Rodrigo A; Kalergis, Alexis M; Vásquez, Abel E
2018-03-01
Group B Streptococcus (GBS) is the leading cause of neonatal meningitis and a common pathogen in livestock and aquaculture industries around the world. Conjugate polysaccharide and protein-based vaccines are under development. The surface immunogenic protein (SIP) is a conserved protein in all GBS serotypes and has been shown to be a good target for vaccine development. The expression of recombinant proteins in Escherichia coli cells has been shown to be useful in the development of vaccines, and the protein purification is a factor affecting their immunogenicity. The response surface methodology (RSM) and Box-Behnken design can optimise the performance in the expression of recombinant proteins. However, the biological effect in mice immunised with an immunogenic protein that is optimised by RSM and purified by low-affinity chromatography is unknown. In this study, we used RSM for the optimisation of the expression of the rSIP, and we evaluated the SIP-specific humoral response and the property to decrease the GBS colonisation in the vaginal tract in female mice. It was observed by NI-NTA chromatography that the RSM increases the yield in the expression of rSIP, generating a better purification process. This improvement in rSIP purification suggests a better induction of IgG anti-SIP immune response and a positive effect in the decreased GBS intravaginal colonisation. The RSM applied to optimise the expression of recombinant proteins with immunogenic capacity is an interesting alternative in the evaluation of vaccines in preclinical phase, which could improve their immune response.
NASA Astrophysics Data System (ADS)
Li, Dewei; Li, Jiwei; Xi, Yugeng; Gao, Furong
2017-12-01
In practical applications, systems are always influenced by parameter uncertainties and external disturbance. Both the H2 performance and the H∞ performance are important for the real applications. For a constrained system, the previous designs of mixed H2/H∞ robust model predictive control (RMPC) optimise one performance with the other performance requirement as a constraint. But the two performances cannot be optimised at the same time. In this paper, an improved design of mixed H2/H∞ RMPC for polytopic uncertain systems with external disturbances is proposed to optimise them simultaneously. In the proposed design, the original uncertain system is decomposed into two subsystems by the additive character of linear systems. Two different Lyapunov functions are used to separately formulate the two performance indices for the two subsystems. Then, the proposed RMPC is designed to optimise both the two performances by the weighting method with the satisfaction of the H∞ performance requirement. Meanwhile, to make the design more practical, a simplified design is also developed. The recursive feasible conditions of the proposed RMPC are discussed and the closed-loop input state practical stable is proven. The numerical examples reflect the enlarged feasible region and the improved performance of the proposed design.
Statistical optimisation of diclofenac sustained release pellets coated with polymethacrylic films.
Kramar, A; Turk, S; Vrecer, F
2003-04-30
The objective of the present study was to evaluate three formulation parameters for the application of polymethacrylic films from aqueous dispersions in order to obtain multiparticulate sustained release of diclofenac sodium. Film coating of pellet cores was performed in a laboratory fluid bed apparatus. The chosen independent variables, i.e. the concentration of plasticizer (triethyl citrate), methacrylate polymers ratio (Eudragit RS:Eudragit RL) and the quantity of coating dispersion were optimised with a three-factor, three-level Box-Behnken design. The chosen dependent variables were cumulative percentage values of diclofenac dissolved in 3, 4 and 6 h. Based on the experimental design, different diclofenac release profiles were obtained. Response surface plots were used to relate the dependent and the independent variables. The optimisation procedure generated an optimum of 40% release in 3 h. The levels of plasticizer concentration, quantity of coating dispersion and polymer to polymer ratio (Eudragit RS:Eudragit RL) were 25% w/w, 400 g and 3/1, respectively. The optimised formulation prepared according to computer-determined levels provided a release profile, which was close to the predicted values. We also studied thermal and surface characteristics of the polymethacrylic films to understand the influence of plasticizer concentration on the drug release from the pellets.
NASA Astrophysics Data System (ADS)
Kluge, S.; Goodwillie, A. M.
2012-12-01
As STEM learning requirements enter the mainstream, there is benefit to providing the tools necessary for students to engage with research-quality geoscience data in a cutting-edge, easy-to-use map-based interface. Funded with an NSF GeoEd award, GeoMapApp Learning Activities ( http://serc.carleton.edu/geomapapp/collection.html ) are being created to help in that endeavour. GeoMapApp Learning Activities offer step-by-step instructions within a guided inquiry approach that enables students to dictate the pace of learning. Based upon GeoMapApp (http://www.geomapapp.org), a free, easy-to-use map-based data exploration and visualisation tool, each activity furnishes the educator with an efficient package of downloadable documents. This includes step-by-step student instructions and answer sheet; an educator's annotated worksheet containing teaching tips, additional content and suggestions for further work; and, quizzes for use before and after the activity to assess learning. Examples of activities so far created involve calculation and analysis of the rate of seafloor spreading; compilation of present-day evidence for huge ancient landslides on the seafloor around the Hawaiian islands; a study of radiometrically-dated volcanic rocks to help understand the concept of hotspots; and, the optimisation of contours as a means to aid visualisation of 3-D data sets on a computer screen. The activities are designed for students at the introductory undergraduate, community college and high school levels, and present a virtual lab-like environment to expose students to content and concepts typically found in those educational settings. The activities can be used in the classroom or out of class, and their guided nature means that the requirement for teacher intervention is reduced thus allowing students to spend more time analysing and understanding geoscience data, content and concepts. Each activity is freely available through the SERC-Carleton web site.
Martin, Colin J
2016-06-01
Doses to the eye lenses of clinicians undertaking fluoroscopically guided procedures can exceed the dose annual limit of 20 mSv, so optimisation of radiation protection is essential. Ceiling-suspended shields and disposable radiation absorbing pads can reduce eye dose by factors of 2-7. Lead glasses that shield against exposures from the side can lower doses by 2.5-4.5 times. Training in effective use of protective devices is an essential element in achieving good protection and acceptable eye doses. Effective methods for dose monitoring are required to identify protection issues. Dosemeters worn adjacent to the eye provide the better option for interventional clinicians, but an unprotected dosemeter worn at the neck will give an indication of eye dose that is adequate for most interventional staff. Potential requirements for protective devices and dose monitoring can be determined from risk assessments using generic values for dose linked to examination workload. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Joa, Eunhyek; Park, Kwanwoo; Koh, Youngil; Yi, Kyongsu; Kim, Kilsoo
2018-04-01
This paper presents a tyre slip-based integrated chassis control of front/rear traction distribution and four-wheel braking for enhanced performance from moderate driving to limit handling. The proposed algorithm adopted hierarchical structure: supervisor - desired motion tracking controller - optimisation-based control allocation. In the supervisor, by considering transient cornering characteristics, desired vehicle motion is calculated. In the desired motion tracking controller, in order to track desired vehicle motion, virtual control input is determined in the manner of sliding mode control. In the control allocation, virtual control input is allocated to minimise cost function. The cost function consists of two major parts. First part is a slip-based tyre friction utilisation quantification, which does not need a tyre force estimation. Second part is an allocation guideline, which guides optimally allocated inputs to predefined solution. The proposed algorithm has been investigated via simulation from moderate driving to limit handling scenario. Compared to Base and direct yaw moment control system, the proposed algorithm can effectively reduce tyre dissipation energy in the moderate driving situation. Moreover, the proposed algorithm enhances limit handling performance compared to Base and direct yaw moment control system. In addition to comparison with Base and direct yaw moment control, comparison the proposed algorithm with the control algorithm based on the known tyre force information has been conducted. The results show that the performance of the proposed algorithm is similar with that of the control algorithm with the known tyre force information.
Swallow, Veronica M; Hall, Andrew G; Carolan, Ian; Santacroce, Sheila; Webb, Nicholas J A; Smith, Trish; Hanif, Noreen
2014-02-18
There is a lack of online, evidence-based information and resources to support home-based care of childhood CKD stages 3-5. Qualitative interviews were undertaken with parents, patients and professionals to explore their views on content of the proposed online parent information and support (OPIS) web-application. Data were analysed using Framework Analysis, guided by the concept of Self-efficacy. 32 parents, 26 patients and 12 professionals were interviewed. All groups wanted an application that explains, demonstrates, and enables parental clinical care-giving, with condition-specific, continously available, reliable, accessible material and a closed communication system to enable contact between families living with CKD. Professionals advocated a regularly updated application to empower parents to make informed health-care decisions. To address these requirements, key web-application components were defined as: (i) Clinical care-giving support (information on treatment regimens, video-learning tools, condition-specific cartoons/puzzles, and a question and answer area) and (ii) Psychosocial support for care-giving (social-networking, case studies, managing stress, and enhancing families' health-care experiences). Developing a web-application that meets parents' information and support needs will maximise its utility, thereby augmenting parents' self-efficacy for CKD caregiving, and optimising outcomes. Self-efficacy theory provides a schema for how parents' self-efficacy beliefs about management of their child's CKD could potentially be promoted by OPIS.
Aveling, Emma-Louise; Zegeye, Desalegn Tegabu; Silverman, Michael
2016-08-17
Access to safe surgical care represents a critical gap in healthcare delivery and development in many low- and middle-income countries, including Ethiopia. Quality improvement (QI) initiatives at hospital level may contribute to closing this gap. Many such quality improvement initiatives are carried out through international health partnerships. Better understanding of how to optimise quality improvement in low-income settings is needed, including through partnership-based approaches. Drawing on a process evaluation of an intervention to improve surgical services in an Ethiopian hospital, this paper offers lessons to help meet this need. We conducted a qualitative process evaluation of a quality improvement project which aimed to improve access to surgical services in an Ethiopian referral hospital through better management. Data was collected longitudinally and included: 66 in-depth interviews with surgical staff and project team members; observation (135 h) in the surgery department and of project meetings; project-related documentation. Thematic analysis, guided by theoretical constructs, focused on identifying obstacles to implementation. The project largely failed to achieve its goals. Key barriers related to project design, partnership working and the implementation context, and included: confusion over project objectives and project and partner roles and responsibilities; logistical challenges concerning overseas visits; difficulties in communication; gaps between the time and authority team members had and that needed to implement and engage other staff; limited strategies for addressing adaptive-as opposed to technical-challenges; effects of hierarchy and resource scarcity on QI efforts. While many of the obstacles identified are common to diverse settings, our findings highlight ways in which some features of low-income country contexts amplify these common challenges. We identify lessons for optimising the design and planning of quality improvement interventions within such challenging healthcare contexts, with specific reference to international partnership-based approaches. These include: the need for a funded lead-in phase to clarify and agree goals, roles, mutual expectations and communication strategies; explicitly incorporating adaptive, as well as technical, solutions; transparent management of resources and opportunities; leadership which takes account of both formal and informal power structures; and articulating links between project goals and wider organisational interests.
Hind, Daniel; Parkin, James; Whitworth, Victoria; Rex, Saleema; Young, Tracey; Hampson, Lisa; Sheehan, Jennie; Maguire, Chin; Cantrill, Hannah; Scott, Elaine; Epps, Heather; Main, Marion; Geary, Michelle; McMurchie, Heather; Pallant, Lindsey; Woods, Daniel; Freeman, Jennifer; Lee, Ellen; Eagle, Michelle; Willis, Tracey; Muntoni, Francesco; Baxter, Peter
2017-05-01
Duchenne muscular dystrophy (DMD) is a rare disease that causes the progressive loss of motor abilities such as walking. Standard treatment includes physiotherapy. No trial has evaluated whether or not adding aquatic therapy (AT) to land-based therapy (LBT) exercises helps to keep muscles strong and children independent. To assess the feasibility of recruiting boys with DMD to a randomised trial evaluating AT (primary objective) and to collect data from them; to assess how, and how well, the intervention and trial procedures work. Parallel-group, single-blind, randomised pilot trial with nested qualitative research. Six paediatric neuromuscular units. Children with DMD aged 7-16 years, established on corticosteroids, with a North Star Ambulatory Assessment (NSAA) score of 8-34 and able to complete a 10-m walk without aids/assistance. Exclusions: > 20% variation between baseline screens 4 weeks apart and contraindications. Participants were allocated on a 1 : 1 ratio to (1) optimised, manualised LBT (prescribed by specialist neuromuscular physiotherapists) or (2) the same plus manualised AT (30 minutes, twice weekly for 6 months: active assisted and/or passive stretching regime; simulated or real functional activities; submaximal exercise). Semistructured interviews with participants, parents ( n = 8) and professionals ( n = 8) were analysed using Framework analysis. An independent rater reviewed patient records to determine the extent to which treatment was optimised. A cost-impact analysis was performed. Quantitative and qualitative data were mixed using a triangulation exercise. Feasibility of recruiting 40 participants in 6 months, participant and therapist views on the acceptability of the intervention and research protocols, clinical outcomes including NSAA, independent assessment of treatment optimisation and intervention costs. Over 6 months, 348 children were screened - most lived too far from centres or were enrolled in other trials. Twelve (30% of target) were randomised to AT ( n = 8) or control ( n = 4). People in the AT ( n = 8) and control ( n = 2: attrition because of parental report) arms contributed outcome data. The mean change in NSAA score at 6 months was -5.5 [standard deviation (SD) 7.8] for LBT and -2.8 (SD 4.1) in the AT arm. One boy suffered pain and fatigue after AT, which resolved the same day. Physiotherapists and parents valued AT and believed that it should be delivered in community settings. The independent rater considered AT optimised for three out of eight children, with other children given programmes that were too extensive and insufficiently focused. The estimated NHS costs of 6-month service were between £1970 and £2734 per patient. The focus on delivery in hospitals limits generalisability. Neither a full-scale frequentist randomised controlled trial (RCT) recruiting in the UK alone nor a twice-weekly open-ended AT course delivered at tertiary centres is feasible. Further intervention development research is needed to identify how community-based pools can be accessed, and how families can link with each other and community physiotherapists to access tailored AT programmes guided by highly specialised physiotherapists. Bayesian RCTs may be feasible; otherwise, time series designs are recommended. Current Controlled Trials ISRCTN41002956. This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment ; Vol. 21, No. 27. See the NIHR Journals Library website for further project information.
Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca
2015-10-31
To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice.
CAMELOT: Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox
NASA Astrophysics Data System (ADS)
Di Carlo, Marilena; Romero Martin, Juan Manuel; Vasile, Massimiliano
2018-03-01
Computational-Analytical Multi-fidElity Low-thrust Optimisation Toolbox (CAMELOT) is a toolbox for the fast preliminary design and optimisation of low-thrust trajectories. It solves highly complex combinatorial problems to plan multi-target missions characterised by long spirals including different perturbations. To do so, CAMELOT implements a novel multi-fidelity approach combining analytical surrogate modelling and accurate computational estimations of the mission cost. Decisions are then made using two optimisation engines included in the toolbox, a single-objective global optimiser, and a combinatorial optimisation algorithm. CAMELOT has been applied to a variety of case studies: from the design of interplanetary trajectories to the optimal de-orbiting of space debris and from the deployment of constellations to on-orbit servicing. In this paper, the main elements of CAMELOT are described and two examples, solved using the toolbox, are presented.
Tail mean and related robust solution concepts
NASA Astrophysics Data System (ADS)
Ogryczak, Włodzimierz
2014-01-01
Robust optimisation might be viewed as a multicriteria optimisation problem where objectives correspond to the scenarios although their probabilities are unknown or imprecise. The simplest robust solution concept represents a conservative approach focused on the worst-case scenario results optimisation. A softer concept allows one to optimise the tail mean thus combining performances under multiple worst scenarios. We show that while considering robust models allowing the probabilities to vary only within given intervals, the tail mean represents the robust solution for only upper bounded probabilities. For any arbitrary intervals of probabilities the corresponding robust solution may be expressed by the optimisation of appropriately combined mean and tail mean criteria thus remaining easily implementable with auxiliary linear inequalities. Moreover, we use the tail mean concept to develope linear programming implementable robust solution concepts related to risk averse optimisation criteria.
Refractive index dependence of L3 photonic crystal nano-cavities.
Adawi, A M; Chalcraft, A R; Whittaker, D M; Lidzey, D G
2007-10-29
We model the optical properties of L3 photonic crystal nano-cavities as a function of the photonic crystal membrane refractive index n using a guided mode expansion method. Band structure calculations revealed that a TE-like full band-gap exists for materials of refractive index as low as 1.6. The Q-factor of such cavities showed a super-linear increase with refractive index. By adjusting the relative position of the cavity side holes, the Q-factor was optimised as a function of the photonic crystal membrane refractive index n over the range 1.6 to 3.4. Q-factors in the range 3000-8000 were predicted from absorption free materials in the visible range with refractive index between 2.45 and 2.8.
Optimised mounting conditions for poly (ether sulfone) in radiation detection.
Nakamura, Hidehito; Shirakawa, Yoshiyuki; Sato, Nobuhiro; Yamada, Tatsuya; Kitamura, Hisashi; Takahashi, Sentaro
2014-09-01
Poly (ether sulfone) (PES) is a candidate for use as a scintillation material in radiation detection. Its characteristics, such as its emission spectrum and its effective refractive index (based on the emission spectrum), directly affect the propagation of light generated to external photodetectors. It is also important to examine the presence of background radiation sources in manufactured PES. Here, we optimise the optical coupling and surface treatment of the PES, and characterise its background. Optical grease was used to enhance the optical coupling between the PES and the photodetector; absorption by the grease of short-wavelength light emitted from PES was negligible. Diffuse reflection induced by surface roughening increased the light yield for PES, despite the high effective refractive index. Background radiation derived from the PES sample and its impurities was negligible above the ambient, natural level. Overall, these results serve to optimise the mounting conditions for PES in radiation detection. Copyright © 2014 Elsevier Ltd. All rights reserved.
Ławryńczuk, Maciej
2017-03-01
This paper details development of a Model Predictive Control (MPC) algorithm for a boiler-turbine unit, which is a nonlinear multiple-input multiple-output process. The control objective is to follow set-point changes imposed on two state (output) variables and to satisfy constraints imposed on three inputs and one output. In order to obtain a computationally efficient control scheme, the state-space model is successively linearised on-line for the current operating point and used for prediction. In consequence, the future control policy is easily calculated from a quadratic optimisation problem. For state estimation the extended Kalman filter is used. It is demonstrated that the MPC strategy based on constant linear models does not work satisfactorily for the boiler-turbine unit whereas the discussed algorithm with on-line successive model linearisation gives practically the same trajectories as the truly nonlinear MPC controller with nonlinear optimisation repeated at each sampling instant. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jiménez-Redondo, Noemi; Calle-Cordón, Alvaro; Kandler, Ute; Simroth, Axel; Morales, Francisco J.; Reyes, Antonio; Odelius, Johan; Thaduri, Aditya; Morgado, Joao; Duarte, Emmanuele
2017-09-01
The on-going H2020 project INFRALERT aims to increase rail and road infrastructure capacity in the current framework of increased transportation demand by developing and deploying solutions to optimise maintenance interventions planning. It includes two real pilots for road and railways infrastructure. INFRALERT develops an ICT platform (the expert-based Infrastructure Management System, eIMS) which follows a modular approach including several expert-based toolkits. This paper presents the methodologies and preliminary results of the toolkits for i) nowcasting and forecasting of asset condition, ii) alert generation, iii) RAMS & LCC analysis and iv) decision support. The results of these toolkits in a meshed road network in Portugal under the jurisdiction of Infraestruturas de Portugal (IP) are presented showing the capabilities of the approaches.
Kassem, Abdulsalam M; Ibrahim, Hany M; Samy, Ahmed M
2017-05-01
The objective of this study was to develop and optimise self-nanoemulsifying drug delivery system (SNEDDS) of atorvastatin calcium (ATC) for improving dissolution rate and eventually oral bioavailability. Ternary phase diagrams were constructed on basis of solubility and emulsification studies. The composition of ATC-SNEDDS was optimised using the Box-Behnken optimisation design. Optimised ATC-SNEDDS was characterised for various physicochemical properties. Pharmacokinetic, pharmacodynamic and histological findings were performed in rats. Optimised ATC-SNEDDS resulted in droplets size of 5.66 nm, zeta potential of -19.52 mV, t 90 of 5.43 min and completely released ATC within 30 min irrespective of pH of the medium. Area under the curve of optimised ATC-SNEDDS in rats was 2.34-folds higher than ATC suspension. Pharmacodynamic studies revealed significant reduction in serum lipids of rats with fatty liver. Photomicrographs showed improvement in hepatocytes structure. In this study, we confirmed that ATC-SNEDDS would be a promising approach for improving oral bioavailability of ATC.
NASA Astrophysics Data System (ADS)
Zhou, Changjiu; Meng, Qingchun; Guo, Zhongwen; Qu, Wiefen; Yin, Bo
2002-04-01
Robot learning in unstructured environments has been proved to be an extremely challenging problem, mainly because of many uncertainties always present in the real world. Human beings, on the other hand, seem to cope very well with uncertain and unpredictable environments, often relying on perception-based information. Furthermore, humans beings can also utilize perceptions to guide their learning on those parts of the perception-action space that are actually relevant to the task. Therefore, we conduct a research aimed at improving robot learning through the incorporation of both perception-based and measurement-based information. For this reason, a fuzzy reinforcement learning (FRL) agent is proposed in this paper. Based on a neural-fuzzy architecture, different kinds of information can be incorporated into the FRL agent to initialise its action network, critic network and evaluation feedback module so as to accelerate its learning. By making use of the global optimisation capability of GAs (genetic algorithms), a GA-based FRL (GAFRL) agent is presented to solve the local minima problem in traditional actor-critic reinforcement learning. On the other hand, with the prediction capability of the critic network, GAs can perform a more effective global search. Different GAFRL agents are constructed and verified by using the simulation model of a physical biped robot. The simulation analysis shows that the biped learning rate for dynamic balance can be improved by incorporating perception-based information on biped balancing and walking evaluation. The biped robot can find its application in ocean exploration, detection or sea rescue activity, as well as military maritime activity.
Griffanti, Ludovica; Zamboni, Giovanna; Khan, Aamira; Li, Linxin; Bonifacio, Guendalina; Sundaresan, Vaanathi; Schulz, Ursula G; Kuker, Wilhelm; Battaglini, Marco; Rothwell, Peter M; Jenkinson, Mark
2016-11-01
Reliable quantification of white matter hyperintensities of presumed vascular origin (WMHs) is increasingly needed, given the presence of these MRI findings in patients with several neurological and vascular disorders, as well as in elderly healthy subjects. We present BIANCA (Brain Intensity AbNormality Classification Algorithm), a fully automated, supervised method for WMH detection, based on the k-nearest neighbour (k-NN) algorithm. Relative to previous k-NN based segmentation methods, BIANCA offers different options for weighting the spatial information, local spatial intensity averaging, and different options for the choice of the number and location of the training points. BIANCA is multimodal and highly flexible so that the user can adapt the tool to their protocol and specific needs. We optimised and validated BIANCA on two datasets with different MRI protocols and patient populations (a "predominantly neurodegenerative" and a "predominantly vascular" cohort). BIANCA was first optimised on a subset of images for each dataset in terms of overlap and volumetric agreement with a manually segmented WMH mask. The correlation between the volumes extracted with BIANCA (using the optimised set of options), the volumes extracted from the manual masks and visual ratings showed that BIANCA is a valid alternative to manual segmentation. The optimised set of options was then applied to the whole cohorts and the resulting WMH volume estimates showed good correlations with visual ratings and with age. Finally, we performed a reproducibility test, to evaluate the robustness of BIANCA, and compared BIANCA performance against existing methods. Our findings suggest that BIANCA, which will be freely available as part of the FSL package, is a reliable method for automated WMH segmentation in large cross-sectional cohort studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Devos, Olivier; Downey, Gerard; Duponchel, Ludovic
2014-04-01
Classification is an important task in chemometrics. For several years now, support vector machines (SVMs) have proven to be powerful for infrared spectral data classification. However such methods require optimisation of parameters in order to control the risk of overfitting and the complexity of the boundary. Furthermore, it is established that the prediction ability of classification models can be improved using pre-processing in order to remove unwanted variance in the spectra. In this paper we propose a new methodology based on genetic algorithm (GA) for the simultaneous optimisation of SVM parameters and pre-processing (GENOPT-SVM). The method has been tested for the discrimination of the geographical origin of Italian olive oil (Ligurian and non-Ligurian) on the basis of near infrared (NIR) or mid infrared (FTIR) spectra. Different classification models (PLS-DA, SVM with mean centre data, GENOPT-SVM) have been tested and statistically compared using McNemar's statistical test. For the two datasets, SVM with optimised pre-processing give models with higher accuracy than the one obtained with PLS-DA on pre-processed data. In the case of the NIR dataset, most of this accuracy improvement (86.3% compared with 82.8% for PLS-DA) occurred using only a single pre-processing step. For the FTIR dataset, three optimised pre-processing steps are required to obtain SVM model with significant accuracy improvement (82.2%) compared to the one obtained with PLS-DA (78.6%). Furthermore, this study demonstrates that even SVM models have to be developed on the basis of well-corrected spectral data in order to obtain higher classification rates. Copyright © 2013 Elsevier Ltd. All rights reserved.
Cheong, Vee San; Bull, Anthony M J
2015-12-16
The choice of coordinate system and alignment of bone will affect the quantification of mechanical properties obtained during in-vitro biomechanical testing. Where these are used in predictive models, such as finite element analysis, the fidelic description of these properties is paramount. Currently in bending and torsional tests, bones are aligned on a pre-defined fixed span based on the reference system marked out. However, large inter-specimen differences have been reported. This suggests a need for the development of a specimen-specific alignment system for use in experimental work. Eleven ovine tibiae were used in this study and three-dimensional surface meshes were constructed from micro-Computed Tomography scan images. A novel, semi-automated algorithm was developed and applied to the surface meshes to align the whole bone based on its calculated principal directions. Thereafter, the code isolates the optimised location and length of each bone for experimental testing. This resulted in a lowering of the second moment of area about the chosen bending axis in the central region. More importantly, the optimisation method decreases the irregularity of the shape of the cross-sectional slices as the unbiased estimate of the population coefficient of variation of the second moment of area decreased from a range of (0.210-0.435) to (0.145-0.317) in the longitudinal direction, indicating a minimisation of the product moment, which causes eccentric loading. Thus, this methodology serves as an important pre-step to align the bone for mechanical tests or simulation work, is optimised for each specimen, ensures repeatability, and is general enough to be applied to any long bone. Copyright © 2015 Elsevier Ltd. All rights reserved.
Tsipa, Argyro; Koutinas, Michalis; Usaku, Chonlatep; Mantalaris, Athanasios
2018-05-02
Currently, design and optimisation of biotechnological bioprocesses is performed either through exhaustive experimentation and/or with the use of empirical, unstructured growth kinetics models. Whereas, elaborate systems biology approaches have been recently explored, mixed-substrate utilisation is predominantly ignored despite its significance in enhancing bioprocess performance. Herein, bioprocess optimisation for an industrially-relevant bioremediation process involving a mixture of highly toxic substrates, m-xylene and toluene, was achieved through application of a novel experimental-modelling gene regulatory network - growth kinetic (GRN-GK) hybrid framework. The GRN model described the TOL and ortho-cleavage pathways in Pseudomonas putida mt-2 and captured the transcriptional kinetics expression patterns of the promoters. The GRN model informed the formulation of the growth kinetics model replacing the empirical and unstructured Monod kinetics. The GRN-GK framework's predictive capability and potential as a systematic optimal bioprocess design tool, was demonstrated by effectively predicting bioprocess performance, which was in agreement with experimental values, when compared to four commonly used models that deviated significantly from the experimental values. Significantly, a fed-batch biodegradation process was designed and optimised through the model-based control of TOL Pr promoter expression resulting in 61% and 60% enhanced pollutant removal and biomass formation, respectively, compared to the batch process. This provides strong evidence of model-based bioprocess optimisation at the gene level, rendering the GRN-GK framework as a novel and applicable approach to optimal bioprocess design. Finally, model analysis using global sensitivity analysis (GSA) suggests an alternative, systematic approach for model-driven strain modification for synthetic biology and metabolic engineering applications. Copyright © 2018. Published by Elsevier Inc.
Del Prado, A; Misselbrook, T; Chadwick, D; Hopkins, A; Dewhurst, R J; Davison, P; Butler, A; Schröder, J; Scholefield, D
2011-09-01
Multiple demands are placed on farming systems today. Society, national legislation and market forces seek what could be seen as conflicting outcomes from our agricultural systems, e.g. food quality, affordable prices, a healthy environmental, consideration of animal welfare, biodiversity etc., Many of these demands, or desirable outcomes, are interrelated, so reaching one goal may often compromise another and, importantly, pose a risk to the economic viability of the farm. SIMS(DAIRY), a farm-scale model, was used to explore this complexity for dairy farm systems. SIMS(DAIRY) integrates existing approaches to simulate the effect of interactions between farm management, climate and soil characteristics on losses of nitrogen, phosphorus and carbon. The effects on farm profitability and attributes of biodiversity, milk quality, soil quality and animal welfare are also included. SIMS(DAIRY) can also be used to optimise fertiliser N. In this paper we discuss some limitations and strengths of using SIMS(DAIRY) compared to other modelling approaches and propose some potential improvements. Using the model we evaluated the sustainability of organic dairy systems compared with conventional dairy farms under non-optimised and optimised fertiliser N use. Model outputs showed for example, that organic dairy systems based on grass-clover swards and maize silage resulted in much smaller total GHG emissions per l of milk and slightly smaller losses of NO(3) leaching and NO(x) emissions per l of milk compared with the grassland/maize-based conventional systems. These differences were essentially because the conventional systems rely on indirect energy use for 'fixing' N compared with biological N fixation for the organic systems. SIMS(DAIRY) runs also showed some other potential benefits from the organic systems compared with conventional systems in terms of financial performance and soil quality and biodiversity scores. Optimisation of fertiliser N timings and rates showed a considerable scope to reduce the (GHG emissions per l milk too). Copyright © 2011 Elsevier B.V. All rights reserved.
Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model
NASA Astrophysics Data System (ADS)
Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.
2017-09-01
The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.
Application of Three Existing Stope Boundary Optimisation Methods in an Operating Underground Mine
NASA Astrophysics Data System (ADS)
Erdogan, Gamze; Yavuz, Mahmut
2017-12-01
The underground mine planning and design optimisation process have received little attention because of complexity and variability of problems in underground mines. Although a number of optimisation studies and software tools are available and some of them, in special, have been implemented effectively to determine the ultimate-pit limits in an open pit mine, there is still a lack of studies for optimisation of ultimate stope boundaries in underground mines. The proposed approaches for this purpose aim at maximizing the economic profit by selecting the best possible layout under operational, technical and physical constraints. In this paper, the existing three heuristic techniques including Floating Stope Algorithm, Maximum Value Algorithm and Mineable Shape Optimiser (MSO) are examined for optimisation of stope layout in a case study. Each technique is assessed in terms of applicability, algorithm capabilities and limitations considering the underground mine planning challenges. Finally, the results are evaluated and compared.
Use of anti-TNF drug levels to optimise patient management
Papamichael, Konstantinos; Cheifetz, Adam S
2016-01-01
Anti-tumour necrosis factor (TNF) therapies, such as infliximab, adalimumab, certolizumab pegol and golimumab, have been proven to be effective for the treatment of patients with Crohn's disease and ulcerative colitis. However, 10%–30% of patients with inflammatory bowel disease (IBD) show no initial clinical benefit to anti-TNF therapy (primary non-response), and over 50% after an initial favourable outcome will lose response over time (secondary loss of response (SLR)). Numerous recent studies in IBD have revealed an exposure–response relationship suggesting a positive correlation between high serum anti-TNF concentrations and favourable therapeutic outcomes including clinical, biomarker and endoscopic remission, whereas antidrug antibodies have been associated with SLR and infusion reactions. Currently, therapeutic drug monitoring (TDM) is typically performed when treatment failure occurs either for SLR, drug intolerance (potential immune-mediated reaction) or infusion reaction (reactive TDM). Nevertheless, recent data demonstrate that proactive TDM and a treat-to-target (trough) therapeutic approach may more effectively optimise anti-TNF therapy efficacy, safety and cost. However, implementing TDM in real-life clinical practice is currently limited by the diversity in study design, therapeutic outcomes and assays used, which have hindered the identification of robust clinically relevant concentration thresholds. This review will focus mainly on the pharmacodynamic properties of anti-TNF therapy and the role of TDM in guiding therapeutic decisions in IBD. PMID:28839870
Crawford, Keith W; Ripin, David H Brown; Levin, Andrew D; Campbell, Jennifer R; Flexner, Charles
2012-07-01
It is expected that funding limitations for worldwide HIV treatment and prevention in resource-limited settings will continue, and, because the need for treatment scale-up is urgent, the emphasis on value for money has become an increasing priority. The Conference on Antiretroviral Drug Optimization--a collaborative project between the Clinton Health Access Initiative, the Johns Hopkins University School of Medicine, and the Bill & Melinda Gates Foundation--brought together process chemists, clinical pharmacologists, pharmaceutical scientists, physicians, pharmacists, and regulatory specialists to explore strategies for the reduction of antiretroviral drug costs. The antiretroviral drugs discussed were prioritised for consideration on the basis of their market impact, and the objectives of the conference were framed as discussion questions generated to guide scientific assessment of potential strategies. These strategies included modifications to the synthesis of the active pharmaceutical ingredient (API) and use of cheaper sources of raw materials in synthesis of these ingredients. Innovations in product formulation could improve bioavailability thus needing less API. For several antiretroviral drugs, studies show efficacy is maintained at doses below the approved dose (eg, efavirenz, lopinavir plus ritonavir, atazanavir, and darunavir). Optimising pharmacoenhancement and extending shelf life are additional strategies. The conference highlighted a range of interventions; optimum cost savings could be achieved through combining approaches. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.
2018-03-01
This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.
Experiences of giving and receiving care in traumatic brain injury: An integrative review.
Kivunja, Stephen; River, Jo; Gullick, Janice
2018-04-01
To synthesise the literature on the experiences of giving or receiving care for traumatic brain injury for people with traumatic brain injury, their family members and nurses in hospital and rehabilitation settings. Traumatic brain injury represents a major source of physical, social and economic burden. In the hospital setting, people with traumatic brain injury feel excluded from decision-making processes and perceive impatient care. Families describe inadequate information and support for psychological distress. Nurses find the care of people with traumatic brain injury challenging particularly when experiencing heavy workloads. To date, a contemporary synthesis of the literature on people with traumatic brain injury, family and nurse experiences of traumatic brain injury care has not been conducted. Integrative literature review. A systematic search strategy guided by the PRISMA statement was conducted in CINAHL, PubMed, Proquest, EMBASE and Google Scholar. Whittemore and Knafl's (Journal of Advanced Nursing, 52, 2005, 546) integrative review framework guided data reduction, data display, data comparison and conclusion verification. Across the three participant categories (people with traumatic brain injury/family members/nurses) and sixteen subcategories, six cross-cutting themes emerged: seeking personhood, navigating challenging behaviour, valuing skills and competence, struggling with changed family responsibilities, maintaining productive partnerships and reflecting on workplace culture. Traumatic brain injury creates changes in physical, cognitive and emotional function that challenge known ways of being in the world for people. This alters relationship dynamics within families and requires a specific skill set among nurses. Recommendations include the following: (i) formal inclusion of people with traumatic brain injury and families in care planning, (ii) routine risk screening for falls and challenging behaviour to ensure that controls are based on accurate assessment, (iii) formal orientation and training for novice nurses in the management of challenging behaviour, (iv) professional case management to guide access to services and funding and (v) personal skill development to optimise family functioning. © 2018 John Wiley & Sons Ltd.
de Knegt, Martina Chantal; Fuchs, A; Weeke, P; Møgelvang, R; Hassager, C; Kofoed, K F
2016-12-01
Current echocardiographic assessments of coronary vascular territories use the 17-segment model and are based on general assumptions of coronary vascular distribution. Fusion of 3D echocardiography (3DE) with multidetector computed tomography (MDCT) derived coronary anatomy may provide a more accurate assessment of left ventricular (LV) territorial function. We aimed to test the feasibility of MDCT and 3DE fusion and to compare territorial longitudinal strain (LS) using the 17-segment model and a MDCT-guided vascular model. 28 patients underwent 320-slice MDCT and transthoracic 3DE on the same day followed by invasive coronary angiography. MDCT (Aquilion ONE, ViSION Edition, Toshiba Medical Systems) and 3DE apical full-volume images (Artida, Toshiba Medical Systems) were fused offline using a dedicated workstation (prototype fusion software, Toshiba Medical Systems). 3DE/MDCT image alignment was assessed by 3 readers using a 4-point scale. Territorial LS was assessed using the 17-segment model and the MDCT-guided vascular model in territories supplied by significantly stenotic and non-significantly stenotic vessels. Successful 3DE/MDCT image alignment was obtained in 86 and 93 % of cases for reader one, and reader two and three, respectively. Fair agreement on the quality of automatic image alignment (intra-class correlation = 0.40) and the success of manual image alignment (Fleiss' Kappa = 0.40) among the readers was found. In territories supplied by non-significantly stenotic left circumflex arteries, LS was significantly higher in the MDCT-guided vascular model compared to the 17-segment model: -15.00 ± 7.17 (mean ± standard deviation) versus -11.87 ± 4.09 (p < 0.05). Fusion of MDCT and 3DE is feasible and provides physiologically meaningful displays of myocardial function.
Swartman, B; Frere, D; Wei, W; Schnetzke, M; Beisemann, N; Keil, H; Franke, J; Grützner, P A; Vetter, S Y
2017-10-01
A new software application can be used without fixed reference markers or a registration process in wire placement. The aim was to compare placement of Kirschner wires (K-wires) into the proximal femur with the software application versus the conventional method without guiding. As study hypothesis, we assumed less placement attempts, shorter procedure time and shorter fluoroscopy time using the software. The same precision inside a proximal femur bone model using the software application was premised. The software detects a K-wire within the 2D fluoroscopic image. By evaluating its direction and tip location, it superimposes a trajectory on the image, visualizing the intended direction of the K-wire. The K-wire was positioned in 20 artificial bones with the use of software by one surgeon; 20 bones served as conventional controls. A brass thumb tack was placed into the femoral head and its tip targeted with the wire. Number of placement attempts, duration of the procedure, duration of fluoroscopy time and distance to the target in a postoperative 3D scan were recorded. Compared with the conventional method, use of the application showed fewer attempts for optimal wire placement (p=0.026), shorter duration of surgery (p=0.004), shorter fluoroscopy time (p=0.024) and higher precision (p=0.018). Final wire position was achieved in the first attempt in 17 out of 20 cases with the software and in 9 out of 20 cases with the conventional method. The study hypothesis was confirmed. The new application optimised the process of K-wire placement in the proximal femur in an artificial bone model while also improving precision. Benefits lie especially in the reduction of placement attempts and reduction of fluoroscopy time under the aspect of radiation protection. The software runs on a conventional image intensifier and can therefore be easily integrated into the daily surgical routine. Copyright © 2017 Elsevier Ltd. All rights reserved.
Salvador-Carulla, L; Lukersmith, S; Sullivan, W
2017-04-01
Guideline methods to develop recommendations dedicate most effort around organising discovery and corroboration knowledge following the evidence-based medicine (EBM) framework. Guidelines typically use a single dimension of information, and generally discard contextual evidence and formal expert knowledge and consumer's experiences in the process. In recognition of the limitations of guidelines in complex cases, complex interventions and systems research, there has been significant effort to develop new tools, guides, resources and structures to use alongside EBM methods of guideline development. In addition to these advances, a new framework based on the philosophy of science is required. Guidelines should be defined as implementation decision support tools for improving the decision-making process in real-world practice and not only as a procedure to optimise the knowledge base of scientific discovery and corroboration. A shift from the model of the EBM pyramid of corroboration of evidence to the use of broader multi-domain perspective graphically depicted as 'Greek temple' could be considered. This model takes into account the different stages of scientific knowledge (discovery, corroboration and implementation), the sources of knowledge relevant to guideline development (experimental, observational, contextual, expert-based and experiential); their underlying inference mechanisms (deduction, induction, abduction, means-end inferences) and a more precise definition of evidence and related terms. The applicability of this broader approach is presented for the development of the Canadian Consensus Guidelines for the Primary Care of People with Developmental Disabilities.
The multiple roles of computational chemistry in fragment-based drug design
NASA Astrophysics Data System (ADS)
Law, Richard; Barker, Oliver; Barker, John J.; Hesterkamp, Thomas; Godemann, Robert; Andersen, Ole; Fryatt, Tara; Courtney, Steve; Hallett, Dave; Whittaker, Mark
2009-08-01
Fragment-based drug discovery (FBDD) represents a change in strategy from the screening of molecules with higher molecular weights and physical properties more akin to fully drug-like compounds, to the screening of smaller, less complex molecules. This is because it has been recognised that fragment hit molecules can be efficiently grown and optimised into leads, particularly after the binding mode to the target protein has been first determined by 3D structural elucidation, e.g. by NMR or X-ray crystallography. Several studies have shown that medicinal chemistry optimisation of an already drug-like hit or lead compound can result in a final compound with too high molecular weight and lipophilicity. The evolution of a lower molecular weight fragment hit therefore represents an attractive alternative approach to optimisation as it allows better control of compound properties. Computational chemistry can play an important role both prior to a fragment screen, in producing a target focussed fragment library, and post-screening in the evolution of a drug-like molecule from a fragment hit, both with and without the available fragment-target co-complex structure. We will review many of the current developments in the area and illustrate with some recent examples from successful FBDD discovery projects that we have conducted.
NASA Astrophysics Data System (ADS)
Ben-Romdhane, Hajer; Krichen, Saoussen; Alba, Enrique
2017-05-01
Optimisation in changing environments is a challenging research topic since many real-world problems are inherently dynamic. Inspired by the natural evolution process, evolutionary algorithms (EAs) are among the most successful and promising approaches that have addressed dynamic optimisation problems. However, managing the exploration/exploitation trade-off in EAs is still a prevalent issue, and this is due to the difficulties associated with the control and measurement of such a behaviour. The proposal of this paper is to achieve a balance between exploration and exploitation in an explicit manner. The idea is to use two equally sized populations: the first one performs exploration while the second one is responsible for exploitation. These tasks are alternated from one generation to the next one in a regular pattern, so as to obtain a balanced search engine. Besides, we reinforce the ability of our algorithm to quickly adapt after cnhanges by means of a memory of past solutions. Such a combination aims to restrain the premature convergence, to broaden the search area, and to speed up the optimisation. We show through computational experiments, and based on a series of dynamic problems and many performance measures, that our approach improves the performance of EAs and outperforms competing algorithms.
Schmidt, Ronny; Cook, Elizabeth A; Kastelic, Damjana; Taussig, Michael J; Stoevesandt, Oda
2013-08-02
We have previously described a protein arraying process based on cell free expression from DNA template arrays (DNA Array to Protein Array, DAPA). Here, we have investigated the influence of different array support coatings (Ni-NTA, Epoxy, 3D-Epoxy and Polyethylene glycol methacrylate (PEGMA)). Their optimal combination yields an increased amount of detected protein and an optimised spot morphology on the resulting protein array compared to the previously published protocol. The specificity of protein capture was improved using a tag-specific capture antibody on a protein repellent surface coating. The conditions for protein expression were optimised to yield the maximum amount of protein or the best detection results using specific monoclonal antibodies or a scaffold binder against the expressed targets. The optimised DAPA system was able to increase by threefold the expression of a representative model protein while conserving recognition by a specific antibody. The amount of expressed protein in DAPA was comparable to those of classically spotted protein arrays. Reaction conditions can be tailored to suit the application of interest. DAPA represents a cost effective, easy and convenient way of producing protein arrays on demand. The reported work is expected to facilitate the application of DAPA for personalized medicine and screening purposes. Copyright © 2013 Elsevier B.V. All rights reserved.
Formulation and optimisation of raft-forming chewable tablets containing H2 antagonist
Prajapati, Shailesh T; Mehta, Anant P; Modhia, Ishan P; Patel, Chhagan N
2012-01-01
Purpose: The purpose of this research work was to formulate raft-forming chewable tablets of H2 antagonist (Famotidine) using a raft-forming agent along with an antacid- and gas-generating agent. Materials and Methods: Tablets were prepared by wet granulation and evaluated for raft strength, acid neutralisation capacity, weight variation, % drug content, thickness, hardness, friability and in vitro drug release. Various raft-forming agents were used in preliminary screening. A 23 full-factorial design was used in the present study for optimisation. The amount of sodium alginate, amount of calcium carbonate and amount sodium bicarbonate were selected as independent variables. Raft strength, acid neutralisation capacity and drug release at 30 min were selected as responses. Results: Tablets containing sodium alginate were having maximum raft strength as compared with other raft-forming agents. Acid neutralisation capacity and in vitro drug release of all factorial batches were found to be satisfactory. The F5 batch was optimised based on maximum raft strength and good acid neutralisation capacity. Drug–excipient compatibility study showed no interaction between the drug and excipients. Stability study of the optimised formulation showed that the tablets were stable at accelerated environmental conditions. Conclusion: It was concluded that raft-forming chewable tablets prepared using an optimum amount of sodium alginate, calcium carbonate and sodium bicarbonate could be an efficient dosage form in the treatment of gastro oesophageal reflux disease. PMID:23580933
Formulation and optimisation of raft-forming chewable tablets containing H2 antagonist.
Prajapati, Shailesh T; Mehta, Anant P; Modhia, Ishan P; Patel, Chhagan N
2012-10-01
The purpose of this research work was to formulate raft-forming chewable tablets of H2 antagonist (Famotidine) using a raft-forming agent along with an antacid- and gas-generating agent. Tablets were prepared by wet granulation and evaluated for raft strength, acid neutralisation capacity, weight variation, % drug content, thickness, hardness, friability and in vitro drug release. Various raft-forming agents were used in preliminary screening. A 2(3) full-factorial design was used in the present study for optimisation. The amount of sodium alginate, amount of calcium carbonate and amount sodium bicarbonate were selected as independent variables. Raft strength, acid neutralisation capacity and drug release at 30 min were selected as responses. Tablets containing sodium alginate were having maximum raft strength as compared with other raft-forming agents. Acid neutralisation capacity and in vitro drug release of all factorial batches were found to be satisfactory. The F5 batch was optimised based on maximum raft strength and good acid neutralisation capacity. Drug-excipient compatibility study showed no interaction between the drug and excipients. Stability study of the optimised formulation showed that the tablets were stable at accelerated environmental conditions. It was concluded that raft-forming chewable tablets prepared using an optimum amount of sodium alginate, calcium carbonate and sodium bicarbonate could be an efficient dosage form in the treatment of gastro oesophageal reflux disease.
Generic guide concepts for the European Spallation Source
NASA Astrophysics Data System (ADS)
Zendler, C.; Martin Rodriguez, D.; Bentley, P. M.
2015-12-01
The construction of the European Spallation Source (ESS) faces many challenges from the neutron beam transport point of view: the spallation source is specified as being driven by a 5 MW beam of protons, each with 2 GeV energy, and yet the requirements in instrument background suppression relative to measured signal vary between 10-6 and 10-8. The energetic particles, particularly above 20 MeV, which are expected to be produced in abundance in the target, have to be filtered in order to make the beamlines safe, operational and provide good quality measurements with low background. We present generic neutron guides of short and medium length instruments which are optimised for good performance at minimal cost. Direct line of sight to the source is avoided twice, with either the first point out of line of sight or both being inside the bunker (20 m) to minimise shielding costs. These guide geometries are regarded as a baseline to define standards for instruments to be constructed at ESS. They are used to find commonalities and develop principles and solutions for common problems. Lastly, we report the impact of employing the over-illumination concept to mitigate losses from random misalignment passively, and that over-illumination should be used sparingly in key locations to be effective. For more widespread alignment issues, a more direct, active approach is likely to be needed.
Optimisation of strain selection in evolutionary continuous culture
NASA Astrophysics Data System (ADS)
Bayen, T.; Mairet, F.
2017-12-01
In this work, we study a minimal time control problem for a perfectly mixed continuous culture with n ≥ 2 species and one limiting resource. The model that we consider includes a mutation factor for the microorganisms. Our aim is to provide optimal feedback control laws to optimise the selection of the species of interest. Thanks to Pontryagin's Principle, we derive optimality conditions on optimal controls and introduce a sub-optimal control law based on a most rapid approach to a singular arc that depends on the initial condition. Using adaptive dynamics theory, we also study a simplified version of this model which allows to introduce a near optimal strategy.
First on-sky results of a neural network based tomographic reconstructor: Carmen on Canary
NASA Astrophysics Data System (ADS)
Osborn, J.; Guzman, D.; de Cos Juez, F. J.; Basden, A. G.; Morris, T. J.; Gendron, É.; Butterley, T.; Myers, R. M.; Guesalaga, A.; Sanchez Lasheras, F.; Gomez Victoria, M.; Sánchez Rodríguez, M. L.; Gratadour, D.; Rousset, G.
2014-07-01
We present on-sky results obtained with Carmen, an artificial neural network tomographic reconstructor. It was tested during two nights in July 2013 on Canary, an AO demonstrator on the William Hershel Telescope. Carmen is trained during the day on the Canary calibration bench. This training regime ensures that Carmen is entirely flexible in terms of atmospheric turbulence profile, negating any need to re-optimise the reconstructor in changing atmospheric conditions. Carmen was run in short bursts, interlaced with an optimised Learn and Apply reconstructor. We found the performance of Carmen to be approximately 5% lower than that of Learn and Apply.
NASA Astrophysics Data System (ADS)
Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.
2016-04-01
Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the optimization problem (the best wells to retain in the monitoring network) depends on the total number of wells removed; this number is a management decision. The water level monitoring network of Mires basin has been optimized 6 times by removing 5, 8, 12, 15, 20 and 25 wells from the original network. In order to achieve the optimum solution in the minimum possible computational time, a stall generations criterion was set for each optimisation scenario. An improvement made to the classic genetic algorithm was the change of the mutation and crossover fraction in respect to the change of the mean fitness value. This results to a randomness in reproduction, if the solution converges, to avoid local minima, or, in a more educated reproduction (higher crossover ratio) when there is higher change in the mean fitness value. The choice of integer genetic algorithm in MATLAB 2015a poses the restriction of adding custom selection and crossover-mutation functions. Therefore, custom population and crossover-mutation-selection functions have been created to set the initial population type to custom and have the ability to change the mutation crossover probability in respect to the convergence of the genetic algorithm, achieving thus higher accuracy. The application of the network optimisation tool to Mires basin indicates that 25 wells can be removed with a relatively small deterioration of the groundwater level map. The results indicate the robustness of the network optimisation tool: Wells were removed from high well-density areas while preserving the spatial pattern of the original groundwater level map. Varouchakis, E. A. and D. T. Hristopulos (2013). "Improvement of groundwater level prediction in sparsely gauged basins using physical laws and local geographic features as auxiliary variables." Advances in Water Resources 52: 34-49.
Systems Biology of Recombinant Protein Production in Bacillus megaterium
NASA Astrophysics Data System (ADS)
Biedendieck, Rebekka; Bunk, Boyke; Fürch, Tobias; Franco-Lara, Ezequiel; Jahn, Martina; Jahn, Dieter
Over the last two decades the Gram-positive bacterium Bacillus megaterium was systematically developed to a useful alternative protein production host. Multiple vector systems for high yield intra- and extracellular protein production were constructed. Strong inducible promoters were combined with DNA sequences for optimised ribosome binding sites, various leader peptides for protein export and N- as well as C-terminal affinity tags for affinity chromatographic purification of the desired protein. High cell density cultivation and recombinant protein production were successfully tested. For further system biology based control and optimisation of the production process the genomes of two B. megaterium strains were completely elucidated, DNA arrays designed, proteome, fluxome and metabolome analyses performed and all data integrated using the bioinformatics platform MEGABAC. Now, solid theoretical and experimental bases for primary modeling attempts of the production process are available.
Material model of pelvic bone based on modal analysis: a study on the composite bone.
Henyš, Petr; Čapek, Lukáš
2017-02-01
Digital models based on finite element (FE) analysis are widely used in orthopaedics to predict the stress or strain in the bone due to bone-implant interaction. The usability of the model depends strongly on the bone material description. The material model that is most commonly used is based on a constant Young's modulus or on the apparent density of bone obtained from computer tomography (CT) data. The Young's modulus of bone is described in many experimental works with large variations in the results. The concept of measuring and validating the material model of the pelvic bone based on modal analysis is introduced in this pilot study. The modal frequencies, damping, and shapes of the composite bone were measured precisely by an impact hammer at 239 points. An FE model was built using the data pertaining to the geometry and apparent density obtained from the CT of the composite bone. The isotropic homogeneous Young's modulus and Poisson's ratio of the cortical and trabecular bone were estimated from the optimisation procedure including Gaussian statistical properties. The performance of the updated model was investigated through the sensitivity analysis of the natural frequencies with respect to the material parameters. The maximal error between the numerical and experimental natural frequencies of the bone reached 1.74 % in the first modal shape. Finally, the optimised parameters were matched with the data sheets of the composite bone. The maximal difference between the calibrated material properties and that obtained from the data sheet was 34 %. The optimisation scheme of the FE model based on the modal analysis data provides extremely useful calibration of the FE models with the uncertainty bounds and without the influence of the boundary conditions.
Bahia, Daljit; Cheung, Robert; Buchs, Mirjam; Geisse, Sabine; Hunt, Ian
2005-01-01
This report describes a method to culture insects cells in 24 deep-well blocks for the routine small-scale optimisation of baculovirus-mediated protein expression experiments. Miniaturisation of this process provides the necessary reduction in terms of resource allocation, reagents, and labour to allow extensive and rapid optimisation of expression conditions, with the concomitant reduction in lead-time before commencement of large-scale bioreactor experiments. This therefore greatly simplifies the optimisation process and allows the use of liquid handling robotics in much of the initial optimisation stages of the process, thereby greatly increasing the throughput of the laboratory. We present several examples of the use of deep-well block expression studies in the optimisation of therapeutically relevant protein targets. We also discuss how the enhanced throughput offered by this approach can be adapted to robotic handling systems and the implications this has on the capacity to conduct multi-parallel protein expression studies.
The development of a tournament preparation framework for competitive golf: A Delphi study.
Pilgrim, Jarred; Kremer, Peter; Robertson, Samuel
2018-05-09
Tournament preparation in golf is used by players to increase course knowledge, develop strategy, optimise playing conditions and facilitate self-regulation. It is not known whether specific behaviours in tournament preparation should be given priority in education and practice at different stages of competition. This study aimed to achieve consensus on the importance of specific tournament preparation behaviours or "items" to players of five competitive levels. A two-round Delphi study was used, including an expert panel of 36 coaches, high-performance staff, players and academics. Participants were asked to score the relative importance of 48 items to players using a 5-point Likert-type scale. For an item to achieve consensus, 67% agreement was required in two adjacent score categories. Consensus was reached for 46 items and these were used to develop a ranked framework for each competitive level. The developed framework provides consensus-based guidelines of the behaviours that are perceived as important in tournament preparation. This framework could be used by national sport organisations to guide the development of more comprehensive learning environments for players and coaches. It could also direct future studies examining the critical behaviours for golfers across different competitive levels.
Biometric templates selection and update using quality measures
NASA Astrophysics Data System (ADS)
Abboud, Ali J.; Jassim, Sabah A.
2012-06-01
To deal with severe variation in recording conditions, most biometric systems acquire multiple biometric samples, at the enrolment stage, for the same person and then extract their individual biometric feature vectors and store them in the gallery in the form of biometric template(s), labelled with the person's identity. The number of samples/templates and the choice of the most appropriate templates influence the performance of the system. The desired biometric template(s) selection technique must aim to control the run time and storage requirements while improving the recognition accuracy of the biometric system. This paper is devoted to elaborating on and discussing a new two stages approach for biometric templates selection and update. This approach uses a quality-based clustering, followed by a special criterion for the selection of an ultimate set of biometric templates from the various clusters. This approach is developed to select adaptively a specific number of templates for each individual. The number of biometric templates depends mainly on the performance of each individual (i.e. gallery size should be optimised to meet the needs of each target individual). These experiments have been conducted on two face image databases and their results will demonstrate the effectiveness of proposed quality-guided approach.
Ponsford, Ruth; Allen, Elizabeth; Campbell, Rona; Elbourne, Diana; Hadley, Alison; Lohan, Maria; Melendez-Torres, G J; Mercer, Catherine H; Morris, Steve; Young, Honor; Bonell, Chris
2018-01-01
Since the introduction of the Teenage Pregnancy Strategy (TPS), England's under-18 conception rate has fallen by 55%, but a continued focus on prevention is needed to maintain and accelerate progress. The teenage birth rate remains higher in the UK than comparable Western European countries. Previous trials indicate that school-based social marketing interventions are a promising approach to addressing teenage pregnancy and improving sexual health. Such interventions are yet to be trialled in the UK. This study aims to optimise and establish the feasibility and acceptability of one such intervention: Positive Choices. Design: Optimisation, feasibility testing and pilot cluster randomised trial.Interventions: The Positive Choices intervention comprises a student needs survey, a student/staff led School Health Promotion Council (SHPC), a classroom curriculum for year nine students covering social and emotional skills and sex education, student-led social marketing activities, parent information and a review of school sexual health services.Systematic optimisation of Positive Choices will be carried out with the National Children's Bureau Sex Education Forum (NCB SEF), one state secondary school in England and other youth and policy stakeholders.Feasibility testing will involve the same state secondary school and will assess progression criteria to advance to the pilot cluster RCT.Pilot cluster RCT with integral process evaluation will involve six different state secondary schools (four interventions and two controls) and will assess the feasibility and utility of progressing to a full effectiveness trial.The following outcome measures will be trialled as part of the pilot:Self-reported pregnancy and unintended pregnancy (initiation of pregnancy for boys) and sexually transmitted infections,Age of sexual debut, number of sexual partners, use of contraception at first and last sex and non-volitional sexEducational attainmentThe feasibility of linking administrative data on births and termination to self-report survey data to measure our primary outcome (unintended teenage pregnancy) will also be tested. This will be the first UK-based pilot trial of a school-wide social marketing intervention to reduce unintended teenage pregnancy and improve sexual health. If this study indicates feasibility and acceptability of the optimised Positive Choices intervention in English secondary schools, plans will be initiated for a phase III trial and economic evaluation of the intervention. ISRCTN registry (ISCTN12524938. Registered 03/07/2017).
FIBER AND INTEGRATED OPTICS: Bandgap modes in a coupled waveguide array
NASA Astrophysics Data System (ADS)
Usievich, B. A.; Nurligareev, D. Kh; Svetikov, V. V.; Sychugov, V. A.
2009-08-01
This work examines a waveguide array that consists of ten Nb2O5/SiO2 double layers and supports a 0.63-μm surface wave. The deposition of a Nb2O5 capping layer on top of the waveguide array enables a marked increase in the wave field intensity on its surface. The efficiency of surface-wave excitation in the Kretschmann configuration can be optimised by adjusting the number of double layers. We analyse the behaviour of the Bragg mode in relation to the thickness of the layer exposed to air and the transition of this mode from the second allowed band to the first through the bandgap of the system. In addition, the conventional leaky mode converts to a surface mode and then to a guided mode.
On the design and optimisation of new fractal antenna using PSO
NASA Astrophysics Data System (ADS)
Rani, Shweta; Singh, A. P.
2013-10-01
An optimisation technique for newly shaped fractal structure using particle swarm optimisation with curve fitting is presented in this article. The aim of particle swarm optimisation is to find the geometry of the antenna for the required user-defined frequency. To assess the effectiveness of the presented method, a set of representative numerical simulations have been done and the results are compared with the measurements from experimental prototypes built according to the design specifications coming from the optimisation procedure. The proposed fractal antenna resonates at the 5.8 GHz industrial, scientific and medical band which is suitable for wireless telemedicine applications. The antenna characteristics have been studied using extensive numerical simulations and are experimentally verified. The antenna exhibits well-defined radiation patterns over the band.
Baugreet, Sephora; Kerry, Joseph P; Brodkorb, André; Gomez, Carolina; Auty, Mark; Allen, Paul; Hamill, Ruth M
2018-08-01
With the goal of optimising a protein-enriched restructured beef steak targeted at the nutritional and chemosensory requirements of older adults, technological performance of thirty formulations, containing plant-based ingredients, pea protein isolate (PPI), rice protein (RP) and lentil flour (LF) with transglutaminase (TG) to enhance binding of meat pieces, were analysed. Maximal protein content of 28% in cooked product was achieved with PPI, RP and LF. Binding strength was primarily affected by TG, while textural parameters were improved with LF inclusion. Optimal formulation (F) to obtain a protein-enriched steak with lowest hardness values was achieved with TG (2%), PPI (8%), RP (9.35%) and LF (4%). F, F1S (optimal formulation 1 with added seasoning) and control restructured products (not containing plant proteins or seasonings) were scored by 120 consumers' aged over-65 years. Controls were most preferred (P < .05), while F1S were least liked by the older consumers. Consumer testing suggests further refinement and optimisation of restructured products with plant proteins should be undertaken. Copyright © 2018 Elsevier Ltd. All rights reserved.
Floating-to-Fixed-Point Conversion for Digital Signal Processors
NASA Astrophysics Data System (ADS)
Menard, Daniel; Chillet, Daniel; Sentieys, Olivier
2006-12-01
Digital signal processing applications are specified with floating-point data types but they are usually implemented in embedded systems with fixed-point arithmetic to minimise cost and power consumption. Thus, methodologies which establish automatically the fixed-point specification are required to reduce the application time-to-market. In this paper, a new methodology for the floating-to-fixed point conversion is proposed for software implementations. The aim of our approach is to determine the fixed-point specification which minimises the code execution time for a given accuracy constraint. Compared to previous methodologies, our approach takes into account the DSP architecture to optimise the fixed-point formats and the floating-to-fixed-point conversion process is coupled with the code generation process. The fixed-point data types and the position of the scaling operations are optimised to reduce the code execution time. To evaluate the fixed-point computation accuracy, an analytical approach is used to reduce the optimisation time compared to the existing methods based on simulation. The methodology stages are described and several experiment results are presented to underline the efficiency of this approach.
Sun, Jingcan; Yu, Bin; Curran, Philip; Liu, Shao-Quan
2012-12-15
Coconut cream and fusel oil, two low-cost natural substances, were used as starting materials for the biosynthesis of flavour-active octanoic acid esters (ethyl-, butyl-, isobutyl- and (iso)amyl octanoate) using lipase Palatase as the biocatalyst. The Taguchi design method was used for the first time to optimize the biosynthesis of esters by a lipase in an aqueous system of coconut cream and fusel oil. Temperature, time and enzyme amount were found to be statistically significant factors and the optimal conditions were determined to be as follows: temperature 30°C, fusel oil concentration 9% (v/w), reaction time 24h, pH 6.2 and enzyme amount 0.26 g. Under the optimised conditions, a yield of 14.25mg/g (based on cream weight) and signal-to-noise (S/N) ratio of 23.07 dB were obtained. The results indicate that the Taguchi design method was an efficient and systematic approach to the optimisation of lipase-catalysed biological processes. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Malko, Daniel; Lopes, Thiago; Ticianelli, Edson A.; Kucernak, Anthony
2016-08-01
The effect of the ionomer to carbon (I/C) ratio on the performance of single cell polymer electrolyte fuel cells is investigated for three different types of non-precious metal cathodic catalysts. Polarisation curves as well as impedance spectra are recorded at different potentials in the presence of argon or oxygen at the cathode and hydrogen at the anode. It is found that a optimised ionomer content is a key factor for improving the performance of the catalyst. Non-optimal ionomer loading can be assessed by two different factors from the impedance spectra. Hence this observation could be used as a diagnostic element to determine the ideal ionomer content and distribution in newly developed catalyst-electrodes. An electrode morphology based on the presence of inhomogeneous resistance distribution within the porous structure is suggested to explain the observed phenomena. The back-pressure and relative humidity effect on this feature is also investigated and supports the above hypothesis. We give a simple flowchart to aid optimisation of electrodes with the minimum number of trials.
Dellson, P; Nilbert, M; Bendahl, P-O; Malmström, P; Carlsson, C
2011-07-01
Clinical trials are crucial to improve cancer treatment but recruitment is difficult. Optimised patient information has been recognised as a key issue. In line with the increasing focus on patients' perspectives in health care, we aimed to study patients' opinions about the written information used in three clinical trials for breast cancer. Primary data collection was done in focus group interviews with breast cancer patient advocates. Content analysis identified three major themes: comprehensibility, emotions and associations, and decision making. Based on the advocates' suggestions for improvements, 21 key issues were defined and validated through a questionnaire in an independent group of breast cancer patient advocates. Clear messages, emotionally neutral expressions, careful descriptions of side effects, clear comparisons between different treatment alternatives and information about the possibility to discontinue treatment were perceived as the most important issues. Patients' views of the information in clinical trials provide new insights and identify key issues to consider in optimising future written information and may improve recruitment to clinical cancer trials. © 2010 Blackwell Publishing Ltd.
Guillaume, Y C; Peyrin, E
2000-03-06
A chemometric methodology is proposed to study the separation of seven p-hydroxybenzoic esters in reversed phase liquid chromatography (RPLC). Fifteen experiments were found to be necessary to find a mathematical model which linked a novel chromatographic response function (CRF) with the column temperature, the water fraction in the mobile phase and its flow rate. The CRF optimum was determined using a new algorithm based on Glover's taboo search (TS). A flow-rate of 0.9 ml min(-1) with a water fraction of 0.64 in the ACN-water mixture and a column temperature of 10 degrees C gave the most efficient separation conditions. The usefulness of TS was compared with the pure random search (PRS) and simplex search (SS). As demonstrated by calculations, the algorithm avoids entrapment in local minima and continues the search to give a near-optimal final solution. Unlike other methods of global optimisation, this procedure is generally applicable, easy to implement, derivative free, conceptually simple and could be used in the future for much more complex optimisation problems.
Zeinali-Davarani, Shahrokh; Shirazi-Adl, Aboulfazl; Dariush, Behzad; Hemami, Hooshang; Parnianpour, Mohamad
2011-07-01
The effects of external resistance on the recruitment of trunk muscles in sagittal movements and the coactivation mechanism to maintain spinal stability were investigated using a simple computational model of iso-resistive spine sagittal movements. Neural excitation of muscles was attained based on inverse dynamics approach along with a stability-based optimisation. The trunk flexion and extension movements between 60° flexion and the upright posture against various resistance levels were simulated. Incorporation of the stability constraint in the optimisation algorithm required higher antagonistic activities for all resistance levels mostly close to the upright position. Extension movements showed higher coactivation with higher resistance, whereas flexion movements demonstrated lower coactivation indicating a greater stability demand in backward extension movements against higher resistance at the neighbourhood of the upright posture. Optimal extension profiles based on minimum jerk, work and power had distinct kinematics profiles which led to recruitment patterns with different timing and amplitude of activation.
Anstey, Kaarin J; Bielak, Allison AM; Birrell, Carole L; Browning, Colette J; Burns, Richard A; Byles, Julie; Kiley, Kim M; Nepal, Binod; Ross, Lesley A; Steel, David; Windsor, Timothy D
2014-01-01
Aim To describe the Dynamic Analyses to Optimise Ageing (DYNOPTA) project and illustrate its contributions to understanding ageing through innovative methodology, and investigations on outcomes based on the project themes. DYNOPTA provides a platform and technical expertise that may be used to combine other national and international datasets. Method The DYNOPTA project has pooled and harmonized data from nine Australian longitudinal studies to create the largest available longitudinal dataset (N=50652) on ageing in Australia. Results A range of findings have resulted from the study to date, including methodological advances, prevalence rates of disease and disability, and mapping trajectories of ageing with and without increasing morbidity. DYNOPTA also forms the basis of a microsimulation model that will provide projections of future costs of disease and disability for the baby boomer cohort. Conclusion DYNOPTA contributes significantly to the Australian evidence-base on ageing to inform key social and health policy domains. PMID:22032767
A knowledge-based control system for air-scour optimisation in membrane bioreactors.
Ferrero, G; Monclús, H; Sancho, L; Garrido, J M; Comas, J; Rodríguez-Roda, I
2011-01-01
Although membrane bioreactors (MBRs) technology is still a growing sector, its progressive implementation all over the world, together with great technical achievements, has allowed it to reach a mature degree, just comparable to other more conventional wastewater treatment technologies. With current energy requirements around 0.6-1.1 kWh/m3 of treated wastewater and investment costs similar to conventional treatment plants, main market niche for MBRs can be areas with very high restrictive discharge limits, where treatment plants have to be compact or where water reuse is necessary. Operational costs are higher than for conventional treatments; consequently there is still a need and possibilities for energy saving and optimisation. This paper presents the development of a knowledge-based decision support system (DSS) for the integrated operation and remote control of the biological and physical (filtration and backwashing or relaxation) processes in MBRs. The core of the DSS is a knowledge-based control module for air-scour consumption automation and energy consumption minimisation.
H2/H∞ control for grid-feeding converter considering system uncertainty
NASA Astrophysics Data System (ADS)
Li, Zhongwen; Zang, Chuanzhi; Zeng, Peng; Yu, Haibin; Li, Shuhui; Fu, Xingang
2017-05-01
Three-phase grid-feeding converters are key components to integrate distributed generation and renewable power sources to the power utility. Conventionally, proportional integral and proportional resonant-based control strategies are applied to control the output power or current of a GFC. But, those control strategies have poor transient performance and are not robust against uncertainties and volatilities in the system. This paper proposes a H2/H∞-based control strategy, which can mitigate the above restrictions. The uncertainty and disturbance are included to formulate the GFC system state-space model, making it more accurate to reflect the practical system conditions. The paper uses a convex optimisation method to design the H2/H∞-based optimal controller. Instead of using a guess-and-check method, the paper uses particle swarm optimisation to search a H2/H∞ optimal controller. Several case studies implemented by both simulation and experiment can verify the superiority of the proposed control strategy than the traditional PI control methods especially under dynamic and variable system conditions.
On the performance of energy detection-based CR with SC diversity over IG channel
NASA Astrophysics Data System (ADS)
Verma, Pappu Kumar; Soni, Sanjay Kumar; Jain, Priyanka
2017-12-01
Cognitive radio (CR) is a viable 5G technology to address the scarcity of the spectrum. Energy detection-based sensing is known to be the simplest method as far as hardware complexity is concerned. In this paper, the performance of spectrum sensing-based energy detection technique in CR networks over inverse Gaussian channel for selection combining diversity technique is analysed. More specifically, accurate analytical expressions for the average detection probability under different detection scenarios such as single channel (no diversity) and with diversity reception are derived and evaluated. Further, the detection threshold parameter is optimised by minimising the probability of error over several diversity branches. The results clearly show the significant improvement in the probability of detection when optimised threshold parameter is applied. The impact of shadowing parameters on the performance of energy detector is studied in terms of complimentary receiver operating characteristic curve. To verify the correctness of our analysis, the derived analytical expressions are corroborated via exact result and Monte Carlo simulations.
NASA Astrophysics Data System (ADS)
Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.
2017-09-01
This paper presents a systematic methodology to analyse the warpage of the side arm part using Autodesk Moldflow Insight software. Response Surface Methodology (RSM) was proposed to optimise the processing parameters that will result in optimal solutions by efficiently minimising the warpage of the side arm part. The variable parameters considered in this study was based on most significant parameters affecting warpage stated by previous researchers, that is melt temperature, mould temperature and packing pressure while adding packing time and cooling time as these is the commonly used parameters by researchers. The results show that warpage was improved by 10.15% and the most significant parameters affecting warpage are packing pressure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozhdestvensky, Yu V
The possibility is studied for obtaining intense cold atomic beams by using the Renyi entropy to optimise the laser cooling process. It is shown in the case of a Gaussian velocity distribution of atoms, the Renyi entropy coincides with the density of particles in the phase space. The optimisation procedure for cooling atoms by resonance optical radiation is described, which is based on the thermodynamic law of increasing the Renyi entropy in time. Our method is compared with the known methods for increasing the laser cooling efficiency such as the tuning of a laser frequency in time and a changemore » of the atomic transition frequency in an inhomogeneous transverse field of a magnetic solenoid. (laser cooling)« less
Di Paolo Emilio, M; Festuccia, R; Palladino, L
2015-09-01
In this work, the X-ray emission generated from a plasma produced by focusing Nd-YAG laser beam on the Mylar and Yttrium targets will be characterised. The goal is to reach the best condition that optimises the X-ray conversion efficiency at 500 eV (pre-edge of the Oxigen K-shell), strongly absorbed by carbon-based structures. The characteristics of the microbeam optical system, the software/hardware control and the preliminary measurements of the X-ray fluence will be presented. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
A Cost-Utility Analysis of Prostate Cancer Screening in Australia.
Keller, Andrew; Gericke, Christian; Whitty, Jennifer A; Yaxley, John; Kua, Boon; Coughlin, Geoff; Gianduzzo, Troy
2017-02-01
The Göteborg randomised population-based prostate cancer screening trial demonstrated that prostate-specific antigen (PSA)-based screening reduces prostate cancer deaths compared with an age-matched control group. Utilising the prostate cancer detection rates from this study, we investigated the clinical and cost effectiveness of a similar PSA-based screening strategy for an Australian population of men aged 50-69 years. A decision model that incorporated Markov processes was developed from a health system perspective. The base-case scenario compared a population-based screening programme with current opportunistic screening practices. Costs, utility values, treatment patterns and background mortality rates were derived from Australian data. All costs were adjusted to reflect July 2015 Australian dollars (A$). An alternative scenario compared systematic with opportunistic screening but with optimisation of active surveillance (AS) uptake in both groups. A discount rate of 5 % for costs and benefits was utilised. Univariate and probabilistic sensitivity analyses were performed to assess the effect of variable uncertainty on model outcomes. Our model very closely replicated the number of deaths from both prostate cancer and background mortality in the Göteborg study. The incremental cost per quality-adjusted life-year (QALY) for PSA screening was A$147,528. However, for years of life gained (LYGs), PSA-based screening (A$45,890/LYG) appeared more favourable. Our alternative scenario with optimised AS improved cost utility to A$45,881/QALY, with screening becoming cost effective at a 92 % AS uptake rate. Both modelled scenarios were most sensitive to the utility of patients before and after intervention, and the discount rate used. PSA-based screening is not cost effective compared with Australia's assumed willingness-to-pay threshold of A$50,000/QALY. It appears more cost effective if LYGs are used as the relevant outcome, and is more cost effective than the established Australian breast cancer screening programme on this basis. Optimised utilisation of AS increases the cost effectiveness of prostate cancer screening dramatically.
NASA Astrophysics Data System (ADS)
Zimmerling, Clemens; Dörr, Dominik; Henning, Frank; Kärger, Luise
2018-05-01
Due to their high mechanical performance, continuous fibre reinforced plastics (CoFRP) become increasingly important for load bearing structures. In many cases, manufacturing CoFRPs comprises a forming process of textiles. To predict and optimise the forming behaviour of a component, numerical simulations are applied. However, for maximum part quality, both the geometry and the process parameters must match in mutual regard, which in turn requires numerous numerically expensive optimisation iterations. In both textile and metal forming, a lot of research has focused on determining optimum process parameters, whilst regarding the geometry as invariable. In this work, a meta-model based approach on component level is proposed, that provides a rapid estimation of the formability for variable geometries based on pre-sampled, physics-based draping data. Initially, a geometry recognition algorithm scans the geometry and extracts a set of doubly-curved regions with relevant geometry parameters. If the relevant parameter space is not part of an underlying data base, additional samples via Finite-Element draping simulations are drawn according to a suitable design-table for computer experiments. Time saving parallel runs of the physical simulations accelerate the data acquisition. Ultimately, a Gaussian Regression meta-model is built from the data base. The method is demonstrated on a box-shaped generic structure. The predicted results are in good agreement with physics-based draping simulations. Since evaluations of the established meta-model are numerically inexpensive, any further design exploration (e.g. robustness analysis or design optimisation) can be performed in short time. It is expected that the proposed method also offers great potential for future applications along virtual process chains: For each process step along the chain, a meta-model can be set-up to predict the impact of design variations on manufacturability and part performance. Thus, the method is considered to facilitate a lean and economic part and process design under consideration of manufacturing effects.
Statistical optimisation techniques in fatigue signal editing problem
NASA Astrophysics Data System (ADS)
Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.
2015-02-01
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.
O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.
2012-01-01
Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279
Statistical optimisation techniques in fatigue signal editing problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nopiah, Z. M.; Osman, M. H.; Baharin, N.
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window andmore » fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.« less
Escher, Graziela Bragueto; Santos, Jânio Sousa; Rosso, Neiva Deliberali; Marques, Mariza Boscacci; Azevedo, Luciana; do Carmo, Mariana Araújo Vieira; Daguer, Heitor; Molognoni, Luciano; Prado-Silva, Leonardo do; Sant'Ana, Anderson S; da Silva, Marcia Cristina; Granato, Daniel
2018-05-19
This study aimed to optimise the experimental conditions of extraction of the phytochemical compounds and functional properties of Centaurea cyanus petals. The following parameters were determined: the chemical composition (LC-ESI-MS/MS), the effects of pH on the stability and antioxidant activity of anthocyanins, the inhibition of lipid peroxidation, antioxidant activity, anti-hemolytic activity, antimicrobial, anti-hypertensive, and cytotoxic/cytoprotective effect, and the measurements of intracellular reactive oxygen species. Results showed that the temperature and time influenced (p ≤ 0.05) the content of flavonoids, anthocyanins, and FRAP. Only the temperature influenced the total phenolic content, non-anthocyanin flavonoids, and antioxidant activity (DPPH). The statistical approach made it possible to obtain the optimised experimental extraction conditions to increase the level of bioactive compounds. Chlorogenic, caffeic, ferulic, and p-coumaric acids, isoquercitrin, and coumarin were identified as the major compounds in the optimised extract. The optimised extract presented anti-hemolytic and anti-hypertensive activity in vitro, in addition to showing stability and reversibility of anthocyanins and antioxidant activity with pH variation. The C. cyanus petals aqueous extract exhibited high IC 50 and GI 50 (>900 μg/mL) values for all cell lines, meaning low cytotoxicity. Based on the stress oxidative assay, the extract exhibited pro-oxidant action (10-100 μg/mL) but did not cause damage or cell death. Copyright © 2018 Elsevier Ltd. All rights reserved.
Johnston, Christopher; Douarre, Pierre E; Soulimane, Tewfik; Pletzer, Daniel; Weingart, Helge; MacSharry, John; Coffey, Aidan; Sleator, Roy D; O'Mahony, Jim
2013-06-01
Subunit and DNA-based vaccines against Mycobacterium avium ssp. paratuberculosis (MAP) attempt to overcome inherent issues associated with whole-cell formulations. However, these vaccines can be hampered by poor expression of recombinant antigens from a number of disparate hosts. The high G+C content of MAP invariably leads to a codon bias throughout gene expression. To investigate if the codon bias affects recombinant MAP antigen expression, the open reading frame of a MAP-specific antigen MptD (MAP3733c) was codon optimised for expression against a Lactobacillus salivarius host. Of the total 209 codons which constitute MAP3733c, 172 were modified resulting in a reduced G+C content from 61% for the native gene to 32.7% for the modified form. Both genes were placed under the transcriptional control of the PnisA promoter; allowing controlled heterologous expression in L. salivarius. Expression was monitored using fluorescence microscopy and microplate fluorometry via GFP tags translationally fused to the C-termini of the two MptD genes. A > 37-fold increase in expression was observed for the codon-optimised MAP3733synth variant over the native gene. Due to the low cost and improved expression achieved, codon optimisation significantly improves the potential of L. salivarius as an oral vaccine stratagem against Johne's disease. © 2013 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.
Chaves, N J; Cheng, A C; Runnegar, N; Kirschner, J; Lee, T; Buising, K
2014-06-01
Antimicrobial stewardship programmes aim to optimise use of antibiotics and are now mandatory in all Australian hospitals. We aimed to identify barriers to and enablers of appropriate antimicrobial prescribing among hospital doctors. Two paper-based and one web-based surveys were administered at three Australian university teaching hospitals from March 2010 to May 2011. The 18-item questionnaire recorded doctors’ level of experience, their knowledge regarding the use of common antimicrobials and their attitudes regarding antimicrobial prescribing. Local survey modifications allowed inclusion of specific questions on: infections in intensive care unit patients, clinical microbiology and use of local guidelines. The respondents (n = 272) were comprised of 96 (35%) registrars, 67 (25%)residents, 57 (21%) interns and 47 (17%) consultant hospital doctors. Forty-one per cent were working in a medical specialty. Identified barriers included: gaps in antimicrobial prescribing knowledge (especially among interns), a lack of awareness about which antimicrobials were restricted and a reliance on senior colleagues to make antimicrobial prescribing decisions. Enablers of optimal prescribing included: an acknowledgement of the need for assistance in prescribing and reported readiness to consult national prescribing guidelines. These results were used to help guide and prioritise interventions to improve prescribing practices. A transferable knowledge and attitudes survey tool can be used to highlight barriers and facilitators to optimal hospital antimicrobial prescribing in order to inform tailored antimicrobial stewardship interventions.
Refolding of proteins from inclusion bodies: rational design and recipes.
Basu, Anindya; Li, Xiang; Leong, Susanna Su Jan
2011-10-01
The need to develop protein biomanufacturing platforms that can deliver proteins quickly and cost-effectively is ever more pressing. The rapid rate at which genomes can now be sequenced demands efficient protein production platforms for gene function identification. There is a continued need for the biotech industry to deliver new and more effective protein-based drugs to address new diseases. Bacterial production platforms have the advantage of high expression yields, but insoluble expression of many proteins necessitates the development of diverse and optimised refolding-based processes. Strategies employed to eliminate insoluble expression are reviewed, where it is concluded that inclusion bodies are difficult to eliminate for various reasons. Rational design of refolding systems and recipes are therefore needed to expedite production of recombinant proteins. This review article discusses efforts towards rational design of refolding systems and recipes, which can be guided by the development of refolding screening platforms that yield both qualitative and quantitative information on the progression of a given refolding process. The new opportunities presented by light scattering technologies for developing rational protein refolding buffer systems which in turn can be used to develop new process designs armed with better monitoring and controlling functionalities are discussed. The coupling of dynamic and static light scattering methodologies for incorporation into future bioprocess designs to ensure delivery of high-quality refolded proteins at faster rates is also discussed.
Modern imaging techniques: applications in the management of acute aortic pathologies.
Nagpal, Prashant; Khandelwal, Ashish; Saboo, Sachin S; Bathla, Girish; Steigner, Michael L; Rybicki, Frank J
2015-08-01
Acute aortic pathologies include traumatic and non-traumatic life-threatening emergencies of the aorta. Since the clinical manifestation of these entities can be non-specific and may overlap with other conditions presenting with chest pain, non-invasive imaging plays a crucial role in their rapid and accurate evaluation. The early diagnosis and accurate radiological assessment of acute aortic diseases is essential for improved clinical outcomes. Multidetector CT is the imaging modality of choice for evaluation of acute aortic diseases with MRI playing more of a problem-solving role. The management can be medical, endovascular or surgical depending upon pathology, and imaging remains an indispensable management-guiding tool. It is important to understand the pathogenesis, natural history, and imaging principles of acute aortic diseases for appropriate use of advanced imaging modalities. This understanding helps to formulate a more appropriate management and follow-up plan for optimised care of these patients. Imaging reporting pearls for day-to-day radiology as well as treatment options based on latest multidisciplinary guidelines are discussed. With newer techniques of image acquisition and processing, we are hopeful that imaging would further help in predicting aortic disease progression and assessing the haemodynamic parameters based on which decisions on management can be made. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Sheikh Rashid, Marya; Leensen, Monique C J; de Laat, Jan A P M; Dreschler, Wouter A
2017-11-01
The "Occupational Earcheck" (OEC) is a Dutch online self-screening speech-in-noise test developed for the detection of occupational high-frequency hearing loss (HFHL). This study evaluates an optimised version of the test and determines the most appropriate masking noise. The original OEC was improved by homogenisation of the speech material, and shortening the test. A laboratory-based cross-sectional study was performed in which the optimised OEC in five alternative masking noise conditions was evaluated. The study was conducted on 18 normal-hearing (NH) adults, and 15 middle-aged listeners with HFHL. The OEC in a low-pass (LP) filtered stationary background noise (test version LP 3: with a cut-off frequency of 1.6 kHz, and a noise floor of -12 dB) was the most accurate version tested. The test showed a reasonable sensitivity (93%), and specificity (94%) and test reliability (intra-class correlation coefficient: 0.84, mean within-subject standard deviation: 1.5 dB SNR, slope of psychometric function: 13.1%/dB SNR). The improved OEC, with homogenous word material in a LP filtered noise, appears to be suitable for the discrimination between younger NH listeners and older listeners with HFHL. The appropriateness of the OEC for screening purposes in an occupational setting will be studied further.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knight, Stephen P, E-mail: stephen.knight@health.qld.gov.au
The aim of this review was to develop a radiographic optimisation strategy to make use of digital radiography (DR) and needle phosphor computerised radiography (CR) detectors, in order to lower radiation dose and improve image quality for paediatrics. This review was based on evidence-based practice, of which a component was a review of the relevant literature. The resulting exposure chart was developed with two distinct groups of exposure optimisation strategies – body exposures (for head, trunk, humerus, femur) and distal extremity exposures (elbow to finger, knee to toe). Exposure variables manipulated included kilovoltage peak (kVp), target detector exposure and milli-ampere-secondsmore » (mAs), automatic exposure control (AEC), additional beam filtration, and use of antiscatter grid. Mean dose area product (DAP) reductions of up to 83% for anterior–posterior (AP)/posterior–anterior (PA) abdomen projections were recorded postoptimisation due to manipulation of multiple-exposure variables. For body exposures, the target EI and detector exposure, and thus the required mAs were typically 20% less postoptimisation. Image quality for some distal extremity exposures was improved by lowering kVp and increasing mAs around constant entrance skin dose. It is recommended that purchasing digital X-ray equipment with high detective quantum efficiency detectors, and then optimising the exposure chart for use with these detectors is of high importance for sites performing paediatric imaging. Multiple-exposure variables may need to be manipulated to achieve optimal outcomes.« less
Asselineau, Charles-Alexis; Zapata, Jose; Pye, John
2015-06-01
A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.
3D Reconstruction of human bones based on dictionary learning.
Zhang, Binkai; Wang, Xiang; Liang, Xiao; Zheng, Jinjin
2017-11-01
An effective method for reconstructing a 3D model of human bones from computed tomography (CT) image data based on dictionary learning is proposed. In this study, the dictionary comprises the vertices of triangular meshes, and the sparse coefficient matrix indicates the connectivity information. For better reconstruction performance, we proposed a balance coefficient between the approximation and regularisation terms and a method for optimisation. Moreover, we applied a local updating strategy and a mesh-optimisation method to update the dictionary and the sparse matrix, respectively. The two updating steps are iterated alternately until the objective function converges. Thus, a reconstructed mesh could be obtained with high accuracy and regularisation. The experimental results show that the proposed method has the potential to obtain high precision and high-quality triangular meshes for rapid prototyping, medical diagnosis, and tissue engineering. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
Rickard, Annette C; Vassallo, James; Nutbeam, Tim; Lyttle, Mark D; Maconochie, Ian K; Enki, Doyo G; Smith, Jason E
2018-04-28
Paediatric traumatic cardiac arrest (TCA) is associated with low survival and poor outcomes. The mechanisms that underlie TCA are different from medical cardiac arrest; the approach to treatment of TCA may therefore also need to differ to optimise outcomes. The aim of this study was to explore the opinion of subject matter experts regarding the diagnosis and treatment of paediatric TCA, and to reach consensus on how best to manage this group of patients. An online Delphi study was conducted over three rounds, with the aim of achieving consensus (defined as 70% agreement) on statements related to the diagnosis and management of paediatric TCA. Participants were invited from paediatric and adult emergency medicine, paediatric anaesthetics, paediatric ICU and paediatric surgery, as well as Paediatric Major Trauma Centre leads and representatives from the Resuscitation Council UK. Statements were informed by literature reviews and were based on elements of APLS resuscitation algorithms as well as some concepts used in the management of adult TCA; they ranged from confirmation of cardiac arrest to the indications for thoracotomy. 73 experts completed all three rounds between June and November 2016. Consensus was reached on 14 statements regarding the diagnosis and management of paediatric TCA; oxygenation and ventilatory support, along with rapid volume replacement with warmed blood, improve survival. The duration of cardiac arrest and the lack of a response to intervention, along with cardiac standstill on ultrasound, help to guide the decision to terminate resuscitation. This study has given a consensus-based framework to guide protocol development in the management of paediatric TCA, though further work is required in other key areas including its acceptability to clinicians. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Design optimisation of a TOF-based collimated camera prototype for online hadrontherapy monitoring
NASA Astrophysics Data System (ADS)
Pinto, M.; Dauvergne, D.; Freud, N.; Krimmer, J.; Letang, J. M.; Ray, C.; Roellinghoff, F.; Testa, E.
2014-12-01
Hadrontherapy is an innovative radiation therapy modality for which one of the main key advantages is the target conformality allowed by the physical properties of ion species. However, in order to maximise the exploitation of its potentialities, online monitoring is required in order to assert the treatment quality, namely monitoring devices relying on the detection of secondary radiations. Herein is presented a method based on Monte Carlo simulations to optimise a multi-slit collimated camera employing time-of-flight selection of prompt-gamma rays to be used in a clinical scenario. In addition, an analytical tool is developed based on the Monte Carlo data to predict the expected precision for a given geometrical configuration. Such a method follows the clinical workflow requirements to simultaneously have a solution that is relatively accurate and fast. Two different camera designs are proposed, considering different endpoints based on the trade-off between camera detection efficiency and spatial resolution to be used in a proton therapy treatment with active dose delivery and assuming a homogeneous target.
An illustration of new methods in machine condition monitoring, Part I: stochastic resonance
NASA Astrophysics Data System (ADS)
Worden, K.; Antoniadou, I.; Marchesiello, S.; Mba, C.; Garibaldi, L.
2017-05-01
There have been many recent developments in the application of data-based methods to machine condition monitoring. A powerful methodology based on machine learning has emerged, where diagnostics are based on a two-step procedure: extraction of damage-sensitive features, followed by unsupervised learning (novelty detection) or supervised learning (classification). The objective of the current pair of papers is simply to illustrate one state-of-the-art procedure for each step, using synthetic data representative of reality in terms of size and complexity. The first paper in the pair will deal with feature extraction. Although some papers have appeared in the recent past considering stochastic resonance as a means of amplifying damage information in signals, they have largely relied on ad hoc specifications of the resonator used. In contrast, the current paper will adopt a principled optimisation-based approach to the resonator design. The paper will also show that a discrete dynamical system can provide all the benefits of a continuous system, but also provide a considerable speed-up in terms of simulation time in order to facilitate the optimisation approach.
Beauchamp, Alison; Batterham, Roy W; Dodson, Sarity; Astbury, Brad; Elsworth, Gerald R; McPhee, Crystal; Jacobson, Jeanine; Buchbinder, Rachelle; Osborne, Richard H
2017-03-03
The need for healthcare strengthening to enhance equity is critical, requiring systematic approaches that focus on those experiencing lesser access and outcomes. This project developed and tested the Ophelia (OPtimising HEalth LIteracy and Access) approach for co-design of interventions to improve health literacy and equity of access. Eight principles guided this development: Outcomes focused; Equity driven, Needs diagnosis, Co-design, Driven by local wisdom, Sustainable, Responsive and Systematically applied. We report the application of the Ophelia process where proof-of-concept was defined as successful application of the principles. Nine sites were briefed on the aims of the project around health literacy, co-design and quality improvement. The sites were rural/metropolitan, small/large hospitals, community health centres or municipalities. Each site identified their own priorities for improvement; collected health literacy data using the Health Literacy Questionnaire (HLQ) within the identified priority groups; engaged staff in co-design workshops to generate ideas for improvement; developed program-logic models; and implemented their projects using Plan-Do-Study-Act (PDSA) cycles. Evaluation included assessment of impacts on organisations, practitioners and service users, and whether the principles were applied. Sites undertook co-design workshops involving discussion of service user needs informed by HLQ (n = 813) and interview data. Sites generated between 21 and 78 intervention ideas and then planned their selected interventions through program-logic models. Sites successfully implemented interventions and refined them progressively with PDSA cycles. Interventions generally involved one of four pathways: development of clinician skills and resources for health literacy, engagement of community volunteers to disseminate health promotion messages, direct impact on consumers' health literacy, and redesign of existing services. Evidence of application of the principles was found in all sites. The Ophelia approach guided identification of health literacy issues at each participating site and the development and implementation of locally appropriate solutions. The eight principles provided a framework that allowed flexible application of the Ophelia approach and generation of a diverse set of interventions. Changes were observed at organisational, staff, and community member levels. The Ophelia approach can be used to generate health service improvements that enhance health outcomes and address inequity of access to healthcare.
Bradley, Steven M; Strauss, Craig E; Ho, P Michael
2017-08-01
Healthcare value, defined as health outcomes achieved relative to the costs of care, has been proposed as a unifying approach to measure improvements in the quality and affordability of healthcare. Although value is of increasing interest to payers, many providers remain unfamiliar with how value differs from other approaches to the comparison of cost and outcomes (ie, cost-effectiveness analysis). While cost-effectiveness studies can be used by policy makers and payers to inform decisions about coverage and reimbursement for new therapies, the assessment of healthcare can guide improvements in the delivery of healthcare to achieve better outcomes at lower cost. Comparison on value allows for the identification of healthcare delivery organisations or care delivery settings where patient outcomes have been optimised at a lower cost. Gaps remain in the measurement of healthcare value, particularly as it relates to patient-reported health status (symptoms, functional status and health-related quality of life). The use of technology platforms that capture health status measures with minimal disruption to clinical workflow (ie, web portals, automated telephonic systems and tablets to facilitate capture outside of in-person clinical interaction) is facilitating use of health status measures to improve clinical care and optimise patient outcomes. Furthermore, the use of a value framework has catalysed quality improvement efforts and research to seek better patient outcomes at lower cost. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
ERIC Educational Resources Information Center
Mooij, Ton
2004-01-01
Specific combinations of educational and ICT conditions including computer use may optimise learning processes, particularly for learners at risk. This position paper asks which curricular, instructional, and ICT characteristics can be expected to optimise learning processes and outcomes, and how to best achieve this optimization. A theoretical…
Prior knowledge guided active modules identification: an integrated multi-objective approach.
Chen, Weiqi; Liu, Jing; He, Shan
2017-03-14
Active module, defined as an area in biological network that shows striking changes in molecular activity or phenotypic signatures, is important to reveal dynamic and process-specific information that is correlated with cellular or disease states. A prior information guided active module identification approach is proposed to detect modules that are both active and enriched by prior knowledge. We formulate the active module identification problem as a multi-objective optimisation problem, which consists two conflicting objective functions of maximising the coverage of known biological pathways and the activity of the active module simultaneously. Network is constructed from protein-protein interaction database. A beta-uniform-mixture model is used to estimate the distribution of p-values and generate scores for activity measurement from microarray data. A multi-objective evolutionary algorithm is used to search for Pareto optimal solutions. We also incorporate a novel constraints based on algebraic connectivity to ensure the connectedness of the identified active modules. Application of proposed algorithm on a small yeast molecular network shows that it can identify modules with high activities and with more cross-talk nodes between related functional groups. The Pareto solutions generated by the algorithm provides solutions with different trade-off between prior knowledge and novel information from data. The approach is then applied on microarray data from diclofenac-treated yeast cells to build network and identify modules to elucidate the molecular mechanisms of diclofenac toxicity and resistance. Gene ontology analysis is applied to the identified modules for biological interpretation. Integrating knowledge of functional groups into the identification of active module is an effective method and provides a flexible control of balance between pure data-driven method and prior information guidance.
On structural health monitoring of aircraft adhesively bonded repairs
NASA Astrophysics Data System (ADS)
Pavlopoulou, Sofia
The recent interest in life extension of ageing aircraft and the need to address the repair challenges in the new age composite ones, led to the investigation of new repair methodologies such as adhesively bonded repair patches. The present thesis focuses on structural health monitoring aspects of the repairs, evaluating their performance with guided ultrasonic waves aiming to develop a monitoring strategy which would eliminate unscheduled maintenance and unnecessary inspection costs. To address the complex nature of the wave propagation phenomena, a finite element based model identified the existing challenges by exploring the interaction of the excitation waves with different levels of damage. The damage sensitivity of the first anti-symmetric mode was numerically investigated. An external bonded patch and a scarf repair, were further tested in static and dynamic loadings, and their performance was monitored with Lamb waves, excited by surface-bonded piezoelectric transducers.. The response was processed by means of advanced pattern recognition and data dimension reduction techniques such as novelty detection and principal component analysis. An optimisation of these tools enabled an accurate damage detection under complex conditions. The phenomena of mode isolation and precise arrival time determination under a noisy environment and the problem of inadequate training data were investigated and solved through appropriate transducer arrangements and advanced signal processing respectively. The applicability of the established techniques was demonstrated on an aluminium repaired helicopter tail stabilizer. Each case study utilised alternative non-destructive techniques for validation such as 3D digital image correlation, X-ray radiography and thermography. Finally a feature selection strategy was developed through the analysis of the instantaneous properties of guided waves for damage detection purposes..
Bourne, Richard S; Shulman, Rob; Tomlin, Mark; Borthwick, Mark; Berry, Will; Mills, Gary H
2017-04-01
To identify between and within profession-rater reliability of clinical impact grading for common critical care prescribing error and optimisation cases. To identify representative clinical impact grades for each individual case. Electronic questionnaire. 5 UK NHS Trusts. 30 Critical care healthcare professionals (doctors, pharmacists and nurses). Participants graded severity of clinical impact (5-point categorical scale) of 50 error and 55 optimisation cases. Case between and within profession-rater reliability and modal clinical impact grading. Between and within profession rater reliability analysis used linear mixed model and intraclass correlation, respectively. The majority of error and optimisation cases (both 76%) had a modal clinical severity grade of moderate or higher. Error cases: doctors graded clinical impact significantly lower than pharmacists (-0.25; P < 0.001) and nurses (-0.53; P < 0.001), with nurses significantly higher than pharmacists (0.28; P < 0.001). Optimisation cases: doctors graded clinical impact significantly lower than nurses and pharmacists (-0.39 and -0.5; P < 0.001, respectively). Within profession reliability grading was excellent for pharmacists (0.88 and 0.89; P < 0.001) and doctors (0.79 and 0.83; P < 0.001) but only fair to good for nurses (0.43 and 0.74; P < 0.001), for optimisation and error cases, respectively. Representative clinical impact grades for over 100 common prescribing error and optimisation cases are reported for potential clinical practice and research application. The between professional variability highlights the importance of multidisciplinary perspectives in assessment of medication error and optimisation cases in clinical practice and research. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Xiao, Long; Liu, Xinggao; Ma, Liang; Zhang, Zeyin
2018-03-01
Dynamic optimisation problem with characteristic times, widely existing in many areas, is one of the frontiers and hotspots of dynamic optimisation researches. This paper considers a class of dynamic optimisation problems with constraints that depend on the interior points either fixed or variable, where a novel direct pseudospectral method using Legendre-Gauss (LG) collocation points for solving these problems is presented. The formula for the state at the terminal time of each subdomain is derived, which results in a linear combination of the state at the LG points in the subdomains so as to avoid the complex nonlinear integral. The sensitivities of the state at the collocation points with respect to the variable characteristic times are derived to improve the efficiency of the method. Three well-known characteristic time dynamic optimisation problems are solved and compared in detail among the reported literature methods. The research results show the effectiveness of the proposed method.
Scobbie, Lesley; Duncan, Edward A; Brady, Marian C; Wyke, Sally
2015-01-01
We investigated the nature of services providing community-based stroke rehabilitation across the UK, and goal setting practice used within them, to inform evaluation of a goal setting and action planning (G-AP) framework. We designed, piloted and electronically distributed a survey to health professionals working in community-based stroke rehabilitation settings across the UK. We optimised recruitment using a multi-faceted strategy. Responses were analysed from 437 services. Services size, composition and input was highly variable; however, most were multi-disciplinary (82%; n = 335/407) and provided input to a mixed diagnostic group of patients (71%; n = 312/437). Ninety one percent of services (n = 358/395) reported setting goals with "all" or "most" stroke survivors. Seventeen percent (n = 65/380) reported that no methods were used to guide goal setting practice; 47% (n = 148/315) reported use of informal methods only. Goal setting practice varied, e.g. 98% of services (n = 362/369) reported routinely asking patients about goal priorities; 39% (n = 141/360) reported routinely providing patients with a copy of their goals. Goal setting is embedded within community-based stroke rehabilitation; however, practice varies and is potentially sub-optimal. Further evaluation of the G-AP framework is warranted to inform optimal practice. Evaluation design will take account of the diverse service models that exist. Implications for Rehabilitation Community-based stroke rehabilitation services across the UK are diverse and tend to see a mixed diagnostic group of patients. Goal setting is implemented routinely within community-based stroke rehabilitation services; however, practice is variable and potentially sub-optimal. Further evaluation of the G-AP framework is warranted to assess its effectiveness in practice.
Roethke, M C; Kuru, T H; Schultze, S; Tichy, D; Kopp-Schneider, A; Fenchel, M; Schlemmer, H-P; Hadaschik, B A
2014-02-01
To evaluate the Prostate Imaging Reporting and Data System (PI-RADS) proposed by the European Society of Urogenital Radiology (ESUR) for detection of prostate cancer (PCa) by multiparametric magnetic resonance imaging (mpMRI) in a consecutive cohort of patients with magnetic resonance/transrectal ultrasound (MR/TRUS) fusion-guided biopsy. Suspicious lesions on mpMRI at 3.0 T were scored according to the PI-RADS system before MR/TRUS fusion-guided biopsy and correlated to histopathology results. Statistical correlation was obtained by a Mann-Whitney U test. Receiver operating characteristics (ROC) and optimal thresholds were calculated. In 64 patients, 128/445 positive biopsy cores were obtained out of 95 suspicious regions of interest (ROIs). PCa was present in 27/64 (42%) of the patients. ROC results for the aggregated PI-RADS scores exhibited higher areas under the curve compared to those of the Likert score. Sensitivity/Specificity for the following thresholds were calculated: 85 %/73 % and 67 %/92 % for PI-RADS scores of 9 and 10, respectively; 85 %/60 % and 56 %/97 % for Likert scores of 3 and 4, respectively [corrected. The standardised ESUR PI-RADS system is beneficial to indicate the likelihood of PCa of suspicious lesions on mpMRI. It is also valuable to identify locations to be targeted with biopsy. The aggregated PI-RADS score achieved better results compared to the single five-point Likert score. • The ESUR PI-RADS scoring system was evaluated using multiparametric 3.0-T MRI. • To investigate suspicious findings, transperineal MR/TRUS fusion-guided biopsy was used. • PI-RADS can guide biopsy locations and improve detection of clinically significant cancer. • Biopsy procedures can be optimised, reducing unnecessary negative biopsies for patients. • The PI-RADS scoring system may contribute to more effective prostate MRI.
2014-01-01
Background The potential of clinical practice guidelines has not been realized due to inconsistent adoption in clinical practice. Optimising intrinsic characteristics of guidelines (e.g., its wording and format) that are associated with uptake (as perceived by their end users) may have potential. Using findings from a realist review on guideline uptake and consultation with experts in guideline development, we designed a conceptual version of a future tool called Guideline Implementability Tool (GUIDE-IT). The tool will aim to involve family physicians in the guideline development process by providing a process to assess draft guideline recommendations. This feedback will then be given back to developers to consider when finalizing the recommendations. As guideline characteristics are best assessed by end-users, the objectives of the current study were to explore how family physicians perceive guideline implementability, and to determine what components should comprise the final GUIDE-IT prototype. Methods We conducted a qualitative study with family physicians inToronto, Ontario. Two experienced investigators conducted one-hour interviews with family physicians using a semi-structured interview guide to 1) elicit feedback on perceptions on guideline implementability; 2) to generate a discussion in response to three draft recommendations; and 3) to provide feedback on the conceptual GUIDE-IT. Sessions were audio taped and transcribed verbatim. Data collection and analysis were guided by content analyses. Results 20 family physicians participated. They perceived guideline uptake according to facilitators and barriers across 6 categories of guideline implementability (format, content, language, usability, development, and the practice environment). Participants’ feedback on 3 draft guideline recommendations were grouped according to guideline perception, cognition, and agreement. When asked to comment on GUIDE-IT, most respondents believed that the tool would be useful, but urged to involve “regular” or community family physicians in the process, and suggested that an online system would be the most efficient way to deliver it. Conclusions Our study identified facilitators and barriers of guideline implementability from the perspective of community and academic family physicians that will be used to build our GUIDE-IT prototype. Our findings build on current knowledge by showing that family physicians perceive guideline uptake mostly according to factors that are in the control of guideline developers. PMID:24476491
Le, Van So; Do, Zoe Phuc-Hien; Le, Minh Khoi; Le, Vicki; Le, Natalie Nha-Truc
2014-06-10
Methods of increasing the performance of radionuclide generators used in nuclear medicine radiotherapy and SPECT/PET imaging were developed and detailed for 99Mo/99mTc and 68Ge/68Ga radionuclide generators as the cases. Optimisation methods of the daughter nuclide build-up versus stand-by time and/or specific activity using mean progress functions were developed for increasing the performance of radionuclide generators. As a result of this optimisation, the separation of the daughter nuclide from its parent one should be performed at a defined optimal time to avoid the deterioration in specific activity of the daughter nuclide and wasting stand-by time of the generator, while the daughter nuclide yield is maintained to a reasonably high extent. A new characteristic parameter of the formation-decay kinetics of parent/daughter nuclide system was found and effectively used in the practice of the generator production and utilisation. A method of "early elution schedule" was also developed for increasing the daughter nuclide production yield and specific radioactivity, thus saving the cost of the generator and improving the quality of the daughter radionuclide solution. These newly developed optimisation methods in combination with an integrated elution-purification-concentration system of radionuclide generators recently developed is the most suitable way to operate the generator effectively on the basis of economic use and improvement of purposely suitable quality and specific activity of the produced daughter radionuclides. All these features benefit the economic use of the generator, the improved quality of labelling/scan, and the lowered cost of nuclear medicine procedure. Besides, a new method of quality control protocol set-up for post-delivery test of radionuclidic purity has been developed based on the relationship between gamma ray spectrometric detection limit, required limit of impure radionuclide activity and its measurement certainty with respect to optimising decay/measurement time and product sample activity used for QC quality control. The optimisation ensures a certainty of measurement of the specific impure radionuclide and avoids wasting the useful amount of valuable purified/concentrated daughter nuclide product. This process is important for the spectrometric measurement of very low activity of impure radionuclide contamination in the radioisotope products of much higher activity used in medical imaging and targeted radiotherapy.
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.
2017-09-01
In this study, Computer Aided Engineering was used for injection moulding simulation. The method of Design of experiment (DOE) was utilize according to the Latin Square orthogonal array. The relationship between the injection moulding parameters and warpage were identify based on the experimental data that used. Response Surface Methodology (RSM) was used as to validate the model accuracy. Then, the RSM and GA method were combine as to examine the optimum injection moulding process parameter. Therefore the optimisation of injection moulding is largely improve and the result shown an increasing accuracy and also reliability. The propose method by combining RSM and GA method also contribute in minimising the warpage from occur.
On the analysis of using 3-coil wireless power transfer system in retinal prosthesis.
Bai, Shun; Skafidas, Stan
2014-01-01
Designing a wireless power transmission system(WPTS) using inductive coupling has been investigated extensively in the last decade. Depending on the different configurations of the coupling system, there have been various designing methods to optimise the power transmission efficiency based on the tuning circuitry, quality factor optimisation and geometrical configuration. Recently, a 3-coil WPTS was introduced in retinal prosthesis to overcome the low power transferring efficiency due to low coupling coefficient. Here we present a method to analyse this 3-coil WPTS using the S-parameters to directly obtain maximum achievable power transferring efficiency. Through electromagnetic simulation, we brought a question on the condition of improvement using 3-coil WPTS in powering retinal prosthesis.
Optimised design for a 1 kJ diode-pumped solid-state laser system
NASA Astrophysics Data System (ADS)
Mason, Paul D.; Ertel, Klaus; Banerjee, Saumyabrata; Phillips, P. Jonathan; Hernandez-Gomez, Cristina; Collier, John L.
2011-06-01
A conceptual design for a kJ-class diode-pumped solid-state laser (DPSSL) system based on cryogenic gas-cooled multislab ceramic Yb:YAG amplifier technology has been developed at the STFC as a building block towards a MJ-class source for inertial fusion energy (IFE) projects such as HiPER. In this paper, we present an overview of an amplifier design optimised for efficient generation of 1 kJ nanosecond pulses at 10 Hz repetition rate. In order to confirm the viability of this technology, a prototype version of this amplifier scaled to deliver 10 J at 10 Hz, DiPOLE, is under development at the Central Laser Facility. A progress update on the status of this system is also presented.
Energy efficiency in membrane bioreactors.
Barillon, B; Martin Ruel, S; Langlais, C; Lazarova, V
2013-01-01
Energy consumption remains the key factor for the optimisation of the performance of membrane bioreactors (MBRs). This paper presents the results of the detailed energy audits of six full-scale MBRs operated by Suez Environnement in France, Spain and the USA based on on-site energy measurement and analysis of plant operation parameters and treatment performance. Specific energy consumption is compared for two different MBR configurations (flat sheet and hollow fibre membranes) and for plants with different design, loads and operation parameters. The aim of this project was to understand how the energy is consumed in MBR facilities and under which operating conditions, in order to finally provide guidelines and recommended practices for optimisation of MBR operation and design to reduce energy consumption and environmental impacts.
Sybil--efficient constraint-based modelling in R.
Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J
2013-11-13
Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).
NASA Astrophysics Data System (ADS)
Cobden, L. J.
2017-12-01
Mineral physics provides the essential link between seismic observations of the Earth's interior, and laboratory (or computer-simulated) measurements of rock properties. In this presentation I will outline the procedure for quantitative conversion from thermochemical structure to seismic structure (and vice versa) using the latest datasets from seismology and mineralogy. I will show examples of how this method can allow us to infer major chemical and dynamic properties of the deep mantle. I will also indicate where uncertainties and limitations in the data require us to exercise caution, in order not to "over-interpret" seismic observations. Understanding and modelling these uncertainties serves as a useful guide for mineralogists to ascertain which mineral parameters are most useful in seismic interpretation, and enables seismologists to optimise their data assembly and inversions for quantitative interpretations.
[Leadership strategies--the Bible as a guide to management in the health care system].
Kudlacek, Stefan; Meran, Johannes G
2006-06-01
Management and leadership are an integral part of any organisation, to optimise procedures and increase efficiency. Aims, ideals and structures first need to be defined for tasks to be carried out successfully, particularly in difficult times. A good example for the way communities can effectively and with conviction pass on their values and standpoints from generation to generation, grow in strength and also influence their surroundings is provided by religion. This paper focuses leadership provided by charismatic personalities within the Jewish and Christian religions. Monasteries have run hospitals without governmental support ever since the Middle Ages. Leadership within today's health care system calls for a variety of strategies in the different phases of development. In times of limited resources and multifarious societies, leadership implies both a scientific as well as an ethical challenge.
NASA Astrophysics Data System (ADS)
Biscarros, D.; Cantenot, C.; Séronie-Vivien, J.; Schmidt, G.
AstroBus on-board software is a customisable software for ERC32 based avionics implementing standard ESA Packet Utilization Standard functions. Its architecture based on generic design templates and relying on a library providing standard PUS TC, TM and event services enhances its reusability on various programs. Finally, AstroBus on-board software development and validation environment is based on last generation tools providing an optimised customisation environment.
ERIC Educational Resources Information Center
Redshaw, Clare H; Frampton, Ian
2014-01-01
As the value of multi-disciplinary working in the business and research worlds is becoming more recognised, the number of inter-disciplinary postgraduate environmental and health sciences courses is also increasing. Equally, the popularity of problem-based learning (PBL) is expected to grow and influence instructional approaches in many…
Energy and wear optimisation of train longitudinal dynamics and of traction and braking systems
NASA Astrophysics Data System (ADS)
Conti, R.; Galardi, E.; Meli, E.; Nocciolini, D.; Pugi, L.; Rindi, A.
2015-05-01
Traction and braking systems deeply affect longitudinal train dynamics, especially when an extensive blending phase among different pneumatic, electric and magnetic devices is required. The energy and wear optimisation of longitudinal vehicle dynamics has a crucial economic impact and involves several engineering problems such as wear of braking friction components, energy efficiency, thermal load on components, level of safety under degraded or adhesion conditions (often constrained by the current regulation in force on signalling or other safety-related subsystem). In fact, the application of energy storage systems can lead to an efficiency improvement of at least 10% while, as regards the wear reduction, the improvement due to distributed traction systems and to optimised traction devices can be quantified in about 50%. In this work, an innovative integrated procedure is proposed by the authors to optimise longitudinal train dynamics and traction and braking manoeuvres in terms of both energy and wear. The new approach has been applied to existing test cases and validated with experimental data provided by Breda and, for some components and their homologation process, the results of experimental activities derive from cooperation performed with relevant industrial partners such as Trenitalia and Italcertifer. In particular, simulation results are referred to the simulation tests performed on a high-speed train (Ansaldo Breda Emu V250) and on a tram (Ansaldo Breda Sirio Tram). The proposed approach is based on a modular simulation platform in which the sub-models corresponding to different subsystems can be easily customised, depending on the considered application, on the availability of technical data and on the homologation process of different components.
Fabrication of Organic Radar Absorbing Materials: A Report on the TIF Project
2005-05-01
thickness, permittivity and permeability. The ability to measure the permittivity and permeability is an essential requirement for designing an optimised...absorber. And good optimisations codes are required in order to achieve the best possible absorber designs . In this report, the results from a...through measurement of their conductivity and permittivity at microwave frequencies. Methods were then developed for optimising the design of
NASA Astrophysics Data System (ADS)
Behera, Kishore Kumar; Pal, Snehanshu
2018-03-01
This paper describes a new approach towards optimum utilisation of ferrochrome added during stainless steel making in AOD converter. The objective of optimisation is to enhance end blow chromium content of steel and reduce the ferrochrome addition during refining. By developing a thermodynamic based mathematical model, a study has been conducted to compute the optimum trade-off between ferrochrome addition and end blow chromium content of stainless steel using a predator prey genetic algorithm through training of 100 dataset considering different input and output variables such as oxygen, argon, nitrogen blowing rate, duration of blowing, initial bath temperature, chromium and carbon content, weight of ferrochrome added during refining. Optimisation is performed within constrained imposed on the input parameters whose values fall within certain ranges. The analysis of pareto fronts is observed to generate a set of feasible optimal solution between the two conflicting objectives that provides an effective guideline for better ferrochrome utilisation. It is found out that after a certain critical range, further addition of ferrochrome does not affect the chromium percentage of steel. Single variable response analysis is performed to study the variation and interaction of all individual input parameters on output variables.
Rotational degree-of-freedom synthesis: An optimised finite difference method for non-exact data
NASA Astrophysics Data System (ADS)
Gibbons, T. J.; Öztürk, E.; Sims, N. D.
2018-01-01
Measuring the rotational dynamic behaviour of a structure is important for many areas of dynamics such as passive vibration control, acoustics, and model updating. Specialist and dedicated equipment is often needed, unless the rotational degree-of-freedom is synthesised based upon translational data. However, this involves numerically differentiating the translational mode shapes to approximate the rotational modes, for example using a finite difference algorithm. A key challenge with this approach is choosing the measurement spacing between the data points, an issue which has often been overlooked in the published literature. The present contribution will for the first time prove that the use of a finite difference approach can be unstable when using non-exact measured data and a small measurement spacing, for beam-like structures. Then, a generalised analytical error analysis is used to propose an optimised measurement spacing, which balances the numerical error of the finite difference equation with the propagation error from the perturbed data. The approach is demonstrated using both numerical and experimental investigations. It is shown that by obtaining a small number of test measurements it is possible to optimise the measurement accuracy, without any further assumptions on the boundary conditions of the structure.
A novel swarm intelligence algorithm for finding DNA motifs.
Lei, Chengwei; Ruan, Jianhua
2009-01-01
Discovering DNA motifs from co-expressed or co-regulated genes is an important step towards deciphering complex gene regulatory networks and understanding gene functions. Despite significant improvement in the last decade, it still remains one of the most challenging problems in computational molecular biology. In this work, we propose a novel motif finding algorithm that finds consensus patterns using a population-based stochastic optimisation technique called Particle Swarm Optimisation (PSO), which has been shown to be effective in optimising difficult multidimensional problems in continuous domains. We propose to use a word dissimilarity graph to remap the neighborhood structure of the solution space of DNA motifs, and propose a modification of the naive PSO algorithm to accommodate discrete variables. In order to improve efficiency, we also propose several strategies for escaping from local optima and for automatically determining the termination criteria. Experimental results on simulated challenge problems show that our method is both more efficient and more accurate than several existing algorithms. Applications to several sets of real promoter sequences also show that our approach is able to detect known transcription factor binding sites, and outperforms two of the most popular existing algorithms.
Prediction of road traffic death rate using neural networks optimised by genetic algorithm.
Jafari, Seyed Ali; Jahandideh, Sepideh; Jahandideh, Mina; Asadabadi, Ebrahim Barzegari
2015-01-01
Road traffic injuries (RTIs) are realised as a main cause of public health problems at global, regional and national levels. Therefore, prediction of road traffic death rate will be helpful in its management. Based on this fact, we used an artificial neural network model optimised through Genetic algorithm to predict mortality. In this study, a five-fold cross-validation procedure on a data set containing total of 178 countries was used to verify the performance of models. The best-fit model was selected according to the root mean square errors (RMSE). Genetic algorithm, as a powerful model which has not been introduced in prediction of mortality to this extent in previous studies, showed high performance. The lowest RMSE obtained was 0.0808. Such satisfactory results could be attributed to the use of Genetic algorithm as a powerful optimiser which selects the best input feature set to be fed into the neural networks. Seven factors have been known as the most effective factors on the road traffic mortality rate by high accuracy. The gained results displayed that our model is very promising and may play a useful role in developing a better method for assessing the influence of road traffic mortality risk factors.
Calibration of phoswich-based lung counting system using realistic chest phantom.
Manohari, M; Mathiyarasu, R; Rajagopal, V; Meenakshisundaram, V; Indira, R
2011-03-01
A phoswich detector, housed inside a low background steel room, coupled with a state-of-art pulse shape discrimination (PSD) electronics is recently established at Radiological Safety Division of IGCAR for in vivo monitoring of actinides. The various parameters of PSD electronics were optimised to achieve efficient background reduction in low-energy regions. The PSD with optimised parameters has reduced steel room background from 9.5 to 0.28 cps in the 17 keV region and 5.8 to 0.3 cps in the 60 keV region. The Figure of Merit for the timing spectrum of the system is 3.0. The true signal loss due to PSD was found to be less than 2 %. The phoswich system was calibrated with Lawrence Livermore National Laboratory realistic chest phantom loaded with (241)Am activity tagged lung set. Calibration factors for varying chest wall composition and chest wall thickness in terms of muscle equivalent chest wall thickness were established. (241)Am activity in the JAERI phantom which was received as a part of IAEA inter-comparison exercise was estimated. This paper presents the optimisation of PSD electronics and the salient results of the calibration.
Application of the adjoint optimisation of shock control bump for ONERA-M6 wing
NASA Astrophysics Data System (ADS)
Nejati, A.; Mazaheri, K.
2017-11-01
This article is devoted to the numerical investigation of the shock wave/boundary layer interaction (SWBLI) as the main factor influencing the aerodynamic performance of transonic bumped airfoils and wings. The numerical analysis is conducted for the ONERA-M6 wing through a shock control bump (SCB) shape optimisation process using the adjoint optimisation method. SWBLI is analyzed for both clean and bumped airfoils and wings, and it is shown how the modified wave structure originating from upstream of the SCB reduces the wave drag, by improving the boundary layer velocity profile downstream of the shock wave. The numerical simulation of the turbulent viscous flow and a gradient-based adjoint algorithm are used to find the optimum location and shape of the SCB for the ONERA-M6 airfoil and wing. Two different geometrical models are introduced for the 3D SCB, one with linear variations, and another with periodic variations. Both configurations result in drag reduction and improvement in the aerodynamic efficiency, but the periodic model is more effective. Although the three-dimensional flow structure involves much more complexities, the overall results are shown to be similar to the two-dimensional case.
Martens, Leon; Goode, Grahame; Wold, Johan F H; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas
2014-01-01
To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield.
Martens, Leon; Goode, Grahame; Wold, Johan F. H.; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas
2014-01-01
Aims To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Methods Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. Results With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Conclusions Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield. PMID:24927475
Probabilistic Sizing and Verification of Space Ceramic Structures
NASA Astrophysics Data System (ADS)
Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit
2012-07-01
Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.
A Method for Decentralised Optimisation in Networks
NASA Astrophysics Data System (ADS)
Saramäki, Jari
2005-06-01
We outline a method for distributed Monte Carlo optimisation of computational problems in networks of agents, such as peer-to-peer networks of computers. The optimisation and messaging procedures are inspired by gossip protocols and epidemic data dissemination, and are decentralised, i.e. no central overseer is required. In the outlined method, each agent follows simple local rules and seeks for better solutions to the optimisation problem by Monte Carlo trials, as well as by querying other agents in its local neighbourhood. With proper network topology, good solutions spread rapidly through the network for further improvement. Furthermore, the system retains its functionality even in realistic settings where agents are randomly switched on and off.
Optimising operational amplifiers by evolutionary algorithms and gm/Id method
NASA Astrophysics Data System (ADS)
Tlelo-Cuautle, E.; Sanabria-Borbon, A. C.
2016-10-01
The evolutionary algorithm called non-dominated sorting genetic algorithm (NSGA-II) is applied herein in the optimisation of operational transconductance amplifiers. NSGA-II is accelerated by applying the gm/Id method to estimate reduced search spaces associated to widths (W) and lengths (L) of the metal-oxide-semiconductor field-effect-transistor (MOSFETs), and to guarantee their appropriate bias levels conditions. In addition, we introduce an integer encoding for the W/L sizes of the MOSFETs to avoid a post-processing step for rounding-off their values to be multiples of the integrated circuit fabrication technology. Finally, from the feasible solutions generated by NSGA-II, we introduce a second optimisation stage to guarantee that the final feasible W/L sizes solutions support process, voltage and temperature (PVT) variations. The optimisation results lead us to conclude that the gm/Id method and integer encoding are quite useful to accelerate the convergence of the evolutionary algorithm NSGA-II, while the second optimisation stage guarantees robustness of the feasible solutions to PVT variations.
Optimisation of active suspension control inputs for improved vehicle handling performance
NASA Astrophysics Data System (ADS)
Čorić, Mirko; Deur, Joško; Kasać, Josip; Tseng, H. Eric; Hrovat, Davor
2016-11-01
Active suspension is commonly considered under the framework of vertical vehicle dynamics control aimed at improvements in ride comfort. This paper uses a collocation-type control variable optimisation tool to investigate to which extent the fully active suspension (FAS) application can be broaden to the task of vehicle handling/cornering control. The optimisation approach is firstly applied to solely FAS actuator configurations and three types of double lane-change manoeuvres. The obtained optimisation results are used to gain insights into different control mechanisms that are used by FAS to improve the handling performance in terms of path following error reduction. For the same manoeuvres the FAS performance is compared with the performance of different active steering and active differential actuators. The optimisation study is finally extended to combined FAS and active front- and/or rear-steering configurations to investigate if they can use their complementary control authorities (over the vertical and lateral vehicle dynamics, respectively) to further improve the handling performance.
NASA Astrophysics Data System (ADS)
Harré, Michael S.
2013-02-01
Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.
Optimisation techniques in vaginal cuff brachytherapy.
Tuncel, N; Garipagaoglu, M; Kizildag, A U; Andic, F; Toy, A
2009-11-01
The aim of this study was to explore whether an in-house dosimetry protocol and optimisation method are able to produce a homogeneous dose distribution in the target volume, and how often optimisation is required in vaginal cuff brachytherapy. Treatment planning was carried out for 109 fractions in 33 patients who underwent high dose rate iridium-192 (Ir(192)) brachytherapy using Fletcher ovoids. Dose prescription and normalisation were performed to catheter-oriented lateral dose points (dps) within a range of 90-110% of the prescribed dose. The in-house vaginal apex point (Vk), alternative vaginal apex point (Vk'), International Commission on Radiation Units and Measurements (ICRU) rectal point (Rg) and bladder point (Bl) doses were calculated. Time-position optimisations were made considering dps, Vk and Rg doses. Keeping the Vk dose higher than 95% and the Rg dose less than 85% of the prescribed dose was intended. Target dose homogeneity, optimisation frequency and the relationship between prescribed dose, Vk, Vk', Rg and ovoid diameter were investigated. The mean target dose was 99+/-7.4% of the prescription dose. Optimisation was required in 92 out of 109 (83%) fractions. Ovoid diameter had a significant effect on Rg (p = 0.002), Vk (p = 0.018), Vk' (p = 0.034), minimum dps (p = 0.021) and maximum dps (p<0.001). Rg, Vk and Vk' doses with 2.5 cm diameter ovoids were significantly higher than with 2 cm and 1.5 cm ovoids. Catheter-oriented dose point normalisation provided a homogeneous dose distribution with a 99+/-7.4% mean dose within the target volume, requiring time-position optimisation.
Holroyd, Kenneth A; Cottrell, Constance K; O'Donnell, Francis J; Cordingley, Gary E; Drew, Jana B; Carlson, Bruce W; Himawan, Lina
2010-09-29
To determine if the addition of preventive drug treatment (β blocker), brief behavioural migraine management, or their combination improves the outcome of optimised acute treatment in the management of frequent migraine. Randomised placebo controlled trial over 16 months from July 2001 to November 2005. Two outpatient sites in Ohio, USA. 232 adults (mean age 38 years; 79% female) with diagnosis of migraine with or without aura according to International Headache Society classification of headache disorders criteria, who recorded at least three migraines with disability per 30 days (mean 5.5 migraines/30 days), during an optimised run-in of acute treatment. Addition of one of four preventive treatments to optimised acute treatment: β blocker (n=53), matched placebo (n=55), behavioural migraine management plus placebo (n=55), or behavioural migraine management plus β blocker (n=69). The primary outcome was change in migraines/30 days; secondary outcomes included change in migraine days/30 days and change in migraine specific quality of life scores. Mixed model analysis showed statistically significant (P≤0.05) differences in outcomes among the four added treatments for both the primary outcome (migraines/30 days) and the two secondary outcomes (change in migraine days/30 days and change in migraine specific quality of life scores). The addition of combined β blocker and behavioural migraine management (-3.3 migraines/30 days, 95% confidence interval -3.2 to -3.5), but not the addition of β blocker alone (-2.1 migraines/30 days, -1.9 to -2.2) or behavioural migraine management alone (-2.2 migraines migraines/30 days, -2.0 to -2.4), improved outcomes compared with optimised acute treatment alone (-2.1 migraines/30 days, -1.9 to -2.2). For a clinically significant (≥50% reduction) in migraines/30 days, the number needed to treat for optimised acute treatment plus combined β blocker and behavioural migraine management was 3.1 compared with optimised acute treatment alone, 2.6 compared with optimised acute treatment plus β blocker, and 3.1 compared with optimised acute treatment plus behavioural migraine management. Results were consistent for the two secondary outcomes, and at both month 10 (the primary endpoint) and month 16. The addition of combined β blocker plus behavioural migraine management, but not the addition of β blocker alone or behavioural migraine management alone, improved outcomes of optimised acute treatment. Combined β blocker treatment and behavioural migraine management may improve outcomes in the treatment of frequent migraine. Clinical trials NCT00910689.
Planet Formation Imager (PFI): science vision and key requirements
NASA Astrophysics Data System (ADS)
Kraus, Stefan; Monnier, John D.; Ireland, Michael J.; Duchêne, Gaspard; Espaillat, Catherine; Hönig, Sebastian; Juhasz, Attila; Mordasini, Chris; Olofsson, Johan; Paladini, Claudia; Stassun, Keivan; Turner, Neal; Vasisht, Gautam; Harries, Tim J.; Bate, Matthew R.; Gonzalez, Jean-François; Matter, Alexis; Zhu, Zhaohuan; Panic, Olja; Regaly, Zsolt; Morbidelli, Alessandro; Meru, Farzana; Wolf, Sebastian; Ilee, John; Berger, Jean-Philippe; Zhao, Ming; Kral, Quentin; Morlok, Andreas; Bonsor, Amy; Ciardi, David; Kane, Stephen R.; Kratter, Kaitlin; Laughlin, Greg; Pepper, Joshua; Raymond, Sean; Labadie, Lucas; Nelson, Richard P.; Weigelt, Gerd; ten Brummelaar, Theo; Pierens, Arnaud; Oudmaijer, Rene; Kley, Wilhelm; Pope, Benjamin; Jensen, Eric L. N.; Bayo, Amelia; Smith, Michael; Boyajian, Tabetha; Quiroga-Nuñez, Luis Henry; Millan-Gabet, Rafael; Chiavassa, Andrea; Gallenne, Alexandre; Reynolds, Mark; de Wit, Willem-Jan; Wittkowski, Markus; Millour, Florentin; Gandhi, Poshak; Ramos Almeida, Cristina; Alonso Herrero, Almudena; Packham, Chris; Kishimoto, Makoto; Tristram, Konrad R. W.; Pott, Jörg-Uwe; Surdej, Jean; Buscher, David; Haniff, Chris; Lacour, Sylvestre; Petrov, Romain; Ridgway, Steve; Tuthill, Peter; van Belle, Gerard; Armitage, Phil; Baruteau, Clement; Benisty, Myriam; Bitsch, Bertram; Paardekooper, Sijme-Jan; Pinte, Christophe; Masset, Frederic; Rosotti, Giovanni
2016-08-01
The Planet Formation Imager (PFI) project aims to provide a strong scientific vision for ground-based optical astronomy beyond the upcoming generation of Extremely Large Telescopes. We make the case that a breakthrough in angular resolution imaging capabilities is required in order to unravel the processes involved in planet formation. PFI will be optimised to provide a complete census of the protoplanet population at all stellocentric radii and over the age range from 0.1 to 100 Myr. Within this age period, planetary systems undergo dramatic changes and the final architecture of planetary systems is determined. Our goal is to study the planetary birth on the natural spatial scale where the material is assembled, which is the "Hill Sphere" of the forming planet, and to characterise the protoplanetary cores by measuring their masses and physical properties. Our science working group has investigated the observational characteristics of these young protoplanets as well as the migration mechanisms that might alter the system architecture. We simulated the imprints that the planets leave in the disk and study how PFI could revolutionise areas ranging from exoplanet to extragalactic science. In this contribution we outline the key science drivers of PFI and discuss the requirements that will guide the technology choices, the site selection, and potential science/technology tradeoffs.
Pelvic re-irradiation using stereotactic ablative radiotherapy (SABR): A systematic review.
Murray, Louise Janet; Lilley, John; Hawkins, Maria A; Henry, Ann M; Dickinson, Peter; Sebag-Montefiore, David
2017-11-01
To perform a systematic review regarding the use of stereotactic ablative radiotherapy (SABR) for the re-irradiation of recurrent malignant disease within the pelvis, to guide the clinical implementation of this technique. A systematic search strategy was adopted using the MEDLINE, EMBASE and Cochrane Library databases. 195 articles were identified, of which 17 were appropriate for inclusion. Studies were small and data largely retrospective. In total, 205 patients are reported to have received pelvic SABR re-irradiation. Dose and fractionation schedules and re-irradiated volumes are highly variable. Little information is provided regarding organ at risk constraints adopted in the re-irradiation setting. Treatment appears well-tolerated overall, with nine grade 3 and six grade 4 toxicities amongst thirteen re-irradiated patients. Local control at one year ranged from 51% to 100%. Symptomatic improvements were also noted. For previously irradiated patients with recurrent pelvic disease, SABR re-irradiation could be a feasible intervention for those who otherwise have limited options. Evidence to support this technique is limited but shows initial promise. Based on the available literature, suggestions for a more formal SABR re-irradiation pathway are proposed. Prospective studies and a multidisciplinary approach are required to optimise future treatment. Copyright © 2017 Elsevier B.V. All rights reserved.
Debecker, Damien P; Gaigneaux, Eric M; Busca, Guido
2009-01-01
Basic catalysis! The basic properties of hydrotalcites (see picture) make them attractive for numerous catalytic applications. Probing the basicity of the catalysts is crucial to understand the base-catalysed processes and to optimise the catalyst preparation. Various parameters can be employed to tune the basic properties of hydrotalcite-based catalysts towards the basicity demanded by each target chemical reaction.Hydrotalcites offer unique basic properties that make them very attractive for catalytic applications. It is of primary interest to make use of accurate tools for probing the basicity of hydrotalcite-based catalysts for the purpose of 1) fundamental understanding of base-catalysed processes with hydrotalcites and 2) optimisation of the catalytic performance achieved in reactions of industrial interest. Techniques based on probe molecules, titration techniques and test reactions along with physicochemical characterisation are overviewed in the first part of this review. The aim is to provide the tools for understanding how series of parameters involved in the preparation of hydrotalcite-based catalytic materials can be employed to control and adapt the basic properties of the catalyst towards the basicity demanded by each target chemical reaction. An overview of recent and significant achievements in that perspective is presented in the second part of the paper.
NASA Astrophysics Data System (ADS)
Haworth, Annette; Mears, Christopher; Betts, John M.; Reynolds, Hayley M.; Tack, Guido; Leo, Kevin; Williams, Scott; Ebert, Martin A.
2016-01-01
Treatment plans for ten patients, initially treated with a conventional approach to low dose-rate brachytherapy (LDR, 145 Gy to entire prostate), were compared with plans for the same patients created with an inverse-optimisation planning process utilising a biologically-based objective. The ‘biological optimisation’ considered a non-uniform distribution of tumour cell density through the prostate based on known and expected locations of the tumour. Using dose planning-objectives derived from our previous biological-model validation study, the volume of the urethra receiving 125% of the conventional prescription (145 Gy) was reduced from a median value of 64% to less than 8% whilst maintaining high values of TCP. On average, the number of planned seeds was reduced from 85 to less than 75. The robustness of plans to random seed displacements needs to be carefully considered when using contemporary seed placement techniques. We conclude that an inverse planning approach to LDR treatments, based on a biological objective, has the potential to maintain high rates of tumour control whilst minimising dose to healthy tissue. In future, the radiobiological model will be informed using multi-parametric MRI to provide a personalised medicine approach.
Xu, Xiangtao; Medvigy, David; Wright, Stuart Joseph; ...
2017-07-04
Leaf longevity (LL) varies more than 20-fold in tropical evergreen forests, but it remains unclear how to capture these variations using predictive models. Current theories of LL that are based on carbon optimisation principles are challenging to quantitatively assess because of uncertainty across species in the ‘ageing rate:’ the rate at which leaf photosynthetic capacity declines with age. Here in this paper, we present a meta-analysis of 49 species across temperate and tropical biomes, demonstrating that the ageing rate of photosynthetic capacity is positively correlated with the mass-based carboxylation rate of mature leaves. We assess an improved trait-driven carbon optimalitymore » model with in situLL data for 105 species in two Panamanian forests. Additionally, we show that our model explains over 40% of the cross-species variation in LL under contrasting light environment. Collectively, our results reveal how variation in LL emerges from carbon optimisation constrained by both leaf structural traits and abiotic environment.« less
Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M
2009-03-01
Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.
Novel Approach on the Optimisation of Mid-Course Corrections Along Interplanetary Trajectories
NASA Astrophysics Data System (ADS)
Iorfida, Elisabetta; Palmer, Phil; Roberts, Mark
The primer vector theory, firstly proposed by Lawden, defines a set of necessary conditions to characterise whether an impulsive thrust trajectory is optimal with respect to propellant usage, within a two-body problem context. If the conditions are not satisfied, one or more potential intermediate impulses are performed along the transfer arc, in order to lower the overall cost. The method is based on the propagation of the state transition matrix and on the solution of a boundary value problem, which leads to a mathematical and computational complexity.In this paper, a different approach is introduced. It is based on a polar coordinates transformation of the primer vector which allows the decoupling between its in-plane and out-of-plane components. The out-of-plane component is solved analytically while for the in-plane ones a Hamiltonian approximation is made.The novel procedure reduces the mathematical complexity and the computational cost of Lawden's problem and gives also a different perspective about the optimisation of a transfer trajectory.
NASA Astrophysics Data System (ADS)
Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong
2012-10-01
Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Xiangtao; Medvigy, David; Wright, Stuart Joseph
Leaf longevity (LL) varies more than 20-fold in tropical evergreen forests, but it remains unclear how to capture these variations using predictive models. Current theories of LL that are based on carbon optimisation principles are challenging to quantitatively assess because of uncertainty across species in the ‘ageing rate:’ the rate at which leaf photosynthetic capacity declines with age. Here in this paper, we present a meta-analysis of 49 species across temperate and tropical biomes, demonstrating that the ageing rate of photosynthetic capacity is positively correlated with the mass-based carboxylation rate of mature leaves. We assess an improved trait-driven carbon optimalitymore » model with in situLL data for 105 species in two Panamanian forests. Additionally, we show that our model explains over 40% of the cross-species variation in LL under contrasting light environment. Collectively, our results reveal how variation in LL emerges from carbon optimisation constrained by both leaf structural traits and abiotic environment.« less
Employing multi-GPU power for molecular dynamics simulation: an extension of GALAMOST
NASA Astrophysics Data System (ADS)
Zhu, You-Liang; Pan, Deng; Li, Zhan-Wei; Liu, Hong; Qian, Hu-Jun; Zhao, Yang; Lu, Zhong-Yuan; Sun, Zhao-Yan
2018-04-01
We describe the algorithm of employing multi-GPU power on the basis of Message Passing Interface (MPI) domain decomposition in a molecular dynamics code, GALAMOST, which is designed for the coarse-grained simulation of soft matters. The code of multi-GPU version is developed based on our previous single-GPU version. In multi-GPU runs, one GPU takes charge of one domain and runs single-GPU code path. The communication between neighbouring domains takes a similar algorithm of CPU-based code of LAMMPS, but is optimised specifically for GPUs. We employ a memory-saving design which can enlarge maximum system size at the same device condition. An optimisation algorithm is employed to prolong the update period of neighbour list. We demonstrate good performance of multi-GPU runs on the simulation of Lennard-Jones liquid, dissipative particle dynamics liquid, polymer and nanoparticle composite, and two-patch particles on workstation. A good scaling of many nodes on cluster for two-patch particles is presented.
Ubogagu, Edith; Harris, Dylan G
2012-12-01
Terminal haemorrhage is a rare and distressing emergency in palliative oncology. We present an algorithm for the management of terminal haemorrhage in patients likely to receive end-of-life care at home, based on a literature review of the management of terminal haemorrhage for patients with advanced cancer, where a DNAR (do not attempt resuscitation) order is in place and the patient wishes to die at home. A literature review was conducted to identify literature on the management of terminal haemorrhage in patients with advanced cancer who are no longer amenable to active interventional/invasive procedures. Electronic databases, the grey literature, local guidelines from hospitals and hospices, and online web portals were all searched systematically. The literature review was used to formulate a management algorithm. The evidence base is very limited. A three-step practical algorithm is suggested: preparing for the event, managing the event ('ABC') and 'aftercare'. Step 1 involves the identification and optimisation of risk factors. Step 2 (the event) consists of A (assure and re-assure the patient), B (be there - above all stay with the patient) and C (comfort, calm, consider dark towels and anxiolytics if possible). Step 3 (the aftercare) involves the provision of practical and psychological support to those involved including relatives and professionals. Terminal haemorrhage is a rare yet highly feared complication of advanced cancer, for which there is a limited evidence base to guide management. The suggested three-step approach to managing this situation gives professionals a logical framework within which to work.
Integration of PGD-virtual charts into an engineering design process
NASA Astrophysics Data System (ADS)
Courard, Amaury; Néron, David; Ladevèze, Pierre; Ballere, Ludovic
2016-04-01
This article deals with the efficient construction of approximations of fields and quantities of interest used in geometric optimisation of complex shapes that can be encountered in engineering structures. The strategy, which is developed herein, is based on the construction of virtual charts that allow, once computed offline, to optimise the structure for a negligible online CPU cost. These virtual charts can be used as a powerful numerical decision support tool during the design of industrial structures. They are built using the proper generalized decomposition (PGD) that offers a very convenient framework to solve parametrised problems. In this paper, particular attention has been paid to the integration of the procedure into a genuine engineering design process. In particular, a dedicated methodology is proposed to interface the PGD approach with commercial software.
Scientific Approach for Optimising Performance, Health and Safety in High-Altitude Observatories
NASA Astrophysics Data System (ADS)
Böcker, Michael; Vogy, Joachim; Nolle-Gösser, Tanja
2008-09-01
The ESO coordinated study “Optimising Performance, Health and Safety in High-Altitude Observatories” is based on a psychological approach using a questionnaire for data collection and assessment of high-altitude effects. During 2007 and 2008, data from 28 staff and visitors involved in APEX and ALMA were collected and analysed and the first results of the study are summarised. While there is a lot of information about biomedical changes at high altitude, relatively few studies have focussed on psychological changes, for example with respect to performance of mental tasks, safety consciousness and emotions. Both, biomedical and psychological changes are relevant factors in occupational safety and health. The results of the questionnaire on safety, health and performance issues demonstrate that the working conditions at high altitude are less detrimental than expected.
Optimisation and characterisation of tungsten thick coatings on copper based alloy substrates
NASA Astrophysics Data System (ADS)
Riccardi, B.; Montanari, R.; Casadei, M.; Costanza, G.; Filacchioni, G.; Moriani, A.
2006-06-01
Tungsten is a promising armour material for plasma facing components of nuclear fusion reactors because of its low sputter rate and favourable thermo-mechanical properties. Among all the techniques able to realise W armours, plasma spray looks particularly attractive owing to its simplicity and low cost. The present work concerns the optimisation of spraying parameters aimed at 4-5 mm thick W coating on copper-chromium-zirconium (Cu,Cr,Zr) alloy substrates. Characterisation of coatings was performed in order to assess microstructure, impurity content, density, tensile strength, adhesion strength, thermal conductivity and thermal expansion coefficient. The work performed has demonstrated the feasibility of thick W coatings on flat and curved geometries. These coatings appear as a reliable armour for medium heat flux plasma facing component.
Khairuddin Md Yusof, Ahmad
2013-01-01
Concerns about ionizing radiation during interventional cardiology have been increased in recent years as a result of rapid growth in interventional procedure volumes and the high radiation doses associated with some procedures. Noncancer radiation risks to cardiologists and medical staff in terms of radiation-induced cataracts and skin injuries for patients appear clear potential consequences of interventional cardiology procedures, while radiation-induced potential risk of developing cardiovascular effects remains less clear. This paper provides an overview of the evidence-based reviews of concerns about noncancer risks of radiation exposure in interventional cardiology. Strategies commonly undertaken to reduce radiation doses to both medical staff and patients during interventional cardiology procedures are discussed; optimisation of interventional cardiology procedures is highlighted. PMID:24027768
Close packing in curved space by simulated annealing
NASA Astrophysics Data System (ADS)
Wille, L. T.
1987-12-01
The problem of packing spheres of a maximum radius on the surface of a four-dimensional hypersphere is considered. It is shown how near-optimal solutions can be obtained by packing soft spheres, modelled as classical particles interacting under an inverse power potential, followed by a subsequent hardening of the interaction. In order to avoid trapping in high-lying local minima, the simulated annealing method is used to optimise the soft-sphere packing. Several improvements over other work (based on local optimisation of random initial configurations of hard spheres) have been found. The freezing behaviour of this system is discussed as a function of particle number, softness of the potential and cooling rate. Apart from their geometric interest, these results are useful in the study of topological frustration, metallic glasses and quasicrystals.
A simple model clarifies the complicated relationships of complex networks
Zheng, Bojin; Wu, Hongrun; Kuang, Li; Qin, Jun; Du, Wenhua; Wang, Jianmin; Li, Deyi
2014-01-01
Real-world networks such as the Internet and WWW have many common traits. Until now, hundreds of models were proposed to characterize these traits for understanding the networks. Because different models used very different mechanisms, it is widely believed that these traits origin from different causes. However, we find that a simple model based on optimisation can produce many traits, including scale-free, small-world, ultra small-world, Delta-distribution, compact, fractal, regular and random networks. Moreover, by revising the proposed model, the community-structure networks are generated. By this model and the revised versions, the complicated relationships of complex networks are illustrated. The model brings a new universal perspective to the understanding of complex networks and provide a universal method to model complex networks from the viewpoint of optimisation. PMID:25160506
Hind, Daniel; Parkin, James; Whitworth, Victoria; Rex, Saleema; Young, Tracey; Hampson, Lisa; Sheehan, Jennie; Maguire, Chin; Cantrill, Hannah; Scott, Elaine; Epps, Heather; Main, Marion; Geary, Michelle; McMurchie, Heather; Pallant, Lindsey; Woods, Daniel; Freeman, Jennifer; Lee, Ellen; Eagle, Michelle; Willis, Tracey; Muntoni, Francesco; Baxter, Peter
2017-01-01
Standard treatment of Duchenne muscular dystrophy (DMD) includes regular physiotherapy. There are no data to show whether adding aquatic therapy (AT) to land-based exercises helps maintain motor function. We assessed the feasibility of recruiting and collecting data from boys with DMD in a parallel-group pilot randomised trial (primary objective), also assessing how intervention and trial procedures work. Ambulant boys with DMD aged 7-16 years established on steroids, with North Star Ambulatory Assessment (NSAA) score ≥8, who were able to complete a 10-m walk test without aids or assistance, were randomly allocated (1:1) to 6 months of either optimised land-based exercises 4 to 6 days/week, defined by local community physiotherapists, or the same 4 days/week plus AT 2 days/week. Those unable to commit to a programme, with >20% variation between NSAA scores 4 weeks apart, or contraindications to AT were excluded. The main outcome measures included feasibility of recruiting 40 participants in 6 months from six UK centres, clinical outcomes including NSAA, independent assessment of treatment optimisation, participant/therapist views on acceptability of intervention and research protocols, value of information (VoI) analysis and cost-impact analysis. Over 6 months, 348 boys were screened: most lived too far from centres or were enrolled in other trials; 12 (30% of the targets) were randomised to AT ( n = 8) or control ( n = 4). The mean change in NSAA at 6 months was -5.5 (SD 7.8) in the control arm and -2.8 (SD 4.1) in the AT arm. Harms included fatigue in two boys, pain in one. Physiotherapists and parents valued AT but believed it should be delivered in community settings. Randomisation was unattractive to families, who had already decided that AT was useful and who often preferred to enrol in drug studies. The AT prescription was considered to be optimised for three boys, with other boys given programmes that were too extensive and insufficiently focused. Recruitment was insufficient for VoI analysis. Neither a UK-based RCT of AT nor a twice weekly AT therapy delivered at tertiary centres is feasible. Our study will help in the optimisation of AT service provision and the design of future research. ISRCTN41002956.
Jointly learning word embeddings using a corpus and a knowledge base
Bollegala, Danushka; Maehara, Takanori; Kawarabayashi, Ken-ichi
2018-01-01
Methods for representing the meaning of words in vector spaces purely using the information distributed in text corpora have proved to be very valuable in various text mining and natural language processing (NLP) tasks. However, these methods still disregard the valuable semantic relational structure between words in co-occurring contexts. These beneficial semantic relational structures are contained in manually-created knowledge bases (KBs) such as ontologies and semantic lexicons, where the meanings of words are represented by defining the various relationships that exist among those words. We combine the knowledge in both a corpus and a KB to learn better word embeddings. Specifically, we propose a joint word representation learning method that uses the knowledge in the KBs, and simultaneously predicts the co-occurrences of two words in a corpus context. In particular, we use the corpus to define our objective function subject to the relational constrains derived from the KB. We further utilise the corpus co-occurrence statistics to propose two novel approaches, Nearest Neighbour Expansion (NNE) and Hedged Nearest Neighbour Expansion (HNE), that dynamically expand the KB and therefore derive more constraints that guide the optimisation process. Our experimental results over a wide-range of benchmark tasks demonstrate that the proposed method statistically significantly improves the accuracy of the word embeddings learnt. It outperforms a corpus-only baseline and reports an improvement of a number of previously proposed methods that incorporate corpora and KBs in both semantic similarity prediction and word analogy detection tasks. PMID:29529052
Coxon, Kristy; Keay, Lisa
2015-12-09
Safe-transport is important to well-being in later life but balancing safety and independence for older drivers can be challenging. While self-regulation is a promising tool to promote road safety, more research is required to optimise programs. Qualitative research was used to inform the choice and adaptation of a safe-transport education program for older drivers. Three focus groups were conducted with older drivers living in northwest Sydney to explore four key areas related to driving in later life including aged-based licensing, stopping or limiting driving, barriers to driving cessation and alternative modes of transportation. Data were analysed using content analysis. Four categories emerged from the data; bad press for older drivers, COMPETENCE not age, call for fairness in licensing regulations, and hanging up the keys: It's complicated! Two key issues being (1) older drivers wanted to drive for as long as possible but (2) were not prepared for driving cessation; guided the choice and adaption of the Knowledge Enhances Your Safety (KEYS) program. This program was adapted for the Australian context and focus group findings raised the need for practical solutions, including transport alternatives, to be added. Targeted messages were developed from the data using the Precaution Adoption Process Model (PAPM), allowing the education to be tailored to the individual's stage of behaviour change. Adapting our program based on insights gained from community consultation should ensure the program is sensitive to the needs, skills and preferences of older drivers.
Bennett, A L; Buckton, S; Lawrance, I; Leong, R W; Moore, G; Andrews, J M
2015-12-01
Current models of care for ulcerative colitis (UC) across healthcare systems are inconsistent with a paucity of existing guidelines or supportive tools for outpatient management. This study aimed to produce and evaluate evidence-based outpatient management tools for UC to guide primary care practitioners and patients in clinical decision-making. Three tools were developed after identifying current gaps in the provision of healthcare services for patients with UC at a Clinical Insights Meeting in 2013. Draft designs were further refined through consultation and consolidation of feedback by the steering committee. Final drafts were developed following feasibility testing in three key stakeholder groups (gastroenterologists, general practitioners and patients) by questionnaire. The tools were officially launched into mainstream use in Australia in 2014. Three quarters of all respondents liked the layout and content of each tool. Minimal safety concerns were aired and those, along with pieces of information that were felt to be omitted, that were reviewed by the steering committee and incorporated into the final documents. The majority (over 80%) of respondents felt that the tools would be useful and would improve outpatient management of UC. Evidence-based outpatient clinical management tools for UC can be developed. The concept and end-product have been well received by all stakeholder groups. These tools should support non-specialist clinicians to optimise UC management and empower patients by facilitating them to safely self-manage and identify when medical support is needed. © 2015 Royal Australasian College of Physicians.
Evidence-informed primary health care workforce policy: are we asking the right questions?
Naccarella, Lucio; Buchan, Jim; Brooks, Peter
2010-01-01
Australia is facing a primary health care workforce shortage. To inform primary health care (PHC) workforce policy reforms, reflection is required on ways to strengthen the evidence base and its uptake into policy making. In 2008 the Australian Primary Health Care Research Institute funded the Australian Health Workforce Institute to host Professor James Buchan, Queen Margaret University, UK, an expert in health services policy research and health workforce planning. Professor Buchan's visit enabled over forty Australian PHC workforce mid-career and senior researchers and policy stakeholders to be involved in roundtable policy dialogue on issues influencing PHC workforce policy making. Six key thematic questions emerged. (1) What makes PHC workforce planning different? (2) Why does the PHC workforce need to be viewed in a global context? (3) What is the capacity of PHC workforce research? (4) What policy levers exist for PHC workforce planning? (5) What principles can guide PHC workforce planning? (6) What incentives exist to optimise the use of evidence in policy making? The emerging themes need to be discussed within the context of current PHC workforce policy reforms, which are focussed on increasing workforce supply (via education/training programs), changing the skill mix and extending the roles of health workers to meet patient needs. With the Australian government seeking to reform and strengthen the PHC workforce, key questions remain about ways to strengthen the PHC workforce evidence base and its uptake into PHC workforce policy making.
Optimising predictor domains for spatially coherent precipitation downscaling
NASA Astrophysics Data System (ADS)
Radanovics, S.; Vidal, J.-P.; Sauquet, E.; Ben Daoud, A.; Bontron, G.
2013-10-01
Statistical downscaling is widely used to overcome the scale gap between predictors from numerical weather prediction models or global circulation models and predictands like local precipitation, required for example for medium-term operational forecasts or climate change impact studies. The predictors are considered over a given spatial domain which is rarely optimised with respect to the target predictand location. In this study, an extended version of the growing rectangular domain algorithm is proposed to provide an ensemble of near-optimum predictor domains for a statistical downscaling method. This algorithm is applied to find five-member ensembles of near-optimum geopotential predictor domains for an analogue downscaling method for 608 individual target zones covering France. Results first show that very similar downscaling performances based on the continuous ranked probability score (CRPS) can be achieved by different predictor domains for any specific target zone, demonstrating the need for considering alternative domains in this context of high equifinality. A second result is the large diversity of optimised predictor domains over the country that questions the commonly made hypothesis of a common predictor domain for large areas. The domain centres are mainly distributed following the geographical location of the target location, but there are apparent differences between the windward and the lee side of mountain ridges. Moreover, domains for target zones located in southeastern France are centred more east and south than the ones for target locations on the same longitude. The size of the optimised domains tends to be larger in the southeastern part of the country, while domains with a very small meridional extent can be found in an east-west band around 47° N. Sensitivity experiments finally show that results are rather insensitive to the starting point of the optimisation algorithm except for zones located in the transition area north of this east-west band. Results also appear generally robust with respect to the archive length considered for the analogue method, except for zones with high interannual variability like in the Cévennes area. This study paves the way for defining regions with homogeneous geopotential predictor domains for precipitation downscaling over France, and therefore de facto ensuring the spatial coherence required for hydrological applications.
NASA Astrophysics Data System (ADS)
Munk, David J.; Kipouros, Timoleon; Vio, Gareth A.; Steven, Grant P.; Parks, Geoffrey T.
2017-11-01
Recently, the study of micro fluidic devices has gained much interest in various fields from biology to engineering. In the constant development cycle, the need to optimise the topology of the interior of these devices, where there are two or more optimality criteria, is always present. In this work, twin physical situations, whereby optimal fluid mixing in the form of vorticity maximisation is accompanied by the requirement that the casing in which the mixing takes place has the best structural performance in terms of the greatest specific stiffness, are considered. In the steady state of mixing this also means that the stresses in the casing are as uniform as possible, thus giving a desired operating life with minimum weight. The ultimate aim of this research is to couple two key disciplines, fluids and structures, into a topology optimisation framework, which shows fast convergence for multidisciplinary optimisation problems. This is achieved by developing a bi-directional evolutionary structural optimisation algorithm that is directly coupled to the Lattice Boltzmann method, used for simulating the flow in the micro fluidic device, for the objectives of minimum compliance and maximum vorticity. The needs for the exploration of larger design spaces and to produce innovative designs make meta-heuristic algorithms, such as genetic algorithms, particle swarms and Tabu Searches, less efficient for this task. The multidisciplinary topology optimisation framework presented in this article is shown to increase the stiffness of the structure from the datum case and produce physically acceptable designs. Furthermore, the topology optimisation method outperforms a Tabu Search algorithm in designing the baffle to maximise the mixing of the two fluids.
von Birgelen, C; Mintz, G; de Vrey, E A; Serruys, P; Kimura, T; Nobuyoshi, M; Popma, J; Leon, M; Erbel, R; de Feyter, P J
2000-01-01
AIMS—To classify atherosclerotic coronary lesions on the basis of adequate or inadequate compensatory vascular enlargement, and to examine changes in lumen, plaque, and vessel volumes during balloon optimised directional coronary atherectomy procedures in relation to the state of adaptive remodelling before the intervention. DESIGN—29 lesion segments in 29 patients were examined with intravascular ultrasound before and after successful balloon optimised directional coronary atherectomy procedures, and a validated volumetric intravascular ultrasound analysis was performed off-line to assess the atherosclerotic lesion remodelling and changes in plaque and vessel volumes that occurred during the intervention. Based on the intravascular ultrasound data, lesions were classified according to whether there was inadequate (group I) or adequate (group II) compensatory enlargement. RESULTS—There was no significant difference in patient and lesion characteristics between groups I and II (n = 10 and 19), including lesion length and details of the intervention. Quantitative coronary angiographic data were similar for both groups. However, plaque and vessel volumes were significantly smaller in group I than in II. In group I, 9 (4)% (mean (SD)) of the plaque volume was ablated, while in group II 16 (11)% was ablated (p = 0.01). This difference was reflected in a lower lumen volume gain in group I than in group II (46 (18) mm3 v 80 (49) mm3 (p < 0.02)). CONCLUSIONS—Preintervention lesion remodelling has an impact on the operative mechanisms of balloon optimised directional coronary atherectomy procedures. Plaque ablation was found to be particularly low in lesions with inadequate compensatory vascular enlargement. Keywords: intravascular ultrasound; ultrasonics; remodelling; coronary artery disease; atherectomy PMID:10648496
A support vector machine for predicting defibrillation outcomes from waveform metrics.
Howe, Andrew; Escalona, Omar J; Di Maio, Rebecca; Massot, Bertrand; Cromie, Nick A; Darragh, Karen M; Adgey, Jennifer; McEneaney, David J
2014-03-01
Algorithms to predict shock success based on VF waveform metrics could significantly enhance resuscitation by optimising the timing of defibrillation. To investigate robust methods of predicting defibrillation success in VF cardiac arrest patients, by using a support vector machine (SVM) optimisation approach. Frequency-domain (AMSA, dominant frequency and median frequency) and time-domain (slope and RMS amplitude) VF waveform metrics were calculated in a 4.1Y window prior to defibrillation. Conventional prediction test validity of each waveform parameter was conducted and used AUC>0.6 as the criterion for inclusion as a corroborative attribute processed by the SVM classification model. The latter used a Gaussian radial-basis-function (RBF) kernel and the error penalty factor C was fixed to 1. A two-fold cross-validation resampling technique was employed. A total of 41 patients had 115 defibrillation instances. AMSA, slope and RMS waveform metrics performed test validation with AUC>0.6 for predicting termination of VF and return-to-organised rhythm. Predictive accuracy of the optimised SVM design for termination of VF was 81.9% (± 1.24 SD); positive and negative predictivity were respectively 84.3% (± 1.98 SD) and 77.4% (± 1.24 SD); sensitivity and specificity were 87.6% (± 2.69 SD) and 71.6% (± 9.38 SD) respectively. AMSA, slope and RMS were the best VF waveform frequency-time parameters predictors of termination of VF according to test validity assessment. This a priori can be used for a simplified SVM optimised design that combines the predictive attributes of these VF waveform metrics for improved prediction accuracy and generalisation performance without requiring the definition of any threshold value on waveform metrics. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Das, Anup Kumar; Mandal, Vivekananda; Mandal, Subhash C
2014-01-01
Extraction forms the very basic step in research on natural products for drug discovery. A poorly optimised and planned extraction methodology can jeopardise the entire mission. To provide a vivid picture of different chemometric tools and planning for process optimisation and method development in extraction of botanical material, with emphasis on microwave-assisted extraction (MAE) of botanical material. A review of studies involving the application of chemometric tools in combination with MAE of botanical materials was undertaken in order to discover what the significant extraction factors were. Optimising a response by fine-tuning those factors, experimental design or statistical design of experiment (DoE), which is a core area of study in chemometrics, was then used for statistical analysis and interpretations. In this review a brief explanation of the different aspects and methodologies related to MAE of botanical materials that were subjected to experimental design, along with some general chemometric tools and the steps involved in the practice of MAE, are presented. A detailed study on various factors and responses involved in the optimisation is also presented. This article will assist in obtaining a better insight into the chemometric strategies of process optimisation and method development, which will in turn improve the decision-making process in selecting influential extraction parameters. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Astley, R. J.; Sugimoto, R.; Mustafi, P.
2011-08-01
Novel techniques are presented to reduce noise from turbofan aircraft engines by optimising the acoustic treatment in engine ducts. The application of Computational Aero-Acoustics (CAA) to predict acoustic propagation and absorption in turbofan ducts is reviewed and a critical assessment of performance indicates that validated and accurate techniques are now available for realistic engine predictions. A procedure for integrating CAA methods with state of the art optimisation techniques is proposed in the remainder of the article. This is achieved by embedding advanced computational methods for noise prediction within automated and semi-automated optimisation schemes. Two different strategies are described and applied to realistic nacelle geometries and fan sources to demonstrate the feasibility of this approach for industry scale problems.
Skou, Soren T; Roos, Ewa M; Laursen, Mogens B; Rathleff, Michael S; Arendt-Nielsen, Lars; Simonsen, Ole H; Rasmussen, Sten
2012-05-09
There is a lack of high quality evidence concerning the efficacy of total knee arthroplasty (TKA). According to international evidence-based guidelines, treatment of knee osteoarthritis (KOA) should include patient education, exercise and weight loss. Insoles and pharmacological treatment can be included as supplementary treatments. If the combination of these non-surgical treatment modalities is ineffective, TKA may be indicated. The purpose of this randomised controlled trial is to examine whether TKA provides further improvement in pain, function and quality of life in addition to optimised non-surgical treatment in patients with KOA defined as definite radiographic OA and up to moderate pain. The study will be conducted in The North Denmark Region. 100 participants with radiographic KOA (K-L grade ≥2) and mean pain during the previous week of ≤ 60 mm (0-100, best to worst scale) who are considered eligible for TKA by an orthopaedic surgeon will be included. The treatment will consist of 12 weeks of optimised non-surgical treatment consisting of patient education, exercise, diet, insoles, analgesics and/or NSAIDs. Patients will be randomised to either receiving or not receiving a TKA in addition to the optimised non-surgical treatment. The primary outcome will be the change from baseline to 12 months on the Knee Injury and Osteoarthritis Outcome Score (KOOS)(4) defined as the average score for the subscale scores for pain, symptoms, activities of daily living, and quality of life. Secondary outcomes include the five individual KOOS subscale scores, EQ-5D, pain on a 100 mm Visual Analogue Scale, self-efficacy, pain pressure thresholds, and isometric knee flexion and knee extension strength. This is the first randomised controlled trial to investigate the efficacy of TKA as an adjunct treatment to optimised non-surgical treatment in patients with KOA. The results will significantly contribute to evidence-based recommendations for the treatment of patients with KOA. Clinicaltrials.gov reference: NCT01410409.
2012-01-01
Background There is a lack of high quality evidence concerning the efficacy of total knee arthroplasty (TKA). According to international evidence-based guidelines, treatment of knee osteoarthritis (KOA) should include patient education, exercise and weight loss. Insoles and pharmacological treatment can be included as supplementary treatments. If the combination of these non-surgical treatment modalities is ineffective, TKA may be indicated. The purpose of this randomised controlled trial is to examine whether TKA provides further improvement in pain, function and quality of life in addition to optimised non-surgical treatment in patients with KOA defined as definite radiographic OA and up to moderate pain. Methods/Design The study will be conducted in The North Denmark Region. 100 participants with radiographic KOA (K-L grade ≥2) and mean pain during the previous week of ≤ 60 mm (0–100, best to worst scale) who are considered eligible for TKA by an orthopaedic surgeon will be included. The treatment will consist of 12 weeks of optimised non-surgical treatment consisting of patient education, exercise, diet, insoles, analgesics and/or NSAIDs. Patients will be randomised to either receiving or not receiving a TKA in addition to the optimised non-surgical treatment. The primary outcome will be the change from baseline to 12 months on the Knee Injury and Osteoarthritis Outcome Score (KOOS)4 defined as the average score for the subscale scores for pain, symptoms, activities of daily living, and quality of life. Secondary outcomes include the five individual KOOS subscale scores, EQ-5D, pain on a 100 mm Visual Analogue Scale, self-efficacy, pain pressure thresholds, and isometric knee flexion and knee extension strength. Discussion This is the first randomised controlled trial to investigate the efficacy of TKA as an adjunct treatment to optimised non-surgical treatment in patients with KOA. The results will significantly contribute to evidence-based recommendations for the treatment of patients with KOA. Trial registration Clinicaltrials.gov reference: NCT01410409 PMID:22571284
Xiao, Fuyuan; Aritsugi, Masayoshi; Wang, Qing; Zhang, Rong
2016-09-01
For efficient and sophisticated analysis of complex event patterns that appear in streams of big data from health care information systems and support for decision-making, a triaxial hierarchical model is proposed in this paper. Our triaxial hierarchical model is developed by focusing on hierarchies among nested event pattern queries with an event concept hierarchy, thereby allowing us to identify the relationships among the expressions and sub-expressions of the queries extensively. We devise a cost-based heuristic by means of the triaxial hierarchical model to find an optimised query execution plan in terms of the costs of both the operators and the communications between them. According to the triaxial hierarchical model, we can also calculate how to reuse the results of the common sub-expressions in multiple queries. By integrating the optimised query execution plan with the reuse schemes, a multi-query optimisation strategy is developed to accomplish efficient processing of multiple nested event pattern queries. We present empirical studies in which the performance of multi-query optimisation strategy was examined under various stream input rates and workloads. Specifically, the workloads of pattern queries can be used for supporting monitoring patients' conditions. On the other hand, experiments with varying input rates of streams can correspond to changes of the numbers of patients that a system should manage, whereas burst input rates can correspond to changes of rushes of patients to be taken care of. The experimental results have shown that, in Workload 1, our proposal can improve about 4 and 2 times throughput comparing with the relative works, respectively; in Workload 2, our proposal can improve about 3 and 2 times throughput comparing with the relative works, respectively; in Workload 3, our proposal can improve about 6 times throughput comparing with the relative work. The experimental results demonstrated that our proposal was able to process complex queries efficiently which can support health information systems and further decision-making. Copyright © 2016 Elsevier B.V. All rights reserved.
Integration of environmental aspects in modelling and optimisation of water supply chains.
Koleva, Mariya N; Calderón, Andrés J; Zhang, Di; Styan, Craig A; Papageorgiou, Lazaros G
2018-04-26
Climate change becomes increasingly more relevant in the context of water systems planning. Tools are necessary to provide the most economic investment option considering the reliability of the infrastructure from technical and environmental perspectives. Accordingly, in this work, an optimisation approach, formulated as a spatially-explicit multi-period Mixed Integer Linear Programming (MILP) model, is proposed for the design of water supply chains at regional and national scales. The optimisation framework encompasses decisions such as installation of new purification plants, capacity expansion, and raw water trading schemes. The objective is to minimise the total cost incurring from capital and operating expenditures. Assessment of available resources for withdrawal is performed based on hydrological balances, governmental rules and sustainable limits. In the light of the increasing importance of reliability of water supply, a second objective, seeking to maximise the reliability of the supply chains, is introduced. The epsilon-constraint method is used as a solution procedure for the multi-objective formulation. Nash bargaining approach is applied to investigate the fair trade-offs between the two objectives and find the Pareto optimality. The models' capability is addressed through a case study based on Australia. The impact of variability in key input parameters is tackled through the implementation of a rigorous global sensitivity analysis (GSA). The findings suggest that variations in water demand can be more disruptive for the water supply chain than scenarios in which rainfalls are reduced. The frameworks can facilitate governmental multi-aspect decision making processes for the adequate and strategic investments of regional water supply infrastructure. Copyright © 2018. Published by Elsevier B.V.
Mghirbi, Oussama; LE Grusse, Philippe; Fabre, Jacques; Mandart, Elisabeth; Bord, Jean-Paul
2017-03-01
The health, environmental and socio-economic issues related to the massive use of plant protection products are a concern for all the stakeholders involved in the agricultural sector. These stakeholders, including farmers and territorial actors, have expressed a need for decision-support tools for the management of diffuse pollution related to plant protection practices and their impacts. To meet the needs expressed by the public authorities and the territorial actors for such decision-support tools, we have developed a technical-economic model "OptiPhy" for risk mitigation based on indicators of pesticide toxicity risk to applicator health (IRSA) and to the environment (IRTE), under the constraint of suitable economic outcomes. This technical-economic optimisation model is based on linear programming techniques and offers various scenarios to help the different actors in choosing plant protection products, depending on their different levels of constraints and aspirations. The health and environmental risk indicators can be broken down into sub-indicators so that management can be tailored to the context. This model for technical-economic optimisation and management of plant protection practices can analyse scenarios for the reduction of pesticide-related risks by proposing combinations of substitution PPPs, according to criteria of efficiency, economic performance and vulnerability of the natural environment. The results of the scenarios obtained on real ITKs in different cropping systems show that it is possible to reduce the PPP pressure (TFI) and reduce toxicity risks to applicator health (IRSA) and to the environment (IRTE) by up to approximately 50 %.
Gender differences in visuospatial planning: an eye movements study.
Cazzato, Valentina; Basso, Demis; Cutini, Simone; Bisiacchi, Patrizia
2010-01-20
Gender studies report a male advantage in several visuospatial abilities. Only few studies however, have evaluated differences in visuospatial planning behaviour with regard to gender. This study was aimed at exploring whether gender may affect the choice of cognitive strategies in a visuospatial planning task and, if oculomotor measures could assist in disentangling the cognitive processes involved. A computerised task based on the travelling salesperson problem paradigm, the Maps test, was used to investigate these issues. Participants were required to optimise time and space of a path travelling among a set of sub-goals in a spatially constrained environment. Behavioural results suggest that there are no gender differences in the initial visual processing of the stimuli, but rather during the execution of the plan, with males showing a shorter execution time and a higher path length optimisation than females. Males often showed changes of heuristics during the execution while females seemed to prefer a constant strategy. Moreover, a better performance in behavioural and oculomotor measures seemed to suggest that males are more able than females in either the optimisation of spatial features or the realisation of the planned scheme. Despite inconclusive findings, the results support previous research and provide insight into the level of cognitive processing involved in navigation and planning tasks, with regard to the influence of gender.
Adaptive neuro fuzzy inference system-based power estimation method for CMOS VLSI circuits
NASA Astrophysics Data System (ADS)
Vellingiri, Govindaraj; Jayabalan, Ramesh
2018-03-01
Recent advancements in very large scale integration (VLSI) technologies have made it feasible to integrate millions of transistors on a single chip. This greatly increases the circuit complexity and hence there is a growing need for less-tedious and low-cost power estimation techniques. The proposed work employs Back-Propagation Neural Network (BPNN) and Adaptive Neuro Fuzzy Inference System (ANFIS), which are capable of estimating the power precisely for the complementary metal oxide semiconductor (CMOS) VLSI circuits, without requiring any knowledge on circuit structure and interconnections. The ANFIS to power estimation application is relatively new. Power estimation using ANFIS is carried out by creating initial FIS modes using hybrid optimisation and back-propagation (BP) techniques employing constant and linear methods. It is inferred that ANFIS with the hybrid optimisation technique employing the linear method produces better results in terms of testing error that varies from 0% to 0.86% when compared to BPNN as it takes the initial fuzzy model and tunes it by means of a hybrid technique combining gradient descent BP and mean least-squares optimisation algorithms. ANFIS is the best suited for power estimation application with a low RMSE of 0.0002075 and a high coefficient of determination (R) of 0.99961.
Airfoil Shape Optimization based on Surrogate Model
NASA Astrophysics Data System (ADS)
Mukesh, R.; Lingadurai, K.; Selvakumar, U.
2018-02-01
Engineering design problems always require enormous amount of real-time experiments and computational simulations in order to assess and ensure the design objectives of the problems subject to various constraints. In most of the cases, the computational resources and time required per simulation are large. In certain cases like sensitivity analysis, design optimisation etc where thousands and millions of simulations have to be carried out, it leads to have a life time of difficulty for designers. Nowadays approximation models, otherwise called as surrogate models (SM), are more widely employed in order to reduce the requirement of computational resources and time in analysing various engineering systems. Various approaches such as Kriging, neural networks, polynomials, Gaussian processes etc are used to construct the approximation models. The primary intention of this work is to employ the k-fold cross validation approach to study and evaluate the influence of various theoretical variogram models on the accuracy of the surrogate model construction. Ordinary Kriging and design of experiments (DOE) approaches are used to construct the SMs by approximating panel and viscous solution algorithms which are primarily used to solve the flow around airfoils and aircraft wings. The method of coupling the SMs with a suitable optimisation scheme to carryout an aerodynamic design optimisation process for airfoil shapes is also discussed.
NASA Astrophysics Data System (ADS)
Azadeh, A.; Foroozan, H.; Ashjari, B.; Motevali Haghighi, S.; Yazdanparast, R.; Saberi, M.; Torki Nejad, M.
2017-10-01
ISs and ITs play a critical role in large complex gas corporations. Many factors such as human, organisational and environmental factors affect IS in an organisation. Therefore, investigating ISs success is considered to be a complex problem. Also, because of the competitive business environment and the high amount of information flow in organisations, new issues like resilient ISs and successful customer relationship management (CRM) have emerged. A resilient IS will provide sustainable delivery of information to internal and external customers. This paper presents an integrated approach to enhance and optimise the performance of each component of a large IS based on CRM and resilience engineering (RE) in a gas company. The enhancement of the performance can help ISs to perform business tasks efficiently. The data are collected from standard questionnaires. It is then analysed by data envelopment analysis by selecting the optimal mathematical programming approach. The selected model is validated and verified by principle component analysis method. Finally, CRM and RE factors are identified as influential factors through sensitivity analysis for this particular case study. To the best of our knowledge, this is the first study for performance assessment and optimisation of large IS by combined RE and CRM.
A new paradigm in personal dosimetry using LiF:Mg,Cu,P.
Cassata, J R; Moscovitch, M; Rotunda, J E; Velbeck, K J
2002-01-01
The United States Navy has been monitoring personnel for occupational exposure to ionising radiation since 1947. Film was exclusively used until 1973 when thermoluminescence dosemeters were introduced and used to the present time. In 1994, a joint research project between the Naval Dosimetry Center, Georgetown University, and Saint Gobain Crystals and Detectors (formerly Bicron RMP formerly Harshaw TLD) began to develop a state of the art thermoluminescent dosimetry system. The study was conducted from a large-scale dosimetry processor point of view with emphasis on a systems approach. Significant improvements were achieved by replacing the LiF:Mg,Ti with LiF:Mg,Cu,P TL elements due to the significant sensitivity increase, linearity, and negligible hiding. Dosemeter filters were optimised for gamma and X ray energy discrimination using Monte Carlo modelling (MCNP) resulting in significant improvement in accuracy and precision. Further improvements were achieved through the use of neural-network based dose calculation algorithms. Both back propagation and functional link methods were implemented and the data compared with essentially the same results. Several operational aspects of the system are discussed, including (1) background subtraction using control dosemeters, (2) selection criteria for control dosemeters, (3) optimisation of the TLD readers, (4) calibration methodology, and (5) the optimisation of the heating profile.
NASA Astrophysics Data System (ADS)
Suja Priyadharsini, S.; Edward Rajan, S.; Femilin Sheniha, S.
2016-03-01
Electroencephalogram (EEG) is the recording of electrical activities of the brain. It is contaminated by other biological signals, such as cardiac signal (electrocardiogram), signals generated by eye movement/eye blinks (electrooculogram) and muscular artefact signal (electromyogram), called artefacts. Optimisation is an important tool for solving many real-world problems. In the proposed work, artefact removal, based on the adaptive neuro-fuzzy inference system (ANFIS) is employed, by optimising the parameters of ANFIS. Artificial Immune System (AIS) algorithm is used to optimise the parameters of ANFIS (ANFIS-AIS). Implementation results depict that ANFIS-AIS is effective in removing artefacts from EEG signal than ANFIS. Furthermore, in the proposed work, improved AIS (IAIS) is developed by including suitable selection processes in the AIS algorithm. The performance of the proposed method IAIS is compared with AIS and with genetic algorithm (GA). Measures such as signal-to-noise ratio, mean square error (MSE) value, correlation coefficient, power spectrum density plot and convergence time are used for analysing the performance of the proposed method. From the results, it is found that the IAIS algorithm converges faster than the AIS and performs better than the AIS and GA. Hence, IAIS tuned ANFIS (ANFIS-IAIS) is effective in removing artefacts from EEG signals.
An improved PSO-SVM model for online recognition defects in eddy current testing
NASA Astrophysics Data System (ADS)
Liu, Baoling; Hou, Dibo; Huang, Pingjie; Liu, Banteng; Tang, Huayi; Zhang, Wubo; Chen, Peihua; Zhang, Guangxin
2013-12-01
Accurate and rapid recognition of defects is essential for structural integrity and health monitoring of in-service device using eddy current (EC) non-destructive testing. This paper introduces a novel model-free method that includes three main modules: a signal pre-processing module, a classifier module and an optimisation module. In the signal pre-processing module, a kind of two-stage differential structure is proposed to suppress the lift-off fluctuation that could contaminate the EC signal. In the classifier module, multi-class support vector machine (SVM) based on one-against-one strategy is utilised for its good accuracy. In the optimisation module, the optimal parameters of classifier are obtained by an improved particle swarm optimisation (IPSO) algorithm. The proposed IPSO technique can improve convergence performance of the primary PSO through the following strategies: nonlinear processing of inertia weight, introductions of the black hole and simulated annealing model with extremum disturbance. The good generalisation ability of the IPSO-SVM model has been validated through adding additional specimen into the testing set. Experiments show that the proposed algorithm can achieve higher recognition accuracy and efficiency than other well-known classifiers and the superiorities are more obvious with less training set, which contributes to online application.
Manogaran, Motharasan; Shukor, Mohd Yunus; Yasid, Nur Adeela; Khalil, Khalilah Abdul; Ahmad, Siti Aqlima
2018-02-01
The herbicide glyphosate is often used to control weeds in agricultural lands. However, despite its ability to effectively kill weeds at low cost, health problems are still reported due to its toxicity level. The removal of glyphosate from the environment is usually done by microbiological process since chemical process of degradation is ineffective due to the presence of highly stable bonds. Therefore, finding glyphosate-degrading microorganisms in the soil of interest is crucial to remediate this glyphosate. Burkholderia vietnamiensis strain AQ5-12 was found to have glyphosate-degrading ability. Optimisation of biodegradation condition was carried out utilising one factor at a time (OFAT) and response surface methodology (RSM). Five parameters including carbon and nitrogen source, pH, temperature and glyphosate concentration were optimised. Based on OFAT result, glyphosate degradation was observed to be optimum at fructose concentration of 6, 0.5 g/L ammonia sulphate, pH 6.5, temperature of 32 °C and glyphosate concentration at 100 ppm. Meanwhile, RSM resulted in a better degradation with 92.32% of 100 ppm glyphosate compared to OFAT. The bacterium was seen to tolerate up to 500 ppm glyphosate while increasing concentration results in reduced degradation and bacterial growth rate.
Using modified fruit fly optimisation algorithm to perform the function test and case studies
NASA Astrophysics Data System (ADS)
Pan, Wen-Tsao
2013-06-01
Evolutionary computation is a computing mode established by practically simulating natural evolutionary processes based on the concept of Darwinian Theory, and it is a common research method. The main contribution of this paper was to reinforce the function of searching for the optimised solution using the fruit fly optimization algorithm (FOA), in order to avoid the acquisition of local extremum solutions. The evolutionary computation has grown to include the concepts of animal foraging behaviour and group behaviour. This study discussed three common evolutionary computation methods and compared them with the modified fruit fly optimization algorithm (MFOA). It further investigated the ability of the three mathematical functions in computing extreme values, as well as the algorithm execution speed and the forecast ability of the forecasting model built using the optimised general regression neural network (GRNN) parameters. The findings indicated that there was no obvious difference between particle swarm optimization and the MFOA in regards to the ability to compute extreme values; however, they were both better than the artificial fish swarm algorithm and FOA. In addition, the MFOA performed better than the particle swarm optimization in regards to the algorithm execution speed, and the forecast ability of the forecasting model built using the MFOA's GRNN parameters was better than that of the other three forecasting models.
Dong, Xu-Yan; Kong, Fan-Pi; Yuan, Gang-You; Wei, Fang; Jiang, Mu-Lan; Li, Guang-Ming; Wang, Zhan; Zhao, Yuan-Di; Chen, Hong
2012-01-01
Phytosterol liposomes were prepared using the thin film method and used to encapsulate nattokinase (NK). In order to obtain a high encapsulation efficiency within the liposome, an orthogonal experiment (L9 (3)(4)) was applied to optimise the preparation conditions. The molar ratio of lecithin to phytosterols, NK activity and mass ratio of mannite to lecithin were the main factors that influenced the encapsulation efficiency of the liposomes. Based on the results of a single-factor test, these three factors were chosen for this study. We determined the optimum extraction conditions to be as follows: a molar ratio of lecithin to phytosterol of 2 : 1, NK activity of 2500 U mL⁻¹ and a mass ratio of mannite to lecithin of 3 : 1. Under these optimised conditions, an encapsulation efficiency of 65.25% was achieved, which agreed closely with the predicted result. Moreover, the zeta potential, size distribution and microstructure of the liposomes prepared were measured, and we found that the zeta potential was -51 ± 3 mV and the mean diameter was 194.1 nm. From the results of the scanning electron microscopy, we observed that the phytosterol liposomes were round and regular in shape and showed no aggregation.
Soh, Josephine Lay Peng; Grachet, Maud; Whitlock, Mark; Lukas, Timothy
2013-02-01
This is a study to fully assess a commercially available co-processed mannitol for its usefulness as an off-the-shelf excipient for developing orally disintegrating tablets (ODTs) by direct compression on a pilot scale (up to 4 kg). This work encompassed material characterization, formulation optimisation and process robustness. Overall, this co-processed mannitol possessed favourable physical attributes including low hygroscopicity and compactibility. Two design-of-experiments (DoEs) were used to screen and optimise the placebo formulation. Xylitol and crospovidone concentrations were found to have the most significant impact on disintegration time (p < 0.05). Higher xylitol concentrations retarded disintegration. Avicel PH102 promoted faster disintegration than PH101, at higher levels of xylitol. Without xylitol, higher crospovidone concentrations yielded faster disintegration and reduced tablet friability. Lubrication sensitivity studies were later conducted at two fill loads, three levels for lubricant concentration and number of blend rotations. Even at 75% fill load, the design space plot showed that 1.5% lubricant and 300 blend revolutions were sufficient to manufacture ODTs with ≤ 0.1% friability and disintegrated within 15 s. This study also describes results using a modified disintegration method based on the texture analyzer as an alternative to the USP method.
Rurality Index for Small Areas in Spain
ERIC Educational Resources Information Center
Ocana-Riola, Ricardo; Sanchez-Cantalejo, Carmen
2005-01-01
An operational definition for "rural area" is pivotal if proposals, policies and decisions aimed at optimising the distribution of resources, closing the gap on inequity between areas and raising standards of living for the least advantaged populations are to be put in place. The concept of rurality, however, is often based on…
Is ICRP guidance on the use of reference levels consistent?
Hedemann-Jensen, Per; McEwan, Andrew C
2011-12-01
In ICRP 103, which has replaced ICRP 60, it is stated that no fundamental changes have been introduced compared with ICRP 60. This is true except that the application of reference levels in emergency and existing exposure situations seems to be applied inconsistently, and also in the related publications ICRP 109 and ICRP 111. ICRP 103 emphasises that focus should be on the residual doses after the implementation of protection strategies in emergency and existing exposure situations. If possible, the result of an optimised protection strategy should bring the residual dose below the reference level. Thus the reference level represents the maximum acceptable residual dose after an optimised protection strategy has been implemented. It is not an 'off-the-shelf item' that can be set free of the prevailing situation. It should be determined as part of the process of optimising the protection strategy. If not, protection would be sub-optimised. However, in ICRP 103 some inconsistent concepts have been introduced, e.g. in paragraph 279 which states: 'All exposures above or below the reference level should be subject to optimisation of protection, and particular attention should be given to exposures above the reference level'. If, in fact, all exposures above and below reference levels are subject to the process of optimisation, reference levels appear superfluous. It could be considered that if optimisation of protection below a fixed reference level is necessary, then the reference level has been set too high at the outset. Up until the last phase of the preparation of ICRP 103 the concept of a dose constraint was recommended to constrain the optimisation of protection in all types of exposure situations. In the final phase, the term 'dose constraint' was changed to 'reference level' for emergency and existing exposure situations. However, it seems as if in ICRP 103 it was not fully recognised that dose constraints and reference levels are conceptually different. The use of reference levels in radiological protection is reviewed. It is concluded that the recommendations in ICRP 103 and related ICRP publications seem to be inconsistent regarding the use of reference levels in existing and emergency exposure situations.
hydroPSO: A Versatile Particle Swarm Optimisation R Package for Calibration of Environmental Models
NASA Astrophysics Data System (ADS)
Zambrano-Bigiarini, M.; Rojas, R.
2012-04-01
Particle Swarm Optimisation (PSO) is a recent and powerful population-based stochastic optimisation technique inspired by social behaviour of bird flocking, which shares similarities with other evolutionary techniques such as Genetic Algorithms (GA). In PSO, however, each individual of the population, known as particle in PSO terminology, adjusts its flying trajectory on the multi-dimensional search-space according to its own experience (best-known personal position) and the one of its neighbours in the swarm (best-known local position). PSO has recently received a surge of attention given its flexibility, ease of programming, low memory and CPU requirements, and efficiency. Despite these advantages, PSO may still get trapped into sub-optimal solutions, suffer from swarm explosion or premature convergence. Thus, the development of enhancements to the "canonical" PSO is an active area of research. To date, several modifications to the canonical PSO have been proposed in the literature, resulting into a large and dispersed collection of codes and algorithms which might well be used for similar if not identical purposes. In this work we present hydroPSO, a platform-independent R package implementing several enhancements to the canonical PSO that we consider of utmost importance to bring this technique to the attention of a broader community of scientists and practitioners. hydroPSO is model-independent, allowing the user to interface any model code with the calibration engine without having to invest considerable effort in customizing PSO to a new calibration problem. Some of the controlling options to fine-tune hydroPSO are: four alternative topologies, several types of inertia weight, time-variant acceleration coefficients, time-variant maximum velocity, regrouping of particles when premature convergence is detected, different types of boundary conditions and many others. Additionally, hydroPSO implements recent PSO variants such as: Improved Particle Swarm Optimisation (IPSO), Fully Informed Particle Swarm (FIPS), and weighted FIPS (wFIPS). Finally, an advanced sensitivity analysis using the Latin Hypercube One-At-a-Time (LH-OAT) method and user-friendly plotting summaries facilitate the interpretation and assessment of the calibration/optimisation results. We validate hydroPSO against the standard PSO algorithm (SPSO-2007) employing five test functions commonly used to assess the performance of optimisation algorithms. Additionally, we illustrate how the performance of the optimization/calibration engine is boosted by using several of the fine-tune options included in hydroPSO. Finally, we show how to interface SWAT-2005 with hydroPSO to calibrate a semi-distributed hydrological model for the Ega River basin in Spain, and how to interface MODFLOW-2000 and hydroPSO to calibrate a groundwater flow model for the regional aquifer of the Pampa del Tamarugal in Chile. We limit the applications of hydroPSO to study cases dealing with surface water and groundwater models as these two are the authors' areas of expertise. However, based on the flexibility of hydroPSO we believe this package can be implemented to any model code requiring some form of parameter estimation.
NASA Astrophysics Data System (ADS)
Dubey, M.; Chandra, H.; Kumar, Anil
2016-02-01
A thermal modelling for the performance evaluation of gas turbine cogeneration system with reheat is presented in this paper. The Joule-Brayton cogeneration reheat cycle is based on the total useful energy rate (TUER) has been optimised and the efficiency at the maximum TUER is determined. The variation of maximum dimensionless TUER and efficiency at maximum TUER with respect to cycle temperature ratio have also been analysed. From the results, it has been found that the dimensionless maximum TUER and the corresponding thermal efficiency decrease with the increase in power to heat ratio. The result also shows that the inclusion of reheat significantly improves the overall performance of the cycle. From the thermodynamic performance point of view, this methodology may be quite useful in the selection and comparison of combined energy production systems.
Optimisation of a Generic Ionic Model of Cardiac Myocyte Electrical Activity
Guo, Tianruo; Al Abed, Amr; Lovell, Nigel H.; Dokos, Socrates
2013-01-01
A generic cardiomyocyte ionic model, whose complexity lies between a simple phenomenological formulation and a biophysically detailed ionic membrane current description, is presented. The model provides a user-defined number of ionic currents, employing two-gate Hodgkin-Huxley type kinetics. Its generic nature allows accurate reconstruction of action potential waveforms recorded experimentally from a range of cardiac myocytes. Using a multiobjective optimisation approach, the generic ionic model was optimised to accurately reproduce multiple action potential waveforms recorded from central and peripheral sinoatrial nodes and right atrial and left atrial myocytes from rabbit cardiac tissue preparations, under different electrical stimulus protocols and pharmacological conditions. When fitted simultaneously to multiple datasets, the time course of several physiologically realistic ionic currents could be reconstructed. Model behaviours tend to be well identified when extra experimental information is incorporated into the optimisation. PMID:23710254
Load-sensitive dynamic workflow re-orchestration and optimisation for faster patient healthcare.
Meli, Christopher L; Khalil, Ibrahim; Tari, Zahir
2014-01-01
Hospital waiting times are considerably long, with no signs of reducing any-time soon. A number of factors including population growth, the ageing population and a lack of new infrastructure are expected to further exacerbate waiting times in the near future. In this work, we show how healthcare services can be modelled as queueing nodes, together with healthcare service workflows, such that these workflows can be optimised during execution in order to reduce patient waiting times. Services such as X-ray, computer tomography, and magnetic resonance imaging often form queues, thus, by taking into account the waiting times of each service, the workflow can be re-orchestrated and optimised. Experimental results indicate average waiting time reductions are achievable by optimising workflows using dynamic re-orchestration. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.
Optimisation of Fabric Reinforced Polymer Composites Using a Variant of Genetic Algorithm
NASA Astrophysics Data System (ADS)
Axinte, Andrei; Taranu, Nicolae; Bejan, Liliana; Hudisteanu, Iuliana
2017-12-01
Fabric reinforced polymeric composites are high performance materials with a rather complex fabric geometry. Therefore, modelling this type of material is a cumbersome task, especially when an efficient use is targeted. One of the most important issue of its design process is the optimisation of the individual laminae and of the laminated structure as a whole. In order to do that, a parametric model of the material has been defined, emphasising the many geometric variables needed to be correlated in the complex process of optimisation. The input parameters involved in this work, include: widths or heights of the tows and the laminate stacking sequence, which are discrete variables, while the gaps between adjacent tows and the height of the neat matrix are continuous variables. This work is one of the first attempts of using a Genetic Algorithm ( GA) to optimise the geometrical parameters of satin reinforced multi-layer composites. Given the mixed type of the input parameters involved, an original software called SOMGA (Satin Optimisation with a Modified Genetic Algorithm) has been conceived and utilised in this work. The main goal is to find the best possible solution to the problem of designing a composite material which is able to withstand to a given set of external, in-plane, loads. The optimisation process has been performed using a fitness function which can analyse and compare mechanical behaviour of different fabric reinforced composites, the results being correlated with the ultimate strains, which demonstrate the efficiency of the composite structure.
NASA Astrophysics Data System (ADS)
Perry, Anna-Kristina; Pavia, Giancarlo; Passmore, Martin
2016-11-01
As vehicle manufacturers work to reduce energy consumption of all types of vehicles, external vehicle aerodynamics has become increasingly important. Whilst production vehicle shape optimisation methods are well developed, the need to make further advances requires deeper understanding of the highly three-dimensional flow around bluff bodies. In this paper, the wake flow of a generic bluff body, the Windsor body, based on a square-back car geometry, was investigated by means of balance measurements, surface pressure measurements and 2D particle image velocimetry planes. Changes in the wake topology are triggered by the application of short tapers (4 % of the model length) to the top and bottom edges of the base, representing a shape optimisation that is realistic for many modern production vehicles. The base drag is calculated and correlated with the aerodynamic drag data. The results not only show the effectiveness of such small devices in modifying the time average topology of the wake but also shed some light on the effects produced by different levels of upwash and downwash on the bi-stable nature of the wake itself.
NASA Astrophysics Data System (ADS)
van der Kuur, J.; Gottardi, L. G.; Akamatsu, H.; van Leeuwen, B. J.; den Hartog, R.; Haas, D.; Kiviranta, M.; Jackson, B. J.
2016-07-01
Athena is a space-based X-ray observatory intended for exploration of the hot and energetic universe. One of the science instruments on Athena will be the X-ray Integrated Field Unit (X-IFU), which is a cryogenic X-ray spectrometer, based on a large cryogenic imaging array of Transition Edge Sensors (TES) based microcalorimeters operating at a temperature of 100mK. The imaging array consists of 3800 pixels providing 2.5 eV spectral resolution, and covers a field of view with a diameter of of 5 arc minutes. Multiplexed readout of the cryogenic microcalorimeter array is essential to comply with the cooling power and complexity constraints on a space craft. Frequency domain multiplexing has been under development for the readout of TES-based detectors for this purpose, not only for the X-IFU detector arrays but also for TES-based bolometer arrays for the Safari instrument of the Japanese SPICA observatory. This paper discusses the design considerations which are applicable to optimise the multiplex factor within the boundary conditions as set by the space craft. More specifically, the interplay between the science requirements such as pixel dynamic range, pixel speed, and cross talk, and the space craft requirements such as the power dissipation budget, available bandwidth, and electromagnetic compatibility will be discussed.
Optimisation of the Management of Higher Activity Waste in the UK - 13537
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walsh, Ciara; Buckley, Matthew
2013-07-01
The Upstream Optioneering project was created in the Nuclear Decommissioning Authority (UK) to support the development and implementation of significant opportunities to optimise activities across all the phases of the Higher Activity Waste management life cycle (i.e. retrieval, characterisation, conditioning, packaging, storage, transport and disposal). The objective of the Upstream Optioneering project is to work in conjunction with other functions within NDA and the waste producers to identify and deliver solutions to optimise the management of higher activity waste. Historically, optimisation may have occurred on aspects of the waste life cycle (considered here to include retrieval, conditioning, treatment, packaging, interimmore » storage, transport to final end state, which may be geological disposal). By considering the waste life cycle as a whole, critical analysis of assumed constraints may lead to cost savings for the UK Tax Payer. For example, it may be possible to challenge the requirements for packaging wastes for disposal to deliver an optimised waste life cycle. It is likely that the challenges faced in the UK are shared in other countries. It is therefore likely that the opportunities identified may also apply elsewhere, with the potential for sharing information to enable value to be shared. (authors)« less
Design optimisation of powers-of-two FIR filter using self-organising random immigrants GA
NASA Astrophysics Data System (ADS)
Chandra, Abhijit; Chattopadhyay, Sudipta
2015-01-01
In this communication, we propose a novel design strategy of multiplier-less low-pass finite impulse response (FIR) filter with the aid of a recent evolutionary optimisation technique, known as the self-organising random immigrants genetic algorithm. Individual impulse response coefficients of the proposed filter have been encoded as sum of signed powers-of-two. During the formulation of the cost function for the optimisation algorithm, both the frequency response characteristic and the hardware cost of the discrete coefficient FIR filter have been considered. The role of crossover probability of the optimisation technique has been evaluated on the overall performance of the proposed strategy. For this purpose, the convergence characteristic of the optimisation technique has been included in the simulation results. In our analysis, two design examples of different specifications have been taken into account. In order to substantiate the efficiency of our proposed structure, a number of state-of-the-art design strategies of multiplier-less FIR filter have also been included in this article for the purpose of comparison. Critical analysis of the result unambiguously establishes the usefulness of our proposed approach for the hardware efficient design of digital filter.
A New Multiconstraint Method for Determining the Optimal Cable Stresses in Cable-Stayed Bridges
Asgari, B.; Osman, S. A.; Adnan, A.
2014-01-01
Cable-stayed bridges are one of the most popular types of long-span bridges. The structural behaviour of cable-stayed bridges is sensitive to the load distribution between the girder, pylons, and cables. The determination of pretensioning cable stresses is critical in the cable-stayed bridge design procedure. By finding the optimum stresses in cables, the load and moment distribution of the bridge can be improved. In recent years, different research works have studied iterative and modern methods to find optimum stresses of cables. However, most of the proposed methods have limitations in optimising the structural performance of cable-stayed bridges. This paper presents a multiconstraint optimisation method to specify the optimum cable forces in cable-stayed bridges. The proposed optimisation method produces less bending moments and stresses in the bridge members and requires shorter simulation time than other proposed methods. The results of comparative study show that the proposed method is more successful in restricting the deck and pylon displacements and providing uniform deck moment distribution than unit load method (ULM). The final design of cable-stayed bridges can be optimised considerably through proposed multiconstraint optimisation method. PMID:25050400
A new multiconstraint method for determining the optimal cable stresses in cable-stayed bridges.
Asgari, B; Osman, S A; Adnan, A
2014-01-01
Cable-stayed bridges are one of the most popular types of long-span bridges. The structural behaviour of cable-stayed bridges is sensitive to the load distribution between the girder, pylons, and cables. The determination of pretensioning cable stresses is critical in the cable-stayed bridge design procedure. By finding the optimum stresses in cables, the load and moment distribution of the bridge can be improved. In recent years, different research works have studied iterative and modern methods to find optimum stresses of cables. However, most of the proposed methods have limitations in optimising the structural performance of cable-stayed bridges. This paper presents a multiconstraint optimisation method to specify the optimum cable forces in cable-stayed bridges. The proposed optimisation method produces less bending moments and stresses in the bridge members and requires shorter simulation time than other proposed methods. The results of comparative study show that the proposed method is more successful in restricting the deck and pylon displacements and providing uniform deck moment distribution than unit load method (ULM). The final design of cable-stayed bridges can be optimised considerably through proposed multiconstraint optimisation method.
McEvoy, Eamon; Donegan, Sheila; Power, Joe; Altria, Kevin
2007-05-09
A rapid and efficient oil-in-water microemulsion liquid chromatographic method has been optimised and validated for the analysis of paracetamol in a suppository formulation. Excellent linearity, accuracy, precision and assay results were obtained. Lengthy sample pre-treatment/extraction procedures were eliminated due to the solubilising power of the microemulsion and rapid analysis times were achieved. The method was optimised to achieve rapid analysis time and relatively high peak efficiencies. A standard microemulsion composition of 33 g SDS, 66 g butan-1-ol, 8 g n-octane in 1l of 0.05% TFA modified with acetonitrile has been shown to be suitable for the rapid analysis of paracetamol in highly hydrophobic preparations under isocratic conditions. Validated assay results and overall analysis time of the optimised method was compared to British Pharmacopoeia reference methods. Sample preparation and analysis times for the MELC analysis of paracetamol in a suppository were extremely rapid compared to the reference method and similar assay results were achieved. A gradient MELC method using the same microemulsion has been optimised for the resolution of paracetamol and five of its related substances in approximately 7 min.
Optimisation of the supercritical extraction of toxic elements in fish oil.
Hajeb, P; Jinap, S; Shakibazadeh, Sh; Afsah-Hejri, L; Mohebbi, G H; Zaidul, I S M
2014-01-01
This study aims to optimise the operating conditions for the supercritical fluid extraction (SFE) of toxic elements from fish oil. The SFE operating parameters of pressure, temperature, CO2 flow rate and extraction time were optimised using a central composite design (CCD) of response surface methodology (RSM). High coefficients of determination (R²) (0.897-0.988) for the predicted response surface models confirmed a satisfactory adjustment of the polynomial regression models with the operation conditions. The results showed that the linear and quadratic terms of pressure and temperature were the most significant (p < 0.05) variables affecting the overall responses. The optimum conditions for the simultaneous elimination of toxic elements comprised a pressure of 61 MPa, a temperature of 39.8ºC, a CO₂ flow rate of 3.7 ml min⁻¹ and an extraction time of 4 h. These optimised SFE conditions were able to produce fish oil with the contents of lead, cadmium, arsenic and mercury reduced by up to 98.3%, 96.1%, 94.9% and 93.7%, respectively. The fish oil extracted under the optimised SFE operating conditions was of good quality in terms of its fatty acid constituents.
Devikanniga, D; Joshua Samuel Raj, R
2018-04-01
Osteoporosis is a life threatening disease which commonly affects women mostly after their menopause. It primarily causes mild bone fractures, which on advanced stage leads to the death of an individual. The diagnosis of osteoporosis is done based on bone mineral density (BMD) values obtained through various clinical methods experimented from various skeletal regions. The main objective of the authors' work is to develop a hybrid classifier model that discriminates the osteoporotic patient from healthy person, based on BMD values. In this Letter, the authors propose the monarch butterfly optimisation-based artificial neural network classifier which helps in earlier diagnosis and prevention of osteoporosis. The experiments were conducted using 10-fold cross-validation method for two datasets lumbar spine and femoral neck. The results were compared with other similar hybrid approaches. The proposed method resulted with the accuracy, specificity and sensitivity of 97.9% ± 0.14, 98.33% ± 0.03 and 95.24% ± 0.08, respectively, for lumbar spine dataset and 99.3% ± 0.16%, 99.2% ± 0.13 and 100, respectively, for femoral neck dataset. Further, its performance is compared using receiver operating characteristics analysis and Wilcoxon signed-rank test. The results proved that the proposed classifier is efficient and it outperformed the other approaches in all the cases.
Evolving aerodynamic airfoils for wind turbines through a genetic algorithm
NASA Astrophysics Data System (ADS)
Hernández, J. J.; Gómez, E.; Grageda, J. I.; Couder, C.; Solís, A.; Hanotel, C. L.; Ledesma, JI
2017-01-01
Nowadays, genetic algorithms stand out for airfoil optimisation, due to the virtues of mutation and crossing-over techniques. In this work we propose a genetic algorithm with arithmetic crossover rules. The optimisation criteria are taken to be the maximisation of both aerodynamic efficiency and lift coefficient, while minimising drag coefficient. Such algorithm shows greatly improvements in computational costs, as well as a high performance by obtaining optimised airfoils for Mexico City's specific wind conditions from generic wind turbines designed for higher Reynolds numbers, in few iterations.
Exemples d’utilisation des techniques d’optimisation en calcul de structures de reacteurs
2003-03-01
34~ optimisation g~om~trique (architecture fig~e) A la difference du secteur automobile et des avionneurs, la plupart des composants des r~acteurs n...utilise des lois de comportement mat~riaux non lin~aires ainsi que des hypotheses de grands d~placements. Ltude d’optimisation consiste ý minimiser...un disque simple et d~cid6 de s~lectionner trois param~tes qui influent sur la rupture : 1paisseur de la toile du disque ElI, la hauteur L3 et la
Microfluidic converging/diverging channels optimised for homogeneous extensional deformation.
Zografos, K; Pimenta, F; Alves, M A; Oliveira, M S N
2016-07-01
In this work, we optimise microfluidic converging/diverging geometries in order to produce constant strain-rates along the centreline of the flow, for performing studies under homogeneous extension. The design is examined for both two-dimensional and three-dimensional flows where the effects of aspect ratio and dimensionless contraction length are investigated. Initially, pressure driven flows of Newtonian fluids under creeping flow conditions are considered, which is a reasonable approximation in microfluidics, and the limits of the applicability of the design in terms of Reynolds numbers are investigated. The optimised geometry is then used for studying the flow of viscoelastic fluids and the practical limitations in terms of Weissenberg number are reported. Furthermore, the optimisation strategy is also applied for electro-osmotic driven flows, where the development of a plug-like velocity profile allows for a wider region of homogeneous extensional deformation in the flow field.
NASA Astrophysics Data System (ADS)
Bhansali, Gaurav; Singh, Bhanu Pratap; Kumar, Rajesh
2016-09-01
In this paper, the problem of microgrid optimisation with storage has been addressed in an unaccounted way rather than confining it to loss minimisation. Unitised regenerative fuel cell (URFC) systems have been studied and employed in microgrids to store energy and feed it back into the system when required. A value function-dependent on line losses, URFC system operational cost and stored energy at the end of the day are defined here. The function is highly complex, nonlinear and multi dimensional in nature. Therefore, heuristic optimisation techniques in combination with load flow analysis are used here to resolve the network and time domain complexity related with the problem. Particle swarm optimisation with the forward/backward sweep algorithm ensures optimal operation of microgrid thereby minimising the operational cost of the microgrid. Results are shown and are found to be consistently improving with evolution of the solution strategy.
NASA Astrophysics Data System (ADS)
Böing, F.; Murmann, A.; Pellinger, C.; Bruckmeier, A.; Kern, T.; Mongin, T.
2018-02-01
The expansion of capacities in the German transmission grid is a necessity for further integration of renewable energy sources into the electricity sector. In this paper, the grid optimisation measures ‘Overhead Line Monitoring’, ‘Power-to-Heat’ and ‘Demand Response in the Industry’ are evaluated and compared against conventional grid expansion for the year 2030. Initially, the methodical approach of the simulation model is presented and detailed descriptions of the grid model and the used grid data, which partly originates from open-source platforms, are provided. Further, this paper explains how ‘Curtailment’ and ‘Redispatch’ can be reduced by implementing grid optimisation measures and how the depreciation of economic costs can be determined considering construction costs. The developed simulations show that the conventional grid expansion is more efficient and implies more grid relieving effects than the evaluated grid optimisation measures.
VLSI Technology for Cognitive Radio
NASA Astrophysics Data System (ADS)
VIJAYALAKSHMI, B.; SIDDAIAH, P.
2017-08-01
One of the most challenging tasks of cognitive radio is the efficiency in the spectrum sensing scheme to overcome the spectrum scarcity problem. The popular and widely used spectrum sensing technique is the energy detection scheme as it is very simple and doesn’t require any previous information related to the signal. We propose one such approach which is an optimised spectrum sensing scheme with reduced filter structure. The optimisation is done in terms of area and power performance of the spectrum. The simulations of the VLSI structure of the optimised flexible spectrum is done using verilog coding by using the XILINX ISE software. Our method produces performance with 13% reduction in area and 66% reduction in power consumption in comparison to the flexible spectrum sensing scheme. All the results are tabulated and comparisons are made. A new scheme for optimised and effective spectrum sensing opens up with our model.
NASA Astrophysics Data System (ADS)
Grady, A.; Makarigakis, A.; Gersonius, B.
2015-09-01
This paper investigates how to optimise decentralisation for effective disaster risk reduction (DRR) in developing states. There is currently limited literature on empirical analysis of decentralisation for DRR. This paper evaluates decentralised governance for DRR in the case study of Indonesia and provides recommendations for its optimisation. Wider implications are drawn to optimise decentralisation for DRR in developing states more generally. A framework to evaluate the institutional and policy setting was developed which necessitated the use of a gap analysis, desk study and field investigation. Key challenges to decentralised DRR include capacity gaps at lower levels, low compliance with legislation, disconnected policies, issues in communication and coordination and inadequate resourcing. DRR authorities should lead coordination and advocacy on DRR. Sustainable multistakeholder platforms and civil society organisations should fill the capacity gap at lower levels. Dedicated and regulated resources for DRR should be compulsory.
A management and optimisation model for water supply planning in water deficit areas
NASA Astrophysics Data System (ADS)
Molinos-Senante, María; Hernández-Sancho, Francesc; Mocholí-Arce, Manuel; Sala-Garrido, Ramón
2014-07-01
The integrated water resources management approach has proven to be a suitable option for efficient, equitable and sustainable water management. In water-poor regions experiencing acute and/or chronic shortages, optimisation techniques are a useful tool for supporting the decision process of water allocation. In order to maximise the value of water use, an optimisation model was developed which involves multiple supply sources (conventional and non-conventional) and multiple users. Penalties, representing monetary losses in the event of an unfulfilled water demand, have been incorporated into the objective function. This model represents a novel approach which considers water distribution efficiency and the physical connections between water supply and demand points. Subsequent empirical testing using data from a Spanish Mediterranean river basin demonstrated the usefulness of the global optimisation model to solve existing water imbalances at the river basin level.
Microfluidic converging/diverging channels optimised for homogeneous extensional deformation
Zografos, K.; Oliveira, M. S. N.
2016-01-01
In this work, we optimise microfluidic converging/diverging geometries in order to produce constant strain-rates along the centreline of the flow, for performing studies under homogeneous extension. The design is examined for both two-dimensional and three-dimensional flows where the effects of aspect ratio and dimensionless contraction length are investigated. Initially, pressure driven flows of Newtonian fluids under creeping flow conditions are considered, which is a reasonable approximation in microfluidics, and the limits of the applicability of the design in terms of Reynolds numbers are investigated. The optimised geometry is then used for studying the flow of viscoelastic fluids and the practical limitations in terms of Weissenberg number are reported. Furthermore, the optimisation strategy is also applied for electro-osmotic driven flows, where the development of a plug-like velocity profile allows for a wider region of homogeneous extensional deformation in the flow field. PMID:27478523
Optimisation of confinement in a fusion reactor using a nonlinear turbulence model
NASA Astrophysics Data System (ADS)
Highcock, E. G.; Mandell, N. R.; Barnes, M.
2018-04-01
The confinement of heat in the core of a magnetic fusion reactor is optimised using a multidimensional optimisation algorithm. For the first time in such a study, the loss of heat due to turbulence is modelled at every stage using first-principles nonlinear simulations which accurately capture the turbulent cascade and large-scale zonal flows. The simulations utilise a novel approach, with gyrofluid treatment of the small-scale drift waves and gyrokinetic treatment of the large-scale zonal flows. A simple near-circular equilibrium with standard parameters is chosen as the initial condition. The figure of merit, fusion power per unit volume, is calculated, and then two control parameters, the elongation and triangularity of the outer flux surface, are varied, with the algorithm seeking to optimise the chosen figure of merit. A twofold increase in the plasma power per unit volume is achieved by moving to higher elongation and strongly negative triangularity.
A Strategy-Based Approach towards Optimising Research Output
ERIC Educational Resources Information Center
Lues, L.
2013-01-01
The South African higher education fraternity has experienced an outflow of senior research capacity during the past decade, resulting in a large influx of younger and less-published academics. More emphasis is therefore placed on the role of the central institution in ensuring research output. The Faculty of Economic and Management Sciences at a…
The Death of Socrates: Managerialism, Metrics and Bureaucratisation in Universities
ERIC Educational Resources Information Center
Orr, Yancey; Orr, Raymond
2016-01-01
Neoliberalism exults the ability of unregulated markets to optimise human relations. Yet, as David Graeber has recently illustrated, it is paradoxically built on rigorous systems of rules, metrics and managers. The potential transition to a market-based tuition and research-funding model for higher education in Australia has, not surprisingly,…
A Pilot Study of the Epistemological Beliefs of Students in Industrial-Technical Fields
ERIC Educational Resources Information Center
Zinn, Bernd
2012-01-01
An investigation of the epistemological beliefs of apprentices in the commercial engineering sector is of interest for vocational training, both from the point of view of optimising vocational didactic processes as well as in terms of communicating suitable knowledge based beliefs about principles and performance in the commercial engineering…
NASA Astrophysics Data System (ADS)
Lin, Yi-Kuei; Yeh, Cheng-Ta
2013-05-01
From the perspective of supply chain management, the selected carrier plays an important role in freight delivery. This article proposes a new criterion of multi-commodity reliability and optimises the carrier selection based on such a criterion for logistics networks with routes and nodes, over which multiple commodities are delivered. Carrier selection concerns the selection of exactly one carrier to deliver freight on each route. The capacity of each carrier has several available values associated with a probability distribution, since some of a carrier's capacity may be reserved for various orders. Therefore, the logistics network, given any carrier selection, is a multi-commodity multi-state logistics network. Multi-commodity reliability is defined as a probability that the logistics network can satisfy a customer's demand for various commodities, and is a performance indicator for freight delivery. To solve this problem, this study proposes an optimisation algorithm that integrates genetic algorithm, minimal paths and Recursive Sum of Disjoint Products. A practical example in which multi-sized LCD monitors are delivered from China to Germany is considered to illustrate the solution procedure.
Optimising value from the soft re-use of brownfield sites.
Bardos, R Paul; Jones, Sarah; Stephenson, Ian; Menger, Pierre; Beumer, Victor; Neonato, Francesca; Maring, Linda; Ferber, Uwe; Track, Thomas; Wendler, Katja
2016-09-01
Soft re-use of brownfields describes intended temporary or final re-uses of brownfield sites which are not based on built constructions or infrastructure ('hard' re-use). Examples of soft re-uses include the creation of public green space. These are essentially uses where the soil is not sealed. Often the case for soft re-use of brownfields has not been easy to demonstrate in strictly financial terms. The purpose of this paper is to describe a value based approach to identify and optimise services provided by the restoration of brownfields to soft re-uses, on a permanent or interim basis. A 'Brownfield Opportunity Matrix' is suggested as means of identifying and discussing soft restoration opportunities. The use of 'sustainability linkages' is suggested as a means of understanding the sustainability of the services under consideration and providing a structure for the overall valuation of restoration work, for example as part of design or option appraisal processes, or to support the solicitation of interest in a project. Copyright © 2015 Elsevier B.V. All rights reserved.
Inauen, Alice; Rettke, Horst; Fridrich, Annemarie; Spirig, Rebecca; Bauer, Georg F
2017-01-01
Background: Due to scarce resources in health care, staff deployment has to meet the demands. To optimise skill-grade-mix, a Swiss University Hospital initiated a project based on principles of Lean Management. The project team accompanied each participating nursing department and scientifically evaluated the results of the project. Aim: The aim of this qualitative sub-study was to identify critical success factors of this project. Method: In four focus groups, participants discussed their experience of the project. Recruitment was performed from departments assessing the impact of the project retrospectively either positive or critical. In addition, the degree of direct involvement in the project served as a distinguishing criterion. Results: While the degree of direct involvement in the project was not decisive, conflicting opinions and experiences appeared in the groups with more positive or critical project evaluation. Transparency, context and attitude proved critical for the project’s success. Conclusions: Project managers should ensure transparency of the project’s progress and matching of the project structure with local conditions in order to support participants in their critical or positive attitude towards the project.
Radiation exposure in X-ray-based imaging techniques used in osteoporosis
Adams, Judith E.; Guglielmi, Giuseppe; Link, Thomas M.
2010-01-01
Recent advances in medical X-ray imaging have enabled the development of new techniques capable of assessing not only bone quantity but also structure. This article provides (a) a brief review of the current X-ray methods used for quantitative assessment of the skeleton, (b) data on the levels of radiation exposure associated with these methods and (c) information about radiation safety issues. Radiation doses associated with dual-energy X-ray absorptiometry are very low. However, as with any X-ray imaging technique, each particular examination must always be clinically justified. When an examination is justified, the emphasis must be on dose optimisation of imaging protocols. Dose optimisation is more important for paediatric examinations because children are more vulnerable to radiation than adults. Methods based on multi-detector CT (MDCT) are associated with higher radiation doses. New 3D volumetric hip and spine quantitative computed tomography (QCT) techniques and high-resolution MDCT for evaluation of bone structure deliver doses to patients from 1 to 3 mSv. Low-dose protocols are needed to reduce radiation exposure from these methods and minimise associated health risks. PMID:20559834
The role of predictive uncertainty in the operational management of reservoirs
NASA Astrophysics Data System (ADS)
Todini, E.
2014-09-01
The present work deals with the operational management of multi-purpose reservoirs, whose optimisation-based rules are derived, in the planning phase, via deterministic (linear and nonlinear programming, dynamic programming, etc.) or via stochastic (generally stochastic dynamic programming) approaches. In operation, the resulting deterministic or stochastic optimised operating rules are then triggered based on inflow predictions. In order to fully benefit from predictions, one must avoid using them as direct inputs to the reservoirs, but rather assess the "predictive knowledge" in terms of a predictive probability density to be operationally used in the decision making process for the estimation of expected benefits and/or expected losses. Using a theoretical and extremely simplified case, it will be shown why directly using model forecasts instead of the full predictive density leads to less robust reservoir management decisions. Moreover, the effectiveness and the tangible benefits for using the entire predictive probability density instead of the model predicted values will be demonstrated on the basis of the Lake Como management system, operational since 1997, as well as on the basis of a case study on the lake of Aswan.
Mitchell, P; Korobelnik, J-F; Lanzetta, P; Holz, F G; Prünte, C; Schmidt-Erfurth, U; Tano, Y; Wolf, S
2010-01-01
Neovascular age-related macular degeneration (AMD) has a poor prognosis if left untreated, frequently resulting in legal blindness. Ranibizumab is approved for treating neovascular AMD. However, further guidance is needed to assist ophthalmologists in clinical practice to optimise treatment outcomes. An international retina expert panel assessed evidence available from prospective, multicentre studies evaluating different ranibizumab treatment schedules (ANCHOR, MARINA, PIER, SAILOR, SUSTAIN and EXCITE) and a literature search to generate evidence-based and consensus recommendations for treatment indication and assessment, retreatment and monitoring. Ranibizumab is indicated for choroidal neovascular lesions with active disease, the clinical parameters of which are outlined. Treatment initiation with three consecutive monthly injections, followed by continued monthly injections, has provided the best visual-acuity outcomes in pivotal clinical trials. If continued monthly injections are not feasible after initiation, a flexible strategy appears viable, with monthly monitoring of lesion activity recommended. Initiation regimens of fewer than three injections have not been assessed. Continuous careful monitoring with flexible retreatment may help avoid vision loss recurring. Standardised biomarkers need to be determined. Evidence-based guidelines will help to optimise treatment outcomes with ranibizumab in neovascular AMD.
Coil optimisation for transcranial magnetic stimulation in realistic head geometry.
Koponen, Lari M; Nieminen, Jaakko O; Mutanen, Tuomas P; Stenroos, Matti; Ilmoniemi, Risto J
Transcranial magnetic stimulation (TMS) allows focal, non-invasive stimulation of the cortex. A TMS pulse is inherently weakly coupled to the cortex; thus, magnetic stimulation requires both high current and high voltage to reach sufficient intensity. These requirements limit, for example, the maximum repetition rate and the maximum number of consecutive pulses with the same coil due to the rise of its temperature. To develop methods to optimise, design, and manufacture energy-efficient TMS coils in realistic head geometry with an arbitrary overall coil shape. We derive a semi-analytical integration scheme for computing the magnetic field energy of an arbitrary surface current distribution, compute the electric field induced by this distribution with a boundary element method, and optimise a TMS coil for focal stimulation. Additionally, we introduce a method for manufacturing such a coil by using Litz wire and a coil former machined from polyvinyl chloride. We designed, manufactured, and validated an optimised TMS coil and applied it to brain stimulation. Our simulations indicate that this coil requires less than half the power of a commercial figure-of-eight coil, with a 41% reduction due to the optimised winding geometry and a partial contribution due to our thinner coil former and reduced conductor height. With the optimised coil, the resting motor threshold of abductor pollicis brevis was reached with the capacitor voltage below 600 V and peak current below 3000 A. The described method allows designing practical TMS coils that have considerably higher efficiency than conventional figure-of-eight coils. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hurford, Anthony; Harou, Julien
2014-05-01
Water related eco-system services are important to the livelihoods of the poorest sectors of society in developing countries. Degradation or loss of these services can increase the vulnerability of people decreasing their capacity to support themselves. New approaches to help guide water resources management decisions are needed which account for the non-market value of ecosystem goods and services. In case studies from Brazil and Kenya we demonstrate the capability of many objective Pareto-optimal trade-off analysis to help decision makers balance economic and non-market benefits from the management of existing multi-reservoir systems. A multi-criteria search algorithm is coupled to a water resources management simulator of each basin to generate a set of Pareto-approximate trade-offs representing the best case management decisions. In both cases, volume dependent reservoir release rules are the management decisions being optimised. In the Kenyan case we further assess the impacts of proposed irrigation investments, and how the possibility of new investments impacts the system's trade-offs. During the multi-criteria search (optimisation), performance of different sets of management decisions (policies) is assessed against case-specific objective functions representing provision of water supply and irrigation, hydropower generation and maintenance of ecosystem services. Results are visualised as trade-off surfaces to help decision makers understand the impacts of different policies on a broad range of stakeholders and to assist in decision-making. These case studies show how the approach can reveal unexpected opportunities for win-win solutions, and quantify the trade-offs between investing to increase agricultural revenue and negative impacts on protected ecosystems which support rural livelihoods.
NASA Astrophysics Data System (ADS)
Artous, Sébastien; Zimmermann, Eric; Douissard, Paul-Antoine; Locatelli, Dominique; Motellier, Sylvie; Derrough, Samir
2015-05-01
The implementation in many products of manufactured nanoparticles is growing fast and raises new questions. For this purpose, the CEA - NanoSafety Platform is developing various research topics for health and safety, environment and nanoparticles exposure in professional activities. The containment optimisation for the exposition lowering, then the exposure assessment to nanoparticles is a strategy for safety improvement at workplace and workspace. The lowering step consists in an optimisation of dynamic and static containment at workplace and/or workspace. Generally, the exposure risk due to the presence of nanoparticles substances does not allow modifying the parameters of containment at workplace and/or workspace. Therefore, gaseous or nanoparticulate tracers are used to evaluate performances of containment. Using a tracer allows to modify safely the parameters of the dynamic containment (ventilation, flow, speed) and to study several configurations of static containment. Moreover, a tracer allows simulating accidental or incidental situation. As a result, a safety procedure can be written more easily in order to manage this type of situation. The step of measurement and characterization of aerosols can therefore be used to assess the exposition at workplace and workspace. The case of study, aim of this paper, concerns the potential emission of Lead nanoparticles at the exhaust of a furnace in an epitaxy laboratory. The use of Helium tracer to evaluate the performance of containment is firstly studied. Secondly, the exposure assessment is characterised in accordance with the French guide “recommendations for characterizing potential emissions and exposure to aerosols released from nanomaterials in workplace operations”. Thirdly the aerosols are sampled, on several places, using collection membranes to try to detect traces of Lead in air.
NASA Astrophysics Data System (ADS)
Wang, Hui; Chen, Huansheng; Wu, Qizhong; Lin, Junmin; Chen, Xueshun; Xie, Xinwei; Wang, Rongrong; Tang, Xiao; Wang, Zifa
2017-08-01
The Global Nested Air Quality Prediction Modeling System (GNAQPMS) is the global version of the Nested Air Quality Prediction Modeling System (NAQPMS), which is a multi-scale chemical transport model used for air quality forecast and atmospheric environmental research. In this study, we present the porting and optimisation of GNAQPMS on a second-generation Intel Xeon Phi processor, codenamed Knights Landing
(KNL). Compared with the first-generation Xeon Phi coprocessor (codenamed Knights Corner, KNC), KNL has many new hardware features such as a bootable processor, high-performance in-package memory and ISA compatibility with Intel Xeon processors. In particular, we describe the five optimisations we applied to the key modules of GNAQPMS, including the CBM-Z gas-phase chemistry, advection, convection and wet deposition modules. These optimisations work well on both the KNL 7250 processor and the Intel Xeon E5-2697 V4 processor. They include (1) updating the pure Message Passing Interface (MPI) parallel mode to the hybrid parallel mode with MPI and OpenMP in the emission, advection, convection and gas-phase chemistry modules; (2) fully employing the 512 bit wide vector processing units (VPUs) on the KNL platform; (3) reducing unnecessary memory access to improve cache efficiency; (4) reducing the thread local storage (TLS) in the CBM-Z gas-phase chemistry module to improve its OpenMP performance; and (5) changing the global communication from writing/reading interface files to MPI functions to improve the performance and the parallel scalability. These optimisations greatly improved the GNAQPMS performance. The same optimisations also work well for the Intel Xeon Broadwell processor, specifically E5-2697 v4. Compared with the baseline version of GNAQPMS, the optimised version was 3.51 × faster on KNL and 2.77 × faster on the CPU. Moreover, the optimised version ran at 26 % lower average power on KNL than on the CPU. With the combined performance and energy improvement, the KNL platform was 37.5 % more efficient on power consumption compared with the CPU platform. The optimisations also enabled much further parallel scalability on both the CPU cluster and the KNL cluster scaled to 40 CPU nodes and 30 KNL nodes, with a parallel efficiency of 70.4 and 42.2 %, respectively.
Preece, Stephen J; Chapman, Jonathan D; Braunstein, Bjoern; Brüggemann, Gert-Peter; Nester, Christopher J
2017-01-01
Appropriate footwear for individuals with diabetes but no ulceration history could reduce the risk of first ulceration. However, individuals who deem themselves at low risk are unlikely to seek out bespoke footwear which is personalised. Therefore, our primary aim was to investigate whether group-optimised footwear designs, which could be prefabricated and delivered in a retail setting, could achieve appropriate pressure reduction, or whether footwear selection must be on a patient-by-patient basis. A second aim was to compare responses to footwear design between healthy participants and people with diabetes in order to understand the transferability of previous footwear research, performed in healthy populations. Plantar pressures were recorded from 102 individuals with diabetes, considered at low risk of ulceration. This cohort included 17 individuals with peripheral neuropathy. We also collected data from 66 healthy controls. Each participant walked in 8 rocker shoe designs (4 apex positions × 2 rocker angles). ANOVA analysis was then used to understand the effect of two design features and descriptive statistics used to identify the group-optimised design. Using 200 kPa as a target, this group-optimised design was then compared to the design identified as the best for each participant (using plantar pressure data). Peak plantar pressure increased significantly as apex position was moved distally and rocker angle reduced ( p < 0.001). The group-optimised design incorporated an apex at 52% of shoe length, a 20° rocker angle and an apex angle of 95°. With this design 71-81% of peak pressures were below the 200 kPa threshold, both in the full cohort of individuals with diabetes and also in the neuropathic subgroup. Importantly, only small increases (<5%) in this proportion were observed when participants wore footwear which was individually selected. In terms of optimised footwear designs, healthy participants demonstrated the same response as participants with diabetes, despite having lower plantar pressures. This is the first study demonstrating that a group-optimised, generic rocker shoe might perform almost as well as footwear selected on a patient by patient basis in a low risk patient group. This work provides a starting point for clinical evaluation of generic versus personalised pressure reducing footwear.
NASA Astrophysics Data System (ADS)
Moore, Craig S.; Wood, Tim J.; Saunderson, John R.; Beavis, Andrew W.
2017-09-01
The use of computer simulated digital x-radiographs for optimisation purposes has become widespread in recent years. To make these optimisation investigations effective, it is vital simulated radiographs contain accurate anatomical and system noise. Computer algorithms that simulate radiographs based solely on the incident detector x-ray intensity (‘dose’) have been reported extensively in the literature. However, while it has been established for digital mammography that x-ray beam quality is an important factor when modelling noise in simulated images there are no such studies for diagnostic imaging of the chest, abdomen and pelvis. This study investigates the influence of beam quality on image noise in a digital radiography (DR) imaging system, and incorporates these effects into a digitally reconstructed radiograph (DRR) computer simulator. Image noise was measured on a real DR imaging system as a function of dose (absorbed energy) over a range of clinically relevant beam qualities. Simulated ‘absorbed energy’ and ‘beam quality’ DRRs were then created for each patient and tube voltage under investigation. Simulated noise images, corrected for dose and beam quality, were subsequently produced from the absorbed energy and beam quality DRRs, using the measured noise, absorbed energy and beam quality relationships. The noise images were superimposed onto the noiseless absorbed energy DRRs to create the final images. Signal-to-noise measurements in simulated chest, abdomen and spine images were within 10% of the corresponding measurements in real images. This compares favourably to our previous algorithm where images corrected for dose only were all within 20%.
Marsac, L; Chauvet, D; La Greca, R; Boch, A-L; Chaumoitre, K; Tanter, M; Aubry, J-F
2017-09-01
Transcranial brain therapy has recently emerged as a non-invasive strategy for the treatment of various neurological diseases, such as essential tremor or neurogenic pain. However, treatments require millimetre-scale accuracy. The use of high frequencies (typically ≥1 MHz) decreases the ultrasonic wavelength to the millimetre scale, thereby increasing the clinical accuracy and lowering the probability of cavitation, which improves the safety of the technique compared with the use of low-frequency devices that operate at 220 kHz. Nevertheless, the skull produces greater distortions of high-frequency waves relative to low-frequency waves. High-frequency waves require high-performance adaptive focusing techniques, based on modelling the wave propagation through the skull. This study sought to optimise the acoustical modelling of the skull based on computed tomography (CT) for a 1 MHz clinical brain therapy system. The best model tested in this article corresponded to a maximum speed of sound of 4000 m.s -1 in the skull bone, and it restored 86% of the optimal pressure amplitude on average in a collection of six human skulls. Compared with uncorrected focusing, the optimised non-invasive correction led to an average increase of 99% in the maximum pressure amplitude around the target and an average decrease of 48% in the distance between the peak pressure and the selected target. The attenuation through the skulls was also assessed within the bandwidth of the transducers, and it was found to vary in the range of 10 ± 3 dB at 800 kHz and 16 ± 3 dB at 1.3 MHz.
Comparisons of the utility of researcher-defined and participant-defined successful ageing.
Brown, Lynsey J; Bond, Malcolm J
2016-03-01
To investigate the impact of different approaches for measuring 'successful ageing', four alternative researcher and participant definitions were compared, including a novel measure informed by cluster analysis. Rates of successful ageing were explored, as were their relative associations with age and measures of successful adaptation, to assess construct validity. Participants, aged over 65, were recruited from community-based organisations. Questionnaires (assessing successful ageing, lifestyle activities and selective optimisation with compensation) were completed by 317 individuals. Successful ageing ranged from 11.4% to 87.4%, with higher rates evident from participant definitions. Though dependent upon the definition, successful agers were typically younger, reported greater engagement with lifestyle activities and more frequent optimisation. While the current study suggested an improved classification algorithm using a common research definition, future research should explore how subjective and objective aspects of successful ageing may be combined to derive a measure relevant to policy and practice. © 2016 AJA Inc.
Bock, I; Raveh-Amit, H; Losonczi, E; Carstea, A C; Feher, A; Mashayekhi, K; Matyas, S; Dinnyes, A; Pribenszky, C
2016-04-01
The efficiency of various assisted reproductive techniques can be improved by preconditioning the gametes and embryos with sublethal hydrostatic pressure treatment. However, the underlying molecular mechanism responsible for this protective effect remains unknown and requires further investigation. Here, we studied the effect of optimised hydrostatic pressure treatment on the global gene expression of mouse oocytes after embryonic genome activation. Based on a gene expression microarray analysis, a significant effect of treatment was observed in 4-cell embryos derived from treated oocytes, revealing a transcriptional footprint of hydrostatic pressure-affected genes. Functional analysis identified numerous genes involved in protein synthesis that were downregulated in 4-cell embryos in response to hydrostatic pressure treatment, suggesting that regulation of translation has a major role in optimised hydrostatic pressure-induced stress tolerance. We present a comprehensive microarray analysis and further delineate a potential mechanism responsible for the protective effect of hydrostatic pressure treatment.
Optimised Iteration in Coupled Monte Carlo - Thermal-Hydraulics Calculations
NASA Astrophysics Data System (ADS)
Hoogenboom, J. Eduard; Dufek, Jan
2014-06-01
This paper describes an optimised iteration scheme for the number of neutron histories and the relaxation factor in successive iterations of coupled Monte Carlo and thermal-hydraulic reactor calculations based on the stochastic iteration method. The scheme results in an increasing number of neutron histories for the Monte Carlo calculation in successive iteration steps and a decreasing relaxation factor for the spatial power distribution to be used as input to the thermal-hydraulics calculation. The theoretical basis is discussed in detail and practical consequences of the scheme are shown, among which a nearly linear increase per iteration of the number of cycles in the Monte Carlo calculation. The scheme is demonstrated for a full PWR type fuel assembly. Results are shown for the axial power distribution during several iteration steps. A few alternative iteration method are also tested and it is concluded that the presented iteration method is near optimal.
Optimisation of assembly scheduling in VCIM systems using genetic algorithm
NASA Astrophysics Data System (ADS)
Dao, Son Duy; Abhary, Kazem; Marian, Romeo
2017-09-01
Assembly plays an important role in any production system as it constitutes a significant portion of the lead time and cost of a product. Virtual computer-integrated manufacturing (VCIM) system is a modern production system being conceptually developed to extend the application of traditional computer-integrated manufacturing (CIM) system to global level. Assembly scheduling in VCIM systems is quite different from one in traditional production systems because of the difference in the working principles of the two systems. In this article, the assembly scheduling problem in VCIM systems is modeled and then an integrated approach based on genetic algorithm (GA) is proposed to search for a global optimised solution to the problem. Because of dynamic nature of the scheduling problem, a novel GA with unique chromosome representation and modified genetic operations is developed herein. Robustness of the proposed approach is verified by a numerical example.
Optimising, generalising and integrating educational practice using neuroscience
NASA Astrophysics Data System (ADS)
Colvin, Robert
2016-07-01
Practical collaboration at the intersection of education and neuroscience research is difficult because the combined discipline encompasses both the activity of microscopic neurons and the complex social interactions of teachers and students in a classroom. Taking a pragmatic view, this paper discusses three education objectives to which neuroscience can be effectively applied: optimising, generalising and integrating instructional techniques. These objectives are characterised by: (1) being of practical importance; (2) building on existing education and cognitive research; and (3) being infeasible to address based on behavioural experiments alone. The focus of the neuroscientific aspect of collaborative research should be on the activity of the brain before, during and after learning a task, as opposed to performance of a task. The objectives are informed by literature that highlights possible pitfalls with educational neuroscience research, and are described with respect to the static and dynamic aspects of brain physiology that can be measured by current technology.
Design and analysis of magneto rheological fluid brake for an all terrain vehicle
NASA Astrophysics Data System (ADS)
George, Luckachan K.; Tamilarasan, N.; Thirumalini, S.
2018-02-01
This work presents an optimised design for a magneto rheological fluid brake for all terrain vehicles. The actuator consists of a disk which is immersed in the magneto rheological fluid surrounded by an electromagnet. The braking torque is controlled by varying the DC current applied to the electromagnet. In the presence of a magnetic field, the magneto rheological fluid particle aligns in a chain like structure, thus increasing the viscosity. The shear stress generated causes friction in the surfaces of the rotating disk. Electromagnetic analysis of the proposed system is carried out using finite element based COMSOL multi-physics software and the amount of magnetic field generated is calculated with the help of COMSOL. The geometry is optimised and performance of the system in terms of braking torque is carried out. Proposed design reveals better performance in terms of braking torque from the existing literature.
Scale-up and economic analysis of biodiesel production from municipal primary sewage sludge.
Olkiewicz, Magdalena; Torres, Carmen M; Jiménez, Laureano; Font, Josep; Bengoa, Christophe
2016-08-01
Municipal wastewater sludge is a promising lipid feedstock for biodiesel production, but the need to eliminate the high water content before lipid extraction is the main limitation for scaling up. This study evaluates the economic feasibility of biodiesel production directly from liquid primary sludge based on experimental data at laboratory scale. Computational tools were used for the modelling of the process scale-up and the different configurations of lipid extraction to optimise this step, as it is the most expensive. The operational variables with a major influence in the cost were the extraction time and the amount of solvent. The optimised extraction process had a break-even price of biodiesel of 1232 $/t, being economically competitive with the current cost of fossil diesel. The proposed biodiesel production process from waste sludge eliminates the expensive step of sludge drying, lowering the biodiesel price. Copyright © 2016 Elsevier Ltd. All rights reserved.
Casemix Funding Optimisation: Working Together to Make the Most of Every Episode.
Uzkuraitis, Carly; Hastings, Karen; Torney, Belinda
2010-10-01
Eastern Health, a large public Victorian Healthcare network, conducted a WIES optimisation audit across the casemix-funded sites for separations in the 2009/2010 financial year. The audit was conducted using existing staff resources and resulted in a significant increase in casemix funding at a minimal cost. The audit showcased the skill set of existing staff and resulted in enormous benefits to the coding and casemix team by demonstrating the value of the combination of skills that makes clinical coders unique. The development of an internal web-based application allowed accurate and timely reporting of the audit results, providing the basis for a restructure of the coding and casemix service, along with approval for additional staffing resources and inclusion of a regular auditing program to focus on the creation of high quality data for research, health services management and financial reimbursement.
SASS Applied to Optimum Work Roll Profile Selection in the Hot Rolling of Wide Steel
NASA Astrophysics Data System (ADS)
Nolle, Lars
The quality of steel strip produced in a wide strip rolling mill depends heavily on the careful selection of initial ground work roll profiles for each of the mill stands in the finishing train. In the past, these profiles were determined by human experts, based on their knowledge and experience. In previous work, the profiles were successfully optimised using a self-organising migration algorithm (SOMA). In this research, SASS, a novel heuristic optimisation algorithm that has only one control parameter, has been used to find the optimum profiles for a simulated rolling mill. The resulting strip quality produced using the profiles found by SASS is compared with results from previous work and the quality produced using the original profile specifications. The best set of profiles found by SASS clearly outperformed the original set and performed equally well as SOMA without the need of finding a suitable set of control parameters.
Multi-terminal pipe routing by Steiner minimal tree and particle swarm optimisation
NASA Astrophysics Data System (ADS)
Liu, Qiang; Wang, Chengen
2012-08-01
Computer-aided design of pipe routing is of fundamental importance for complex equipments' developments. In this article, non-rectilinear branch pipe routing with multiple terminals that can be formulated as a Euclidean Steiner Minimal Tree with Obstacles (ESMTO) problem is studied in the context of an aeroengine-integrated design engineering. Unlike the traditional methods that connect pipe terminals sequentially, this article presents a new branch pipe routing algorithm based on the Steiner tree theory. The article begins with a new algorithm for solving the ESMTO problem by using particle swarm optimisation (PSO), and then extends the method to the surface cases by using geodesics to meet the requirements of routing non-rectilinear pipes on the surfaces of aeroengines. Subsequently, the adaptive region strategy and the basic visibility graph method are adopted to increase the computation efficiency. Numeral computations show that the proposed routing algorithm can find satisfactory routing layouts while running in polynomial time.
Martin, S P; Lynch, J M; Reddy, S M
2002-09-01
The benzidines, 3,3'-diaminobenzidine (DAB), 3,3'-dimethoxybenzidine (DMOB) and 3,3',5,5'-tetramethylbenzidine (TMB) were enzymatically oxidised to detect hydrogen peroxide, using the quartz crystal. The oxidised product mainly remains in suspension, resulting in a limited quartz sensor signal. We have used two non-ionic surfactants, Tween 80 and Triton X-100 to interact with the oxidised amphiphilic products to increase their solubility and surface activity, and their ability to adsorb to the crystal surface. Tween 80 exhibits optimised response effects for DAB, DMOB and TMB at 0.012, 0.005, and 0.002% (v/v), respectively, whereas Triton X-100 is optimum at 0.1, 0.2, and 0.006% (v/v), respectively. As a result, we have improved the quartz crystal sensor sensitivity to peroxide. The use of Triton X-100 gave an improved response time.
Battery Cell Balancing Optimisation for Battery Management System
NASA Astrophysics Data System (ADS)
Yusof, M. S.; Toha, S. F.; Kamisan, N. A.; Hashim, N. N. W. N.; Abdullah, M. A.
2017-03-01
Battery cell balancing in every electrical component such as home electronic equipment and electric vehicle is very important to extend battery run time which is simplified known as battery life. The underlying solution to equalize the balance of cell voltage and SOC between the cells when they are in complete charge. In order to control and extend the battery life, the battery cell balancing is design and manipulated in such way as well as shorten the charging process. Active and passive cell balancing strategies as a unique hallmark enables the balancing of the battery with the excellent performances configuration so that the charging process will be faster. The experimental and simulation covers an analysis of how fast the battery can balance for certain time. The simulation based analysis is conducted to certify the use of optimisation in active or passive cell balancing to extend battery life for long periods of time.
Petri-net-based 2D design of DNA walker circuits.
Gilbert, David; Heiner, Monika; Rohr, Christian
2018-01-01
We consider localised DNA computation, where a DNA strand walks along a binary decision graph to compute a binary function. One of the challenges for the design of reliable walker circuits consists in leakage transitions, which occur when a walker jumps into another branch of the decision graph. We automatically identify leakage transitions, which allows for a detailed qualitative and quantitative assessment of circuit designs, design comparison, and design optimisation. The ability to identify leakage transitions is an important step in the process of optimising DNA circuit layouts where the aim is to minimise the computational error inherent in a circuit while minimising the area of the circuit. Our 2D modelling approach of DNA walker circuits relies on coloured stochastic Petri nets which enable functionality, topology and dimensionality all to be integrated in one two-dimensional model. Our modelling and analysis approach can be easily extended to 3-dimensional walker systems.
Fluid Mechanics Optimising Organic Synthesis
NASA Astrophysics Data System (ADS)
Leivadarou, Evgenia; Dalziel, Stuart
2015-11-01
The Vortex Fluidic Device (VFD) is a new ``green'' approach in the synthesis of organic chemicals with many industrial applications in biodiesel generation, cosmetics, protein folding and pharmaceutical production. The VFD is a rapidly rotating tube that can operate with a jet feeding drops of liquid reactants to the base of the tube. The aim of this project is to explain the fluid mechanics of the VFD that influence the rate of reactions. The reaction rate is intimately related to the intense shearing that promotes collision between reactant molecules. In the VFD, the highest shears are found at the bottom of the tube in the Rayleigh and the Ekman layer and at the walls in the Stewardson layers. As a step towards optimising the performance of the VFD we present experiments conducted in order to establish the minimum drop volume and maximum rotation rate for maximum axisymmetric spreading without fingering instability. PhD candidate, Department of Applied Mathematics and Theoretical Physics.
Shape and energy consistent pseudopotentials for correlated electron systems
Needs, R. J.
2017-01-01
A method is developed for generating pseudopotentials for use in correlated-electron calculations. The paradigms of shape and energy consistency are combined and defined in terms of correlated-electron wave-functions. The resulting energy consistent correlated electron pseudopotentials (eCEPPs) are constructed for H, Li–F, Sc–Fe, and Cu. Their accuracy is quantified by comparing the relaxed molecular geometries and dissociation energies which they provide with all electron results, with all quantities evaluated using coupled cluster singles, doubles, and triples calculations. Errors inherent in the pseudopotentials are also compared with those arising from a number of approximations commonly used with pseudopotentials. The eCEPPs provide a significant improvement in optimised geometries and dissociation energies for small molecules, with errors for the latter being an order-of-magnitude smaller than for Hartree-Fock-based pseudopotentials available in the literature. Gaussian basis sets are optimised for use with these pseudopotentials. PMID:28571391
Hardware Design of the Energy Efficient Fall Detection Device
NASA Astrophysics Data System (ADS)
Skorodumovs, A.; Avots, E.; Hofmanis, J.; Korāts, G.
2016-04-01
Health issues for elderly people may lead to different injuries obtained during simple activities of daily living. Potentially the most dangerous are unintentional falls that may be critical or even lethal to some patients due to the heavy injury risk. In the project "Wireless Sensor Systems in Telecare Application for Elderly People", we have developed a robust fall detection algorithm for a wearable wireless sensor. To optimise the algorithm for hardware performance and test it in field, we have designed an accelerometer based wireless fall detector. Our main considerations were: a) functionality - so that the algorithm can be applied to the chosen hardware, and b) power efficiency - so that it can run for a very long time. We have picked and tested the parts, built a prototype, optimised the firmware for lowest consumption, tested the performance and measured the consumption parameters. In this paper, we discuss our design choices and present the results of our work.
NASA Astrophysics Data System (ADS)
Ferretti, S.; Amadori, K.; Boccalatte, A.; Alessandrini, M.; Freddi, A.; Persiani, F.; Poli, G.
2002-01-01
The UNIBO team composed of students and professors of the University of Bologna along with technicians and engineers from Alenia Space Division and Siad Italargon Division, took part in the 3rd Student Parabolic Flight Campaign of the European Space Agency in 2000. It won the student competition and went on to take part in the Professional Parabolic Flight Campaign of May 2001. The experiment focused on "dendritic growth in aluminium alloy weldings", and investigated topics related to the welding process of aluminium in microgravity. The purpose of the research is to optimise the process and to define the areas of interest that could be improved by new conceptual designs. The team performed accurate tests in microgravity to determine which phenomena have the greatest impact on the quality of the weldings with respect to penetration, surface roughness and the microstructures that are formed during the solidification. Various parameters were considered in the economic-technical optimisation, such as the type of electrode and its tip angle. Ground and space tests have determined the optimum chemical composition of the electrodes to offer longest life while maintaining the shape of the point. Additionally, the power consumption has been optimised; this offers opportunities for promoting the product to the customer as well as being environmentally friendly. Tests performed on the Al-Li alloys showed a significant influence of some physical phenomena such as the Marangoni effect and thermal diffusion; predictions have been made on the basis of observations of the thermal flux seen in the stereophotos. Space transportation today is a key element in the construction of space stations and future planetary bases, because the volumes available for launch to space are directly related to the payload capacity of rockets or the Space Shuttle. The research performed gives engineers the opportunity to consider completely new concepts for designing structures for space applications. In fact, once the optimised parameters are defined for welding in space, it could be possible to weld different parts directly in orbit to obtain much larger sizes and volumes, for example for space tourism habitation modules. The second relevant aspect is technology transfer obtained by the optimisation of the TIG process on aluminium which is often used in the automotive industry as well as in mass production markets.
Development of a simple algorithm to guide the effective management of traumatic cardiac arrest.
Lockey, David J; Lyon, Richard M; Davies, Gareth E
2013-06-01
Major trauma is the leading worldwide cause of death in young adults. The mortality from traumatic cardiac arrest remains high but survival with good neurological outcome from cardiopulmonary arrest following major trauma has been regularly reported. Rapid, effective intervention is required to address potential reversible causes of traumatic cardiac arrest if the victim is to survive. Current ILCOR guidelines do not contain a standard algorithm for management of traumatic cardiac arrest. We present a simple algorithm to manage the major trauma patient in actual or imminent cardiac arrest. We reviewed the published English language literature on traumatic cardiac arrest and major trauma management. A treatment algorithm was developed based on this and the experience of treatment of more than a thousand traumatic cardiac arrests by a physician - paramedic pre-hospital trauma service. The algorithm addresses the need treat potential reversible causes of traumatic cardiac arrest. This includes immediate resuscitative thoracotomy in cases of penetrating chest trauma, airway management, optimising oxygenation, correction of hypovolaemia and chest decompression to exclude tension pneumothorax. The requirement to rapidly address a number of potentially reversible pathologies in a short time period lends the management of traumatic cardiac arrest to a simple treatment algorithm. A standardised approach may prevent delay in diagnosis and treatment and improve current poor survival rates. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
The PROactive innovative conceptual framework on physical activity
Dobbels, Fabienne; de Jong, Corina; Drost, Ellen; Elberse, Janneke; Feridou, Chryssoula; Jacobs, Laura; Rabinovich, Roberto; Frei, Anja; Puhan, Milo A.; de Boer, Willem I.; van der Molen, Thys; Williams, Kate; Pinnock, Hillary; Troosters, Thierry; Karlsson, Niklas; Kulich, Karoly; Rüdell, Katja; Brindicci, Caterina; Higenbottam, Tim; Troosters, Thierry; Dobbels, Fabienne; Decramer, Marc; Tabberer, Margaret; Rabinovich, Roberto A; MacNee, William; Vogiatzis, Ioannis; Polkey, Michael; Hopkinson, Nick; Garcia-Aymerich, Judith; Puhan, Milo; Frei, Anja; van der Molen, Thys; de Jong, Corina; de Boer, Pim; Jarrod, Ian; McBride, Paul; Kamel, Nadia; Rudell, Katja; Wilson, Frederick J.; Ivanoff, Nathalie; Kulich, Karoly; Glendenning, Alistair; Karlsson, Niklas X.; Corriol-Rohou, Solange; Nikai, Enkeleida; Erzen, Damijan
2014-01-01
Although physical activity is considered an important therapeutic target in chronic obstructive pulmonary disease (COPD), what “physical activity” means to COPD patients and how their perspective is best measured is poorly understood. We designed a conceptual framework, guiding the development and content validation of two patient reported outcome (PRO) instruments on physical activity (PROactive PRO instruments). 116 patients from four European countries with diverse demographics and COPD phenotypes participated in three consecutive qualitative studies (63% male, age mean±sd 66±9 years, 35% Global Initiative for Chronic Obstructive Lung Disease stage III–IV). 23 interviews and eight focus groups (n = 54) identified the main themes and candidate items of the framework. 39 cognitive debriefings allowed the clarity of the items and instructions to be optimised. Three themes emerged, i.e. impact of COPD on amount of physical activity, symptoms experienced during physical activity, and adaptations made to facilitate physical activity. The themes were similar irrespective of country, demographic or disease characteristics. Iterative rounds of appraisal and refinement of candidate items resulted in 30 items with a daily recall period and 34 items with a 7-day recall period. For the first time, our approach provides comprehensive insight on physical activity from the COPD patients’ perspective. The PROactive PRO instruments’ content validity represents the pivotal basis for empirically based item reduction and validation. PMID:25034563
The dual role of fragments in fragment-assembly methods for de novo protein structure prediction
Handl, Julia; Knowles, Joshua; Vernon, Robert; Baker, David; Lovell, Simon C.
2013-01-01
In fragment-assembly techniques for protein structure prediction, models of protein structure are assembled from fragments of known protein structures. This process is typically guided by a knowledge-based energy function and uses a heuristic optimization method. The fragments play two important roles in this process: they define the set of structural parameters available, and they also assume the role of the main variation operators that are used by the optimiser. Previous analysis has typically focused on the first of these roles. In particular, the relationship between local amino acid sequence and local protein structure has been studied by a range of authors. The correlation between the two has been shown to vary with the window length considered, and the results of these analyses have informed directly the choice of fragment length in state-of-the-art prediction techniques. Here, we focus on the second role of fragments and aim to determine the effect of fragment length from an optimization perspective. We use theoretical analyses to reveal how the size and structure of the search space changes as a function of insertion length. Furthermore, empirical analyses are used to explore additional ways in which the size of the fragment insertion influences the search both in a simulation model and for the fragment-assembly technique, Rosetta. PMID:22095594
Principles of precision medicine in stroke.
Hinman, Jason D; Rost, Natalia S; Leung, Thomas W; Montaner, Joan; Muir, Keith W; Brown, Scott; Arenillas, Juan F; Feldmann, Edward; Liebeskind, David S
2017-01-01
The era of precision medicine has arrived and conveys tremendous potential, particularly for stroke neurology. The diagnosis of stroke, its underlying aetiology, theranostic strategies, recurrence risk and path to recovery are populated by a series of highly individualised questions. Moreover, the phenotypic complexity of a clinical diagnosis of stroke makes a simple genetic risk assessment only partially informative on an individual basis. The guiding principles of precision medicine in stroke underscore the need to identify, value, organise and analyse the multitude of variables obtained from each individual to generate a precise approach to optimise cerebrovascular health. Existing data may be leveraged with novel technologies, informatics and practical clinical paradigms to apply these principles in stroke and realise the promise of precision medicine. Importantly, precision medicine in stroke will only be realised once efforts to collect, value and synthesise the wealth of data collected in clinical trials and routine care starts. Stroke theranostics, the ultimate vision of synchronising tailored therapeutic strategies based on specific diagnostic data, demand cerebrovascular expertise on big data approaches to clinically relevant paradigms. This review considers such challenges and delineates the principles on a roadmap for rational application of precision medicine to stroke and cerebrovascular health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
The PROactive innovative conceptual framework on physical activity.
Dobbels, Fabienne; de Jong, Corina; Drost, Ellen; Elberse, Janneke; Feridou, Chryssoula; Jacobs, Laura; Rabinovich, Roberto; Frei, Anja; Puhan, Milo A; de Boer, Willem I; van der Molen, Thys; Williams, Kate; Pinnock, Hillary; Troosters, Thierry; Karlsson, Niklas; Kulich, Karoly; Rüdell, Katja
2014-11-01
Although physical activity is considered an important therapeutic target in chronic obstructive pulmonary disease (COPD), what "physical activity" means to COPD patients and how their perspective is best measured is poorly understood. We designed a conceptual framework, guiding the development and content validation of two patient reported outcome (PRO) instruments on physical activity (PROactive PRO instruments). 116 patients from four European countries with diverse demographics and COPD phenotypes participated in three consecutive qualitative studies (63% male, age mean±sd 66±9 years, 35% Global Initiative for Chronic Obstructive Lung Disease stage III-IV). 23 interviews and eight focus groups (n = 54) identified the main themes and candidate items of the framework. 39 cognitive debriefings allowed the clarity of the items and instructions to be optimised. Three themes emerged, i.e. impact of COPD on amount of physical activity, symptoms experienced during physical activity, and adaptations made to facilitate physical activity. The themes were similar irrespective of country, demographic or disease characteristics. Iterative rounds of appraisal and refinement of candidate items resulted in 30 items with a daily recall period and 34 items with a 7-day recall period. For the first time, our approach provides comprehensive insight on physical activity from the COPD patients' perspective. The PROactive PRO instruments' content validity represents the pivotal basis for empirically based item reduction and validation. ©ERS 2014.
Analysis and optimisation of the convergence behaviour of the single channel digital tanlock loop
NASA Astrophysics Data System (ADS)
Al-Kharji Al-Ali, Omar; Anani, Nader; Al-Araji, Saleh; Al-Qutayri, Mahmoud
2013-09-01
The mathematical analysis of the convergence behaviour of the first-order single channel digital tanlock loop (SC-DTL) is presented. This article also describes a novel technique that allows controlling the convergence speed of the loop, i.e. the time taken by the phase-error to reach its steady-state value, by using a specialised controller unit. The controller is used to adjust the convergence speed so as to selectively optimise a given performance parameter of the loop. For instance, the controller may be used to speed up the convergence in order to increase the lock range and improve the acquisition speed. However, since increasing the lock range can degrade the noise immunity of the system, in a noisy environment the controller can slow down the convergence speed until locking is achieved. Once the system is in lock, the convergence speed can be increased to improve the acquisition speed. The performance of the SC-DTL system was assessed against similar arctan-based loops and the results demonstrate the success of the controller in optimising the performance of the SC-DTL loop. The results of the system testing using MATLAB/Simulink simulation are presented. A prototype of the proposed system was implemented using a field programmable gate array module and the practical results are in good agreement with those obtained by simulation.
Modelling soil water retention using support vector machines with genetic algorithm optimisation.
Lamorski, Krzysztof; Sławiński, Cezary; Moreno, Felix; Barna, Gyöngyi; Skierucha, Wojciech; Arrue, José L
2014-01-01
This work presents point pedotransfer function (PTF) models of the soil water retention curve. The developed models allowed for estimation of the soil water content for the specified soil water potentials: -0.98, -3.10, -9.81, -31.02, -491.66, and -1554.78 kPa, based on the following soil characteristics: soil granulometric composition, total porosity, and bulk density. Support Vector Machines (SVM) methodology was used for model development. A new methodology for elaboration of retention function models is proposed. Alternative to previous attempts known from literature, the ν-SVM method was used for model development and the results were compared with the formerly used the C-SVM method. For the purpose of models' parameters search, genetic algorithms were used as an optimisation framework. A new form of the aim function used for models parameters search is proposed which allowed for development of models with better prediction capabilities. This new aim function avoids overestimation of models which is typically encountered when root mean squared error is used as an aim function. Elaborated models showed good agreement with measured soil water retention data. Achieved coefficients of determination values were in the range 0.67-0.92. Studies demonstrated usability of ν-SVM methodology together with genetic algorithm optimisation for retention modelling which gave better performing models than other tested approaches.
AllAboard: Visual Exploration of Cellphone Mobility Data to Optimise Public Transport.
Di Lorenzo, G; Sbodio, M; Calabrese, F; Berlingerio, M; Pinelli, F; Nair, R
2016-02-01
The deep penetration of mobile phones offers cities the ability to opportunistically monitor citizens' mobility and use data-driven insights to better plan and manage services. With large scale data on mobility patterns, operators can move away from the costly, mostly survey based, transportation planning processes, to a more data-centric view, that places the instrumented user at the center of development. In this framework, using mobile phone data to perform transit analysis and optimization represents a new frontier with significant societal impact, especially in developing countries. In this paper we present AllAboard, an intelligent tool that analyses cellphone data to help city authorities in visually exploring urban mobility and optimizing public transport. This is performed within a self contained tool, as opposed to the current solutions which rely on a combination of several distinct tools for analysis, reporting, optimisation and planning. An interactive user interface allows transit operators to visually explore the travel demand in both space and time, correlate it with the transit network, and evaluate the quality of service that a transit network provides to the citizens at very fine grain. Operators can visually test scenarios for transit network improvements, and compare the expected impact on the travellers' experience. The system has been tested using real telecommunication data for the city of Abidjan, Ivory Coast, and evaluated from a data mining, optimisation and user prospective.
Ghosh, Ranadhir; Yearwood, John; Ghosh, Moumita; Bagirov, Adil
2006-06-01
In this paper we investigate a hybrid model based on the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. Also we discuss different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this paper we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model. Comparative results on a range of standard datasets are provided for different fusion hybrid models.
Mohamad, Nurhidayatul Asma; Mustafa, Shuhaimi; El Sheikha, Aly Farag; Khairil Mokhtar, Nur Fadhilah; Ismail, Amin; Ali, Md Eaqub
2016-05-01
Poor quality and quantity of DNA extracted from gelatin and gelatin capsules often causes failure in the determination of animal species using PCR. Gelatin, which is mainly derived from porcine and bovine, has been a matter of concern among customers in order to fulfill religious obligation and safety precaution against several transmissible infectious diseases associated with bovine species. Thus, optimised DNA extraction from gelatin is very important for successful real-time PCR detection of gelatin species. In this work, the DNA extraction method was optimised in terms of lysis incubation period and inclusion of pre-treatment pH modification of samples. The yield of DNA extracted from porcine gelatin was significantly increased when the pH of the samples was adjusted to pH 8.5 prior to DNA precipitation with isopropanol. The optimal pH for DNA precipitation from bovine gelatin solution was then determined at the original pH range of solution: pH 7.6 to 8. A DNA fragment of approximately 300 base pairs was available for PCR amplification. DNA extracted from gelatin and commercially available capsules has been successfully utilised for species detection using real-time PCR assay. However, significant adulterations of porcine and bovine in pure gelatin and capsules have been detected, which require further analytical techniques for validation. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.
Jakschitz, Thomas A E; Huck, Christian W; Lubbad, Said; Bonn, Günther K
2007-04-13
In this paper the synthesis, optimisation and application of a silane based monolithic copolymer for the rapid separation of proteins and oligonucleotides is described. The monolith was prepared by thermal initiated in situ copolymerisation of trimethylsilyl-4-methylstyrene (TMSiMS) and bis(4-vinylbenzyl)dimethylsilane (BVBDMSi) in a silanised 200 microm I.D. fused silica column. Different ratios of monomer and crosslinker, as well as different ratios of micro- (toluene) and macro-porogen (2-propanol) were used for optimising the physical properties of the stationary phase regarding separation efficiency. The prepared monolithic stationary phases were characterised by measurement of permeability with different solvents, determination of pore size distribution by mercury intrusion porosimetry (MIP). Morphology was studied by scanning electron microscopy (SEM). Applying optimised conditions, a mixture comprised of five standard proteins ribunuclease A, cytochrome c, alpha-lactalbumine, myoglobine and ovalbumine was separated within 1 min by ion-pair reversed-phase liquid chromatography (IP-RPLC) obtaining half-height peak widths between 1.8 and 2.4 s. Baseline separation of oligonucleotides d(pT)(12-18) was achieved within 1.8 min obtaining half-height peak widths between 3.6 and 5.4 s. The results demonstrate the high potential of this stationary phase for fast separation of high-molecular weight biomolecules such as oligonucleotides and proteins.
O Connell, Malene Barfod; Jensen, Pia Søe; Andersen, Signe Lindgård; Fernbrant, Cecilia; Nørholm, Vibeke; Petersen, Helle Vendel
2018-02-01
To explore the barriers for nutritional care as perceived by nursing staff at an acute orthopaedic ward, aiming to implement evidence-based nutritional care. Previous studies indicate that nurses recognise nutritional care as important, but interventions are often lacking. These studies show that a range of barriers influence the attempt to optimise nutritional care. Before the implementation of evidence-based nutritional care, we examined barriers for nutritional care among the nursing staff. Qualitative study. Four focus groups with thirteen members of the nursing staff were interviewed between October 2013-June 2014. The interview guide was designed according to the Theoretical Domains Framework. The interviews were analysed using qualitative content analysis. Three main categories emerged: lacking common practice, failing to initiate treatment and struggling with existing resources. The nursing staff was lacking both knowledge and common practice regarding nutritional care. They felt they protected patient autonomy by accepting patient's reluctance to eat or getting a feeding tube. The lack of nutritional focus from doctors decreased the nursing staffs focus leading to nonoptimal nutritional treatment. Competing priorities, physical setting and limited nutritional supplements were believed to hinder nutritional care. The results suggest that nutritional care is in a transitional state from experience- to evidence-based practice. Barriers for nutritional care are grounded in lack of knowledge among nursing staff and insufficient collaboration between nursing staff and the doctors. There is a need for nutritional education for the nursing staff and better support from the organisation to help nursing staff provide evidence-based nutritional care. This study contributes with valuable knowledge before the implementation of evidence-based nutritional care. The study provides an understanding of barriers for nutritional care and presents explanations to why nutritional care has failed to become an integrated part of the daily treatment and care. © 2017 John Wiley & Sons Ltd.
Keating, Dolores; McWilliams, Stephen; Schneider, Ian; Hynes, Caroline; Cousins, Gráinne; Strawbridge, Judith; Clarke, Mary
2017-01-01
Objectives Clinical practice guidelines (CPGs) support the translation of research evidence into clinical practice. Key health questions in CPGs ensure that recommendations will be applicable to the clinical context in which the guideline is used. The objectives of this study were to identify CPGs for the pharmacological treatment of first-episode schizophrenia; assess the quality of these guidelines using the Appraisal of Guidelines for Research and Evaluation II (AGREE II) instrument; and compare recommendations in relation to the key health questions that are relevant to the pharmacological treatment of first-episode schizophrenia. Methods A multidisciplinary group identified key health questions that are relevant to the pharmacological treatment of first-episode schizophrenia. The MEDLINE and EMBASE databases, websites of professional organisations and international guideline repositories, were searched for CPGs that met the inclusion criteria. The AGREE II instrument was applied by three raters and data were extracted from the guidelines in relation to the key health questions. Results In total, 3299 records were screened. 10 guidelines met the inclusion criteria. 3 guidelines scored well across all domains. Recommendations varied in specificity. Side effect concerns, rather than comparative efficacy benefits, were a key consideration in antipsychotic choice. Antipsychotic medication is recommended for maintenance of remission following a first episode of schizophrenia but there is a paucity of evidence to guide duration of treatment. Clozapine is universally regarded as the medication of choice for treatment resistance. There is less evidence to guide care for those who do not respond to clozapine. Conclusions An individual's experience of using antipsychotic medication for the initial treatment of first-episode schizophrenia may have implications for future engagement, adherence and outcome. While guidelines of good quality exist to assist in medicines optimisation, the evidence base required to answer key health questions relevant to the pharmacological treatment of first-episode schizophrenia is limited. PMID:28062471
Ahmed, Shaimaa; Vepuri, Suresh B; Kalhapure, Rahul S; Govender, Thirumala
2016-07-21
Dendrimers have emerged as novel and efficient materials that can be used as therapeutic agents/drugs or as drug delivery carriers to enhance therapeutic outcomes. Molecular dendrimer interactions are central to their applications and realising their potential. The molecular interactions of dendrimers with drugs or other materials in drug delivery systems or drug conjugates have been extensively reported in the literature. However, despite the growing application of dendrimers as biologically active materials, research focusing on the mechanistic analysis of dendrimer interactions with therapeutic biological targets is currently lacking in the literature. This comprehensive review on dendrimers over the last 15 years therefore attempts to identify the reasons behind the apparent lack of dendrimer-receptor research and proposes approaches to address this issue. The structure, hierarchy and applications of dendrimers are briefly highlighted, followed by a review of their various applications, specifically as biologically active materials, with a focus on their interactions at the target site. It concludes with a technical guide to assist researchers on how to employ various molecular modelling and computational approaches for research on dendrimer interactions with biological targets at a molecular level. This review highlights the impact of a mechanistic analysis of dendrimer interactions on a molecular level, serves to guide and optimise their discovery as medicinal agents, and hopes to stimulate multidisciplinary research between scientific, experimental and molecular modelling research teams.
Signal Separation of Helicopter Radar Returns Using Wavelet-Based Sparse Signal Optimisation
2016-10-01
RR–0436 ABSTRACT A novel wavelet-based sparse signal representation technique is used to separate the main and tail rotor blade components of a...helicopter from the composite radar returns. The received signal consists of returns from the rotating main and tail rotor blades , the helicopter body...component signal com- prising of returns from the main body, the main and tail rotor hubs and blades . Temporal and Doppler characteristics of these
Integrated optics ring-resonator chemical sensor with polymer transduction layer
NASA Technical Reports Server (NTRS)
Ksendzov, A.; Homer, M. L.; Manfreda, A. M.
2004-01-01
An integrated optics chemical sensor based on a ring resonator with an ethyl cellulose polymer coating has been demonstrated. The measured sensitivity to isopropanol in air is 50 ppm-the level immediately useful for health-related air quality monitoring. The resonator was fabricated using SiO2 and SixNy materials. The signal readout is based on tracking the wavelength of a resonance peak. The resonator layout optimisation for sensing applications is discussed.
Higton, D M
2001-01-01
An improvement to the procedure for the rapid optimisation of mass spectrometry (PROMS), for the development of multiple reaction methods (MRM) for quantitative bioanalytical liquid chromatography/tandem mass spectrometry (LC/MS/MS), is presented. PROMS is an automated protocol that uses flow-injection analysis (FIA) and AppleScripts to create methods and acquire the data for optimisation. The protocol determines the optimum orifice potential, the MRM conditions for each compound, and finally creates the MRM methods needed for sample analysis. The sensitivities of the MRM methods created by PROMS approach those created manually. MRM method development using PROMS currently takes less than three minutes per compound compared to at least fifteen minutes manually. To further enhance throughput, approaches to MRM optimisation using one injection per compound, two injections per pool of five compounds and one injection per pool of five compounds have been investigated. No significant difference in the optimised instrumental parameters for MRM methods were found between the original PROMS approach and these new methods, which are up to ten times faster. The time taken for an AppleScript to determine the optimum conditions and build the MRM methods is the same with all approaches. Copyright 2001 John Wiley & Sons, Ltd.
Optimised analytical models of the dielectric properties of biological tissue.
Salahuddin, Saqib; Porter, Emily; Krewer, Finn; O' Halloran, Martin
2017-05-01
The interaction of electromagnetic fields with the human body is quantified by the dielectric properties of biological tissues. These properties are incorporated into complex numerical simulations using parametric models such as Debye and Cole-Cole, for the computational investigation of electromagnetic wave propagation within the body. These parameters can be acquired through a variety of optimisation algorithms to achieve an accurate fit to measured data sets. A number of different optimisation techniques have been proposed, but these are often limited by the requirement for initial value estimations or by the large overall error (often up to several percentage points). In this work, a novel two-stage genetic algorithm proposed by the authors is applied to optimise the multi-pole Debye parameters for 54 types of human tissues. The performance of the two-stage genetic algorithm has been examined through a comparison with five other existing algorithms. The experimental results demonstrate that the two-stage genetic algorithm produces an accurate fit to a range of experimental data and efficiently out-performs all other optimisation algorithms under consideration. Accurate values of the three-pole Debye models for 54 types of human tissues, over 500 MHz to 20 GHz, are also presented for reference. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
Saikia, Sangeeta; Mahnot, Nikhil Kumar; Mahanta, Charu Lata
2015-03-15
Optimised of the extraction of polyphenol from star fruit (Averrhoa carambola) pomace using response surface methodology was carried out. Two variables viz. temperature (°C) and ethanol concentration (%) with 5 levels (-1.414, -1, 0, +1 and +1.414) were used to design the optimisation model using central composite rotatable design where, -1.414 and +1.414 refer to axial values, -1 and +1 mean factorial points and 0 refers to centre point of the design. The two variables, temperature of 40°C and ethanol concentration of 65% were the optimised conditions for the response variables of total phenolic content, ferric reducing antioxidant capacity and 2,2-diphenyl-1-picrylhydrazyl scavenging activity. The reverse phase-high pressure liquid chromatography chromatogram of the polyphenol extract showed eight phenolic acids and ascorbic acid. The extract was then encapsulated with maltodextrin (⩽ DE 20) by spray and freeze drying methods at three different concentrations. Highest encapsulating efficiency was obtained in freeze dried encapsulates (78-97%). The obtained optimised model could be used for polyphenol extraction from star fruit pomace and microencapsulates can be incorporated in different food systems to enhance their antioxidant property. Copyright © 2014 Elsevier Ltd. All rights reserved.
Analysis of power gating in different hierarchical levels of 2MB cache, considering variation
NASA Astrophysics Data System (ADS)
Jafari, Mohsen; Imani, Mohsen; Fathipour, Morteza
2015-09-01
This article reintroduces power gating technique in different hierarchical levels of static random-access memory (SRAM) design including cell, row, bank and entire cache memory in 16 nm Fin field effect transistor. Different structures of SRAM cells such as 6T, 8T, 9T and 10T are used in design of 2MB cache memory. The power reduction of the entire cache memory employing cell-level optimisation is 99.7% with the expense of area and other stability overheads. The power saving of the cell-level optimisation is 3× (1.2×) higher than power gating in cache (bank) level due to its superior selectivity. The access delay times are allowed to increase by 4% in the same energy delay product to achieve the best power reduction for each supply voltages and optimisation levels. The results show the row-level power gating is the best for optimising the power of the entire cache with lowest drawbacks. Comparisons of cells show that the cells whose bodies have higher power consumption are the best candidates for power gating technique in row-level optimisation. The technique has the lowest percentage of saving in minimum energy point (MEP) of the design. The power gating also improves the variation of power in all structures by at least 70%.
Badri, Alia; Crutzen, Rik; Eltayeb, Shahla; Van den Borne, H W
2013-03-26
Women are considered special groups who are uniquely vulnerable in the context of war exposures. To effectively target the resources aimed at mitigating mental health consequences and optimising and maximising the use of mental health provisions, culturally relevant war trauma counsellor training is required. The objectives of this study are to promote a new philosophy in the Sudanese mental health care by introducing an integrative approach for targeted prevention and tailored treatments to the Darfuri person in a cost-effective way. Furthermore, the study provides evidence- and theory-based guidelines for developing a war trauma counsellor training programme in Sudan, mainly based on qualitative and quantitative studies among war-affected Darfuri female students. Cultural conceptualisations such as gender roles and religious expectations as well as theories that emphasise resilience and other psychosocial adaptation skills have been operationalised to reflect the totality of the Darfuri women's experiences. Furthermore, the results of four interrelated studies among war-traumatised undergraduate Darfuri women who are internally displaced provide the basis that guides an outline for qualification development, capacity building and skills consolidation among Sudanese mental health care providers. Explicit war-related psychosocial needs assessment tools, specific war-related trauma counsellor training and particular counsellor characteristics, qualities and awareness that pertain to strengthening the efficacy of war trauma Sudanese counsellors are recommended. The aim is to produce expertly trained war trauma counsellors working with war-affected Darfuri women in particular and with regards to their helpfulness in responding to the psychosocial needs of war-exposed Sudanese in general.
Thakkar, Jay; Karthikeyan, Ganesan; Purohit, Gaurav; Thakkar, Swetha; Sharma, Jitender; Verma, Sunilkumar; Parakh, Neeraj; Seth, Sandeep; Mishra, Sundeep; Yadav, Rakesh; Singh, Sandeep; Joshi, Rohina; Thiagalingam, Aravinda; Chow, Clara K; Redfern, Julie
2016-01-01
Background Coronary heart disease (CHD) is a leading cause of morbidity and mortality in India. Text message based prevention programs have demonstrated reduction in cardiovascular risk factors among patients with CHD in selected populations. Customisation is important as behaviour change is influenced by culture and linguistic context. Objectives To customise a mobile phone text message program supporting behaviour and treatment adherence in CHD for delivery in North India. Methods We used an iterative process with mixed methods involving three phases: (1) Initial translation, (2) Review and incorporation of feedback including review by cardiologists in India to assess alignment with local guidelines and by consumers on perceived utility and clarity and (3) Pilot testing of message management software. Results Messages were translated in three ways: symmetrical translation, asymmetrical translation and substitution. Feedback from cardiologists and 25 patients was incorporated to develop the final bank. Patients reported Hinglish messages were easy to understand (93%) and useful (78%). The software located in Australia successfully delivered messages to participants based in Delhi-surrounds (India). Conclusions Our process for customisation of a text message program considered cultural, linguistic and the medical context of potential participants. This is important in optimising intervention fidelity across populations enabling examination of the generalisability of text message programs across populations. We also demonstrated the customised program was acceptable to patients in India and that a centralised cross-country delivery model was feasible. This process could be used as a guide for other groups seeking to customise their programs. Trial registration number TEXTMEDS Australia (Parent study)—ACTRN 12613000793718. PMID:27752288
Kumar, Arunaz; Nestel, Debra; Stoyles, Sally; East, Christine; Wallace, Euan M; White, Colleen
2016-02-01
Birth at home is a safe and appropriate choice for healthy women with a low risk pregnancy. However there is a small risk of emergencies requiring immediate, skilled management to optimise maternal and neonatal outcomes. We developed and implemented a simulation workshop designed to run in a home based setting to assist with emergency training for midwives and paramedical staff. The workshop was evaluated by assessing participants' satisfaction and response to key learning issues. Midwifery and emergency paramedical staff attending home births participated in a simulation workshop where they were required to manage birth emergencies in real time with limited availability of resources to suit the setting. They completed a pre-test and post-test evaluation form exploring the content and utility of the workshops. Content analysis was performed on qualitative data regarding the most important learning from the simulation activity. A total of 73 participants attended the workshop (midwifery=46, and paramedical=27). There were 110 comments, made by 49 participants. The most frequently identified key learning elements were related to communication (among midwives, paramedical and hospital staff and with the woman's partner), followed by recognising the role of other health care professionals, developing an understanding of the process and the importance of planning ahead. Home birth simulation workshop was found to be a useful tool by staff that provide care to women who are having a planned home birth. Developing clear communication and teamwork were found to be the key learning principles guiding their practice. Copyright © 2015 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
New Trends in Forging Technologies
NASA Astrophysics Data System (ADS)
Behrens, B.-A.; Hagen, T.; Knigge, J.; Elgaly, I.; Hadifi, T.; Bouguecha, A.
2011-05-01
Limited natural resources increase the demand on highly efficient machinery and transportation means. New energy-saving mobility concepts call for design optimisation through downsizing of components and choice of corrosion resistant materials possessing high strength to density ratios. Component downsizing can be performed either by constructive structural optimisation or by substituting heavy materials with lighter high-strength ones. In this context, forging plays an important role in manufacturing load-optimised structural components. At the Institute of Metal Forming and Metal-Forming Machines (IFUM) various innovative forging technologies have been developed. With regard to structural optimisation, different strategies for localised reinforcement of components were investigated. Locally induced strain hardening by means of cold forging under a superimposed hydrostatic pressure could be realised. In addition, controlled martensitic zones could be created through forming induced phase conversion in metastable austenitic steels. Other research focused on the replacement of heavy steel parts with high-strength nonferrous alloys or hybrid material compounds. Several forging processes of magnesium, aluminium and titanium alloys for different aeronautical and automotive applications were developed. The whole process chain from material characterisation via simulation-based process design to the production of the parts has been considered. The feasibility of forging complex shaped geometries using these alloys was confirmed. In spite of the difficulties encountered due to machine noise and high temperature, acoustic emission (AE) technique has been successfully applied for online monitoring of forging defects. New AE analysis algorithm has been developed, so that different signal patterns due to various events such as product/die cracking or die wear could be detected and classified. Further, the feasibility of the mentioned forging technologies was proven by means of the finite element analysis (FEA). For example, the integrity of forging dies with respect to crack initiation due to thermo-mechanical fatigue as well as the ductile damage of forgings was investigated with the help of cumulative damage models. In this paper some of the mentioned approaches are described.
Gladman, John; Buckell, John; Young, John; Smith, Andrew; Hulme, Clare; Saggu, Satti; Godfrey, Mary; Enderby, Pam; Teale, Elizabeth; Longo, Roberto; Gannon, Brenda; Holditch, Claire; Eardley, Heather; Tucker, Helen
2017-01-01
Introduction To understand the variation in performance between community hospitals, our objectives are: to measure the relative performance (cost efficiency) of rehabilitation services in community hospitals; to identify the characteristics of community hospital rehabilitation that optimise performance; to investigate the current impact of community hospital inpatient rehabilitation for older people on secondary care and the potential impact if community hospital rehabilitation was optimised to best practice nationally; to examine the relationship between the configuration of intermediate care and secondary care bed use; and to develop toolkits for commissioners and community hospital providers to optimise performance. Methods and analysis 4 linked studies will be performed. Study 1: cost efficiency modelling will apply econometric techniques to data sets from the National Health Service (NHS) Benchmarking Network surveys of community hospital and intermediate care. This will identify community hospitals' performance and estimate the gap between high and low performers. Analyses will determine the potential impact if the performance of all community hospitals nationally was optimised to best performance, and examine the association between community hospital configuration and secondary care bed use. Study 2: a national community hospital survey gathering detailed cost data and efficiency variables will be performed. Study 3: in-depth case studies of 3 community hospitals, 2 high and 1 low performing, will be undertaken. Case studies will gather routine hospital and local health economy data. Ward culture will be surveyed. Content and delivery of treatment will be observed. Patients and staff will be interviewed. Study 4: co-designed web-based quality improvement toolkits for commissioners and providers will be developed, including indicators of performance and the gap between local and best community hospitals performance. Ethics and dissemination Publications will be in peer-reviewed journals, reports will be distributed through stakeholder organisations. Ethical approval was obtained from the Bradford Research Ethics Committee (reference: 15/YH/0062). PMID:28242766
Temporal optimisation of fuel treatment design in blue gum (Eucalyptus globulus) plantations
Ana Martin; Brigite Botequim; Tiago M. Oliveira; Alan Ager; Francesco Pirotti
2016-01-01
This study was conducted to support fire and forest management planning in eucalypt plantations based on economic, ecological and fire prevention criteria, with a focus on strategic prioritisation of fuel treatments over time. The central objective was to strategically locate fuel treatments to minimise losses from wildfire while meeting budget constraints and demands...
ERIC Educational Resources Information Center
Nordrum, Lene; Evans, Katherine; Gustafsson, Magnus
2013-01-01
This study compares students' experiences of two types of criteria-based assessment: in-text commentary and rubric-articulated feedback, in an assessment design combining the two feedback channels. The main aim is to use students' responses to shed light on how feedback strategies for formative assessment can be optimised. Following action…
Moving towards Optimising Demand-Led Learning: The 2005-2007 ECUANET Leonardo Da Vinci Project
ERIC Educational Resources Information Center
Dealtry, Richard; Howard, Keith
2008-01-01
Purpose: The purpose of this paper is to present the key project learning points and outcomes as a guideline for the future quality management of demand-led learning and development. Design/methodology/approach: The research methodology was based upon a corporate university blueprint architecture and browser toolkit developed by a member of the…
Using Machine-Learning and Visualisation to Facilitate Learner Interpretation of Source Material
ERIC Educational Resources Information Center
Wolff, Annika; Mulholland, Paul; Zdrahal, Zdenek
2014-01-01
This paper describes an approach for supporting inquiry learning from source materials, realised and tested through a tool-kit. The approach is optimised for tasks that require a student to make interpretations across sets of resources, where opinions and justifications may be hard to articulate. We adopt a dialogue-based approach to learning…
Efficient methods for enol phosphate synthesis using carbon-centred magnesium bases.
Kerr, William J; Lindsay, David M; Patel, Vipulkumar K; Rajamanickam, Muralikrishnan
2015-10-28
Efficient conversion of ketones into kinetic enol phosphates under mild and accessible conditions has been realised using the developed methods with di-tert-butylmagnesium and bismesitylmagnesium. Optimisation of the quench protocol resulted in high yields of enol phosphates from a range of cyclohexanones and aryl methyl ketones, with tolerance of a range of additional functional units.
Richert, Laura; Doussau, Adélaïde; Lelièvre, Jean-Daniel; Arnold, Vincent; Rieux, Véronique; Bouakane, Amel; Lévy, Yves; Chêne, Geneviève; Thiébaut, Rodolphe
2014-02-26
Many candidate vaccine strategies against human immunodeficiency virus (HIV) infection are under study, but their clinical development is lengthy and iterative. To accelerate HIV vaccine development optimised trial designs are needed. We propose a randomised multi-arm phase I/II design for early stage development of several vaccine strategies, aiming at rapidly discarding those that are unsafe or non-immunogenic. We explored early stage designs to evaluate both the safety and the immunogenicity of four heterologous prime-boost HIV vaccine strategies in parallel. One of the vaccines used as a prime and boost in the different strategies (vaccine 1) has yet to be tested in humans, thus requiring a phase I safety evaluation. However, its toxicity risk is considered minimal based on data from similar vaccines. We newly adapted a randomised phase II trial by integrating an early safety decision rule, emulating that of a phase I study. We evaluated the operating characteristics of the proposed design in simulation studies with either a fixed-sample frequentist or a continuous Bayesian safety decision rule and projected timelines for the trial. We propose a randomised four-arm phase I/II design with two independent binary endpoints for safety and immunogenicity. Immunogenicity evaluation at trial end is based on a single-stage Fleming design per arm, comparing the observed proportion of responders in an immunogenicity screening assay to an unacceptably low proportion, without direct comparisons between arms. Randomisation limits heterogeneity in volunteer characteristics between arms. To avoid exposure of additional participants to an unsafe vaccine during the vaccine boost phase, an early safety decision rule is imposed on the arm starting with vaccine 1 injections. In simulations of the design with either decision rule, the risks of erroneous conclusions were controlled <15%. Flexibility in trial conduct is greater with the continuous Bayesian rule. A 12-month gain in timelines is expected by this optimised design. Other existing designs such as bivariate or seamless phase I/II designs did not offer a clear-cut alternative. By combining phase I and phase II evaluations in a multi-arm trial, the proposed optimised design allows for accelerating early stage clinical development of HIV vaccine strategies.
Targeted flock/herd and individual ruminant treatment approaches.
Kenyon, F; Jackson, F
2012-05-04
In Europe, most nematodoses are subclinical involving morbid rather than mortal effects and control is largely achieved using anthelmintics. In cattle, the genera most associated with sub-optimal performance are Ostertagia and Cooperia whereas in sheep and goats, subclinical losses are most often caused by Teladorsagia and Trichostrongylus. In some regions, at certain times, other species such as Nematodirus and Haemonchus also cause disease in sheep and goats. Unfortunately, anthelmintic resistance has now become an issue for European small ruminant producers. One of the key aims of the EU-funded PARASOL project was to identify low input and sustainable approaches to control nematode parasites in ruminants using refugia-based strategies. Two approaches to optimise anthelmintic treatments in sheep and cattle were studied; targeted treatments (TT) - whole-group treatments optimised on the basis of a marker of infection e.g. faecal egg count (FEC), and targeted selected treatment (TST) - treatments given to identified individuals to provide epidemiological and/or production benefits. A number of indicators for TT and TST were assessed to define parasitological and production-system specific indicators for treatment that best suited the regions where the PARASOL studies were conducted. These included liveweight gain, production efficiency, FEC, body condition score and diarrhoea score in small ruminants, and pepsinogen levels and Ostertagia bulk milk tank ELISA in cattle. The PARASOL studies confirmed the value of monitoring FEC as a means of targeting whole-flock treatments in small ruminants. In cattle, bulk milk tank ELISA and serum pepsinogen assays could be used retrospectively to determine the levels of exposure and hence, in the next season to optimise anthelmintic usage. TST approaches in sheep and goats examined production efficiency and liveweight gain as indicators for treatment and confirmed the value of this approach in maintaining performance and anthelmintic susceptibility in the predominant gastrointestinal nematodes. There is good evidence that the TST approach selected less heavily for the development of resistance in comparison to routine monthly treatments. Further research is required to optimise markers for TT and TST but it is also crucial to encourage producers/advisors to adapt these refugia-based strategies to maintain drug susceptible parasites in order to provide sustainable control. Copyright © 2011 Elsevier B.V. All rights reserved.
Optimisation of environmental remediation: how to select and use the reference levels.
Balonov, M; Chipiga, L; Kiselev, S; Sneve, M; Yankovich, T; Proehl, G
2018-06-01
A number of past industrial activities and accidents have resulted in the radioactive contamination of large areas at many sites around the world, giving rise to a need for remediation. According to the International Commission on Radiological Protection (ICRP) and International Atomic Energy Agency (IAEA), such situations should be managed as existing exposure situations (ExESs). Control of exposure to the public in ExESs is based on the application of appropriate reference levels (RLs) for residual doses. The implementation of this potentially fruitful concept for the optimisation of remediation in various regions is hampered by a lack of practical experience and relevant guidance. This paper suggests a generic methodology for the selection of numeric values of relevant RLs both in terms of residual annual effective dose and derived RLs (DRLs) based on an appropriate dose assessment. The value for an RL should be selected in the range of the annual residual effective dose of 1-20 mSv, depending on the prevailing circumstances for the exposure under consideration. Within this range, RL values should be chosen by the following assessment steps: (a) assessment of the projected dose, i.e. the dose to a representative person without remedial actions by means of a realistic model as opposed to a conservative model; (b) modelling of the residual dose to a representative person following application of feasible remedial actions; and (c) selection of an RL value between the projected and residual doses, taking account of the prevailing social and economic conditions. This paper also contains some recommendations for practical implementation of the selected RLs for the optimisation of public protection. The suggested methodology used for the selection of RLs (in terms of dose) and the calculation of DRLs (in terms of activity concentration in food, ambient dose rate, etc) has been illustrated by a retrospective analysis of post-Chernobyl monitoring and modelling data from the Bryansk region, Russia, 2001. From this example, it follows that analysis of real data leads to the selection of an RL from a relatively narrow annual dose range (in this case, about 2-3 mSv), from which relevant DRLs can be calculated and directly used for optimisation of the remediation programme.
Pandey, Sonia; Swamy, S M Vijayendra; Gupta, Arti; Koli, Akshay; Patel, Swagat; Maulvi, Furqan; Vyas, Bhavin
2018-04-29
To optimise the Eudragit/Surelease ® -coated pH-sensitive pellets for controlled and target drug delivery to the colon tissue and to avoid frequent high dosing and associated side effects which restrict its use in the colorectal-cancer therapy. The pellets were prepared using extrusion-spheronisation technique. Box-Behnken and 3 2 full factorial designs were applied to optimise the process parameters [extruder sieve size, spheroniser-speed, and spheroniser-time] and the coating levels [%w/v of Eudragit S100/Eudragit-L100 and Surelease ® ], respectively, to achieve the smooth optimised size pellets with sustained drug delivery without prior drug release in upper gastrointestinal tract (GIT). The design proposed the optimised batch by selecting independent variables at; extruder sieve size (X 1 = 1 mm), spheroniser speed (X 2 = 900 revolutions per minute, rpm), and spheroniser time (X 3 = 15 min) to achieve pellet size of 0.96 mm, aspect ratio of 0.98, and roundness 97.42%. The 16%w/v coating strength of Surelease ® and 13%w/v coating strength of Eudragit showed pH-dependent sustained release up to 22.35 h (t 99% ). The organ distribution study showed the absence of the drug in the upper part of GIT tissue and the presence of high level of capecitabine in the caecum and colon tissue. Thus, the presence of Eudragit coat prevent the release of drug in stomach and the inner Surelease ® coat showed sustained drug release in the colon tissue. The study demonstrates the potential of optimised Eudragit/Surelease ® -coated capecitabine-pellets for effective colon-targeted delivery system to avoid frequent high dosing and associated systemic side effects of drug.
Ali, Ziad A; Maehara, Akiko; Généreux, Philippe; Shlofmitz, Richard A; Fabbiocchi, Franco; Nazif, Tamim M; Guagliumi, Giulio; Meraj, Perwaiz M; Alfonso, Fernando; Samady, Habib; Akasaka, Takashi; Carlson, Eric B; Leesar, Massoud A; Matsumura, Mitsuaki; Ozan, Melek Ozgu; Mintz, Gary S; Ben-Yehuda, Ori; Stone, Gregg W
2016-11-26
Percutaneous coronary intervention (PCI) is most commonly guided by angiography alone. Intravascular ultrasound (IVUS) guidance has been shown to reduce major adverse cardiovascular events (MACE) after PCI, principally by resulting in a larger postprocedure lumen than with angiographic guidance. Optical coherence tomography (OCT) provides higher resolution imaging than does IVUS, although findings from some studies suggest that it might lead to smaller luminal diameters after stent implantation. We sought to establish whether or not a novel OCT-based stent sizing strategy would result in a minimum stent area similar to or better than that achieved with IVUS guidance and better than that achieved with angiography guidance alone. In this randomised controlled trial, we recruited patients aged 18 years or older undergoing PCI from 29 hospitals in eight countries. Eligible patients had one or more target lesions located in a native coronary artery with a visually estimated reference vessel diameter of 2·25-3·50 mm and a length of less than 40 mm. We excluded patients with left main or ostial right coronary artery stenoses, bypass graft stenoses, chronic total occlusions, planned two-stent bifurcations, and in-stent restenosis. Participants were randomly assigned (1:1:1; with use of an interactive web-based system in block sizes of three, stratified by site) to OCT guidance, IVUS guidance, or angiography-guided stent implantation. We did OCT-guided PCI using a specific protocol to establish stent length, diameter, and expansion according to reference segment external elastic lamina measurements. All patients underwent final OCT imaging (operators in the IVUS and angiography groups were masked to the OCT images). The primary efficacy endpoint was post-PCI minimum stent area, measured by OCT at a masked independent core laboratory at completion of enrolment, in all randomly allocated participants who had primary outcome data. The primary safety endpoint was procedural MACE. We tested non-inferiority of OCT guidance to IVUS guidance (with a non-inferiority margin of 1·0 mm 2 ), superiority of OCT guidance to angiography guidance, and superiority of OCT guidance to IVUS guidance, in a hierarchical manner. This trial is registered with ClinicalTrials.gov, number NCT02471586. Between May 13, 2015, and April 5, 2016, we randomly allocated 450 patients (158 [35%] to OCT, 146 [32%] to IVUS, and 146 [32%] to angiography), with 415 final OCT acquisitions analysed for the primary endpoint (140 [34%] in the OCT group, 135 [33%] in the IVUS group, and 140 [34%] in the angiography group). The final median minimum stent area was 5·79 mm 2 (IQR 4·54-7·34) with OCT guidance, 5·89 mm 2 (4·67-7·80) with IVUS guidance, and 5·49 mm 2 (4·39-6·59) with angiography guidance. OCT guidance was non-inferior to IVUS guidance (one-sided 97·5% lower CI -0·70 mm 2 ; p=0·001), but not superior (p=0·42). OCT guidance was also not superior to angiography guidance (p=0·12). We noted procedural MACE in four (3%) of 158 patients in the OCT group, one (1%) of 146 in the IVUS group, and one (1%) of 146 in the angiography group (OCT vs IVUS p=0·37; OCT vs angiography p=0·37). OCT-guided PCI using a specific reference segment external elastic lamina-based stent optimisation strategy was safe and resulted in similar minimum stent area to that of IVUS-guided PCI. These data warrant a large-scale randomised trial to establish whether or not OCT guidance results in superior clinical outcomes to angiography guidance. St Jude Medical. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cheng, Yu-Huei
2014-12-01
Specific primers play an important role in polymerase chain reaction (PCR) experiments, and therefore it is essential to find specific primers of outstanding quality. Unfortunately, many PCR constraints must be simultaneously inspected which makes specific primer selection difficult and time-consuming. This paper introduces a novel computational intelligence-based method, Teaching-Learning-Based Optimisation, to select the specific and feasible primers. The specified PCR product lengths of 150-300 bp and 500-800 bp with three melting temperature formulae of Wallace's formula, Bolton and McCarthy's formula and SantaLucia's formula were performed. The authors calculate optimal frequency to estimate the quality of primer selection based on a total of 500 runs for 50 random nucleotide sequences of 'Homo species' retrieved from the National Center for Biotechnology Information. The method was then fairly compared with the genetic algorithm (GA) and memetic algorithm (MA) for primer selection in the literature. The results show that the method easily found suitable primers corresponding with the setting primer constraints and had preferable performance than the GA and the MA. Furthermore, the method was also compared with the common method Primer3 according to their method type, primers presentation, parameters setting, speed and memory usage. In conclusion, it is an interesting primer selection method and a valuable tool for automatic high-throughput analysis. In the future, the usage of the primers in the wet lab needs to be validated carefully to increase the reliability of the method.
The GOSTT concept and hybrid mixed/virtual/augmented reality environment radioguided surgery.
Valdés Olmos, R A; Vidal-Sicart, S; Giammarile, F; Zaknun, J J; Van Leeuwen, F W; Mariani, G
2014-06-01
The popularity gained by the sentinel lymph node (SLN) procedure in the last two decades did increase the interest of the surgical disciplines for other applications of radioguided surgery. An example is the gamma-probe guided localization of occult or difficult to locate neoplastic lesions. Such guidance can be achieved by intralesional delivery (ultrasound, stereotaxis or CT) of a radiolabelled agent that remains accumulated at the site of the injection. Another possibility rested on the use of systemic administration of a tumour-seeking radiopharmaceutical with favourable tumour accumulation and retention. On the other hand, new intraoperative imaging devices for radioguided surgery in complex anatomical areas became available. All this a few years ago led to the delineation of the concept Guided intraOperative Scintigraphic Tumour Targeting (GOSTT) to include the whole spectrum of basic and advanced nuclear medicine procedures required for providing a roadmap that would optimise surgery. The introduction of allied signatures using, e.g. hybrid tracers for simultaneous detection of the radioactive and fluorescent signals did amply the GOSTT concept. It was now possible to combine perioperative nuclear medicine imaging with the superior resolution of additional optical guidance in the operating room. This hybrid approach is currently in progress and probably will become an important model to follow in the coming years. A cornerstone in the GOSTT concept is constituted by diagnostic imaging technologies like SPECT/CT. SPECT/CT was introduced halfway the past decade and was immediately incorporated into the SLN procedure. Important reasons attributing to the success of SPECT/CT were its combination with lymphoscintigraphy, and the ability to display SLNs in an anatomical environment. This latter aspect has significantly been improved in the new generation of SPECT/CT cameras and provides the base for the novel mixed reality protocols of image-guided surgery. In these protocols the generated virtual SPECT/CT elements are visually superimposed in the body of the patient in the operating room to directly facilitate, by means of visualization on screen or using head-mounted devices, the localization of radioactive and/or fluorescent targets by minimal invasive approaches in areas of complex anatomy. All these technological advances will play an increasing role in the future extension and the clinical impact of the GOSTT concept.
Optimisation of process parameters on thin shell part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.
2017-09-01
This study is carried out to focus on optimisation of process parameters by simulation using Autodesk Moldflow Insight (AMI) software. The process parameters are taken as the input in order to analyse the warpage value which is the output in this study. There are some significant parameters that have been used which are melt temperature, mould temperature, packing pressure, and cooling time. A plastic part made of Polypropylene (PP) has been selected as the study part. Optimisation of process parameters is applied in Design Expert software with the aim to minimise the obtained warpage value. Response Surface Methodology (RSM) has been applied in this study together with Analysis of Variance (ANOVA) in order to investigate the interactions between parameters that are significant to the warpage value. Thus, the optimised warpage value can be obtained using the model designed using RSM due to its minimum error value. This study comes out with the warpage value improved by using RSM.