Sample records for hybrid watershed algorithm

  1. Evaluating water management strategies in watersheds by new hybrid Fuzzy Analytical Network Process (FANP) methods

    NASA Astrophysics Data System (ADS)

    RazaviToosi, S. L.; Samani, J. M. V.

    2016-03-01

    Watersheds are considered as hydrological units. Their other important aspects such as economic, social and environmental functions play crucial roles in sustainable development. The objective of this work is to develop methodologies to prioritize watersheds by considering different development strategies in environmental, social and economic sectors. This ranking could play a significant role in management to assign the most critical watersheds where by employing water management strategies, best condition changes are expected to be accomplished. Due to complex relations among different criteria, two new hybrid fuzzy ANP (Analytical Network Process) algorithms, fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) and fuzzy max-min set methods are used to provide more flexible and accurate decision model. Five watersheds in Iran named Oroomeyeh, Atrak, Sefidrood, Namak and Zayandehrood are considered as alternatives. Based on long term development goals, 38 water management strategies are defined as subcriteria in 10 clusters. The main advantage of the proposed methods is its ability to overcome uncertainty. This task is accomplished by using fuzzy numbers in all steps of the algorithms. To validate the proposed method, the final results were compared with those obtained from the ANP algorithm and the Spearman rank correlation coefficient is applied to find the similarity in the different ranking methods. Finally, the sensitivity analysis was conducted to investigate the influence of cluster weights on the final ranking.

  2. Detection of bone disease by hybrid SST-watershed x-ray image segmentation

    NASA Astrophysics Data System (ADS)

    Sanei, Saeid; Azron, Mohammad; Heng, Ong Sim

    2001-07-01

    Detection of diagnostic features from X-ray images is favorable due to the low cost of these images. Accurate detection of the bone metastasis region greatly assists physicians to monitor the treatment and to remove the cancerous tissue by surgery. A hybrid SST-watershed algorithm, here, efficiently detects the boundary of the diseased regions. Shortest Spanning Tree (SST), based on graph theory, is one of the most powerful tools in grey level image segmentation. The method converts the images into arbitrary-shape closed segments of distinct grey levels. To do that, the image is initially mapped to a tree. Then using RSST algorithm the image is segmented to a certain number of arbitrary-shaped regions. However, in fine segmentation, over-segmentation causes loss of objects of interest. In coarse segmentation, on the other hand, SST-based method suffers from merging the regions belonged to different objects. By applying watershed algorithm, the large segments are divided into the smaller regions based on the number of catchment's basins for each segment. The process exploits bi-level watershed concept to separate each multi-lobe region into a number of areas each corresponding to an object (in our case a cancerous region of the bone,) disregarding their homogeneity in grey level.

  3. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    PubMed

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Microscopic image analysis for reticulocyte based on watershed algorithm

    NASA Astrophysics Data System (ADS)

    Wang, J. Q.; Liu, G. F.; Liu, J. G.; Wang, G.

    2007-12-01

    We present a watershed-based algorithm in the analysis of light microscopic image for reticulocyte (RET), which will be used in an automated recognition system for RET in peripheral blood. The original images, obtained by micrography, are segmented by modified watershed algorithm and are recognized in term of gray entropy and area of connective area. In the process of watershed algorithm, judgment conditions are controlled according to character of the image, besides, the segmentation is performed by morphological subtraction. The algorithm was simulated with MATLAB software. It is similar for automated and manual scoring and there is good correlation(r=0.956) between the methods, which is resulted from 50 pieces of RET images. The result indicates that the algorithm for peripheral blood RETs is comparable to conventional manual scoring, and it is superior in objectivity. This algorithm avoids time-consuming calculation such as ultra-erosion and region-growth, which will speed up the computation consequentially.

  5. A hybrid regional approach to model discharge at multiple sub-basins within the Calapooia Watershed, Oregon, USA

    EPA Science Inventory

    Modeling is a useful tool for quantifying ecosystem services and understanding their temporal dynamics. Here we describe a hybrid regional modeling approach for sub-basins of the Calapooia watershed that incorporates both a precipitation-runoff model and an indexed regression mo...

  6. Using Hybrid Techniques for Generating Watershed-scale Flood Models in an Integrated Modeling Framework

    NASA Astrophysics Data System (ADS)

    Saksena, S.; Merwade, V.; Singhofen, P.

    2017-12-01

    There is an increasing global trend towards developing large scale flood models that account for spatial heterogeneity at watershed scales to drive the future flood risk planning. Integrated surface water-groundwater modeling procedures can elucidate all the hydrologic processes taking part during a flood event to provide accurate flood outputs. Even though the advantages of using integrated modeling are widely acknowledged, the complexity of integrated process representation, computation time and number of input parameters required have deterred its application to flood inundation mapping, especially for large watersheds. This study presents a faster approach for creating watershed scale flood models using a hybrid design that breaks down the watershed into multiple regions of variable spatial resolution by prioritizing higher order streams. The methodology involves creating a hybrid model for the Upper Wabash River Basin in Indiana using Interconnected Channel and Pond Routing (ICPR) and comparing the performance with a fully-integrated 2D hydrodynamic model. The hybrid approach involves simplification procedures such as 1D channel-2D floodplain coupling; hydrologic basin (HUC-12) integration with 2D groundwater for rainfall-runoff routing; and varying spatial resolution of 2D overland flow based on stream order. The results for a 50-year return period storm event show that hybrid model (NSE=0.87) performance is similar to the 2D integrated model (NSE=0.88) but the computational time is reduced to half. The results suggest that significant computational efficiency can be obtained while maintaining model accuracy for large-scale flood models by using hybrid approaches for model creation.

  7. A hierarchical network-based algorithm for multi-scale watershed delineation

    NASA Astrophysics Data System (ADS)

    Castronova, Anthony M.; Goodall, Jonathan L.

    2014-11-01

    Watershed delineation is a process for defining a land area that contributes surface water flow to a single outlet point. It is a commonly used in water resources analysis to define the domain in which hydrologic process calculations are applied. There has been a growing effort over the past decade to improve surface elevation measurements in the U.S., which has had a significant impact on the accuracy of hydrologic calculations. Traditional watershed processing on these elevation rasters, however, becomes more burdensome as data resolution increases. As a result, processing of these datasets can be troublesome on standard desktop computers. This challenge has resulted in numerous works that aim to provide high performance computing solutions to large data, high resolution data, or both. This work proposes an efficient watershed delineation algorithm for use in desktop computing environments that leverages existing data, U.S. Geological Survey (USGS) National Hydrography Dataset Plus (NHD+), and open source software tools to construct watershed boundaries. This approach makes use of U.S. national-level hydrography data that has been precomputed using raster processing algorithms coupled with quality control routines. Our approach uses carefully arranged data and mathematical graph theory to traverse river networks and identify catchment boundaries. We demonstrate this new watershed delineation technique, compare its accuracy with traditional algorithms that derive watershed solely from digital elevation models, and then extend our approach to address subwatershed delineation. Our findings suggest that the open-source hierarchical network-based delineation procedure presented in the work is a promising approach to watershed delineation that can be used summarize publicly available datasets for hydrologic model input pre-processing. Through our analysis, we explore the benefits of reusing the NHD+ datasets for watershed delineation, and find that the our technique offers greater flexibility and extendability than traditional raster algorithms.

  8. Spatial segregation of spawning habitat limits hybridization between sympatric native Steelhead and Coastal Cutthroat Trout

    USGS Publications Warehouse

    Buehrens, T.W.; Glasgow, J.; Ostberg, Carl O.; Quinn, T.P.

    2013-01-01

    Native Coastal Cutthroat Trout Oncorhynchus clarkii clarkii and Coastal Steelhead O. mykiss irideus hybridize naturally in watersheds of the Pacific Northwest yet maintain species integrity. Partial reproductive isolation due to differences in spawning habitat may limit hybridization between these species, but this process is poorly understood. We used a riverscape approach to determine the spatial distribution of spawning habitats used by native Coastal Cutthroat Trout and Steelhead as evidenced by the distribution of recently emerged fry. Molecular genetic markers were used to classify individuals as pure species or hybrids, and individuals were assigned to age-classes based on length. Fish and physical habitat data were collected in a spatially continuous framework to assess the relationship between habitat and watershed features and the spatial distribution of parental species and hybrids. Sampling occurred in 35 reaches from tidewaters to headwaters in a small (20 km2) coastal watershed in Washington State. Cutthroat, Steelhead, and hybrid trout accounted for 35%, 42%, and 23% of the fish collected, respectively. Strong segregation of spawning areas between Coastal Cutthroat Trout and Steelhead was evidenced by the distribution of age-0 trout. Cutthroat Trout were located farther upstream and in smaller tributaries than Steelhead were. The best predictor of species occurrence at a site was the drainage area of the watershed that contributed to the site. This area was positively correlated with the occurrence of age-0 Steelhead and negatively with the presence of Cutthroat Trout, whereas hybrids were found in areas occupied by both parental species. A similar pattern was observed in older juveniles of both species but overlap was greater, suggesting substantial dispersal of trout after emergence. Our results offer support for spatial reproductive segregation as a factor limiting hybridization between Steelhead and Coastal Cutthroat Trout.

  9. Hybrid Multi-Objective Optimization of Folsom Reservoir Operation to Maximize Storage in Whole Watershed

    NASA Astrophysics Data System (ADS)

    Goharian, E.; Gailey, R.; Maples, S.; Azizipour, M.; Sandoval Solis, S.; Fogg, G. E.

    2017-12-01

    The drought incidents and growing water scarcity in California have a profound effect on human, agricultural, and environmental water needs. California experienced multi-year droughts, which have caused groundwater overdraft and dropping groundwater levels, and dwindling of major reservoirs. These concerns call for a stringent evaluation of future water resources sustainability and security in the state. To answer to this call, Sustainable Groundwater Management Act (SGMA) was passed in 2014 to promise a sustainable groundwater management in California by 2042. SGMA refers to managed aquifer recharge (MAR) as a key management option, especially in areas with high variation in water availability intra- and inter-annually, to secure the refill of underground water storage and return of groundwater quality to a desirable condition. The hybrid optimization of an integrated water resources system provides an opportunity to adapt surface reservoir operations for enhancement in groundwater recharge. Here, to re-operate Folsom Reservoir, objectives are maximizing the storage in the whole American-Cosumnes watershed and maximizing hydropower generation from Folsom Reservoir. While a linear programing (LP) module tends to maximize the total groundwater recharge by distributing and spreading water over suitable lands in basin, a genetic based algorithm, Non-dominated Sorting Genetic Algorithm II (NSGA-II), layer above it controls releases from the reservoir to secure the hydropower generation, carry-over storage in reservoir, available water for replenishment, and downstream water requirements. The preliminary results show additional releases from the reservoir for groundwater recharge during high flow seasons. Moreover, tradeoffs between the objectives describe that new operation performs satisfactorily to increase the storage in the basin, with nonsignificant effects on other objectives.

  10. A modified approach combining FNEA and watershed algorithms for segmenting remotely-sensed optical images

    NASA Astrophysics Data System (ADS)

    Liu, Likun

    2018-01-01

    In the field of remote sensing image processing, remote sensing image segmentation is a preliminary step for later analysis of remote sensing image processing and semi-auto human interpretation, fully-automatic machine recognition and learning. Since 2000, a technique of object-oriented remote sensing image processing method and its basic thought prevails. The core of the approach is Fractal Net Evolution Approach (FNEA) multi-scale segmentation algorithm. The paper is intent on the research and improvement of the algorithm, which analyzes present segmentation algorithms and selects optimum watershed algorithm as an initialization. Meanwhile, the algorithm is modified by modifying an area parameter, and then combining area parameter with a heterogeneous parameter further. After that, several experiments is carried on to prove the modified FNEA algorithm, compared with traditional pixel-based method (FCM algorithm based on neighborhood information) and combination of FNEA and watershed, has a better segmentation result.

  11. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  12. A micro-hydrology computation ordering algorithm

    NASA Astrophysics Data System (ADS)

    Croley, Thomas E.

    1980-11-01

    Discrete-distributed-parameter models are essential for watershed modelling where practical consideration of spatial variations in watershed properties and inputs is desired. Such modelling is necessary for analysis of detailed hydrologic impacts from management strategies and land-use effects. Trade-offs between model validity and model complexity exist in resolution of the watershed. Once these are determined, the watershed is then broken into sub-areas which each have essentially spatially-uniform properties. Lumped-parameter (micro-hydrology) models are applied to these sub-areas and their outputs are combined through the use of a computation ordering technique, as illustrated by many discrete-distributed-parameter hydrology models. Manual ordering of these computations requires fore-thought, and is tedious, error prone, sometimes storage intensive and least adaptable to changes in watershed resolution. A programmable algorithm for ordering micro-hydrology computations is presented that enables automatic ordering of computations within the computer via an easily understood and easily implemented "node" definition, numbering and coding scheme. This scheme and the algorithm are detailed in logic flow-charts and an example application is presented. Extensions and modifications of the algorithm are easily made for complex geometries or differing microhydrology models. The algorithm is shown to be superior to manual ordering techniques and has potential use in high-resolution studies.

  13. The development of a 3D mesoscopic model of metallic foam based on an improved watershed algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Jinhua; Zhang, Yadong; Wang, Guikun; Fang, Qin

    2018-06-01

    The watershed algorithm has been used widely in the x-ray computed tomography (XCT) image segmentation. It provides a transformation defined on a grayscale image and finds the lines that separate adjacent images. However, distortion occurs in developing a mesoscopic model of metallic foam based on XCT image data. The cells are oversegmented at some events when the traditional watershed algorithm is used. The improved watershed algorithm presented in this paper can avoid oversegmentation and is composed of three steps. Firstly, it finds all of the connected cells and identifies the junctions of the corresponding cell walls. Secondly, the image segmentation is conducted to separate the adjacent cells. It generates the lost cell walls between the adjacent cells. Optimization is then performed on the segmentation image. Thirdly, this improved algorithm is validated when it is compared with the image of the metallic foam, which shows that it can avoid the image segmentation distortion. A mesoscopic model of metallic foam is thus formed based on the improved algorithm, and the mesoscopic characteristics of the metallic foam, such as cell size, volume and shape, are identified and analyzed.

  14. An auto-adaptive optimization approach for targeting nonpoint source pollution control practices.

    PubMed

    Chen, Lei; Wei, Guoyuan; Shen, Zhenyao

    2015-10-21

    To solve computationally intensive and technically complex control of nonpoint source pollution, the traditional genetic algorithm was modified into an auto-adaptive pattern, and a new framework was proposed by integrating this new algorithm with a watershed model and an economic module. Although conceptually simple and comprehensive, the proposed algorithm would search automatically for those Pareto-optimality solutions without a complex calibration of optimization parameters. The model was applied in a case study in a typical watershed of the Three Gorges Reservoir area, China. The results indicated that the evolutionary process of optimization was improved due to the incorporation of auto-adaptive parameters. In addition, the proposed algorithm outperformed the state-of-the-art existing algorithms in terms of convergence ability and computational efficiency. At the same cost level, solutions with greater pollutant reductions could be identified. From a scientific viewpoint, the proposed algorithm could be extended to other watersheds to provide cost-effective configurations of BMPs.

  15. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    NASA Astrophysics Data System (ADS)

    Abtahi, Amir-Reza; Bijari, Afsane

    2017-03-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  16. Hybrid cryptosystem RSA - CRT optimization and VMPC

    NASA Astrophysics Data System (ADS)

    Rahmadani, R.; Mawengkang, H.; Sutarman

    2018-03-01

    Hybrid cryptosystem combines symmetric algorithms and asymmetric algorithms. This combination utilizes speeds on encryption/decryption processes of symmetric algorithms and asymmetric algorithms to secure symmetric keys. In this paper we propose hybrid cryptosystem that combine symmetric algorithms VMPC and asymmetric algorithms RSA - CRT optimization. RSA - CRT optimization speeds up the decryption process by obtaining plaintext with dp and p key only, so there is no need to perform CRT processes. The VMPC algorithm is more efficient in software implementation and reduces known weaknesses in RC4 key generation. The results show hybrid cryptosystem RSA - CRT optimization and VMPC is faster than hybrid cryptosystem RSA - VMPC and hybrid cryptosystem RSA - CRT - VMPC. Keyword : Cryptography, RSA, RSA - CRT, VMPC, Hybrid Cryptosystem.

  17. COST-EFFECTIVE ALLOCATION OF WATERSHED MANAGEMENT PRACTICES USING A GENETIC ALGORITHM

    EPA Science Inventory

    Implementation of conservation programs are perceived as being crucial for restoring and protecting waters and watersheds from non-point source pollution. Success of these programs depends to a great extent on planning tools that can assist the watershed management process. Here-...

  18. Real-time implementations of image segmentation algorithms on shared memory multicore architecture: a survey (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Akil, Mohamed

    2017-05-01

    The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.

  19. Automatic extraction of planetary image features

    NASA Technical Reports Server (NTRS)

    LeMoigne-Stewart, Jacqueline J. (Inventor); Troglio, Giulia (Inventor); Benediktsson, Jon A. (Inventor); Serpico, Sebastiano B. (Inventor); Moser, Gabriele (Inventor)

    2013-01-01

    A method for the extraction of Lunar data and/or planetary features is provided. The feature extraction method can include one or more image processing techniques, including, but not limited to, a watershed segmentation and/or the generalized Hough Transform. According to some embodiments, the feature extraction method can include extracting features, such as, small rocks. According to some embodiments, small rocks can be extracted by applying a watershed segmentation algorithm to the Canny gradient. According to some embodiments, applying a watershed segmentation algorithm to the Canny gradient can allow regions that appear as close contours in the gradient to be segmented.

  20. Hybrid artificial bee colony algorithm for parameter optimization of five-parameter bidirectional reflectance distribution function model.

    PubMed

    Wang, Qianqian; Zhao, Jing; Gong, Yong; Hao, Qun; Peng, Zhong

    2017-11-20

    A hybrid artificial bee colony (ABC) algorithm inspired by the best-so-far solution and bacterial chemotaxis was introduced to optimize the parameters of the five-parameter bidirectional reflectance distribution function (BRDF) model. To verify the performance of the hybrid ABC algorithm, we measured BRDF of three kinds of samples and simulated the undetermined parameters of the five-parameter BRDF model using the hybrid ABC algorithm and the genetic algorithm, respectively. The experimental results demonstrate that the hybrid ABC algorithm outperforms the genetic algorithm in convergence speed, accuracy, and time efficiency under the same conditions.

  1. HYBRID FAST HANKEL TRANSFORM ALGORITHM FOR ELECTROMAGNETIC MODELING

    EPA Science Inventory

    A hybrid fast Hankel transform algorithm has been developed that uses several complementary features of two existing algorithms: Anderson's digital filtering or fast Hankel transform (FHT) algorithm and Chave's quadrature and continued fraction algorithm. A hybrid FHT subprogram ...

  2. Genetic Algorithms and Local Search

    NASA Technical Reports Server (NTRS)

    Whitley, Darrell

    1996-01-01

    The first part of this presentation is a tutorial level introduction to the principles of genetic search and models of simple genetic algorithms. The second half covers the combination of genetic algorithms with local search methods to produce hybrid genetic algorithms. Hybrid algorithms can be modeled within the existing theoretical framework developed for simple genetic algorithms. An application of a hybrid to geometric model matching is given. The hybrid algorithm yields results that improve on the current state-of-the-art for this problem.

  3. Adaptive striping watershed segmentation method for processing microscopic images of overlapping irregular-shaped and multicentre particles.

    PubMed

    Xiao, X; Bai, B; Xu, N; Wu, K

    2015-04-01

    Oversegmentation is a major drawback of the morphological watershed algorithm. Here, we study and reveal that the oversegmentation is not only because of the irregular shapes of the particle images, which people are familiar with, but also because of some particles, such as ellipses, with more than one centre. A new parameter, the striping level, is introduced and the criterion for striping parameter is built to help find the right markers prior to segmentation. An adaptive striping watershed algorithm is established by applying a procedure, called the marker searching algorithm, to find the markers, which can effectively suppress the oversegmentation. The effectiveness of the proposed method is validated by analysing some typical particle images including the images of gold nanorod ensembles. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  4. Bladder segmentation in MR images with watershed segmentation and graph cut algorithm

    NASA Astrophysics Data System (ADS)

    Blaffert, Thomas; Renisch, Steffen; Schadewaldt, Nicole; Schulz, Heinrich; Wiemker, Rafael

    2014-03-01

    Prostate and cervix cancer diagnosis and treatment planning that is based on MR images benefit from superior soft tissue contrast compared to CT images. For these images an automatic delineation of the prostate or cervix and the organs at risk such as the bladder is highly desirable. This paper describes a method for bladder segmentation that is based on a watershed transform on high image gradient values and gray value valleys together with the classification of watershed regions into bladder contents and tissue by a graph cut algorithm. The obtained results are superior if compared to a simple region-after-region classification.

  5. CURE-SMOTE algorithm and hybrid algorithm for feature selection and parameter optimization based on random forests.

    PubMed

    Ma, Li; Fan, Suohai

    2017-03-14

    The random forests algorithm is a type of classifier with prominent universality, a wide application range, and robustness for avoiding overfitting. But there are still some drawbacks to random forests. Therefore, to improve the performance of random forests, this paper seeks to improve imbalanced data processing, feature selection and parameter optimization. We propose the CURE-SMOTE algorithm for the imbalanced data classification problem. Experiments on imbalanced UCI data reveal that the combination of Clustering Using Representatives (CURE) enhances the original synthetic minority oversampling technique (SMOTE) algorithms effectively compared with the classification results on the original data using random sampling, Borderline-SMOTE1, safe-level SMOTE, C-SMOTE, and k-means-SMOTE. Additionally, the hybrid RF (random forests) algorithm has been proposed for feature selection and parameter optimization, which uses the minimum out of bag (OOB) data error as its objective function. Simulation results on binary and higher-dimensional data indicate that the proposed hybrid RF algorithms, hybrid genetic-random forests algorithm, hybrid particle swarm-random forests algorithm and hybrid fish swarm-random forests algorithm can achieve the minimum OOB error and show the best generalization ability. The training set produced from the proposed CURE-SMOTE algorithm is closer to the original data distribution because it contains minimal noise. Thus, better classification results are produced from this feasible and effective algorithm. Moreover, the hybrid algorithm's F-value, G-mean, AUC and OOB scores demonstrate that they surpass the performance of the original RF algorithm. Hence, this hybrid algorithm provides a new way to perform feature selection and parameter optimization.

  6. Repeated trans-watershed hybridization among haplochromine cichlids (Cichlidae) was triggered by Neogene landscape evolution.

    PubMed

    Schwarzer, Julia; Swartz, Ernst Roelof; Vreven, Emmanuel; Snoeks, Jos; Cotterill, Fenton Peter David; Misof, Bernhard; Schliewen, Ulrich Kurt

    2012-11-07

    The megadiverse haplochromine cichlid radiations of the East African lakes, famous examples of explosive speciation and adaptive radiation, are according to recent studies, introgressed by different riverine lineages. This study is based on the first comprehensive mitochondrial and nuclear DNA dataset from extensive sampling of riverine haplochromine cichlids. It includes species from the lower River Congo and Angolan (River Kwanza) drainages. Reconstruction of phylogenetic hypotheses revealed the paradox of clearly discordant phylogenetic signals. Closely related mtDNA haplotypes are distributed thousands of kilometres apart and across major African watersheds, whereas some neighbouring species carry drastically divergent mtDNA haplotypes. At shallow and deep phylogenetic layers, strong signals of hybridization are attributed to the complex Late Miocene/Early Pliocene palaeohistory of African rivers. Hybridization of multiple lineages across changing watersheds shaped each of the major haplochromine radiations in lakes Tanganyika, Victoria, Malawi and the Kalahari Palaeolakes, as well as a miniature species flock in the Congo basin (River Fwa). On the basis of our results, introgression occurred not only on a spatially restricted scale, but massively over almost the whole range of the haplochromine distribution. This provides an alternative view on the origin and exceptional high diversity of this enigmatic vertebrate group.

  7. G/SPLINES: A hybrid of Friedman's Multivariate Adaptive Regression Splines (MARS) algorithm with Holland's genetic algorithm

    NASA Technical Reports Server (NTRS)

    Rogers, David

    1991-01-01

    G/SPLINES are a hybrid of Friedman's Multivariable Adaptive Regression Splines (MARS) algorithm with Holland's Genetic Algorithm. In this hybrid, the incremental search is replaced by a genetic search. The G/SPLINE algorithm exhibits performance comparable to that of the MARS algorithm, requires fewer least squares computations, and allows significantly larger problems to be considered.

  8. Application of a New Integrated Decision Support Tool (i-DST) for Urban Water Infrastructure: Analyzing Water Quality Compliance Pathways for Three Los Angeles Watersheds

    NASA Astrophysics Data System (ADS)

    Gallo, E. M.; Hogue, T. S.; Bell, C. D.; Spahr, K.; McCray, J. E.

    2017-12-01

    The water quality of receiving streams and waterbodies in urban watersheds are increasingly polluted from stormwater runoff. The implementation of Green Infrastructure (GI), which includes Low Impact Developments (LIDs) and Best Management Practices (BMPs), within a watershed aim to mitigate the effects of urbanization by reducing pollutant loads, runoff volume, and storm peak flow. Stormwater modeling is generally used to assess the impact of GIs implemented within a watershed. These modeling tools are useful for determining the optimal suite of GIs to maximize pollutant load reduction and minimize cost. However, stormwater management for most resource managers and communities also includes the implementation of grey and hybrid stormwater infrastructure. An integrated decision support tool, called i-DST, that allows for the optimization and comprehensive life-cycle cost assessment of grey, green, and hybrid stormwater infrastructure, is currently being developed. The i-DST tool will evaluate optimal stormwater runoff management by taking into account the diverse economic, environmental, and societal needs associated with watersheds across the United States. Three watersheds from southern California will act as a test site and assist in the development and initial application of the i-DST tool. The Ballona Creek, Dominguez Channel, and Los Angeles River Watersheds are located in highly urbanized Los Angeles County. The water quality of the river channels flowing through each are impaired by heavy metals, including copper, lead, and zinc. However, despite being adjacent to one another within the same county, modeling results, using EPA System for Urban Stormwater Treatment and Analysis INtegration (SUSTAIN), found that the optimal path to compliance in each watershed differs significantly. The differences include varied costs, suites of BMPs, and ancillary benefits. This research analyzes how the economic, physical, and hydrological differences between the three watersheds shape the optimal plan for stormwater management.

  9. Ensemble of hybrid genetic algorithm for two-dimensional phase unwrapping

    NASA Astrophysics Data System (ADS)

    Balakrishnan, D.; Quan, C.; Tay, C. J.

    2013-06-01

    The phase unwrapping is the final and trickiest step in any phase retrieval technique. Phase unwrapping by artificial intelligence methods (optimization algorithms) such as hybrid genetic algorithm, reverse simulated annealing, particle swarm optimization, minimum cost matching showed better results than conventional phase unwrapping methods. In this paper, Ensemble of hybrid genetic algorithm with parallel populations is proposed to solve the branch-cut phase unwrapping problem. In a single populated hybrid genetic algorithm, the selection, cross-over and mutation operators are applied to obtain new population in every generation. The parameters and choice of operators will affect the performance of the hybrid genetic algorithm. The ensemble of hybrid genetic algorithm will facilitate to have different parameters set and different choice of operators simultaneously. Each population will use different set of parameters and the offspring of each population will compete against the offspring of all other populations, which use different set of parameters. The effectiveness of proposed algorithm is demonstrated by phase unwrapping examples and advantages of the proposed method are discussed.

  10. Hybrid time-frequency domain equalization based on sign-sign joint decision multimodulus algorithm for 6 × 6 mode division multiplexing system

    NASA Astrophysics Data System (ADS)

    Li, Jiao; Hu, Guijun; Gong, Caili; Li, Li

    2018-02-01

    In this paper, we propose a hybrid time-frequency domain sign-sign joint decision multimodulus algorithm (Hybrid-SJDMMA) for mode-demultiplexing in a 6 × 6 mode division multiplexing (MDM) system with high-order QAM modulation. The equalization performance of Hybrid-SJDMMA was evaluated and compared with the frequency domain multimodulus algorithm (FD-MMA) and the hybrid time-frequency domain sign-sign multimodulus algorithm (Hybrid-SMMA). Simulation results revealed that Hybrid-SJDMMA exhibits a significantly lower computational complexity than FD-MMA, and its convergence speed is similar to that of FD-MMA. Additionally, the bit-error-rate performance of Hybrid-SJDMMA was obviously better than FD-MMA and Hybrid-SMMA for 16 QAM and 64 QAM.

  11. Mapping Mountain Front Recharge Areas in Arid Watersheds Based on a Digital Elevation Model and Land Cover Types

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, Esther E.; Hamada, Yuki; O’Connor, Ben L.

    Here, a recent assessment that quantified potential impacts of solar energy development on water resources in the southwestern United States necessitated the development of a methodology to identify locations of mountain front recharge (MFR) in order to guide land development decisions. A spatially explicit, slope-based algorithm was created to delineate MFR zones in 17 arid, mountainous watersheds using elevation and land cover data. Slopes were calculated from elevation data and grouped into 100 classes using iterative self-organizing classification. Candidate MFR zones were identified based on slope classes that were consistent with MFR. Land cover types that were inconsistent with groundwatermore » recharge were excluded from the candidate areas to determine the final MFR zones. No MFR reference maps exist for comparison with the study’s results, so the reliability of the resulting MFR zone maps was evaluated qualitatively using slope, surficial geology, soil, and land cover datasets. MFR zones ranged from 74 km2 to 1,547 km2 and accounted for 40% of the total watershed area studied. Slopes and surficial geologic materials that were present in the MFR zones were consistent with conditions at the mountain front, while soils and land cover that were present would generally promote groundwater recharge. Visual inspection of the MFR zone maps also confirmed the presence of well-recognized alluvial fan features in several study watersheds. While qualitative evaluation suggested that the algorithm reliably delineated MFR zones in most watersheds overall, the algorithm was better suited for application in watersheds that had characteristic Basin and Range topography and relatively flat basin floors than areas without these characteristics. Because the algorithm performed well to reliably delineate the spatial distribution of MFR, it would allow researchers to quantify aspects of the hydrologic processes associated with MFR and help local land resource managers to consider protection of critical groundwater recharge regions in their development decisions.« less

  12. Mapping Mountain Front Recharge Areas in Arid Watersheds Based on a Digital Elevation Model and Land Cover Types

    DOE PAGES

    Bowen, Esther E.; Hamada, Yuki; O’Connor, Ben L.

    2014-06-01

    Here, a recent assessment that quantified potential impacts of solar energy development on water resources in the southwestern United States necessitated the development of a methodology to identify locations of mountain front recharge (MFR) in order to guide land development decisions. A spatially explicit, slope-based algorithm was created to delineate MFR zones in 17 arid, mountainous watersheds using elevation and land cover data. Slopes were calculated from elevation data and grouped into 100 classes using iterative self-organizing classification. Candidate MFR zones were identified based on slope classes that were consistent with MFR. Land cover types that were inconsistent with groundwatermore » recharge were excluded from the candidate areas to determine the final MFR zones. No MFR reference maps exist for comparison with the study’s results, so the reliability of the resulting MFR zone maps was evaluated qualitatively using slope, surficial geology, soil, and land cover datasets. MFR zones ranged from 74 km2 to 1,547 km2 and accounted for 40% of the total watershed area studied. Slopes and surficial geologic materials that were present in the MFR zones were consistent with conditions at the mountain front, while soils and land cover that were present would generally promote groundwater recharge. Visual inspection of the MFR zone maps also confirmed the presence of well-recognized alluvial fan features in several study watersheds. While qualitative evaluation suggested that the algorithm reliably delineated MFR zones in most watersheds overall, the algorithm was better suited for application in watersheds that had characteristic Basin and Range topography and relatively flat basin floors than areas without these characteristics. Because the algorithm performed well to reliably delineate the spatial distribution of MFR, it would allow researchers to quantify aspects of the hydrologic processes associated with MFR and help local land resource managers to consider protection of critical groundwater recharge regions in their development decisions.« less

  13. A new algorithm for grid-based hydrologic analysis by incorporating stormwater infrastructure

    NASA Astrophysics Data System (ADS)

    Choi, Yosoon; Yi, Huiuk; Park, Hyeong-Dong

    2011-08-01

    We developed a new algorithm, the Adaptive Stormwater Infrastructure (ASI) algorithm, to incorporate ancillary data sets related to stormwater infrastructure into the grid-based hydrologic analysis. The algorithm simultaneously considers the effects of the surface stormwater collector network (e.g., diversions, roadside ditches, and canals) and underground stormwater conveyance systems (e.g., waterway tunnels, collector pipes, and culverts). The surface drainage flows controlled by the surface runoff collector network are superimposed onto the flow directions derived from a DEM. After examining the connections between inlets and outfalls in the underground stormwater conveyance system, the flow accumulation and delineation of watersheds are calculated based on recursive computations. Application of the algorithm to the Sangdong tailings dam in Korea revealed superior performance to that of a conventional D8 single-flow algorithm in terms of providing reasonable hydrologic information on watersheds with stormwater infrastructure.

  14. Spatial multiobjective optimization of agricultural conservation practices using a SWAT model and an evolutionary algorithm.

    PubMed

    Rabotyagov, Sergey; Campbell, Todd; Valcu, Adriana; Gassman, Philip; Jha, Manoj; Schilling, Keith; Wolter, Calvin; Kling, Catherine

    2012-12-09

    Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,(5,12,20)) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods(3,4,9,10,13-15,17-19,22,23,25). In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model(7) with a multiobjective evolutionary algorithm SPEA2(26), and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.

  15. Guided particle swarm optimization method to solve general nonlinear optimization problems

    NASA Astrophysics Data System (ADS)

    Abdelhalim, Alyaa; Nakata, Kazuhide; El-Alem, Mahmoud; Eltawil, Amr

    2018-04-01

    The development of hybrid algorithms is becoming an important topic in the global optimization research area. This article proposes a new technique in hybridizing the particle swarm optimization (PSO) algorithm and the Nelder-Mead (NM) simplex search algorithm to solve general nonlinear unconstrained optimization problems. Unlike traditional hybrid methods, the proposed method hybridizes the NM algorithm inside the PSO to improve the velocities and positions of the particles iteratively. The new hybridization considers the PSO algorithm and NM algorithm as one heuristic, not in a sequential or hierarchical manner. The NM algorithm is applied to improve the initial random solution of the PSO algorithm and iteratively in every step to improve the overall performance of the method. The performance of the proposed method was tested over 20 optimization test functions with varying dimensions. Comprehensive comparisons with other methods in the literature indicate that the proposed solution method is promising and competitive.

  16. Evaluation of hybrid algorithm for analysis of scattered light using ex vivo nuclear morphology measurements of cervical epithelium

    PubMed Central

    Ho, Derek; Drake, Tyler K.; Bentley, Rex C.; Valea, Fidel A.; Wax, Adam

    2015-01-01

    We evaluate a new hybrid algorithm for determining nuclear morphology using angle-resolved low coherence interferometry (a/LCI) measurements in ex vivo cervical tissue. The algorithm combines Mie theory based and continuous wavelet transform inverse light scattering analysis. The hybrid algorithm was validated and compared to traditional Mie theory based analysis using an ex vivo tissue data set. The hybrid algorithm achieved 100% agreement with pathology in distinguishing dysplastic and non-dysplastic biopsy sites in the pilot study. Significantly, the new algorithm performed over four times faster than traditional Mie theory based analysis. PMID:26309741

  17. Robust Global Image Registration Based on a Hybrid Algorithm Combining Fourier and Spatial Domain Techniques

    DTIC Science & Technology

    2012-09-01

    Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain techniques Peter N. Crabtree, Collin Seanor...00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Robust global image registration based on a hybrid algorithm combining Fourier and spatial domain...demonstrate performance of a hybrid algorithm . These results are from analysis of a set of images of an ISO 12233 [12] resolution chart captured in the

  18. Hybrid flower pollination algorithm strategies for t-way test suite generation.

    PubMed

    Nasser, Abdullah B; Zamli, Kamal Z; Alsewari, AbdulRahman A; Ahmed, Bestoun S

    2018-01-01

    The application of meta-heuristic algorithms for t-way testing has recently become prevalent. Consequently, many useful meta-heuristic algorithms have been developed on the basis of the implementation of t-way strategies (where t indicates the interaction strength). Mixed results have been reported in the literature to highlight the fact that no single strategy appears to be superior compared with other configurations. The hybridization of two or more algorithms can enhance the overall search capabilities, that is, by compensating the limitation of one algorithm with the strength of others. Thus, hybrid variants of the flower pollination algorithm (FPA) are proposed in the current work. Four hybrid variants of FPA are considered by combining FPA with other algorithmic components. The experimental results demonstrate that FPA hybrids overcome the problems of slow convergence in the original FPA and offers statistically superior performance compared with existing t-way strategies in terms of test suite size.

  19. Hybrid flower pollination algorithm strategies for t-way test suite generation

    PubMed Central

    Zamli, Kamal Z.; Alsewari, AbdulRahman A.

    2018-01-01

    The application of meta-heuristic algorithms for t-way testing has recently become prevalent. Consequently, many useful meta-heuristic algorithms have been developed on the basis of the implementation of t-way strategies (where t indicates the interaction strength). Mixed results have been reported in the literature to highlight the fact that no single strategy appears to be superior compared with other configurations. The hybridization of two or more algorithms can enhance the overall search capabilities, that is, by compensating the limitation of one algorithm with the strength of others. Thus, hybrid variants of the flower pollination algorithm (FPA) are proposed in the current work. Four hybrid variants of FPA are considered by combining FPA with other algorithmic components. The experimental results demonstrate that FPA hybrids overcome the problems of slow convergence in the original FPA and offers statistically superior performance compared with existing t-way strategies in terms of test suite size. PMID:29718918

  20. Spatio-Temporal Process Variability in Watershed Scale Wetland Restoration Planning

    NASA Astrophysics Data System (ADS)

    Evenson, G. R.

    2012-12-01

    Watershed scale restoration decision making processes are increasingly informed by quantitative methodologies providing site-specific restoration recommendations - sometimes referred to as "systematic planning." The more advanced of these methodologies are characterized by a coupling of search algorithms and ecological models to discover restoration plans that optimize environmental outcomes. Yet while these methods have exhibited clear utility as decision support toolsets, they may be critiqued for flawed evaluations of spatio-temporally variable processes fundamental to watershed scale restoration. Hydrologic and non-hydrologic mediated process connectivity along with post-restoration habitat dynamics, for example, are commonly ignored yet known to appreciably affect restoration outcomes. This talk will present a methodology to evaluate such spatio-temporally complex processes in the production of watershed scale wetland restoration plans. Using the Tuscarawas Watershed in Eastern Ohio as a case study, a genetic algorithm will be coupled with the Soil and Water Assessment Tool (SWAT) to reveal optimal wetland restoration plans as measured by their capacity to maximize nutrient reductions. Then, a so-called "graphical" representation of the optimization problem will be implemented in-parallel to promote hydrologic and non-hydrologic mediated connectivity amongst existing wetlands and sites selected for restoration. Further, various search algorithm mechanisms will be discussed as a means of accounting for temporal complexities such as post-restoration habitat dynamics. Finally, generalized patterns of restoration plan optimality will be discussed as an alternative and possibly superior decision support toolset given the complexity and stochastic nature of spatio-temporal process variability.

  1. Hybrid Particle Swarm Optimization for Hybrid Flowshop Scheduling Problem with Maintenance Activities

    PubMed Central

    Li, Jun-qing; Pan, Quan-ke; Mao, Kun

    2014-01-01

    A hybrid algorithm which combines particle swarm optimization (PSO) and iterated local search (ILS) is proposed for solving the hybrid flowshop scheduling (HFS) problem with preventive maintenance (PM) activities. In the proposed algorithm, different crossover operators and mutation operators are investigated. In addition, an efficient multiple insert mutation operator is developed for enhancing the searching ability of the algorithm. Furthermore, an ILS-based local search procedure is embedded in the algorithm to improve the exploitation ability of the proposed algorithm. The detailed experimental parameter for the canonical PSO is tuning. The proposed algorithm is tested on the variation of 77 Carlier and Néron's benchmark problems. Detailed comparisons with the present efficient algorithms, including hGA, ILS, PSO, and IG, verify the efficiency and effectiveness of the proposed algorithm. PMID:24883414

  2. Operation management of daily economic dispatch using novel hybrid particle swarm optimization and gravitational search algorithm with hybrid mutation strategy

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Huang, Song; Ji, Zhicheng

    2017-07-01

    This paper presents a hybrid particle swarm optimization and gravitational search algorithm based on hybrid mutation strategy (HGSAPSO-M) to optimize economic dispatch (ED) including distributed generations (DGs) considering market-based energy pricing. A daily ED model was formulated and a hybrid mutation strategy was adopted in HGSAPSO-M. The hybrid mutation strategy includes two mutation operators, chaotic mutation, Gaussian mutation. The proposed algorithm was tested on IEEE-33 bus and results show that the approach is effective for this problem.

  3. A novel hybrid algorithm for the design of the phase diffractive optical elements for beam shaping

    NASA Astrophysics Data System (ADS)

    Jiang, Wenbo; Wang, Jun; Dong, Xiucheng

    2013-02-01

    In this paper, a novel hybrid algorithm for the design of a phase diffractive optical elements (PDOE) is proposed. It combines the genetic algorithm (GA) with the transformable scale BFGS (Broyden, Fletcher, Goldfarb, Shanno) algorithm, the penalty function was used in the cost function definition. The novel hybrid algorithm has the global merits of the genetic algorithm as well as the local improvement capabilities of the transformable scale BFGS algorithm. We designed the PDOE using the conventional simulated annealing algorithm and the novel hybrid algorithm. To compare the performance of two algorithms, three indexes of the diffractive efficiency, uniformity error and the signal-to-noise ratio are considered in numerical simulation. The results show that the novel hybrid algorithm has good convergence property and good stability. As an application example, the PDOE was used for the Gaussian beam shaping; high diffractive efficiency, low uniformity error and high signal-to-noise were obtained. The PDOE can be used for high quality beam shaping such as inertial confinement fusion (ICF), excimer laser lithography, fiber coupling laser diode array, laser welding, etc. It shows wide application value.

  4. NONPOINT SOURCE MODEL CALIBRATION IN HONEY CREEK WATERSHED

    EPA Science Inventory

    The U.S. EPA Non-Point Source Model has been applied and calibrated to a fairly large (187 sq. mi.) agricultural watershed in the Lake Erie Drainage basin of north central Ohio. Hydrologic and chemical routing algorithms have been developed. The model is evaluated for suitability...

  5. Solving SAT Problem Based on Hybrid Differential Evolution Algorithm

    NASA Astrophysics Data System (ADS)

    Liu, Kunqi; Zhang, Jingmin; Liu, Gang; Kang, Lishan

    Satisfiability (SAT) problem is an NP-complete problem. Based on the analysis about it, SAT problem is translated equally into an optimization problem on the minimum of objective function. A hybrid differential evolution algorithm is proposed to solve the Satisfiability problem. It makes full use of strong local search capacity of hill-climbing algorithm and strong global search capability of differential evolution algorithm, which makes up their disadvantages, improves the efficiency of algorithm and avoids the stagnation phenomenon. The experiment results show that the hybrid algorithm is efficient in solving SAT problem.

  6. A novel algorithm for delineating wetland depressions and mapping surface hydrologic flow pathways using LiDAR data

    EPA Science Inventory

    In traditional watershed delineation and topographic modeling, surface depressions are generally treated as spurious features and simply removed from a digital elevation model (DEM) to enforce flow continuity of water across the topographic surface to the watershed outlets. In re...

  7. More efficient evolutionary strategies for model calibration with watershed model for demonstration

    NASA Astrophysics Data System (ADS)

    Baggett, J. S.; Skahill, B. E.

    2008-12-01

    Evolutionary strategies allow automatic calibration of more complex models than traditional gradient based approaches, but they are more computationally intensive. We present several efficiency enhancements for evolution strategies, many of which are not new, but when combined have been shown to dramatically decrease the number of model runs required for calibration of synthetic problems. To reduce the number of expensive model runs we employ a surrogate objective function for an adaptively determined fraction of the population at each generation (Kern et al., 2006). We demonstrate improvements to the adaptive ranking strategy that increase its efficiency while sacrificing little reliability and further reduce the number of model runs required in densely sampled parts of parameter space. Furthermore, we include a gradient individual in each generation that is usually not selected when the search is in a global phase or when the derivatives are poorly approximated, but when selected near a smooth local minimum can dramatically increase convergence speed (Tahk et al., 2007). Finally, the selection of the gradient individual is used to adapt the size of the population near local minima. We show, by incorporating these enhancements into the Covariance Matrix Adaption Evolution Strategy (CMAES; Hansen, 2006), that their synergetic effect is greater than their individual parts. This hybrid evolutionary strategy exploits smooth structure when it is present but degrades to an ordinary evolutionary strategy, at worst, if smoothness is not present. Calibration of 2D-3D synthetic models with the modified CMAES requires approximately 10%-25% of the model runs of ordinary CMAES. Preliminary demonstration of this hybrid strategy will be shown for watershed model calibration problems. Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of distribution algorithms. pp. 75-102, Springer Kern, S., N. Hansen and P. Koumoutsakos (2006). Local Meta-Models for Optimization Using Evolution Strategies. In Ninth International Conference on Parallel Problem Solving from Nature PPSN IX, Proceedings, pp.939-948, Berlin: Springer. Tahk, M., Woo, H., and Park. M, (2007). A hybrid optimization of evolutionary and gradient search. Engineering Optimization, (39), 87-104.

  8. Enhanced nonlinearity interval mapping scheme for high-performance simulation-optimization of watershed-scale BMP placement

    NASA Astrophysics Data System (ADS)

    Zou, Rui; Riverson, John; Liu, Yong; Murphy, Ryan; Sim, Youn

    2015-03-01

    Integrated continuous simulation-optimization models can be effective predictors of a process-based responses for cost-benefit optimization of best management practices (BMPs) selection and placement. However, practical application of simulation-optimization model is computationally prohibitive for large-scale systems. This study proposes an enhanced Nonlinearity Interval Mapping Scheme (NIMS) to solve large-scale watershed simulation-optimization problems several orders of magnitude faster than other commonly used algorithms. An efficient interval response coefficient (IRC) derivation method was incorporated into the NIMS framework to overcome a computational bottleneck. The proposed algorithm was evaluated using a case study watershed in the Los Angeles County Flood Control District. Using a continuous simulation watershed/stream-transport model, Loading Simulation Program in C++ (LSPC), three nested in-stream compliance points (CP)—each with multiple Total Maximum Daily Loads (TMDL) targets—were selected to derive optimal treatment levels for each of the 28 subwatersheds, so that the TMDL targets at all the CP were met with the lowest possible BMP implementation cost. Genetic Algorithm (GA) and NIMS were both applied and compared. The results showed that the NIMS took 11 iterations (about 11 min) to complete with the resulting optimal solution having a total cost of 67.2 million, while each of the multiple GA executions took 21-38 days to reach near optimal solutions. The best solution obtained among all the GA executions compared had a minimized cost of 67.7 million—marginally higher, but approximately equal to that of the NIMS solution. The results highlight the utility for decision making in large-scale watershed simulation-optimization formulations.

  9. Time-optimal trajectory planning for underactuated spacecraft using a hybrid particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Zhuang, Yufei; Huang, Haibin

    2014-02-01

    A hybrid algorithm combining particle swarm optimization (PSO) algorithm with the Legendre pseudospectral method (LPM) is proposed for solving time-optimal trajectory planning problem of underactuated spacecrafts. At the beginning phase of the searching process, an initialization generator is constructed by the PSO algorithm due to its strong global searching ability and robustness to random initial values, however, PSO algorithm has a disadvantage that its convergence rate around the global optimum is slow. Then, when the change in fitness function is smaller than a predefined value, the searching algorithm is switched to the LPM to accelerate the searching process. Thus, with the obtained solutions by the PSO algorithm as a set of proper initial guesses, the hybrid algorithm can find a global optimum more quickly and accurately. 200 Monte Carlo simulations results demonstrate that the proposed hybrid PSO-LPM algorithm has greater advantages in terms of global searching capability and convergence rate than both single PSO algorithm and LPM algorithm. Moreover, the PSO-LPM algorithm is also robust to random initial values.

  10. Development and Translation of Hybrid Optoacoustic/Ultrasonic Tomography for Early Breast Cancer Detection

    DTIC Science & Technology

    2014-09-01

    to develop an optimized system design and associated image reconstruction algorithms for a hybrid three-dimensional (3D) breast imaging system that...research is to develop an optimized system design and associated image reconstruction algorithms for a hybrid three-dimensional (3D) breast imaging ...i) developed time-of- flight extraction algorithms to perform USCT, (ii) developing image reconstruction algorithms for USCT, (iii) developed

  11. Modeling urbanized watershed flood response changes with distributed hydrological model: key hydrological processes, parameterization and case studies

    NASA Astrophysics Data System (ADS)

    Chen, Y.

    2017-12-01

    Urbanization is the world development trend for the past century, and the developing countries have been experiencing much rapider urbanization in the past decades. Urbanization brings many benefits to human beings, but also causes negative impacts, such as increasing flood risk. Impact of urbanization on flood response has long been observed, but quantitatively studying this effect still faces great challenges. For example, setting up an appropriate hydrological model representing the changed flood responses and determining accurate model parameters are very difficult in the urbanized or urbanizing watershed. In the Pearl River Delta area, rapidest urbanization has been observed in China for the past decades, and dozens of highly urbanized watersheds have been appeared. In this study, a physically based distributed watershed hydrological model, the Liuxihe model is employed and revised to simulate the hydrological processes of the highly urbanized watershed flood in the Pearl River Delta area. A virtual soil type is then defined in the terrain properties dataset, and its runoff production and routing algorithms are added to the Liuxihe model. Based on a parameter sensitive analysis, the key hydrological processes of a highly urbanized watershed is proposed, that provides insight into the hydrological processes and for parameter optimization. Based on the above analysis, the model is set up in the Songmushan watershed where there is hydrological data observation. A model parameter optimization and updating strategy is proposed based on the remotely sensed LUC types, which optimizes model parameters with PSO algorithm and updates them based on the changed LUC types. The model parameters in Songmushan watershed are regionalized at the Pearl River Delta area watersheds based on the LUC types of the other watersheds. A dozen watersheds in the highly urbanized area of Dongguan City in the Pearl River Delta area were studied for the flood response changes due to urbanization, and the results show urbanization has big impact on the watershed flood responses. The peak flow increased a few times after urbanization which is much higher than previous reports.

  12. A Bayesian Uncertainty Framework for Conceptual Snowmelt and Hydrologic Models Applied to the Tenderfoot Creek Experimental Forest

    NASA Astrophysics Data System (ADS)

    Smith, T.; Marshall, L.

    2007-12-01

    In many mountainous regions, the single most important parameter in forecasting the controls on regional water resources is snowpack (Williams et al., 1999). In an effort to bridge the gap between theoretical understanding and functional modeling of snow-driven watersheds, a flexible hydrologic modeling framework is being developed. The aim is to create a suite of models that move from parsimonious structures, concentrated on aggregated watershed response, to those focused on representing finer scale processes and distributed response. This framework will operate as a tool to investigate the link between hydrologic model predictive performance, uncertainty, model complexity, and observable hydrologic processes. Bayesian methods, and particularly Markov chain Monte Carlo (MCMC) techniques, are extremely useful in uncertainty assessment and parameter estimation of hydrologic models. However, these methods have some difficulties in implementation. In a traditional Bayesian setting, it can be difficult to reconcile multiple data types, particularly those offering different spatial and temporal coverage, depending on the model type. These difficulties are also exacerbated by sensitivity of MCMC algorithms to model initialization and complex parameter interdependencies. As a way of circumnavigating some of the computational complications, adaptive MCMC algorithms have been developed to take advantage of the information gained from each successive iteration. Two adaptive algorithms are compared is this study, the Adaptive Metropolis (AM) algorithm, developed by Haario et al (2001), and the Delayed Rejection Adaptive Metropolis (DRAM) algorithm, developed by Haario et al (2006). While neither algorithm is truly Markovian, it has been proven that each satisfies the desired ergodicity and stationarity properties of Markov chains. Both algorithms were implemented as the uncertainty and parameter estimation framework for a conceptual rainfall-runoff model based on the Probability Distributed Model (PDM), developed by Moore (1985). We implement the modeling framework in Stringer Creek watershed in the Tenderfoot Creek Experimental Forest (TCEF), Montana. The snowmelt-driven watershed offers that additional challenge of modeling snow accumulation and melt and current efforts are aimed at developing a temperature- and radiation-index snowmelt model. Auxiliary data available from within TCEF's watersheds are used to support in the understanding of information value as it relates to predictive performance. Because the model is based on lumped parameters, auxiliary data are hard to incorporate directly. However, these additional data offer benefits through the ability to inform prior distributions of the lumped, model parameters. By incorporating data offering different information into the uncertainty assessment process, a cross-validation technique is engaged to better ensure that modeled results reflect real process complexity.

  13. A Theoretical Analysis of Why Hybrid Ensembles Work.

    PubMed

    Hsu, Kuo-Wei

    2017-01-01

    Inspired by the group decision making process, ensembles or combinations of classifiers have been found favorable in a wide variety of application domains. Some researchers propose to use the mixture of two different types of classification algorithms to create a hybrid ensemble. Why does such an ensemble work? The question remains. Following the concept of diversity, which is one of the fundamental elements of the success of ensembles, we conduct a theoretical analysis of why hybrid ensembles work, connecting using different algorithms to accuracy gain. We also conduct experiments on classification performance of hybrid ensembles of classifiers created by decision tree and naïve Bayes classification algorithms, each of which is a top data mining algorithm and often used to create non-hybrid ensembles. Therefore, through this paper, we provide a complement to the theoretical foundation of creating and using hybrid ensembles.

  14. A hybrid intelligent algorithm for portfolio selection problem with fuzzy returns

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Zhang, Yang; Wong, Hau-San; Qin, Zhongfeng

    2009-11-01

    Portfolio selection theory with fuzzy returns has been well developed and widely applied. Within the framework of credibility theory, several fuzzy portfolio selection models have been proposed such as mean-variance model, entropy optimization model, chance constrained programming model and so on. In order to solve these nonlinear optimization models, a hybrid intelligent algorithm is designed by integrating simulated annealing algorithm, neural network and fuzzy simulation techniques, where the neural network is used to approximate the expected value and variance for fuzzy returns and the fuzzy simulation is used to generate the training data for neural network. Since these models are used to be solved by genetic algorithm, some comparisons between the hybrid intelligent algorithm and genetic algorithm are given in terms of numerical examples, which imply that the hybrid intelligent algorithm is robust and more effective. In particular, it reduces the running time significantly for large size problems.

  15. A hybrid artificial bee colony algorithm and pattern search method for inversion of particle size distribution from spectral extinction data

    NASA Astrophysics Data System (ADS)

    Wang, Li; Li, Feng; Xing, Jian

    2017-10-01

    In this paper, a hybrid artificial bee colony (ABC) algorithm and pattern search (PS) method is proposed and applied for recovery of particle size distribution (PSD) from spectral extinction data. To be more useful and practical, size distribution function is modelled as the general Johnson's ? function that can overcome the difficulty of not knowing the exact type beforehand encountered in many real circumstances. The proposed hybrid algorithm is evaluated through simulated examples involving unimodal, bimodal and trimodal PSDs with different widths and mean particle diameters. For comparison, all examples are additionally validated by the single ABC algorithm. In addition, the performance of the proposed algorithm is further tested by actual extinction measurements with real standard polystyrene samples immersed in water. Simulation and experimental results illustrate that the hybrid algorithm can be used as an effective technique to retrieve the PSDs with high reliability and accuracy. Compared with the single ABC algorithm, our proposed algorithm can produce more accurate and robust inversion results while taking almost comparative CPU time over ABC algorithm alone. The superiority of ABC and PS hybridization strategy in terms of reaching a better balance of estimation accuracy and computation effort increases its potentials as an excellent inversion technique for reliable and efficient actual measurement of PSD.

  16. A Theoretical Analysis of Why Hybrid Ensembles Work

    PubMed Central

    2017-01-01

    Inspired by the group decision making process, ensembles or combinations of classifiers have been found favorable in a wide variety of application domains. Some researchers propose to use the mixture of two different types of classification algorithms to create a hybrid ensemble. Why does such an ensemble work? The question remains. Following the concept of diversity, which is one of the fundamental elements of the success of ensembles, we conduct a theoretical analysis of why hybrid ensembles work, connecting using different algorithms to accuracy gain. We also conduct experiments on classification performance of hybrid ensembles of classifiers created by decision tree and naïve Bayes classification algorithms, each of which is a top data mining algorithm and often used to create non-hybrid ensembles. Therefore, through this paper, we provide a complement to the theoretical foundation of creating and using hybrid ensembles. PMID:28255296

  17. The Rational Hybrid Monte Carlo algorithm

    NASA Astrophysics Data System (ADS)

    Clark, Michael

    2006-12-01

    The past few years have seen considerable progress in algorithmic development for the generation of gauge fields including the effects of dynamical fermions. The Rational Hybrid Monte Carlo (RHMC) algorithm, where Hybrid Monte Carlo is performed using a rational approximation in place the usual inverse quark matrix kernel is one of these developments. This algorithm has been found to be extremely beneficial in many areas of lattice QCD (chiral fermions, finite temperature, Wilson fermions etc.). We review the algorithm and some of these benefits, and we compare against other recent algorithm developements. We conclude with an update of the Berlin wall plot comparing costs of all popular fermion formulations.

  18. Interactive genetic algorithm for user-centered design of distributed conservation practices in a watershed: An examination of user preferences in objective space and user behavior

    NASA Astrophysics Data System (ADS)

    Piemonti, Adriana Debora; Babbar-Sebens, Meghna; Mukhopadhyay, Snehasis; Kleinberg, Austin

    2017-05-01

    Interactive Genetic Algorithms (IGA) are advanced human-in-the-loop optimization methods that enable humans to give feedback, based on their subjective and unquantified preferences and knowledge, during the algorithm's search process. While these methods are gaining popularity in multiple fields, there is a critical lack of data and analyses on (a) the nature of interactions of different humans with interfaces of decision support systems (DSS) that employ IGA in water resources planning problems and on (b) the effect of human feedback on the algorithm's ability to search for design alternatives desirable to end-users. In this paper, we present results and analyses of observational experiments in which different human participants (surrogates and stakeholders) interacted with an IGA-based, watershed DSS called WRESTORE to identify plans of conservation practices in a watershed. The main goal of this paper is to evaluate how the IGA adapts its search process in the objective space to a user's feedback, and identify whether any similarities exist in the objective space of plans found by different participants. Some participants focused on the entire watershed, while others focused only on specific local subbasins. Additionally, two different hydrology models were used to identify any potential differences in interactive search outcomes that could arise from differences in the numerical values of benefits displayed to participants. Results indicate that stakeholders, in comparison to their surrogates, were more likely to use multiple features of the DSS interface to collect information before giving feedback, and dissimilarities existed among participants in the objective space of design alternatives.

  19. An improved approach for the segmentation of starch granules in microscopic images

    PubMed Central

    2010-01-01

    Background Starches are the main storage polysaccharides in plants and are distributed widely throughout plants including seeds, roots, tubers, leaves, stems and so on. Currently, microscopic observation is one of the most important ways to investigate and analyze the structure of starches. The position, shape, and size of the starch granules are the main measurements for quantitative analysis. In order to obtain these measurements, segmentation of starch granules from the background is very important. However, automatic segmentation of starch granules is still a challenging task because of the limitation of imaging condition and the complex scenarios of overlapping granules. Results We propose a novel method to segment starch granules in microscopic images. In the proposed method, we first separate starch granules from background using automatic thresholding and then roughly segment the image using watershed algorithm. In order to reduce the oversegmentation in watershed algorithm, we use the roundness of each segment, and analyze the gradient vector field to find the critical points so as to identify oversegments. After oversegments are found, we extract the features, such as the position and intensity of the oversegments, and use fuzzy c-means clustering to merge the oversegments to the objects with similar features. Experimental results demonstrate that the proposed method can alleviate oversegmentation of watershed segmentation algorithm successfully. Conclusions We present a new scheme for starch granules segmentation. The proposed scheme aims to alleviate the oversegmentation in watershed algorithm. We use the shape information and critical points of gradient vector flow (GVF) of starch granules to identify oversegments, and use fuzzy c-mean clustering based on prior knowledge to merge these oversegments to the objects. Experimental results on twenty microscopic starch images demonstrate the effectiveness of the proposed scheme. PMID:21047380

  20. AMLSA Algorithm for Hybrid Precoding in Millimeter Wave MIMO Systems

    NASA Astrophysics Data System (ADS)

    Liu, Fulai; Sun, Zhenxing; Du, Ruiyan; Bai, Xiaoyu

    2017-10-01

    In this paper, an effective algorithm will be proposed for hybrid precoding in mmWave MIMO systems, referred to as alternating minimization algorithm with the least squares amendment (AMLSA algorithm). To be specific, for the fully-connected structure, the presented algorithm is exploited to minimize the classical objective function and obtain the hybrid precoding matrix. It introduces an orthogonal constraint to the digital precoding matrix which is amended subsequently by the least squares after obtaining its alternating minimization iterative result. Simulation results confirm that the achievable spectral efficiency of our proposed algorithm is better to some extent than that of the existing algorithm without the least squares amendment. Furthermore, the number of iterations is reduced slightly via improving the initialization procedure.

  1. A hybrid algorithm for speckle noise reduction of ultrasound images.

    PubMed

    Singh, Karamjeet; Ranade, Sukhjeet Kaur; Singh, Chandan

    2017-09-01

    Medical images are contaminated by multiplicative speckle noise which significantly reduce the contrast of ultrasound images and creates a negative effect on various image interpretation tasks. In this paper, we proposed a hybrid denoising approach which collaborate the both local and nonlocal information in an efficient manner. The proposed hybrid algorithm consist of three stages in which at first stage the use of local statistics in the form of guided filter is used to reduce the effect of speckle noise initially. Then, an improved speckle reducing bilateral filter (SRBF) is developed to further reduce the speckle noise from the medical images. Finally, to reconstruct the diffused edges we have used the efficient post-processing technique which jointly considered the advantages of both bilateral and nonlocal mean (NLM) filter for the attenuation of speckle noise efficiently. The performance of proposed hybrid algorithm is evaluated on synthetic, simulated and real ultrasound images. The experiments conducted on various test images demonstrate that our proposed hybrid approach outperforms the various traditional speckle reduction approaches included recently proposed NLM and optimized Bayesian-based NLM. The results of various quantitative, qualitative measures and by visual inspection of denoise synthetic and real ultrasound images demonstrate that the proposed hybrid algorithm have strong denoising capability and able to preserve the fine image details such as edge of a lesion better than previously developed methods for speckle noise reduction. The denoising and edge preserving capability of hybrid algorithm is far better than existing traditional and recently proposed speckle reduction (SR) filters. The success of proposed algorithm would help in building the lay foundation for inventing the hybrid algorithms for denoising of ultrasound images. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. HRSSA - Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    NASA Astrophysics Data System (ADS)

    Marchetti, Luca; Priami, Corrado; Thanh, Vo Hong

    2016-07-01

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance and accuracy of HRSSA against other state of the art algorithms.

  3. File text security using Hybrid Cryptosystem with Playfair Cipher Algorithm and Knapsack Naccache-Stern Algorithm

    NASA Astrophysics Data System (ADS)

    Amalia; Budiman, M. A.; Sitepu, R.

    2018-03-01

    Cryptography is one of the best methods to keep the information safe from security attack by unauthorized people. At present, Many studies had been done by previous researchers to generate a more robust cryptographic algorithm to provide high security for data communication. To strengthen data security, one of the methods is hybrid cryptosystem method that combined symmetric and asymmetric algorithm. In this study, we observed a hybrid cryptosystem method contain Modification Playfair Cipher 16x16 algorithm as a symmetric algorithm and Knapsack Naccache-Stern as an asymmetric algorithm. We observe a running time of this hybrid algorithm with some of the various experiments. We tried different amount of characters to be tested which are 10, 100, 1000, 10000 and 100000 characters and we also examined the algorithm with various key’s length which are 10, 20, 30, 40 of key length. The result of our study shows that the processing time for encryption and decryption process each algorithm is linearly proportional, it means the longer messages character then, the more significant times needed to encrypt and decrypt the messages. The encryption running time of Knapsack Naccache-Stern algorithm takes a longer time than its decryption, while the encryption running time of modification Playfair Cipher 16x16 algorithm takes less time than its decryption.

  4. Five-minute, 1/2°, and 1° data sets of continental watersheds and river networks for use in regional and global hydrologic and climate system modeling studies

    NASA Astrophysics Data System (ADS)

    Graham, S. T.; Famiglietti, J. S.; Maidment, D. R.

    1999-02-01

    A major shortcoming of the land surface component in climate models is the absence of a river transport algorithm. This issue becomes particularly important in fully coupled climate system models (CSMs), where river transport is required to close and realistically represent the global water cycle. The development of a river transport algorithm requires knowledge of watersheds and river networks at a scale that is appropriate for use in CSMs. These data must be derived largely from global digital topographic information. The purpose of this paper is to describe a new data set of watersheds and river networks, which is derived primarily from the TerrainBase 5' Global DTM (digital terrain model) and the CIA World Data Bank II. These data serve as a base map for routing continental runoff to the appropriate coast and therefore into the appropriate ocean or inland sea. Using this data set, the runoff produced in any grid cell, when coupled with a routing algorithm, can easily be transported to the appropriate water body and distributed across that water body as desired. The data set includes watershed and flow direction information, as well as supporting hydrologic data at 5', 1/2°, and 1° resolutions globally. It will be useful in fully coupled land-ocean-atmosphere models, in terrestrial ecosystem models, or in stand-alone macroscale hydrologic-modeling studies.

  5. Smart markers for watershed-based cell segmentation.

    PubMed

    Koyuncu, Can Fahrettin; Arslan, Salim; Durmaz, Irem; Cetin-Atalay, Rengul; Gunduz-Demir, Cigdem

    2012-01-01

    Automated cell imaging systems facilitate fast and reliable analysis of biological events at the cellular level. In these systems, the first step is usually cell segmentation that greatly affects the success of the subsequent system steps. On the other hand, similar to other image segmentation problems, cell segmentation is an ill-posed problem that typically necessitates the use of domain-specific knowledge to obtain successful segmentations even by human subjects. The approaches that can incorporate this knowledge into their segmentation algorithms have potential to greatly improve segmentation results. In this work, we propose a new approach for the effective segmentation of live cells from phase contrast microscopy. This approach introduces a new set of "smart markers" for a marker-controlled watershed algorithm, for which the identification of its markers is critical. The proposed approach relies on using domain-specific knowledge, in the form of visual characteristics of the cells, to define the markers. We evaluate our approach on a total of 1,954 cells. The experimental results demonstrate that this approach, which uses the proposed definition of smart markers, is quite effective in identifying better markers compared to its counterparts. This will, in turn, be effective in improving the segmentation performance of a marker-controlled watershed algorithm.

  6. Structure and weights optimisation of a modified Elman network emotion classifier using hybrid computational intelligence algorithms: a comparative study

    NASA Astrophysics Data System (ADS)

    Sheikhan, Mansour; Abbasnezhad Arabi, Mahdi; Gharavian, Davood

    2015-10-01

    Artificial neural networks are efficient models in pattern recognition applications, but their performance is dependent on employing suitable structure and connection weights. This study used a hybrid method for obtaining the optimal weight set and architecture of a recurrent neural emotion classifier based on gravitational search algorithm (GSA) and its binary version (BGSA), respectively. By considering the features of speech signal that were related to prosody, voice quality, and spectrum, a rich feature set was constructed. To select more efficient features, a fast feature selection method was employed. The performance of the proposed hybrid GSA-BGSA method was compared with similar hybrid methods based on particle swarm optimisation (PSO) algorithm and its binary version, PSO and discrete firefly algorithm, and hybrid of error back-propagation and genetic algorithm that were used for optimisation. Experimental tests on Berlin emotional database demonstrated the superior performance of the proposed method using a lighter network structure.

  7. Development of sub-daily erosion and sediment transport algorithms in SWAT

    USDA-ARS?s Scientific Manuscript database

    New Soil and Water Assessment Tool (SWAT) algorithms for simulation of stormwater best management practices (BMPs) such as detention basins, wet ponds, sedimentation filtration ponds, and retention irrigation systems are under development for modeling small/urban watersheds. Modeling stormwater BMPs...

  8. Applications of Hybrid Algorithm (Successive Over Relaxation and Inverse Distance Weighting) for Interpolating Rainfall Data Obtained from a Dense Network of Meteorological Stations in Metro Manila, Philippines

    NASA Astrophysics Data System (ADS)

    Yao, J. G.; Lagrosas, N.; Ampil, L. J. Y.; Lorenzo, G. R. H.; Simpas, J.

    2016-12-01

    A hybrid piecewise rainfall value interpolation algorithm was formulated using the commonly known Inverse Distance Weighting (IDW) and Gauss-Seidel variant Successive Over Relaxation (SOR) to interpolate rainfall values over Metro Manila, Philippines. Due to the fact that the SOR requires boundary values for its algorithm to work, the IDW method has been used to estimate rainfall values at the boundary. Iterations using SOR were then done on the defined boundaries to obtain the desired results corresponding to the lowest RMSE value. The hybrid method was applied to rainfall datasets obtained from a dense network of 30 stations in Metro Manila which has been collecting meteorological data every 5 minutes since 2012. Implementing the Davis Vantage Pro 2 Plus weather monitoring system, each station sends data to a central server which could be accessed through the website metroweather.com.ph. The stations are spread over approximately 625 sq km of area such that each station is approximately within 25 sq km from each other. The locations of the stations determined by the Metro Manila Development Authority (MMDA) are in critical sections of Metro Manila such as watersheds and flood-prone areas. Three cases have been investigated in this study, one for each type of rainfall present in Metro Manila: monsoon-induced (8/20/13), typhoon (6/29/13), and thunderstorm (7/3/15 & 7/4/15). The area where the rainfall stations are located is divided such that large measured rainfall values are used as part of the boundaries for the SOR. Measured station values found inside the area where SOR is implemented are compared with results from interpolated values. Root mean square error (RMSE) and correlation trends between measured and interpolated results are quantified. Results from typhoon, thunderstorm and monsoon cases show RMSE values ranged from 0.25 to 2.46 mm for typhoons, 1.55 to 10.69 mm for monsoon-induced rain and 0.01 to 6.27 mm for thunderstorms. R2 values, on the other hand, are 0.91, 0.89 and 0.76 for typhoons, monsoon-induced rain and thunderstorms, respectively. This study has shown that the method of approximating rainfall works and can be used in improved prediction, analysis and real time flood map generation.

  9. An enhanced DWBA algorithm in hybrid WDM/TDM EPON networks with heterogeneous propagation delays

    NASA Astrophysics Data System (ADS)

    Li, Chengjun; Guo, Wei; Jin, Yaohui; Sun, Weiqiang; Hu, Weisheng

    2011-12-01

    An enhanced dynamic wavelength and bandwidth allocation (DWBA) algorithm in hybrid WDM/TDM PON is proposed and experimentally demonstrated. In addition to the fairness of bandwidth allocation, this algorithm also considers the varying propagation delays between ONUs and OLT. The simulation based on MATLAB indicates that the improved algorithm has a better performance compared with some other algorithms.

  10. Application of a single-objective, hybrid genetic algorithm approach to pharmacokinetic model building.

    PubMed

    Sherer, Eric A; Sale, Mark E; Pollock, Bruce G; Belani, Chandra P; Egorin, Merrill J; Ivy, Percy S; Lieberman, Jeffrey A; Manuck, Stephen B; Marder, Stephen R; Muldoon, Matthew F; Scher, Howard I; Solit, David B; Bies, Robert R

    2012-08-01

    A limitation in traditional stepwise population pharmacokinetic model building is the difficulty in handling interactions between model components. To address this issue, a method was previously introduced which couples NONMEM parameter estimation and model fitness evaluation to a single-objective, hybrid genetic algorithm for global optimization of the model structure. In this study, the generalizability of this approach for pharmacokinetic model building is evaluated by comparing (1) correct and spurious covariate relationships in a simulated dataset resulting from automated stepwise covariate modeling, Lasso methods, and single-objective hybrid genetic algorithm approaches to covariate identification and (2) information criteria values, model structures, convergence, and model parameter values resulting from manual stepwise versus single-objective, hybrid genetic algorithm approaches to model building for seven compounds. Both manual stepwise and single-objective, hybrid genetic algorithm approaches to model building were applied, blinded to the results of the other approach, for selection of the compartment structure as well as inclusion and model form of inter-individual and inter-occasion variability, residual error, and covariates from a common set of model options. For the simulated dataset, stepwise covariate modeling identified three of four true covariates and two spurious covariates; Lasso identified two of four true and 0 spurious covariates; and the single-objective, hybrid genetic algorithm identified three of four true covariates and one spurious covariate. For the clinical datasets, the Akaike information criterion was a median of 22.3 points lower (range of 470.5 point decrease to 0.1 point decrease) for the best single-objective hybrid genetic-algorithm candidate model versus the final manual stepwise model: the Akaike information criterion was lower by greater than 10 points for four compounds and differed by less than 10 points for three compounds. The root mean squared error and absolute mean prediction error of the best single-objective hybrid genetic algorithm candidates were a median of 0.2 points higher (range of 38.9 point decrease to 27.3 point increase) and 0.02 points lower (range of 0.98 point decrease to 0.74 point increase), respectively, than that of the final stepwise models. In addition, the best single-objective, hybrid genetic algorithm candidate models had successful convergence and covariance steps for each compound, used the same compartment structure as the manual stepwise approach for 6 of 7 (86 %) compounds, and identified 54 % (7 of 13) of covariates included by the manual stepwise approach and 16 covariate relationships not included by manual stepwise models. The model parameter values between the final manual stepwise and best single-objective, hybrid genetic algorithm models differed by a median of 26.7 % (q₁ = 4.9 % and q₃ = 57.1 %). Finally, the single-objective, hybrid genetic algorithm approach was able to identify models capable of estimating absorption rate parameters for four compounds that the manual stepwise approach did not identify. The single-objective, hybrid genetic algorithm represents a general pharmacokinetic model building methodology whose ability to rapidly search the feasible solution space leads to nearly equivalent or superior model fits to pharmacokinetic data.

  11. Cloud GIS Based Watershed Management

    NASA Astrophysics Data System (ADS)

    Bediroğlu, G.; Colak, H. E.

    2017-11-01

    In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.

  12. Parameter Estimation of Computationally Expensive Watershed Models Through Efficient Multi-objective Optimization and Interactive Decision Analytics

    NASA Astrophysics Data System (ADS)

    Akhtar, Taimoor; Shoemaker, Christine

    2016-04-01

    Watershed model calibration is inherently a multi-criteria problem. Conflicting trade-offs exist between different quantifiable calibration criterions indicating the non-existence of a single optimal parameterization. Hence, many experts prefer a manual approach to calibration where the inherent multi-objective nature of the calibration problem is addressed through an interactive, subjective, time-intensive and complex decision making process. Multi-objective optimization can be used to efficiently identify multiple plausible calibration alternatives and assist calibration experts during the parameter estimation process. However, there are key challenges to the use of multi objective optimization in the parameter estimation process which include: 1) multi-objective optimization usually requires many model simulations, which is difficult for complex simulation models that are computationally expensive; and 2) selection of one from numerous calibration alternatives provided by multi-objective optimization is non-trivial. This study proposes a "Hybrid Automatic Manual Strategy" (HAMS) for watershed model calibration to specifically address the above-mentioned challenges. HAMS employs a 3-stage framework for parameter estimation. Stage 1 incorporates the use of an efficient surrogate multi-objective algorithm, GOMORS, for identification of numerous calibration alternatives within a limited simulation evaluation budget. The novelty of HAMS is embedded in Stages 2 and 3 where an interactive visual and metric based analytics framework is available as a decision support tool to choose a single calibration from the numerous alternatives identified in Stage 1. Stage 2 of HAMS provides a goodness-of-fit measure / metric based interactive framework for identification of a small subset (typically less than 10) of meaningful and diverse set of calibration alternatives from the numerous alternatives obtained in Stage 1. Stage 3 incorporates the use of an interactive visual analytics framework for decision support in selection of one parameter combination from the alternatives identified in Stage 2. HAMS is applied for calibration of flow parameters of a SWAT model, (Soil and Water Assessment Tool) designed to simulate flow in the Cannonsville watershed in upstate New York. Results from the application of HAMS to Cannonsville indicate that efficient multi-objective optimization and interactive visual and metric based analytics can bridge the gap between the effective use of both automatic and manual strategies for parameter estimation of computationally expensive watershed models.

  13. HRSSA – Efficient hybrid stochastic simulation for spatially homogeneous biochemical reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchetti, Luca, E-mail: marchetti@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; University of Trento, Department of Mathematics

    This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance andmore » accuracy of HRSSA against other state of the art algorithms.« less

  14. An Effective Hybrid Evolutionary Algorithm for Solving the Numerical Optimization Problems

    NASA Astrophysics Data System (ADS)

    Qian, Xiaohong; Wang, Xumei; Su, Yonghong; He, Liu

    2018-04-01

    There are many different algorithms for solving complex optimization problems. Each algorithm has been applied successfully in solving some optimization problems, but not efficiently in other problems. In this paper the Cauchy mutation and the multi-parent hybrid operator are combined to propose a hybrid evolutionary algorithm based on the communication (Mixed Evolutionary Algorithm based on Communication), hereinafter referred to as CMEA. The basic idea of the CMEA algorithm is that the initial population is divided into two subpopulations. Cauchy mutation operators and multiple paternal crossover operators are used to perform two subpopulations parallelly to evolve recursively until the downtime conditions are met. While subpopulation is reorganized, the individual is exchanged together with information. The algorithm flow is given and the performance of the algorithm is compared using a number of standard test functions. Simulation results have shown that this algorithm converges significantly faster than FEP (Fast Evolutionary Programming) algorithm, has good performance in global convergence and stability and is superior to other compared algorithms.

  15. Hybrid algorithms for fuzzy reverse supply chain network design.

    PubMed

    Che, Z H; Chiang, Tzu-An; Kuo, Y C; Cui, Zhihua

    2014-01-01

    In consideration of capacity constraints, fuzzy defect ratio, and fuzzy transport loss ratio, this paper attempted to establish an optimized decision model for production planning and distribution of a multiphase, multiproduct reverse supply chain, which addresses defects returned to original manufacturers, and in addition, develops hybrid algorithms such as Particle Swarm Optimization-Genetic Algorithm (PSO-GA), Genetic Algorithm-Simulated Annealing (GA-SA), and Particle Swarm Optimization-Simulated Annealing (PSO-SA) for solving the optimized model. During a case study of a multi-phase, multi-product reverse supply chain network, this paper explained the suitability of the optimized decision model and the applicability of the algorithms. Finally, the hybrid algorithms showed excellent solving capability when compared with original GA and PSO methods.

  16. Hybrid Algorithms for Fuzzy Reverse Supply Chain Network Design

    PubMed Central

    Che, Z. H.; Chiang, Tzu-An; Kuo, Y. C.

    2014-01-01

    In consideration of capacity constraints, fuzzy defect ratio, and fuzzy transport loss ratio, this paper attempted to establish an optimized decision model for production planning and distribution of a multiphase, multiproduct reverse supply chain, which addresses defects returned to original manufacturers, and in addition, develops hybrid algorithms such as Particle Swarm Optimization-Genetic Algorithm (PSO-GA), Genetic Algorithm-Simulated Annealing (GA-SA), and Particle Swarm Optimization-Simulated Annealing (PSO-SA) for solving the optimized model. During a case study of a multi-phase, multi-product reverse supply chain network, this paper explained the suitability of the optimized decision model and the applicability of the algorithms. Finally, the hybrid algorithms showed excellent solving capability when compared with original GA and PSO methods. PMID:24892057

  17. A fast hybrid algorithm combining regularized motion tracking and predictive search for reducing the occurrence of large displacement errors.

    PubMed

    Jiang, Jingfeng; Hall, Timothy J

    2011-04-01

    A hybrid approach that inherits both the robustness of the regularized motion tracking approach and the efficiency of the predictive search approach is reported. The basic idea is to use regularized speckle tracking to obtain high-quality seeds in an explorative search that can be used in the subsequent intelligent predictive search. The performance of the hybrid speckle-tracking algorithm was compared with three published speckle-tracking methods using in vivo breast lesion data. We found that the hybrid algorithm provided higher displacement quality metric values, lower root mean squared errors compared with a locally smoothed displacement field, and higher improvement ratios compared with the classic block-matching algorithm. On the basis of these comparisons, we concluded that the hybrid method can further enhance the accuracy of speckle tracking compared with its real-time counterparts, at the expense of slightly higher computational demands. © 2011 IEEE

  18. A hybrid optimization algorithm to explore atomic configurations of TiO 2 nanoparticles

    DOE PAGES

    Inclan, Eric J.; Geohegan, David B.; Yoon, Mina

    2017-10-17

    Here in this paper we present a hybrid algorithm comprised of differential evolution, coupled with the Broyden–Fletcher–Goldfarb–Shanno quasi-Newton optimization algorithm, for the purpose of identifying a broad range of (meta)stable Ti nO 2n nanoparticles, as an example system, described by Buckingham interatomic potential. The potential and its gradient are modified to be piece-wise continuous to enable use of these continuous-domain, unconstrained algorithms, thereby improving compatibility. To measure computational effectiveness a regression on known structures is used. This approach defines effectiveness as the ability of an algorithm to produce a set of structures whose energy distribution follows the regression as themore » number of Ti nO 2n increases such that the shape of the distribution is consistent with the algorithm’s stated goals. Our calculation demonstrates that the hybrid algorithm finds global minimum configurations more effectively than the differential evolution algorithms, widely employed in the field of materials science. Specifically, the hybrid algorithm is shown to reproduce the global minimum energy structures reported in the literature up to n = 5, and retains good agreement with the regression up to n = 25. For 25 < n < 100, where literature structures are unavailable, the hybrid effectively obtains structures that are in lower energies per TiO 2 unit as the system size increases.« less

  19. A hybrid multi-objective evolutionary algorithm for wind-turbine blade optimization

    NASA Astrophysics Data System (ADS)

    Sessarego, M.; Dixon, K. R.; Rival, D. E.; Wood, D. H.

    2015-08-01

    A concurrent-hybrid non-dominated sorting genetic algorithm (hybrid NSGA-II) has been developed and applied to the simultaneous optimization of the annual energy production, flapwise root-bending moment and mass of the NREL 5 MW wind-turbine blade. By hybridizing a multi-objective evolutionary algorithm (MOEA) with gradient-based local search, it is believed that the optimal set of blade designs could be achieved in lower computational cost than for a conventional MOEA. To measure the convergence between the hybrid and non-hybrid NSGA-II on a wind-turbine blade optimization problem, a computationally intensive case was performed using the non-hybrid NSGA-II. From this particular case, a three-dimensional surface representing the optimal trade-off between the annual energy production, flapwise root-bending moment and blade mass was achieved. The inclusion of local gradients in the blade optimization, however, shows no improvement in the convergence for this three-objective problem.

  20. Novel bio-inspired smart control for hazard mitigation of civil structures

    NASA Astrophysics Data System (ADS)

    Kim, Yeesock; Kim, Changwon; Langari, Reza

    2010-11-01

    In this paper, a new bio-inspired controller is proposed for vibration mitigation of smart structures subjected to ground disturbances (i.e. earthquakes). The control system is developed through the integration of a brain emotional learning (BEL) algorithm with a proportional-integral-derivative (PID) controller and a semiactive inversion (Inv) algorithm. The BEL algorithm is based on the neurologically inspired computational model of the amygdala and the orbitofrontal cortex. To demonstrate the effectiveness of the proposed hybrid BEL-PID-Inv control algorithm, a seismically excited building structure equipped with a magnetorheological (MR) damper is investigated. The performance of the proposed hybrid BEL-PID-Inv control algorithm is compared with that of passive, PID, linear quadratic Gaussian (LQG), and BEL control systems. In the simulation, the robustness of the hybrid BEL-PID-Inv control algorithm in the presence of modeling uncertainties as well as external disturbances is investigated. It is shown that the proposed hybrid BEL-PID-Inv control algorithm is effective in improving the dynamic responses of seismically excited building structure-MR damper systems.

  1. Opposition-Based Memetic Algorithm and Hybrid Approach for Sorting Permutations by Reversals.

    PubMed

    Soncco-Álvarez, José Luis; Muñoz, Daniel M; Ayala-Rincón, Mauricio

    2018-02-21

    Sorting unsigned permutations by reversals is a difficult problem; indeed, it was proved to be NP-hard by Caprara (1997). Because of its high complexity, many approximation algorithms to compute the minimal reversal distance were proposed until reaching the nowadays best-known theoretical ratio of 1.375. In this article, two memetic algorithms to compute the reversal distance are proposed. The first one uses the technique of opposition-based learning leading to an opposition-based memetic algorithm; the second one improves the previous algorithm by applying the heuristic of two breakpoint elimination leading to a hybrid approach. Several experiments were performed with one-hundred randomly generated permutations, single benchmark permutations, and biological permutations. Results of the experiments showed that the proposed OBMA and Hybrid-OBMA algorithms achieve the best results for practical cases, that is, for permutations of length up to 120. Also, Hybrid-OBMA showed to improve the results of OBMA for permutations greater than or equal to 60. The applicability of our proposed algorithms was checked processing permutations based on biological data, in which case OBMA gave the best average results for all instances.

  2. Supervisory Power Management Control Algorithms for Hybrid Electric Vehicles. A Survey

    DOE PAGES

    Malikopoulos, Andreas

    2014-03-31

    The growing necessity for environmentally benign hybrid propulsion systems has led to the development of advanced power management control algorithms to maximize fuel economy and minimize pollutant emissions. This paper surveys the control algorithms for hybrid electric vehicles (HEVs) and plug-in HEVs (PHEVs) that have been reported in the literature to date. The exposition ranges from parallel, series, and power split HEVs and PHEVs and includes a classification of the algorithms in terms of their implementation and the chronological order of their appearance. Remaining challenges and potential future research directions are also discussed.

  3. An Improved Iris Recognition Algorithm Based on Hybrid Feature and ELM

    NASA Astrophysics Data System (ADS)

    Wang, Juan

    2018-03-01

    The iris image is easily polluted by noise and uneven light. This paper proposed an improved extreme learning machine (ELM) based iris recognition algorithm with hybrid feature. 2D-Gabor filters and GLCM is employed to generate a multi-granularity hybrid feature vector. 2D-Gabor filter and GLCM feature work for capturing low-intermediate frequency and high frequency texture information, respectively. Finally, we utilize extreme learning machine for iris recognition. Experimental results reveal our proposed ELM based multi-granularity iris recognition algorithm (ELM-MGIR) has higher accuracy of 99.86%, and lower EER of 0.12% under the premise of real-time performance. The proposed ELM-MGIR algorithm outperforms other mainstream iris recognition algorithms.

  4. Application of hybrid clustering using parallel k-means algorithm and DIANA algorithm

    NASA Astrophysics Data System (ADS)

    Umam, Khoirul; Bustamam, Alhadi; Lestari, Dian

    2017-03-01

    DNA is one of the carrier of genetic information of living organisms. Encoding, sequencing, and clustering DNA sequences has become the key jobs and routine in the world of molecular biology, in particular on bioinformatics application. There are two type of clustering, hierarchical clustering and partitioning clustering. In this paper, we combined two type clustering i.e. K-Means (partitioning clustering) and DIANA (hierarchical clustering), therefore it called Hybrid clustering. Application of hybrid clustering using Parallel K-Means algorithm and DIANA algorithm used to clustering DNA sequences of Human Papillomavirus (HPV). The clustering process is started with Collecting DNA sequences of HPV are obtained from NCBI (National Centre for Biotechnology Information), then performing characteristics extraction of DNA sequences. The characteristics extraction result is store in a matrix form, then normalize this matrix using Min-Max normalization and calculate genetic distance using Euclidian Distance. Furthermore, the hybrid clustering is applied by using implementation of Parallel K-Means algorithm and DIANA algorithm. The aim of using Hybrid Clustering is to obtain better clusters result. For validating the resulted clusters, to get optimum number of clusters, we use Davies-Bouldin Index (DBI). In this study, the result of implementation of Parallel K-Means clustering is data clustered become 5 clusters with minimal IDB value is 0.8741, and Hybrid Clustering clustered data become 13 sub-clusters with minimal IDB values = 0.8216, 0.6845, 0.3331, 0.1994 and 0.3952. The IDB value of hybrid clustering less than IBD value of Parallel K-Means clustering only that perform at 1ts stage. Its means clustering using Hybrid Clustering have the better result to clustered DNA sequence of HPV than perform parallel K-Means Clustering only.

  5. MIP models and hybrid algorithms for simultaneous job splitting and scheduling on unrelated parallel machines.

    PubMed

    Eroglu, Duygu Yilmaz; Ozmutlu, H Cenk

    2014-01-01

    We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms.

  6. Manifold absolute pressure estimation using neural network with hybrid training algorithm

    PubMed Central

    Selamat, Hazlina; Alimin, Ahmad Jais; Haniff, Mohamad Fadzli

    2017-01-01

    In a modern small gasoline engine fuel injection system, the load of the engine is estimated based on the measurement of the manifold absolute pressure (MAP) sensor, which took place in the intake manifold. This paper present a more economical approach on estimating the MAP by using only the measurements of the throttle position and engine speed, resulting in lower implementation cost. The estimation was done via two-stage multilayer feed-forward neural network by combining Levenberg-Marquardt (LM) algorithm, Bayesian Regularization (BR) algorithm and Particle Swarm Optimization (PSO) algorithm. Based on the results found in 20 runs, the second variant of the hybrid algorithm yields a better network performance than the first variant of hybrid algorithm, LM, LM with BR and PSO by estimating the MAP closely to the simulated MAP values. By using a valid experimental training data, the estimator network that trained with the second variant of the hybrid algorithm showed the best performance among other algorithms when used in an actual retrofit fuel injection system (RFIS). The performance of the estimator was also validated in steady-state and transient condition by showing a closer MAP estimation to the actual value. PMID:29190779

  7. A hybrid CS-SA intelligent approach to solve uncertain dynamic facility layout problems considering dependency of demands

    NASA Astrophysics Data System (ADS)

    Moslemipour, Ghorbanali

    2018-07-01

    This paper aims at proposing a quadratic assignment-based mathematical model to deal with the stochastic dynamic facility layout problem. In this problem, product demands are assumed to be dependent normally distributed random variables with known probability density function and covariance that change from period to period at random. To solve the proposed model, a novel hybrid intelligent algorithm is proposed by combining the simulated annealing and clonal selection algorithms. The proposed model and the hybrid algorithm are verified and validated using design of experiment and benchmark methods. The results show that the hybrid algorithm has an outstanding performance from both solution quality and computational time points of view. Besides, the proposed model can be used in both of the stochastic and deterministic situations.

  8. A Hybrid Neural Network-Genetic Algorithm Technique for Aircraft Engine Performance Diagnostics

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2001-01-01

    In this paper, a model-based diagnostic method, which utilizes Neural Networks and Genetic Algorithms, is investigated. Neural networks are applied to estimate the engine internal health, and Genetic Algorithms are applied for sensor bias detection and estimation. This hybrid approach takes advantage of the nonlinear estimation capability provided by neural networks while improving the robustness to measurement uncertainty through the application of Genetic Algorithms. The hybrid diagnostic technique also has the ability to rank multiple potential solutions for a given set of anomalous sensor measurements in order to reduce false alarms and missed detections. The performance of the hybrid diagnostic technique is evaluated through some case studies derived from a turbofan engine simulation. The results show this approach is promising for reliable diagnostics of aircraft engines.

  9. Study of parameter identification using hybrid neural-genetic algorithm in electro-hydraulic servo system

    NASA Astrophysics Data System (ADS)

    Moon, Byung-Young

    2005-12-01

    The hybrid neural-genetic multi-model parameter estimation algorithm was demonstrated. This method can be applied to structured system identification of electro-hydraulic servo system. This algorithms consist of a recurrent incremental credit assignment(ICRA) neural network and a genetic algorithm. The ICRA neural network evaluates each member of a generation of model and genetic algorithm produces new generation of model. To evaluate the proposed method, electro-hydraulic servo system was designed and manufactured. The experiment was carried out to figure out the hybrid neural-genetic multi-model parameter estimation algorithm. As a result, the dynamic characteristics were obtained such as the parameters(mass, damping coefficient, bulk modulus, spring coefficient), which minimize total square error. The result of this study can be applied to hydraulic systems in industrial fields.

  10. A Benders based rolling horizon algorithm for a dynamic facility location problem

    DOE PAGES

    Marufuzzaman,, Mohammad; Gedik, Ridvan; Roni, Mohammad S.

    2016-06-28

    This study presents a well-known capacitated dynamic facility location problem (DFLP) that satisfies the customer demand at a minimum cost by determining the time period for opening, closing, or retaining an existing facility in a given location. To solve this challenging NP-hard problem, this paper develops a unique hybrid solution algorithm that combines a rolling horizon algorithm with an accelerated Benders decomposition algorithm. Extensive computational experiments are performed on benchmark test instances to evaluate the hybrid algorithm’s efficiency and robustness in solving the DFLP problem. Computational results indicate that the hybrid Benders based rolling horizon algorithm consistently offers high qualitymore » feasible solutions in a much shorter computational time period than the standalone rolling horizon and accelerated Benders decomposition algorithms in the experimental range.« less

  11. A Winner Determination Algorithm for Combinatorial Auctions Based on Hybrid Artificial Fish Swarm Algorithm

    NASA Astrophysics Data System (ADS)

    Zheng, Genrang; Lin, ZhengChun

    The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.

  12. Computing all hybridization networks for multiple binary phylogenetic input trees.

    PubMed

    Albrecht, Benjamin

    2015-07-30

    The computation of phylogenetic trees on the same set of species that are based on different orthologous genes can lead to incongruent trees. One possible explanation for this behavior are interspecific hybridization events recombining genes of different species. An important approach to analyze such events is the computation of hybridization networks. This work presents the first algorithm computing the hybridization number as well as a set of representative hybridization networks for multiple binary phylogenetic input trees on the same set of taxa. To improve its practical runtime, we show how this algorithm can be parallelized. Moreover, we demonstrate the efficiency of the software Hybroscale, containing an implementation of our algorithm, by comparing it to PIRNv2.0, which is so far the best available software computing the exact hybridization number for multiple binary phylogenetic trees on the same set of taxa. The algorithm is part of the software Hybroscale, which was developed specifically for the investigation of hybridization networks including their computation and visualization. Hybroscale is freely available(1) and runs on all three major operating systems. Our simulation study indicates that our approach is on average 100 times faster than PIRNv2.0. Moreover, we show how Hybroscale improves the interpretation of the reported hybridization networks by adding certain features to its graphical representation.

  13. 75 FR 36147 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Order Approving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-24

    ..., as Modified by Amendment No. 1 Thereto, Related to the Hybrid Matching Algorithms June 17, 2010. On... Hybrid System. Each rule currently provides allocation algorithms the Exchange can utilize when executing incoming electronic orders, including the Ultimate Matching Algorithm (``UMA''), and price-time and pro...

  14. A Novel Hybrid Firefly Algorithm for Global Optimization.

    PubMed

    Zhang, Lina; Liu, Liqiang; Yang, Xin-She; Dai, Yuntao

    Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA), is proposed by combining the advantages of both the firefly algorithm (FA) and differential evolution (DE). FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA), differential evolution (DE) and particle swarm optimization (PSO) in the sense of avoiding local minima and increasing the convergence rate.

  15. A Novel Hybrid Firefly Algorithm for Global Optimization

    PubMed Central

    Zhang, Lina; Liu, Liqiang; Yang, Xin-She; Dai, Yuntao

    2016-01-01

    Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA), is proposed by combining the advantages of both the firefly algorithm (FA) and differential evolution (DE). FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA), differential evolution (DE) and particle swarm optimization (PSO) in the sense of avoiding local minima and increasing the convergence rate. PMID:27685869

  16. A hybrid genetic algorithm for resolving closely spaced objects

    NASA Technical Reports Server (NTRS)

    Abbott, R. J.; Lillo, W. E.; Schulenburg, N.

    1995-01-01

    A hybrid genetic algorithm is described for performing the difficult optimization task of resolving closely spaced objects appearing in space based and ground based surveillance data. This application of genetic algorithms is unusual in that it uses a powerful domain-specific operation as a genetic operator. Results of applying the algorithm to real data from telescopic observations of a star field are presented.

  17. Series Hybrid Electric Vehicle Power System Optimization Based on Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhu, Tianjun; Li, Bin; Zong, Changfu; Wu, Yang

    2017-09-01

    Hybrid electric vehicles (HEV), compared with conventional vehicles, have complex structures and more component parameters. If variables optimization designs are carried on all these parameters, it will increase the difficulty and the convergence of algorithm program, so this paper chooses the parameters which has a major influence on the vehicle fuel consumption to make it all work at maximum efficiency. First, HEV powertrain components modelling are built. Second, taking a tandem hybrid structure as an example, genetic algorithm is used in this paper to optimize fuel consumption and emissions. Simulation results in ADVISOR verify the feasibility of the proposed genetic optimization algorithm.

  18. Hybrid real-code ant colony optimisation for constrained mechanical design

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Bureerat, Sujin

    2016-01-01

    This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.

  19. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search.

    PubMed

    Villagra, Andrea; Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology.

  20. A Circuit-Based Neural Network with Hybrid Learning of Backpropagation and Random Weight Change Algorithms

    PubMed Central

    Yang, Changju; Kim, Hyongsuk; Adhikari, Shyam Prasad; Chua, Leon O.

    2016-01-01

    A hybrid learning method of a software-based backpropagation learning and a hardware-based RWC learning is proposed for the development of circuit-based neural networks. The backpropagation is known as one of the most efficient learning algorithms. A weak point is that its hardware implementation is extremely difficult. The RWC algorithm, which is very easy to implement with respect to its hardware circuits, takes too many iterations for learning. The proposed learning algorithm is a hybrid one of these two. The main learning is performed with a software version of the BP algorithm, firstly, and then, learned weights are transplanted on a hardware version of a neural circuit. At the time of the weight transplantation, a significant amount of output error would occur due to the characteristic difference between the software and the hardware. In the proposed method, such error is reduced via a complementary learning of the RWC algorithm, which is implemented in a simple hardware. The usefulness of the proposed hybrid learning system is verified via simulations upon several classical learning problems. PMID:28025566

  1. Watershed modeling at the Savannah River Site.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vache, Kellie

    2015-04-29

    The overall goal of the work was the development of a watershed scale model of hydrological function for application to the US Department of Energy’s (DOE) Savannah River Site (SRS). The primary outcomes is a grid based hydrological modeling system that captures near surface runoff as well as groundwater recharge and contributions of groundwater to streams. The model includes a physically-based algorithm to capture both evaporation and transpiration from forestland.

  2. MIP Models and Hybrid Algorithms for Simultaneous Job Splitting and Scheduling on Unrelated Parallel Machines

    PubMed Central

    Ozmutlu, H. Cenk

    2014-01-01

    We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms. PMID:24977204

  3. Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm

    NASA Astrophysics Data System (ADS)

    Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad

    2018-01-01

    Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.

  4. A comparison between two algorithms for the retrieval of soil moisture using AMSR-E data

    USDA-ARS?s Scientific Manuscript database

    A comparison between two algorithms for estimating soil moisture with microwave satellite data was carried out by using the datasets collected on the four Agricultural Research Service (ARS) watershed sites in the US from 2002 to 2009. These sites collectively represent a wide range of ground condit...

  5. Genetic algorithm optimized rainfall-runoff fuzzy inference system for row crop watersheds with claypan soils

    USDA-ARS?s Scientific Manuscript database

    The fuzzy logic algorithm has the ability to describe knowledge in a descriptive human-like manner in the form of simple rules using linguistic variables, and provides a new way of modeling uncertain or naturally fuzzy hydrological processes like non-linear rainfall-runoff relationships. Fuzzy infe...

  6. Feature detection on 3D images of dental imprints

    NASA Astrophysics Data System (ADS)

    Mokhtari, Marielle; Laurendeau, Denis

    1994-09-01

    A computer vision approach for the extraction of feature points on 3D images of dental imprints is presented. The position of feature points are needed for the measurement of a set of parameters for automatic diagnosis of malocclusion problems in orthodontics. The system for the acquisition of the 3D profile of the imprint, the procedure for the detection of the interstices between teeth, and the approach for the identification of the type of tooth are described, as well as the algorithm for the reconstruction of the surface of each type of tooth. A new approach for the detection of feature points, called the watershed algorithm, is described in detail. The algorithm is a two-stage procedure which tracks the position of local minima at four different scales and produces a final map of the position of the minima. Experimental results of the application of the watershed algorithm on actual 3D images of dental imprints are presented for molars, premolars and canines. The segmentation approach for the analysis of the shape of incisors is also described in detail.

  7. Flood predictions using the parallel version of distributed numerical physical rainfall-runoff model TOPKAPI

    NASA Astrophysics Data System (ADS)

    Boyko, Oleksiy; Zheleznyak, Mark

    2015-04-01

    The original numerical code TOPKAPI-IMMS of the distributed rainfall-runoff model TOPKAPI ( Todini et al, 1996-2014) is developed and implemented in Ukraine. The parallel version of the code has been developed recently to be used on multiprocessors systems - multicore/processors PC and clusters. Algorithm is based on binary-tree decomposition of the watershed for the balancing of the amount of computation for all processors/cores. Message passing interface (MPI) protocol is used as a parallel computing framework. The numerical efficiency of the parallelization algorithms is demonstrated for the case studies for the flood predictions of the mountain watersheds of the Ukrainian Carpathian regions. The modeling results is compared with the predictions based on the lumped parameters models.

  8. LOADING SIMULATION PROGRAM C

    EPA Pesticide Factsheets

    LSPC is the Loading Simulation Program in C++, a watershed modeling system that includes streamlined Hydrologic Simulation Program Fortran (HSPF) algorithms for simulating hydrology, sediment, and general water quality

  9. Using the cloud to speed-up calibration of watershed-scale hydrologic models (Invited)

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Ercan, M. B.; Castronova, A. M.; Humphrey, M.; Beekwilder, N.; Steele, J.; Kim, I.

    2013-12-01

    This research focuses on using the cloud to address computational challenges associated with hydrologic modeling. One example is calibration of a watershed-scale hydrologic model, which can take days of execution time on typical computers. While parallel algorithms for model calibration exist and some researchers have used multi-core computers or clusters to run these algorithms, these solutions do not fully address the challenge because (i) calibration can still be too time consuming even on multicore personal computers and (ii) few in the community have the time and expertise needed to manage a compute cluster. Given this, another option for addressing this challenge that we are exploring through this work is the use of the cloud for speeding-up calibration of watershed-scale hydrologic models. The cloud used in this capacity provides a means for renting a specific number and type of machines for only the time needed to perform a calibration model run. The cloud allows one to precisely balance the duration of the calibration with the financial costs so that, if the budget allows, the calibration can be performed more quickly by renting more machines. Focusing specifically on the SWAT hydrologic model and a parallel version of the DDS calibration algorithm, we show significant speed-up time across a range of watershed sizes using up to 256 cores to perform a model calibration. The tool provides a simple web-based user interface and the ability to monitor the calibration job submission process during the calibration process. Finally this talk concludes with initial work to leverage the cloud for other tasks associated with hydrologic modeling including tasks related to preparing inputs for constructing place-based hydrologic models.

  10. Applicability of Hydrologic Landscapes for Model Calibration ...

    EPA Pesticide Factsheets

    The Pacific Northwest Hydrologic Landscapes (PNW HL) at the assessment unit scale has provided a solid conceptual classification framework to relate and transfer hydrologically meaningful information between watersheds without access to streamflow time series. A collection of techniques were applied to the HL assessment unit composition in watersheds across the Pacific Northwest to aggregate the hydrologic behavior of the Hydrologic Landscapes from the assessment unit scale to the watershed scale. This non-trivial solution both emphasizes HL classifications within the watershed that provide that majority of moisture surplus/deficit and considers the relative position (upstream vs. downstream) of these HL classifications. A clustering algorithm was applied to the HL-based characterization of assessment units within 185 watersheds to help organize watersheds into nine classes hypothesized to have similar hydrologic behavior. The HL-based classes were used to organize and describe hydrologic behavior information about watershed classes and both predictions and validations were independently performed with regard to the general magnitude of six hydroclimatic signature values. A second cluster analysis was then performed using the independently calculated signature values as similarity metrics, and it was found that the six signature clusters showed substantial overlap in watershed class membership to those in the HL-based classes. One hypothesis set forward from thi

  11. Watershed reliability, resilience and vulnerability analysis under uncertainty using water quality data.

    PubMed

    Hoque, Yamen M; Tripathi, Shivam; Hantush, Mohamed M; Govindaraju, Rao S

    2012-10-30

    A method for assessment of watershed health is developed by employing measures of reliability, resilience and vulnerability (R-R-V) using stream water quality data. Observed water quality data are usually sparse, so that a water quality time-series is often reconstructed using surrogate variables (streamflow). A Bayesian algorithm based on relevance vector machine (RVM) was employed to quantify the error in the reconstructed series, and a probabilistic assessment of watershed status was conducted based on established thresholds for various constituents. As an application example, observed water quality data for several constituents at different monitoring points within the Cedar Creek watershed in north-east Indiana (USA) were utilized. Considering uncertainty in the data for the period 2002-2007, the R-R-V analysis revealed that the Cedar Creek watershed tends to be in compliance with respect to selected pesticides, ammonia and total phosphorus. However, the watershed was found to be prone to violations of sediment standards. Ignoring uncertainty in the water quality time-series led to misleading results especially in the case of sediments. Results indicate that the methods presented in this study may be used for assessing the effects of different stressors over a watershed. The method shows promise as a management tool for assessing watershed health. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. SAR image change detection using watershed and spectral clustering

    NASA Astrophysics Data System (ADS)

    Niu, Ruican; Jiao, L. C.; Wang, Guiting; Feng, Jie

    2011-12-01

    A new method of change detection in SAR images based on spectral clustering is presented in this paper. Spectral clustering is employed to extract change information from a pair images acquired on the same geographical area at different time. Watershed transform is applied to initially segment the big image into non-overlapped local regions, leading to reduce the complexity. Experiments results and system analysis confirm the effectiveness of the proposed algorithm.

  13. Classification of Medical Datasets Using SVMs with Hybrid Evolutionary Algorithms Based on Endocrine-Based Particle Swarm Optimization and Artificial Bee Colony Algorithms.

    PubMed

    Lin, Kuan-Cheng; Hsieh, Yi-Hsiu

    2015-10-01

    The classification and analysis of data is an important issue in today's research. Selecting a suitable set of features makes it possible to classify an enormous quantity of data quickly and efficiently. Feature selection is generally viewed as a problem of feature subset selection, such as combination optimization problems. Evolutionary algorithms using random search methods have proven highly effective in obtaining solutions to problems of optimization in a diversity of applications. In this study, we developed a hybrid evolutionary algorithm based on endocrine-based particle swarm optimization (EPSO) and artificial bee colony (ABC) algorithms in conjunction with a support vector machine (SVM) for the selection of optimal feature subsets for the classification of datasets. The results of experiments using specific UCI medical datasets demonstrate that the accuracy of the proposed hybrid evolutionary algorithm is superior to that of basic PSO, EPSO and ABC algorithms, with regard to classification accuracy using subsets with a reduced number of features.

  14. A hybrid monkey search algorithm for clustering analysis.

    PubMed

    Chen, Xin; Zhou, Yongquan; Luo, Qifang

    2014-01-01

    Clustering is a popular data analysis and data mining technique. The k-means clustering algorithm is one of the most commonly used methods. However, it highly depends on the initial solution and is easy to fall into local optimum solution. In view of the disadvantages of the k-means method, this paper proposed a hybrid monkey algorithm based on search operator of artificial bee colony algorithm for clustering analysis and experiment on synthetic and real life datasets to show that the algorithm has a good performance than that of the basic monkey algorithm for clustering analysis.

  15. Hybrid modeling approach for the northern Adriatic watershed management.

    PubMed

    Volf, Goran; Atanasova, Nataša; Škerjanec, Mateja; Ožanić, Nevenka

    2018-04-23

    Northern Adriatic (NA) is one of the most productive parts of the Mediterranean Sea due to vast nutrient discharges from the contributing watershed. To understand better the excess of nutrients as stressors to the state of the marine ecosystem, a hybrid modeling approach following the DPSIR framework and terminology was developed, linking: 1) the AVGWLF model for modeling the pressures, i.e. nutrients originating from the watershed caused by two major drivers (urbanization and agriculture), 2) the ML tool MTSMOTI for inducing a model tree connecting the pressures with the marine ecosystem state, and 3) the water quality index, TRIX, equation to evaluate the trophic state of the marine ecosystem. Data used for the modeling purpose comprised GIS layers (i.e., digital terrain model, land use/cover data, soil map, locations of hydro-meteorological stations and WWTPs), time series data (i.e., hydro-meteorological data and nutrient concentrations), and statistical data (i.e., number of inhabitants, connections to wastewater treatment, livestock statistics, etc.) as well as physical, chemical and biological parameters, measured at six marine water monitoring stations, located between the Po River delta (Italy) and the city of Rovinj (west Istrian coast, Croatia). Using the model, seven watershed management scenarios related to wastewater treatment and agricultural activities were evaluated for their influence on the state of the NA marine ecosystem. According to the results, the gradual implementation of the UWWTD in the last 10years contributed significantly to the preservation and improvement of the NA marine ecosystem state. However, despite the full implementation of the UWWTD, the state of the NA marine ecosystem could deteriorate in case of increased nutrient loads from agriculture. Since the UWWTD is already close to its full implementation, NA watershed management should focus on controlling agricultural activities in order to maintain 'high' state of the NA marine ecosystem. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Hybrid employment recommendation algorithm based on Spark

    NASA Astrophysics Data System (ADS)

    Li, Zuoquan; Lin, Yubei; Zhang, Xingming

    2017-08-01

    Aiming at the real-time application of collaborative filtering employment recommendation algorithm (CF), a clustering collaborative filtering recommendation algorithm (CCF) is developed, which applies hierarchical clustering to CF and narrows the query range of neighbour items. In addition, to solve the cold-start problem of content-based recommendation algorithm (CB), a content-based algorithm with users’ information (CBUI) is introduced for job recommendation. Furthermore, a hybrid recommendation algorithm (HRA) which combines CCF and CBUI algorithms is proposed, and implemented on Spark platform. The experimental results show that HRA can overcome the problems of cold start and data sparsity, and achieve good recommendation accuracy and scalability for employment recommendation.

  17. Parameter Estimation for a Hybrid Adaptive Flight Controller

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Nguyen, Nhan T.; Kaneshige, John; Krishnakumar, Kalmanje

    2009-01-01

    This paper expands on the hybrid control architecture developed at the NASA Ames Research Center by addressing issues related to indirect adaptation using the recursive least squares (RLS) algorithm. Specifically, the hybrid control architecture is an adaptive flight controller that features both direct and indirect adaptation techniques. This paper will focus almost exclusively on the modifications necessary to achieve quality indirect adaptive control. Additionally this paper will present results that, using a full non -linear aircraft model, demonstrate the effectiveness of the hybrid control architecture given drastic changes in an aircraft s dynamics. Throughout the development of this topic, a thorough discussion of the RLS algorithm as a system identification technique will be provided along with results from seven well-known modifications to the popular RLS algorithm.

  18. A Methodology for the Hybridization Based in Active Components: The Case of cGA and Scatter Search

    PubMed Central

    Alba, Enrique; Leguizamón, Guillermo

    2016-01-01

    This work presents the results of a new methodology for hybridizing metaheuristics. By first locating the active components (parts) of one algorithm and then inserting them into second one, we can build efficient and accurate optimization, search, and learning algorithms. This gives a concrete way of constructing new techniques that contrasts the spread ad hoc way of hybridizing. In this paper, the enhanced algorithm is a Cellular Genetic Algorithm (cGA) which has been successfully used in the past to find solutions to such hard optimization problems. In order to extend and corroborate the use of active components as an emerging hybridization methodology, we propose here the use of active components taken from Scatter Search (SS) to improve cGA. The results obtained over a varied set of benchmarks are highly satisfactory in efficacy and efficiency when compared with a standard cGA. Moreover, the proposed hybrid approach (i.e., cGA+SS) has shown encouraging results with regard to earlier applications of our methodology. PMID:27403153

  19. Battery algorithm verification and development using hardware-in-the-loop testing

    NASA Astrophysics Data System (ADS)

    He, Yongsheng; Liu, Wei; Koch, Brain J.

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.

  20. A Hybrid Algorithm for Non-negative Matrix Factorization Based on Symmetric Information Divergence

    PubMed Central

    Devarajan, Karthik; Ebrahimi, Nader; Soofi, Ehsan

    2017-01-01

    The objective of this paper is to provide a hybrid algorithm for non-negative matrix factorization based on a symmetric version of Kullback-Leibler divergence, known as intrinsic information. The convergence of the proposed algorithm is shown for several members of the exponential family such as the Gaussian, Poisson, gamma and inverse Gaussian models. The speed of this algorithm is examined and its usefulness is illustrated through some applied problems. PMID:28868206

  1. Differential evolution-simulated annealing for multiple sequence alignment

    NASA Astrophysics Data System (ADS)

    Addawe, R. C.; Addawe, J. M.; Sueño, M. R. K.; Magadia, J. C.

    2017-10-01

    Multiple sequence alignments (MSA) are used in the analysis of molecular evolution and sequence structure relationships. In this paper, a hybrid algorithm, Differential Evolution - Simulated Annealing (DESA) is applied in optimizing multiple sequence alignments (MSAs) based on structural information, non-gaps percentage and totally conserved columns. DESA is a robust algorithm characterized by self-organization, mutation, crossover, and SA-like selection scheme of the strategy parameters. Here, the MSA problem is treated as a multi-objective optimization problem of the hybrid evolutionary algorithm, DESA. Thus, we name the algorithm as DESA-MSA. Simulated sequences and alignments were generated to evaluate the accuracy and efficiency of DESA-MSA using different indel sizes, sequence lengths, deletion rates and insertion rates. The proposed hybrid algorithm obtained acceptable solutions particularly for the MSA problem evaluated based on the three objectives.

  2. An efficient hybrid method for stochastic reaction-diffusion biochemical systems with delay

    NASA Astrophysics Data System (ADS)

    Sayyidmousavi, Alireza; Ilie, Silvana

    2017-12-01

    Many chemical reactions, such as gene transcription and translation in living cells, need a certain time to finish once they are initiated. Simulating stochastic models of reaction-diffusion systems with delay can be computationally expensive. In the present paper, a novel hybrid algorithm is proposed to accelerate the stochastic simulation of delayed reaction-diffusion systems. The delayed reactions may be of consuming or non-consuming delay type. The algorithm is designed for moderately stiff systems in which the events can be partitioned into slow and fast subsets according to their propensities. The proposed algorithm is applied to three benchmark problems and the results are compared with those of the delayed Inhomogeneous Stochastic Simulation Algorithm. The numerical results show that the new hybrid algorithm achieves considerable speed-up in the run time and very good accuracy.

  3. A Network Selection Algorithm Considering Power Consumption in Hybrid Wireless Networks

    NASA Astrophysics Data System (ADS)

    Joe, Inwhee; Kim, Won-Tae; Hong, Seokjoon

    In this paper, we propose a novel network selection algorithm considering power consumption in hybrid wireless networks for vertical handover. CDMA, WiBro, WLAN networks are candidate networks for this selection algorithm. This algorithm is composed of the power consumption prediction algorithm and the final network selection algorithm. The power consumption prediction algorithm estimates the expected lifetime of the mobile station based on the current battery level, traffic class and power consumption for each network interface card of the mobile station. If the expected lifetime of the mobile station in a certain network is not long enough compared the handover delay, this particular network will be removed from the candidate network list, thereby preventing unnecessary handovers in the preprocessing procedure. On the other hand, the final network selection algorithm consists of AHP (Analytic Hierarchical Process) and GRA (Grey Relational Analysis). The global factors of the network selection structure are QoS, cost and lifetime. If user preference is lifetime, our selection algorithm selects the network that offers longest service duration due to low power consumption. Also, we conduct some simulations using the OPNET simulation tool. The simulation results show that the proposed algorithm provides longer lifetime in the hybrid wireless network environment.

  4. A new effective operator for the hybrid algorithm for solving global optimisation problems

    NASA Astrophysics Data System (ADS)

    Duc, Le Anh; Li, Kenli; Nguyen, Tien Trong; Yen, Vu Minh; Truong, Tung Khac

    2018-04-01

    Hybrid algorithms have been recently used to solve complex single-objective optimisation problems. The ultimate goal is to find an optimised global solution by using these algorithms. Based on the existing algorithms (HP_CRO, PSO, RCCRO), this study proposes a new hybrid algorithm called MPC (Mean-PSO-CRO), which utilises a new Mean-Search Operator. By employing this new operator, the proposed algorithm improves the search ability on areas of the solution space that the other operators of previous algorithms do not explore. Specifically, the Mean-Search Operator helps find the better solutions in comparison with other algorithms. Moreover, the authors have proposed two parameters for balancing local and global search and between various types of local search, as well. In addition, three versions of this operator, which use different constraints, are introduced. The experimental results on 23 benchmark functions, which are used in previous works, show that our framework can find better optimal or close-to-optimal solutions with faster convergence speed for most of the benchmark functions, especially the high-dimensional functions. Thus, the proposed algorithm is more effective in solving single-objective optimisation problems than the other existing algorithms.

  5. Inclusion of glacier processes for distributed hydrological modeling at basin scale with application to a watershed in Tianshan Mountains, northwest China

    USDA-ARS?s Scientific Manuscript database

    In this paper we proposed: (1) an algorithm of glacier melt, sublimation/evaporation, accumulation, mass balance and retreat; (2) a dynamic Hydrological Response Unit approach for incorporating the algorithm into the Soil and Water Assessment Tool (SWAT) model; and (3) simulated the transient glacie...

  6. Efficient hybrid evolutionary algorithm for optimization of a strip coiling process

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Park, Won-Woong; Kim, Dong-Kyu; Im, Yong-Taek; Bureerat, Sujin; Kwon, Hyuck-Cheol; Chun, Myung-Sik

    2015-04-01

    This article proposes an efficient metaheuristic based on hybridization of teaching-learning-based optimization and differential evolution for optimization to improve the flatness of a strip during a strip coiling process. Differential evolution operators were integrated into the teaching-learning-based optimization with a Latin hypercube sampling technique for generation of an initial population. The objective function was introduced to reduce axial inhomogeneity of the stress distribution and the maximum compressive stress calculated by Love's elastic solution within the thin strip, which may cause an irregular surface profile of the strip during the strip coiling process. The hybrid optimizer and several well-established evolutionary algorithms (EAs) were used to solve the optimization problem. The comparative studies show that the proposed hybrid algorithm outperformed other EAs in terms of convergence rate and consistency. It was found that the proposed hybrid approach was powerful for process optimization, especially with a large-scale design problem.

  7. Parameter estimation for chaotic systems using a hybrid adaptive cuckoo search with simulated annealing algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheng, Zheng, E-mail: 19994035@sina.com; Wang, Jun; Zhou, Bihua

    2014-03-15

    This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented tomore » tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.« less

  8. Shape classification of malignant lymphomas and leukemia by morphological watersheds and ARMA modeling

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Song, Yinglei; Ma, Limin; Zhou, Min

    2003-05-01

    A new algorithm that can be used to automatically recognize and classify malignant lymphomas and lukemia is proposed in this paper. The algorithm utilizes the morphological watershed to extract boundaries of cells from their grey-level images. It generates a sequence of Euclidean distances by selecting pixels in clockwise direction on the boundary of the cell and calculating the Euclidean distances of the selected pixels from the centroid of the cell. A feature vector associated with each cell is then obtained by applying the auto-regressive moving-average (ARMA) model to the generated sequence of Euclidean distances. The clustering measure J3=trace{inverse(Sw-1)Sm} involving the within (Sw) and mixed (Sm) class-scattering matrices is computed for both cell classes to provide an insight into the extent to which different cell classes in the training data are separated. Our test results suggest that the algorithm is highly accurate for the development of an interactive, computer-assisted diagnosis (CAD) tool.

  9. Hybrid dose calculation: a dose calculation algorithm for microbeam radiation therapy

    NASA Astrophysics Data System (ADS)

    Donzelli, Mattia; Bräuer-Krisch, Elke; Oelfke, Uwe; Wilkens, Jan J.; Bartzsch, Stefan

    2018-02-01

    Microbeam radiation therapy (MRT) is still a preclinical approach in radiation oncology that uses planar micrometre wide beamlets with extremely high peak doses, separated by a few hundred micrometre wide low dose regions. Abundant preclinical evidence demonstrates that MRT spares normal tissue more effectively than conventional radiation therapy, at equivalent tumour control. In order to launch first clinical trials, accurate and efficient dose calculation methods are an inevitable prerequisite. In this work a hybrid dose calculation approach is presented that is based on a combination of Monte Carlo and kernel based dose calculation. In various examples the performance of the algorithm is compared to purely Monte Carlo and purely kernel based dose calculations. The accuracy of the developed algorithm is comparable to conventional pure Monte Carlo calculations. In particular for inhomogeneous materials the hybrid dose calculation algorithm out-performs purely convolution based dose calculation approaches. It is demonstrated that the hybrid algorithm can efficiently calculate even complicated pencil beam and cross firing beam geometries. The required calculation times are substantially lower than for pure Monte Carlo calculations.

  10. Routing Algorithm based on Minimum Spanning Tree and Minimum Cost Flow for Hybrid Wireless-optical Broadband Access Network

    NASA Astrophysics Data System (ADS)

    Le, Zichun; Suo, Kaihua; Fu, Minglei; Jiang, Ling; Dong, Wen

    2012-03-01

    In order to minimize the average end to end delay for data transporting in hybrid wireless optical broadband access network, a novel routing algorithm named MSTMCF (minimum spanning tree and minimum cost flow) is devised. The routing problem is described as a minimum spanning tree and minimum cost flow model and corresponding algorithm procedures are given. To verify the effectiveness of MSTMCF algorithm, extensively simulations based on OWNS have been done under different types of traffic source.

  11. A Comparison of Soil Moisture Retrieval Models Using SIR-C Measurements over the Little Washita River Watershed

    NASA Technical Reports Server (NTRS)

    Wang, J. R.; Hsu, A.; Shi, J. C.; ONeill, P. E.; Engman, E. T.

    1997-01-01

    Six SIR-C L-band measurements over the Little Washita River watershed in Chickasha, Oklahoma during 11-17 April 1994 have been analyzed for studying the change of soil moisture in the region. Two algorithms developed recently for estimation of moisture content in bare soil were applied to these measurements and the results were compared with those sampled on the ground. There is a good agreement between the values of soil moisture estimated by either one of the algorithms and those measured from ground sampling for bare or sparsely vegetated fields. The standard error from this comparison is on the order of 0.05-0.06 cu cm/cu cm, which is comparable to that expected from a regression between backscattering coefficients and measured soil moisture. Both algorithms provide a poor estimation of soil moisture or fail to give solutions to areas covered with moderate or dense vegetation. Even for bare soils the number of pixels that bear no numerical solution from the application of either one of the two algorithms to the data is not negligible. Results from using one of these algorithms indicate that the fraction of these pixels becomes larger as the bare soils become drier. The other algorithm generally gives a larger fraction of these pixels when the fields are vegetation-covered. The implication and impact of these features are discussed in this article.

  12. Hybrid sparse blind deconvolution: an implementation of SOOT algorithm to real data

    NASA Astrophysics Data System (ADS)

    Pakmanesh, Parvaneh; Goudarzi, Alireza; Kourki, Meisam

    2018-06-01

    Getting information of seismic data depends on deconvolution as an important processing step; it provides the reflectivity series by signal compression. This compression can be obtained by removing the wavelet effects on the traces. The recently blind deconvolution has provided reliable performance for sparse signal recovery. In this study, two deconvolution methods have been implemented to the seismic data; the convolution of these methods provides a robust spiking deconvolution approach. This hybrid deconvolution is applied using the sparse deconvolution (MM algorithm) and the Smoothed-One-Over-Two algorithm (SOOT) in a chain. The MM algorithm is based on the minimization of the cost function defined by standards l1 and l2. After applying the two algorithms to the seismic data, the SOOT algorithm provided well-compressed data with a higher resolution than the MM algorithm. The SOOT algorithm requires initial values to be applied for real data, such as the wavelet coefficients and reflectivity series that can be achieved through the MM algorithm. The computational cost of the hybrid method is high, and it is necessary to be implemented on post-stack or pre-stack seismic data of complex structure regions.

  13. Effective hybrid teaching-learning-based optimization algorithm for balancing two-sided assembly lines with multiple constraints

    NASA Astrophysics Data System (ADS)

    Tang, Qiuhua; Li, Zixiang; Zhang, Liping; Floudas, C. A.; Cao, Xiaojun

    2015-09-01

    Due to the NP-hardness of the two-sided assembly line balancing (TALB) problem, multiple constraints existing in real applications are less studied, especially when one task is involved with several constraints. In this paper, an effective hybrid algorithm is proposed to address the TALB problem with multiple constraints (TALB-MC). Considering the discrete attribute of TALB-MC and the continuous attribute of the standard teaching-learning-based optimization (TLBO) algorithm, the random-keys method is hired in task permutation representation, for the purpose of bridging the gap between them. Subsequently, a special mechanism for handling multiple constraints is developed. In the mechanism, the directions constraint of each task is ensured by the direction check and adjustment. The zoning constraints and the synchronism constraints are satisfied by teasing out the hidden correlations among constraints. The positional constraint is allowed to be violated to some extent in decoding and punished in cost function. Finally, with the TLBO seeking for the global optimum, the variable neighborhood search (VNS) is further hybridized to extend the local search space. The experimental results show that the proposed hybrid algorithm outperforms the late acceptance hill-climbing algorithm (LAHC) for TALB-MC in most cases, especially for large-size problems with multiple constraints, and demonstrates well balance between the exploration and the exploitation. This research proposes an effective and efficient algorithm for solving TALB-MC problem by hybridizing the TLBO and VNS.

  14. Incorporating uncertainty into the ranking of SPARROW model nutrient yields from Mississippi/Atchafalaya River basin watersheds

    USGS Publications Warehouse

    Robertson, Dale M.; Schwarz, Gregory E.; Saad, David A.; Alexander, Richard B.

    2009-01-01

    Excessive loads of nutrients transported by tributary rivers have been linked to hypoxia in the Gulf of Mexico. Management efforts to reduce the hypoxic zone in the Gulf of Mexico and improve the water quality of rivers and streams could benefit from targeting nutrient reductions toward watersheds with the highest nutrient yields delivered to sensitive downstream waters. One challenge is that most conventional watershed modeling approaches (e.g., mechanistic models) used in these management decisions do not consider uncertainties in the predictions of nutrient yields and their downstream delivery. The increasing use of parameter estimation procedures to statistically estimate model coefficients, however, allows uncertainties in these predictions to be reliably estimated. Here, we use a robust bootstrapping procedure applied to the results of a previous application of the hybrid statistical/mechanistic watershed model SPARROW (Spatially Referenced Regression On Watershed attributes) to develop a statistically reliable method for identifying “high priority” areas for management, based on a probabilistic ranking of delivered nutrient yields from watersheds throughout a basin. The method is designed to be used by managers to prioritize watersheds where additional stream monitoring and evaluations of nutrient-reduction strategies could be undertaken. Our ranking procedure incorporates information on the confidence intervals of model predictions and the corresponding watershed rankings of the delivered nutrient yields. From this quantified uncertainty, we estimate the probability that individual watersheds are among a collection of watersheds that have the highest delivered nutrient yields. We illustrate the application of the procedure to 818 eight-digit Hydrologic Unit Code watersheds in the Mississippi/Atchafalaya River basin by identifying 150 watersheds having the highest delivered nutrient yields to the Gulf of Mexico. Highest delivered yields were from watersheds in the Central Mississippi, Ohio, and Lower Mississippi River basins. With 90% confidence, only a few watersheds can be reliably placed into the highest 150 category; however, many more watersheds can be removed from consideration as not belonging to the highest 150 category. Results from this ranking procedure provide robust information on watershed nutrient yields that can benefit management efforts to reduce nutrient loadings to downstream coastal waters, such as the Gulf of Mexico, or to local receiving streams and reservoirs.

  15. A multi-characteristic based algorithm for classifying vegetation in a plateau area: Qinghai Lake watershed, northwestern China

    NASA Astrophysics Data System (ADS)

    Ma, Weiwei; Gong, Cailan; Hu, Yong; Li, Long; Meng, Peng

    2015-10-01

    Remote sensing technology has been broadly recognized for its convenience and efficiency in mapping vegetation, particularly in high-altitude and inaccessible areas where there are lack of in-situ observations. In this study, Landsat Thematic Mapper (TM) images and Chinese environmental mitigation satellite CCD sensor (HJ-1 CCD) images, both of which are at 30m spatial resolution were employed for identifying and monitoring of vegetation types in a area of Western China——Qinghai Lake Watershed(QHLW). A decision classification tree (DCT) algorithm using multi-characteristic including seasonal TM/HJ-1 CCD time series data combined with digital elevation models (DEMs) dataset, and a supervised maximum likelihood classification (MLC) algorithm with single-data TM image were applied vegetation classification. Accuracy of the two algorithms was assessed using field observation data. Based on produced vegetation classification maps, it was found that the DCT using multi-season data and geomorphologic parameters was superior to the MLC algorithm using single-data image, improving the overall accuracy by 11.86% at second class level and significantly reducing the "salt and pepper" noise. The DCT algorithm applied to TM /HJ-1 CCD time series data geomorphologic parameters appeared as a valuable and reliable tool for monitoring vegetation at first class level (5 vegetation classes) and second class level(8 vegetation subclasses). The DCT algorithm using multi-characteristic might provide a theoretical basis and general approach to automatic extraction of vegetation types from remote sensing imagery over plateau areas.

  16. Improved hybridization of Fuzzy Analytic Hierarchy Process (FAHP) algorithm with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW)

    NASA Astrophysics Data System (ADS)

    Zaiwani, B. E.; Zarlis, M.; Efendi, S.

    2018-03-01

    In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.

  17. Information filtering via a scaling-based function.

    PubMed

    Qiu, Tian; Zhang, Zi-Ke; Chen, Guang

    2013-01-01

    Finding a universal description of the algorithm optimization is one of the key challenges in personalized recommendation. In this article, for the first time, we introduce a scaling-based algorithm (SCL) independent of recommendation list length based on a hybrid algorithm of heat conduction and mass diffusion, by finding out the scaling function for the tunable parameter and object average degree. The optimal value of the tunable parameter can be abstracted from the scaling function, which is heterogeneous for the individual object. Experimental results obtained from three real datasets, Netflix, MovieLens and RYM, show that the SCL is highly accurate in recommendation. More importantly, compared with a number of excellent algorithms, including the mass diffusion method, the original hybrid method, and even an improved version of the hybrid method, the SCL algorithm remarkably promotes the personalized recommendation in three other aspects: solving the accuracy-diversity dilemma, presenting a high novelty, and solving the key challenge of cold start problem.

  18. Machining Parameters Optimization using Hybrid Firefly Algorithm and Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Farahlina Johari, Nur; Zain, Azlan Mohd; Haszlinna Mustaffa, Noorfa; Udin, Amirmudin

    2017-09-01

    Firefly Algorithm (FA) is a metaheuristic algorithm that is inspired by the flashing behavior of fireflies and the phenomenon of bioluminescent communication and the algorithm is used to optimize the machining parameters (feed rate, depth of cut, and spindle speed) in this research. The algorithm is hybridized with Particle Swarm Optimization (PSO) to discover better solution in exploring the search space. Objective function of previous research is used to optimize the machining parameters in turning operation. The optimal machining cutting parameters estimated by FA that lead to a minimum surface roughness are validated using ANOVA test.

  19. A study of water balances over the Tigris-Euphrates watershed

    NASA Astrophysics Data System (ADS)

    Kavvas, M. L.; Chen, Z. Q.; Anderson, M. L.; Ohara, N.; Yoon, J. Y.; Xiang, Fu

    Tigris-Euphrates watershed was considered as one hydrologic unit, and a scientific assessment of its water resources was performed. Accordingly, (a) an inventory of land use/land cover, vegetation, soils, and existing hydraulic structures in the watershed was performed; (b) a regional hydroclimate model, RegHCM-TE, of the watershed was developed, and used to reconstruct historical precipitation data, to perform land hydrologic water balance computations for infiltration, soil water storage, actual evapotranspiration, direct runoff as input for streamflow computations, and to estimate irrigation water demands; and (c) a hydrologic model was developed to route streamflows within the river network of the watershed. Also, an algorithm for operating the reservoirs within the watershed was developed, and utilized to perform dynamic water balance studies under various water supply/demand scenarios to establish efficient utilization of the watershed’s water resources to meet the water demands of the riparian countries in the basin. Within this dynamic water balance framework, it is possible to assess and quantify the effect of sequential river flows on the chronologically sequential water balances over the watershed. The water balance study for the natural flow conditions prior to the development of large dams within TE basin, during the 1957-1969 critical period is presented.

  20. CPU-GPU hybrid accelerating the Zuker algorithm for RNA secondary structure prediction applications.

    PubMed

    Lei, Guoqing; Dou, Yong; Wan, Wen; Xia, Fei; Li, Rongchun; Ma, Meng; Zou, Dan

    2012-01-01

    Prediction of ribonucleic acid (RNA) secondary structure remains one of the most important research areas in bioinformatics. The Zuker algorithm is one of the most popular methods of free energy minimization for RNA secondary structure prediction. Thus far, few studies have been reported on the acceleration of the Zuker algorithm on general-purpose processors or on extra accelerators such as Field Programmable Gate-Array (FPGA) and Graphics Processing Units (GPU). To the best of our knowledge, no implementation combines both CPU and extra accelerators, such as GPUs, to accelerate the Zuker algorithm applications. In this paper, a CPU-GPU hybrid computing system that accelerates Zuker algorithm applications for RNA secondary structure prediction is proposed. The computing tasks are allocated between CPU and GPU for parallel cooperate execution. Performance differences between the CPU and the GPU in the task-allocation scheme are considered to obtain workload balance. To improve the hybrid system performance, the Zuker algorithm is optimally implemented with special methods for CPU and GPU architecture. Speedup of 15.93× over optimized multi-core SIMD CPU implementation and performance advantage of 16% over optimized GPU implementation are shown in the experimental results. More than 14% of the sequences are executed on CPU in the hybrid system. The system combining CPU and GPU to accelerate the Zuker algorithm is proven to be promising and can be applied to other bioinformatics applications.

  1. Three hybridization models based on local search scheme for job shop scheduling problem

    NASA Astrophysics Data System (ADS)

    Balbi Fraga, Tatiana

    2015-05-01

    This work presents three different hybridization models based on the general schema of Local Search Heuristics, named Hybrid Successive Application, Hybrid Neighborhood, and Hybrid Improved Neighborhood. Despite similar approaches might have already been presented in the literature in other contexts, in this work these models are applied to analyzes the solution of the job shop scheduling problem, with the heuristics Taboo Search and Particle Swarm Optimization. Besides, we investigate some aspects that must be considered in order to achieve better solutions than those obtained by the original heuristics. The results demonstrate that the algorithms derived from these three hybrid models are more robust than the original algorithms and able to get better results than those found by the single Taboo Search.

  2. Evaluation of Residual Static Corrections by Hybrid Genetic Algorithm Steepest Ascent Autostatics Inversion.Application southern Algerian fields

    NASA Astrophysics Data System (ADS)

    Eladj, Said; bansir, fateh; ouadfeul, sid Ali

    2016-04-01

    The application of genetic algorithm starts with an initial population of chromosomes representing a "model space". Chromosome chains are preferentially Reproduced based on Their fitness Compared to the total population. However, a good chromosome has a Greater opportunity to Produce offspring Compared To other chromosomes in the population. The advantage of the combination HGA / SAA is the use of a global search approach on a large population of local maxima to Improve Significantly the performance of the method. To define the parameters of the Hybrid Genetic Algorithm Steepest Ascent Auto Statics (HGA / SAA) job, we Evaluated by testing in the first stage of "Steepest Ascent," the optimal parameters related to the data used. 1- The number of iterations "Number of hill climbing iteration" is equal to 40 iterations. This parameter defines the participation of the algorithm "SA", in this hybrid approach. 2- The minimum eigenvalue for SA '= 0.8. This is linked to the quality of data and S / N ratio. To find an implementation performance of hybrid genetic algorithms in the inversion for estimating of the residual static corrections, tests Were Performed to determine the number of generation of HGA / SAA. Using the values of residual static corrections already calculated by the Approaches "SAA and CSAA" learning has Proved very effective in the building of the cross-correlation table. To determine the optimal number of generation, we Conducted a series of tests ranging from [10 to 200] generations. The application on real seismic data in southern Algeria allowed us to judge the performance and capacity of the inversion with this hybrid method "HGA / SAA". This experience Clarified the influence of the corrections quality estimated from "SAA / CSAA" and the optimum number of generation hybrid genetic algorithm "HGA" required to have a satisfactory performance. Twenty (20) generations Were enough to Improve continuity and resolution of seismic horizons. This Will allow us to achieve a more accurate structural interpretation Key words: Hybrid Genetic Algorithm, number of generations, model space, local maxima, Number of hill climbing iteration, Minimum eigenvalue, cross-correlation table

  3. Long-Term Evaluation of the AMSR-E Soil Moisture Product Over the Walnut Gulch Watershed, AZ

    NASA Astrophysics Data System (ADS)

    Bolten, J. D.; Jackson, T. J.; Lakshmi, V.; Cosh, M. H.; Drusch, M.

    2005-12-01

    The Advanced Microwave Scanning Radiometer -Earth Observing System (AMSR-E) was launched aboard NASA's Aqua satellite on May 4th, 2002. Quantitative estimates of soil moisture using the AMSR-E provided data have required routine radiometric data calibration and validation using comparisons of satellite observations, extended targets and field campaigns. The currently applied NASA EOS Aqua ASMR-E soil moisture algorithm is based on a change detection approach using polarization ratios (PR) of the calibrated AMSR-E channel brightness temperatures. To date, the accuracy of the soil moisture algorithm has been investigated on short time scales during field campaigns such as the Soil Moisture Experiments in 2004 (SMEX04). Results have indicated self-consistency and calibration stability of the observed brightness temperatures; however the performance of the moisture retrieval algorithm has been poor. The primary objective of this study is to evaluate the quality of the current version of the AMSR-E soil moisture product for a three year period over the Walnut Gulch Experimental Watershed (150 km2) near Tombstone, AZ; the northern study area of SMEX04. This watershed is equipped with hourly and daily recording of precipitation, soil moisture and temperature via a network of raingages and a USDA-NRCS Soil Climate Analysis Network (SCAN) site. Surface wetting and drying are easily distinguished in this area due to the moderately-vegetated terrain and seasonally intense precipitation events. Validation of AMSR-E derived soil moisture is performed from June 2002 to June 2005 using watershed averages of precipitation, and soil moisture and temperature data from the SCAN site supported by a surface soil moisture network. Long-term assessment of soil moisture algorithm performance is investigated by comparing temporal variations of moisture estimates with seasonal changes and precipitation events. Further comparisons are made with a standard soil dataset from the European Centre for Medium-Range Weather Forecasts. The results of this research will contribute to a better characterization of the low biases and discrepancies currently observed in the AMSR-E soil moisture product.

  4. Training radial basis function networks for wind speed prediction using PSO enhanced differential search optimizer

    PubMed Central

    2018-01-01

    This paper presents an integrated hybrid optimization algorithm for training the radial basis function neural network (RBF NN). Training of neural networks is still a challenging exercise in machine learning domain. Traditional training algorithms in general suffer and trap in local optima and lead to premature convergence, which makes them ineffective when applied for datasets with diverse features. Training algorithms based on evolutionary computations are becoming popular due to their robust nature in overcoming the drawbacks of the traditional algorithms. Accordingly, this paper proposes a hybrid training procedure with differential search (DS) algorithm functionally integrated with the particle swarm optimization (PSO). To surmount the local trapping of the search procedure, a new population initialization scheme is proposed using Logistic chaotic sequence, which enhances the population diversity and aid the search capability. To demonstrate the effectiveness of the proposed RBF hybrid training algorithm, experimental analysis on publicly available 7 benchmark datasets are performed. Subsequently, experiments were conducted on a practical application case for wind speed prediction to expound the superiority of the proposed RBF training algorithm in terms of prediction accuracy. PMID:29768463

  5. Training radial basis function networks for wind speed prediction using PSO enhanced differential search optimizer.

    PubMed

    Rani R, Hannah Jessie; Victoire T, Aruldoss Albert

    2018-01-01

    This paper presents an integrated hybrid optimization algorithm for training the radial basis function neural network (RBF NN). Training of neural networks is still a challenging exercise in machine learning domain. Traditional training algorithms in general suffer and trap in local optima and lead to premature convergence, which makes them ineffective when applied for datasets with diverse features. Training algorithms based on evolutionary computations are becoming popular due to their robust nature in overcoming the drawbacks of the traditional algorithms. Accordingly, this paper proposes a hybrid training procedure with differential search (DS) algorithm functionally integrated with the particle swarm optimization (PSO). To surmount the local trapping of the search procedure, a new population initialization scheme is proposed using Logistic chaotic sequence, which enhances the population diversity and aid the search capability. To demonstrate the effectiveness of the proposed RBF hybrid training algorithm, experimental analysis on publicly available 7 benchmark datasets are performed. Subsequently, experiments were conducted on a practical application case for wind speed prediction to expound the superiority of the proposed RBF training algorithm in terms of prediction accuracy.

  6. A formally verified algorithm for interactive consistency under a hybrid fault model

    NASA Technical Reports Server (NTRS)

    Lincoln, Patrick; Rushby, John

    1993-01-01

    Consistent distribution of single-source data to replicated computing channels is a fundamental problem in fault-tolerant system design. The 'Oral Messages' (OM) algorithm solves this problem of Interactive Consistency (Byzantine Agreement) assuming that all faults are worst-cass. Thambidurai and Park introduced a 'hybrid' fault model that distinguished three fault modes: asymmetric (Byzantine), symmetric, and benign; they also exhibited, along with an informal 'proof of correctness', a modified version of OM. Unfortunately, their algorithm is flawed. The discipline of mechanically checked formal verification eventually enabled us to develop a correct algorithm for Interactive Consistency under the hybrid fault model. This algorithm withstands $a$ asymmetric, $s$ symmetric, and $b$ benign faults simultaneously, using $m+1$ rounds, provided $n is greater than 2a + 2s + b + m$, and $m\\geg a$. We present this algorithm, discuss its subtle points, and describe its formal specification and verification in PVS. We argue that formal verification systems such as PVS are now sufficiently effective that their application to fault-tolerance algorithms should be considered routine.

  7. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    NASA Astrophysics Data System (ADS)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  8. A multiobjective hybrid genetic algorithm for the capacitated multipoint network design problem.

    PubMed

    Lo, C C; Chang, W H

    2000-01-01

    The capacitated multipoint network design problem (CMNDP) is NP-complete. In this paper, a hybrid genetic algorithm for CMNDP is proposed. The multiobjective hybrid genetic algorithm (MOHGA) differs from other genetic algorithms (GAs) mainly in its selection procedure. The concept of subpopulation is used in MOHGA. Four subpopulations are generated according to the elitism reservation strategy, the shifting Prufer vector, the stochastic universal sampling, and the complete random method, respectively. Mixing these four subpopulations produces the next generation population. The MOHGA can effectively search the feasible solution space due to population diversity. The MOHGA has been applied to CMNDP. By examining computational and analytical results, we notice that the MOHGA can find most nondominated solutions and is much more effective and efficient than other multiobjective GAs.

  9. River flow simulation using a multilayer perceptron-firefly algorithm model

    NASA Astrophysics Data System (ADS)

    Darbandi, Sabereh; Pourhosseini, Fatemeh Akhoni

    2018-06-01

    River flow estimation using records of past time series is importance in water resources engineering and management and is required in hydrologic studies. In the past two decades, the approaches based on the artificial neural networks (ANN) were developed. River flow modeling is a non-linear process and highly affected by the inputs to the modeling. In this study, the best input combination of the models was identified using the Gamma test then MLP-ANN and hybrid multilayer perceptron (MLP-FFA) is used to forecast monthly river flow for a set of time intervals using observed data. The measurements from three gauge at Ajichay watershed, East Azerbaijani, were used to train and test the models approach for the period from January 2004 to July 2016. Calibration and validation were performed within the same period for MLP-ANN and MLP-FFA models after the preparation of the required data. Statistics, the root mean square error and determination coefficient, are used to verify outputs from MLP-ANN to MLP-FFA models. The results show that MLP-FFA model is satisfactory for monthly river flow simulation in study area.

  10. How will climate change affect watershed mercury export in a representative Coastal Plain watershed?

    NASA Astrophysics Data System (ADS)

    Golden, H. E.; Knightes, C. D.; Conrads, P. A.; Feaster, T.; Davis, G. M.; Benedict, S. T.; Bradley, P. M.

    2012-12-01

    Future climate change is expected to drive variations in watershed hydrological processes and water quality across a wide range of physiographic provinces, ecosystems, and spatial scales. How such shifts in climatic conditions will impact watershed mercury (Hg) dynamics and hydrologically-driven Hg transport is a significant concern. We simulate the responses of watershed hydrological and total Hg (HgT) fluxes and concentrations to a unified set of past and future climate change projections in a Coastal Plain basin using multiple watershed models. We use two statistically downscaled global precipitation and temperature models, ECHO, a hybrid of the ECHAM4 and HOPE-G models, and the Community Climate System Model (CCSM3) across two thirty-year simulations (1980 to 2010 and 2040 to 2070). We apply three watershed models to quantify and bracket potential changes in hydrologic and HgT fluxes, including the Visualizing Ecosystems for Land Management Assessment Model for Hg (VELMA-Hg), the Grid Based Mercury Model (GBMM), and TOPLOAD, a water quality constituent model linked to TOPMODEL hydrological simulations. We estimate a decrease in average annual HgT fluxes in response to climate change using the ECHO projections and an increase with the CCSM3 projections in the study watershed. Average monthly HgT fluxes increase using both climate change projections between in the late spring (March through May), when HgT concentrations and flow are high. Results suggest that hydrological transport associated with changes in precipitation and temperature is the primary mechanism driving HgT flux response to climate change. Our multiple model/multiple projection approach allows us to bracket the relative response of HgT fluxes to climate change, thereby illustrating the uncertainty associated with the projections. In addition, our approach allows us to examine potential variations in climate change-driven water and HgT export based on different conceptualizations of watershed HgT dynamics and the representative mathematical structures underpinning existing watershed Hg models.

  11. Threshold automatic selection hybrid phase unwrapping algorithm for digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Zhou, Meiling; Min, Junwei; Yao, Baoli; Yu, Xianghua; Lei, Ming; Yan, Shaohui; Yang, Yanlong; Dan, Dan

    2015-01-01

    Conventional quality-guided (QG) phase unwrapping algorithm is hard to be applied to digital holographic microscopy because of the long execution time. In this paper, we present a threshold automatic selection hybrid phase unwrapping algorithm that combines the existing QG algorithm and the flood-filled (FF) algorithm to solve this problem. The original wrapped phase map is divided into high- and low-quality sub-maps by selecting a threshold automatically, and then the FF and QG unwrapping algorithms are used in each level to unwrap the phase, respectively. The feasibility of the proposed method is proved by experimental results, and the execution speed is shown to be much faster than that of the original QG unwrapping algorithm.

  12. Improved hybrid optimization algorithm for 3D protein structure prediction.

    PubMed

    Zhou, Changjun; Hou, Caixia; Wei, Xiaopeng; Zhang, Qiang

    2014-07-01

    A new improved hybrid optimization algorithm - PGATS algorithm, which is based on toy off-lattice model, is presented for dealing with three-dimensional protein structure prediction problems. The algorithm combines the particle swarm optimization (PSO), genetic algorithm (GA), and tabu search (TS) algorithms. Otherwise, we also take some different improved strategies. The factor of stochastic disturbance is joined in the particle swarm optimization to improve the search ability; the operations of crossover and mutation that are in the genetic algorithm are changed to a kind of random liner method; at last tabu search algorithm is improved by appending a mutation operator. Through the combination of a variety of strategies and algorithms, the protein structure prediction (PSP) in a 3D off-lattice model is achieved. The PSP problem is an NP-hard problem, but the problem can be attributed to a global optimization problem of multi-extremum and multi-parameters. This is the theoretical principle of the hybrid optimization algorithm that is proposed in this paper. The algorithm combines local search and global search, which overcomes the shortcoming of a single algorithm, giving full play to the advantage of each algorithm. In the current universal standard sequences, Fibonacci sequences and real protein sequences are certified. Experiments show that the proposed new method outperforms single algorithms on the accuracy of calculating the protein sequence energy value, which is proved to be an effective way to predict the structure of proteins.

  13. Optimal Golomb Ruler Sequences Generation for Optical WDM Systems: A Novel Parallel Hybrid Multi-objective Bat Algorithm

    NASA Astrophysics Data System (ADS)

    Bansal, Shonak; Singh, Arun Kumar; Gupta, Neena

    2017-02-01

    In real-life, multi-objective engineering design problems are very tough and time consuming optimization problems due to their high degree of nonlinearities, complexities and inhomogeneity. Nature-inspired based multi-objective optimization algorithms are now becoming popular for solving multi-objective engineering design problems. This paper proposes original multi-objective Bat algorithm (MOBA) and its extended form, namely, novel parallel hybrid multi-objective Bat algorithm (PHMOBA) to generate shortest length Golomb ruler called optimal Golomb ruler (OGR) sequences at a reasonable computation time. The OGRs found their application in optical wavelength division multiplexing (WDM) systems as channel-allocation algorithm to reduce the four-wave mixing (FWM) crosstalk. The performances of both the proposed algorithms to generate OGRs as optical WDM channel-allocation is compared with other existing classical computing and nature-inspired algorithms, including extended quadratic congruence (EQC), search algorithm (SA), genetic algorithms (GAs), biogeography based optimization (BBO) and big bang-big crunch (BB-BC) optimization algorithms. Simulations conclude that the proposed parallel hybrid multi-objective Bat algorithm works efficiently as compared to original multi-objective Bat algorithm and other existing algorithms to generate OGRs for optical WDM systems. The algorithm PHMOBA to generate OGRs, has higher convergence and success rate than original MOBA. The efficiency improvement of proposed PHMOBA to generate OGRs up to 20-marks, in terms of ruler length and total optical channel bandwidth (TBW) is 100 %, whereas for original MOBA is 85 %. Finally the implications for further research are also discussed.

  14. Fuzzy-Based Hybrid Control Algorithm for the Stabilization of a Tri-Rotor UAV.

    PubMed

    Ali, Zain Anwar; Wang, Daobo; Aamir, Muhammad

    2016-05-09

    In this paper, a new and novel mathematical fuzzy hybrid scheme is proposed for the stabilization of a tri-rotor unmanned aerial vehicle (UAV). The fuzzy hybrid scheme consists of a fuzzy logic controller, regulation pole-placement tracking (RST) controller with model reference adaptive control (MRAC), in which adaptive gains of the RST controller are being fine-tuned by a fuzzy logic controller. Brushless direct current (BLDC) motors are installed in the triangular frame of the tri-rotor UAV, which helps maintain control on its motion and different altitude and attitude changes, similar to rotorcrafts. MRAC-based MIT rule is proposed for system stability. Moreover, the proposed hybrid controller with nonlinear flight dynamics is shown in the presence of translational and rotational velocity components. The performance of the proposed algorithm is demonstrated via MATLAB simulations, in which the proposed fuzzy hybrid controller is compared with the existing adaptive RST controller. It shows that our proposed algorithm has better transient performance with zero steady-state error, and fast convergence towards stability.

  15. Non-preconditioned conjugate gradient on cell and FPGA based hybrid supercomputer nodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubois, David H; Dubois, Andrew J; Boorman, Thomas M

    2009-01-01

    This work presents a detailed implementation of a double precision, non-preconditioned, Conjugate Gradient algorithm on a Roadrunner heterogeneous supercomputer node. These nodes utilize the Cell Broadband Engine Architecture{sup TM} in conjunction with x86 Opteron{sup TM} processors from AMD. We implement a common Conjugate Gradient algorithm, on a variety of systems, to compare and contrast performance. Implementation results are presented for the Roadrunner hybrid supercomputer, SRC Computers, Inc. MAPStation SRC-6 FPGA enhanced hybrid supercomputer, and AMD Opteron only. In all hybrid implementations wall clock time is measured, including all transfer overhead and compute timings.

  16. Non-preconditioned conjugate gradient on cell and FPCA-based hybrid supercomputer nodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubois, David H; Dubois, Andrew J; Boorman, Thomas M

    2009-03-10

    This work presents a detailed implementation of a double precision, Non-Preconditioned, Conjugate Gradient algorithm on a Roadrunner heterogeneous supercomputer node. These nodes utilize the Cell Broadband Engine Architecture{trademark} in conjunction with x86 Opteron{trademark} processors from AMD. We implement a common Conjugate Gradient algorithm, on a variety of systems, to compare and contrast performance. Implementation results are presented for the Roadrunner hybrid supercomputer, SRC Computers, Inc. MAPStation SRC-6 FPGA enhanced hybrid supercomputer, and AMD Opteron only. In all hybrid implementations wall clock time is measured, including all transfer overhead and compute timings.

  17. Image segmentation for biomedical applications based on alternating sequential filtering and watershed transformation

    NASA Astrophysics Data System (ADS)

    Gorpas, D.; Yova, D.

    2009-07-01

    One of the major challenges in biomedical imaging is the extraction of quantified information from the acquired images. Light and tissue interaction leads to the acquisition of images that present inconsistent intensity profiles and thus the accurate identification of the regions of interest is a rather complicated process. On the other hand, the complex geometries and the tangent objects that very often are present in the acquired images, lead to either false detections or to the merging, shrinkage or expansion of the regions of interest. In this paper an algorithm, which is based on alternating sequential filtering and watershed transformation, is proposed for the segmentation of biomedical images. This algorithm has been tested over two applications, each one based on different acquisition system, and the results illustrate its accuracy in segmenting the regions of interest.

  18. Combining watershed and graph cuts methods to segment organs at risk in radiotherapy

    NASA Astrophysics Data System (ADS)

    Dolz, Jose; Kirisli, Hortense A.; Viard, Romain; Massoptier, Laurent

    2014-03-01

    Computer-aided segmentation of anatomical structures in medical images is a valuable tool for efficient radiation therapy planning (RTP). As delineation errors highly affect the radiation oncology treatment, it is crucial to delineate geometric structures accurately. In this paper, a semi-automatic segmentation approach for computed tomography (CT) images, based on watershed and graph-cuts methods, is presented. The watershed pre-segmentation groups small areas of similar intensities in homogeneous labels, which are subsequently used as input for the graph-cuts algorithm. This methodology does not require of prior knowledge of the structure to be segmented; even so, it performs well with complex shapes and low intensity. The presented method also allows the user to add foreground and background strokes in any of the three standard orthogonal views - axial, sagittal or coronal - making the interaction with the algorithm easy and fast. Hence, the segmentation information is propagated within the whole volume, providing a spatially coherent result. The proposed algorithm has been evaluated using 9 CT volumes, by comparing its segmentation performance over several organs - lungs, liver, spleen, heart and aorta - to those of manual delineation from experts. A Dicés coefficient higher than 0.89 was achieved in every case. That demonstrates that the proposed approach works well for all the anatomical structures analyzed. Due to the quality of the results, the introduction of the proposed approach in the RTP process will be a helpful tool for organs at risk (OARs) segmentation.

  19. Iterative h-minima-based marker-controlled watershed for cell nucleus segmentation.

    PubMed

    Koyuncu, Can Fahrettin; Akhan, Ece; Ersahin, Tulin; Cetin-Atalay, Rengul; Gunduz-Demir, Cigdem

    2016-04-01

    Automated microscopy imaging systems facilitate high-throughput screening in molecular cellular biology research. The first step of these systems is cell nucleus segmentation, which has a great impact on the success of the overall system. The marker-controlled watershed is a technique commonly used by the previous studies for nucleus segmentation. These studies define their markers finding regional minima on the intensity/gradient and/or distance transform maps. They typically use the h-minima transform beforehand to suppress noise on these maps. The selection of the h value is critical; unnecessarily small values do not sufficiently suppress the noise, resulting in false and oversegmented markers, and unnecessarily large ones suppress too many pixels, causing missing and undersegmented markers. Because cell nuclei show different characteristics within an image, the same h value may not work to define correct markers for all the nuclei. To address this issue, in this work, we propose a new watershed algorithm that iteratively identifies its markers, considering a set of different h values. In each iteration, the proposed algorithm defines a set of candidates using a particular h value and selects the markers from those candidates provided that they fulfill the size requirement. Working with widefield fluorescence microscopy images, our experiments reveal that the use of multiple h values in our iterative algorithm leads to better segmentation results, compared to its counterparts. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  20. A fast complex integer convolution using a hybrid transform

    NASA Technical Reports Server (NTRS)

    Reed, I. S.; K Truong, T.

    1978-01-01

    It is shown that the Winograd transform can be combined with a complex integer transform over the Galois field GF(q-squared) to yield a new algorithm for computing the discrete cyclic convolution of complex number points. By this means a fast method for accurately computing the cyclic convolution of a sequence of complex numbers for long convolution lengths can be obtained. This new hybrid algorithm requires fewer multiplications than previous algorithms.

  1. Genetic Algorithm Optimization of a Cost Competitive Hybrid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Story, George

    2015-01-01

    Performance, reliability and cost have always been drivers in the rocket business. Hybrid rockets have been late entries into the launch business due to substantial early development work on liquid rockets and solid rockets. Slowly the technology readiness level of hybrids has been increasing due to various large scale testing and flight tests of hybrid rockets. One remaining issue is the cost of hybrids versus the existing launch propulsion systems. This paper will review the known state-of-the-art hybrid development work to date and incorporate it into a genetic algorithm to optimize the configuration based on various parameters. A cost module will be incorporated to the code based on the weights of the components. The design will be optimized on meeting the performance requirements at the lowest cost.

  2. Genetic Algorithm Optimization of a Cost Competitive Hybrid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Story, George

    2014-01-01

    Performance, reliability and cost have always been drivers in the rocket business. Hybrid rockets have been late entries into the launch business due to substantial early development work on liquid rockets and later on solid rockets. Slowly the technology readiness level of hybrids has been increasing due to various large scale testing and flight tests of hybrid rockets. A remaining issue is the cost of hybrids vs the existing launch propulsion systems. This paper will review the known state of the art hybrid development work to date and incorporate it into a genetic algorithm to optimize the configuration based on various parameters. A cost module will be incorporated to the code based on the weights of the components. The design will be optimized on meeting the performance requirements at the lowest cost.

  3. Incorporating Green Infrastructure into Water Resources Management Plans to Address Water Quality Impairments

    NASA Astrophysics Data System (ADS)

    Piscopo, A. N.; Detenbeck, N. E.

    2017-12-01

    Managers of urban watersheds with excessive nutrient loads are more frequently turning to green infrastructure (GI) to manage their water quality impairments. The effectiveness of GI is dependent on a number of factors, including (1) the type and placement of GI within the watershed, (2) the specific nutrients to be treated, and (3) the uncertainty in future climates. Although many studies have investigated the effectiveness of individual GI units for different types of nutrients, relatively few have considered the effectiveness of GI on a watershed scale, the scale most relevant to management plans. At the watershed scale, endless combinations of GI type and location are possible, each with different effectiveness in reducing nutrient loads, minimizing costs, and maximizing co-benefits such as reducing runoff. To efficiently generate management plan options that balance the tradeoffs between these objectives, we simulate candidate options using EPA's Stormwater Management Model for multiple future climates and determine the Pareto optimal set of solution options using a multi-objective evolutionary algorithm. Our approach is demonstrated for an urban watershed in Rockville, Maryland.

  4. A Gradient Taguchi Method for Engineering Optimization

    NASA Astrophysics Data System (ADS)

    Hwang, Shun-Fa; Wu, Jen-Chih; He, Rong-Song

    2017-10-01

    To balance the robustness and the convergence speed of optimization, a novel hybrid algorithm consisting of Taguchi method and the steepest descent method is proposed in this work. Taguchi method using orthogonal arrays could quickly find the optimum combination of the levels of various factors, even when the number of level and/or factor is quite large. This algorithm is applied to the inverse determination of elastic constants of three composite plates by combining numerical method and vibration testing. For these problems, the proposed algorithm could find better elastic constants in less computation cost. Therefore, the proposed algorithm has nice robustness and fast convergence speed as compared to some hybrid genetic algorithms.

  5. A GIS and statistical approach to identify variables that control water quality in hydrothermally altered and mineralized watersheds, Silverton, Colorado, USA

    USGS Publications Warehouse

    Yager, Douglas B.; Johnson, Raymond H.; Rockwell, Barnaby W.; Caine, Jonathan S.; Smith, Kathleen S.

    2013-01-01

    Hydrothermally altered bedrock in the Silverton mining area, southwest Colorado, USA, contains sulfide minerals that weather to produce acidic and metal-rich leachate that is toxic to aquatic life. This study utilized a geographic information system (GIS) and statistical approach to identify watershed-scale geologic variables in the Silverton area that influence water quality. GIS analysis of mineral maps produced using remote sensing datasets including Landsat Thematic Mapper, advanced spaceborne thermal emission and reflection radiometer, and a hybrid airborne visible infrared imaging spectrometer and field-based product enabled areas of alteration to be quantified. Correlations between water quality signatures determined at watershed outlets, and alteration types intersecting both total watershed areas and GIS-buffered areas along streams were tested using linear regression analysis. Despite remote sensing datasets having varying watershed area coverage due to vegetation cover and differing mineral mapping capabilities, each dataset was useful for delineating acid-generating bedrock. Areas of quartz–sericite–pyrite mapped by AVIRIS have the highest correlations with acidic surface water and elevated iron and aluminum concentrations. Alkalinity was only correlated with area of acid neutralizing, propylitically altered bedrock containing calcite and chlorite mapped by AVIRIS. Total watershed area of acid-generating bedrock is more significantly correlated with acidic and metal-rich surface water when compared with acid-generating bedrock intersected by GIS-buffered areas along streams. This methodology could be useful in assessing the possible effects that alteration type area has in either generating or neutralizing acidity in unmined watersheds and in areas where new mining is planned.

  6. Selection and placement of best management practices used to reduce water quality degradation in Lincoln Lake watershed

    NASA Astrophysics Data System (ADS)

    Rodriguez, Hector German; Popp, Jennie; Maringanti, Chetan; Chaubey, Indrajeet

    2011-01-01

    An increased loss of agricultural nutrients is a growing concern for water quality in Arkansas. Several studies have shown that best management practices (BMPs) are effective in controlling water pollution. However, those affected with water quality issues need water management plans that take into consideration BMPs selection, placement, and affordability. This study used a nondominated sorting genetic algorithm (NSGA-II). This multiobjective algorithm selects and locates BMPs that minimize nutrients pollution cost-effectively by providing trade-off curves (optimal fronts) between pollutant reduction and total net cost increase. The usefulness of this optimization framework was evaluated in the Lincoln Lake watershed. The final NSGA-II optimization model generated a number of near-optimal solutions by selecting from 35 BMPs (combinations of pasture management, buffer zones, and poultry litter application practices). Selection and placement of BMPs were analyzed under various cost solutions. The NSGA-II provides multiple solutions that could fit the water management plan for the watershed. For instance, by implementing all the BMP combinations recommended in the lowest-cost solution, total phosphorous (TP) could be reduced by at least 76% while increasing cost by less than 2% in the entire watershed. This value represents an increase in cost of 5.49 ha-1 when compared to the baseline. Implementing all the BMP combinations proposed with the medium- and the highest-cost solutions could decrease TP drastically but will increase cost by 24,282 (7%) and $82,306 (25%), respectively.

  7. Kalman Filtered Bio Heat Transfer Model Based Self-adaptive Hybrid Magnetic Resonance Thermometry.

    PubMed

    Zhang, Yuxin; Chen, Shuo; Deng, Kexin; Chen, Bingyao; Wei, Xing; Yang, Jiafei; Wang, Shi; Ying, Kui

    2017-01-01

    To develop a self-adaptive and fast thermometry method by combining the original hybrid magnetic resonance thermometry method and the bio heat transfer equation (BHTE) model. The proposed Kalman filtered Bio Heat Transfer Model Based Self-adaptive Hybrid Magnetic Resonance Thermometry, abbreviated as KalBHT hybrid method, introduced the BHTE model to synthesize a window on the regularization term of the hybrid algorithm, which leads to a self-adaptive regularization both spatially and temporally with change of temperature. Further, to decrease the sensitivity to accuracy of the BHTE model, Kalman filter is utilized to update the window at each iteration time. To investigate the effect of the proposed model, computer heating simulation, phantom microwave heating experiment and dynamic in-vivo model validation of liver and thoracic tumor were conducted in this study. The heating simulation indicates that the KalBHT hybrid algorithm achieves more accurate results without adjusting λ to a proper value in comparison to the hybrid algorithm. The results of the phantom heating experiment illustrate that the proposed model is able to follow temperature changes in the presence of motion and the temperature estimated also shows less noise in the background and surrounding the hot spot. The dynamic in-vivo model validation with heating simulation demonstrates that the proposed model has a higher convergence rate, more robustness to susceptibility problem surrounding the hot spot and more accuracy of temperature estimation. In the healthy liver experiment with heating simulation, the RMSE of the hot spot of the proposed model is reduced to about 50% compared to the RMSE of the original hybrid model and the convergence time becomes only about one fifth of the hybrid model. The proposed model is able to improve the accuracy of the original hybrid algorithm and accelerate the convergence rate of MR temperature estimation.

  8. Hybrid-dual-fourier tomographic algorithm for a fast three-dimensionial optical image reconstruction in turbid media

    NASA Technical Reports Server (NTRS)

    Alfano, Robert R. (Inventor); Cai, Wei (Inventor)

    2007-01-01

    A reconstruction technique for reducing computation burden in the 3D image processes, wherein the reconstruction procedure comprises an inverse and a forward model. The inverse model uses a hybrid dual Fourier algorithm that combines a 2D Fourier inversion with a 1D matrix inversion to thereby provide high-speed inverse computations. The inverse algorithm uses a hybrid transfer to provide fast Fourier inversion for data of multiple sources and multiple detectors. The forward model is based on an analytical cumulant solution of a radiative transfer equation. The accurate analytical form of the solution to the radiative transfer equation provides an efficient formalism for fast computation of the forward model.

  9. CPU-GPU hybrid accelerating the Zuker algorithm for RNA secondary structure prediction applications

    PubMed Central

    2012-01-01

    Background Prediction of ribonucleic acid (RNA) secondary structure remains one of the most important research areas in bioinformatics. The Zuker algorithm is one of the most popular methods of free energy minimization for RNA secondary structure prediction. Thus far, few studies have been reported on the acceleration of the Zuker algorithm on general-purpose processors or on extra accelerators such as Field Programmable Gate-Array (FPGA) and Graphics Processing Units (GPU). To the best of our knowledge, no implementation combines both CPU and extra accelerators, such as GPUs, to accelerate the Zuker algorithm applications. Results In this paper, a CPU-GPU hybrid computing system that accelerates Zuker algorithm applications for RNA secondary structure prediction is proposed. The computing tasks are allocated between CPU and GPU for parallel cooperate execution. Performance differences between the CPU and the GPU in the task-allocation scheme are considered to obtain workload balance. To improve the hybrid system performance, the Zuker algorithm is optimally implemented with special methods for CPU and GPU architecture. Conclusions Speedup of 15.93× over optimized multi-core SIMD CPU implementation and performance advantage of 16% over optimized GPU implementation are shown in the experimental results. More than 14% of the sequences are executed on CPU in the hybrid system. The system combining CPU and GPU to accelerate the Zuker algorithm is proven to be promising and can be applied to other bioinformatics applications. PMID:22369626

  10. Optimal implementation of best management practices to improve agricultural hydrology and water quality

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Engel, B.; Collingsworth, P.; Pijanowski, B. C.

    2017-12-01

    Nutrient loading from the Maumee River watershed is a significant reason for the harmful algal blooms (HABs) problem in Lake Erie. Strategies to reduce nutrient loading from agricultural areas in the Maumee River watershed need to be explored. Best management practices (BMPs) are popular approaches for improving hydrology and water quality. Various scenarios of BMP implementation were simulated in the AXL watershed (an agricultural watershed in Maumee River watershed) using Soil and Water Assessment Tool (SWAT) and a new BMP cost tool to explore the cost-effectiveness of the practices. BMPs of interest included vegetative filter strips, grassed waterways, blind inlets, grade stabilization structures, wetlands, no-till, nutrient management, residue management, and cover crops. The following environmental concerns were considered: streamflow, Total Phosphorous (TP), Dissolved Reactive Phosphorus (DRP), Total Kjeldahl Nitrogen (TKN), and Nitrate+Nitrite (NOx). To obtain maximum hydrological and water quality benefits with minimum cost, an optimization tool was developed to optimally select and place BMPs by connecting SWAT, the BMP cost tool, and optimization algorithms. The optimization tool was then applied in AXL watershed to explore optimization focusing on critical areas (top 25% of areas with highest runoff volume/pollutant loads per area) vs. all areas of the watershed, optimization using weather data for spring (March to July, due to the goal of reducing spring phosphorus in watershed management plan) vs. full year, and optimization results of implementing BMPs to achieve the watershed management plan goal (reducing 2008 TP levels by 40%). The optimization tool and BMP optimization results can be used by watershed groups and communities to solve hydrology and water quality problems.

  11. Predicting DNA hybridization kinetics from sequence

    NASA Astrophysics Data System (ADS)

    Zhang, Jinny X.; Fang, John Z.; Duan, Wei; Wu, Lucia R.; Zhang, Angela W.; Dalchau, Neil; Yordanov, Boyan; Petersen, Rasmus; Phillips, Andrew; Zhang, David Yu

    2018-01-01

    Hybridization is a key molecular process in biology and biotechnology, but so far there is no predictive model for accurately determining hybridization rate constants based on sequence information. Here, we report a weighted neighbour voting (WNV) prediction algorithm, in which the hybridization rate constant of an unknown sequence is predicted based on similarity reactions with known rate constants. To construct this algorithm we first performed 210 fluorescence kinetics experiments to observe the hybridization kinetics of 100 different DNA target and probe pairs (36 nt sub-sequences of the CYCS and VEGF genes) at temperatures ranging from 28 to 55 °C. Automated feature selection and weighting optimization resulted in a final six-feature WNV model, which can predict hybridization rate constants of new sequences to within a factor of 3 with ∼91% accuracy, based on leave-one-out cross-validation. Accurate prediction of hybridization kinetics allows the design of efficient probe sequences for genomics research.

  12. Improving personalized link prediction by hybrid diffusion

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Hu; Zhu, Yu-Xiao; Zhou, Tao

    2016-04-01

    Inspired by traditional link prediction and to solve the problem of recommending friends in social networks, we introduce the personalized link prediction in this paper, in which each individual will get equal number of diversiform predictions. While the performances of many classical algorithms are not satisfactory under this framework, thus new algorithms are in urgent need. Motivated by previous researches in other fields, we generalize heat conduction process to the framework of personalized link prediction and find that this method outperforms many classical similarity-based algorithms, especially in the performance of diversity. In addition, we demonstrate that adding one ground node that is supposed to connect all the nodes in the system will greatly benefit the performance of heat conduction. Finally, better hybrid algorithms composed of local random walk and heat conduction have been proposed. Numerical results show that the hybrid algorithms can outperform other algorithms simultaneously in all four adopted metrics: AUC, precision, recall and hamming distance. In a word, this work may shed some light on the in-depth understanding of the effect of physical processes in personalized link prediction.

  13. A Hybrid Swarm Intelligence Algorithm for Intrusion Detection Using Significant Features.

    PubMed

    Amudha, P; Karthik, S; Sivakumari, S

    2015-01-01

    Intrusion detection has become a main part of network security due to the huge number of attacks which affects the computers. This is due to the extensive growth of internet connectivity and accessibility to information systems worldwide. To deal with this problem, in this paper a hybrid algorithm is proposed to integrate Modified Artificial Bee Colony (MABC) with Enhanced Particle Swarm Optimization (EPSO) to predict the intrusion detection problem. The algorithms are combined together to find out better optimization results and the classification accuracies are obtained by 10-fold cross-validation method. The purpose of this paper is to select the most relevant features that can represent the pattern of the network traffic and test its effect on the success of the proposed hybrid classification algorithm. To investigate the performance of the proposed method, intrusion detection KDDCup'99 benchmark dataset from the UCI Machine Learning repository is used. The performance of the proposed method is compared with the other machine learning algorithms and found to be significantly different.

  14. A Hybrid Swarm Intelligence Algorithm for Intrusion Detection Using Significant Features

    PubMed Central

    Amudha, P.; Karthik, S.; Sivakumari, S.

    2015-01-01

    Intrusion detection has become a main part of network security due to the huge number of attacks which affects the computers. This is due to the extensive growth of internet connectivity and accessibility to information systems worldwide. To deal with this problem, in this paper a hybrid algorithm is proposed to integrate Modified Artificial Bee Colony (MABC) with Enhanced Particle Swarm Optimization (EPSO) to predict the intrusion detection problem. The algorithms are combined together to find out better optimization results and the classification accuracies are obtained by 10-fold cross-validation method. The purpose of this paper is to select the most relevant features that can represent the pattern of the network traffic and test its effect on the success of the proposed hybrid classification algorithm. To investigate the performance of the proposed method, intrusion detection KDDCup'99 benchmark dataset from the UCI Machine Learning repository is used. The performance of the proposed method is compared with the other machine learning algorithms and found to be significantly different. PMID:26221625

  15. Hybrid-optimization strategy for the communication of large-scale Kinetic Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Wu, Baodong; Li, Shigang; Zhang, Yunquan; Nie, Ningming

    2017-02-01

    The parallel Kinetic Monte Carlo (KMC) algorithm based on domain decomposition has been widely used in large-scale physical simulations. However, the communication overhead of the parallel KMC algorithm is critical, and severely degrades the overall performance and scalability. In this paper, we present a hybrid optimization strategy to reduce the communication overhead for the parallel KMC simulations. We first propose a communication aggregation algorithm to reduce the total number of messages and eliminate the communication redundancy. Then, we utilize the shared memory to reduce the memory copy overhead of the intra-node communication. Finally, we optimize the communication scheduling using the neighborhood collective operations. We demonstrate the scalability and high performance of our hybrid optimization strategy by both theoretical and experimental analysis. Results show that the optimized KMC algorithm exhibits better performance and scalability than the well-known open-source library-SPPARKS. On 32-node Xeon E5-2680 cluster (total 640 cores), the optimized algorithm reduces the communication time by 24.8% compared with SPPARKS.

  16. Information Filtering via a Scaling-Based Function

    PubMed Central

    Qiu, Tian; Zhang, Zi-Ke; Chen, Guang

    2013-01-01

    Finding a universal description of the algorithm optimization is one of the key challenges in personalized recommendation. In this article, for the first time, we introduce a scaling-based algorithm (SCL) independent of recommendation list length based on a hybrid algorithm of heat conduction and mass diffusion, by finding out the scaling function for the tunable parameter and object average degree. The optimal value of the tunable parameter can be abstracted from the scaling function, which is heterogeneous for the individual object. Experimental results obtained from three real datasets, Netflix, MovieLens and RYM, show that the SCL is highly accurate in recommendation. More importantly, compared with a number of excellent algorithms, including the mass diffusion method, the original hybrid method, and even an improved version of the hybrid method, the SCL algorithm remarkably promotes the personalized recommendation in three other aspects: solving the accuracy-diversity dilemma, presenting a high novelty, and solving the key challenge of cold start problem. PMID:23696829

  17. Finding minimum spanning trees more efficiently for tile-based phase unwrapping

    NASA Astrophysics Data System (ADS)

    Sawaf, Firas; Tatam, Ralph P.

    2006-06-01

    The tile-based phase unwrapping method employs an algorithm for finding the minimum spanning tree (MST) in each tile. We first examine the properties of a tile's representation from a graph theory viewpoint, observing that it is possible to make use of a more efficient class of MST algorithms. We then describe a novel linear time algorithm which reduces the size of the MST problem by half at the least, and solves it completely at best. We also show how this algorithm can be applied to a tile using a sliding window technique. Finally, we show how the reduction algorithm can be combined with any other standard MST algorithm to achieve a more efficient hybrid, using Prim's algorithm for empirical comparison and noting that the reduction algorithm takes only 0.1% of the time taken by the overall hybrid.

  18. Hybrid approach of selecting hyperparameters of support vector machine for regression.

    PubMed

    Jeng, Jin-Tsong

    2006-06-01

    To select the hyperparameters of the support vector machine for regression (SVR), a hybrid approach is proposed to determine the kernel parameter of the Gaussian kernel function and the epsilon value of Vapnik's epsilon-insensitive loss function. The proposed hybrid approach includes a competitive agglomeration (CA) clustering algorithm and a repeated SVR (RSVR) approach. Since the CA clustering algorithm is used to find the nearly "optimal" number of clusters and the centers of clusters in the clustering process, the CA clustering algorithm is applied to select the Gaussian kernel parameter. Additionally, an RSVR approach that relies on the standard deviation of a training error is proposed to obtain an epsilon in the loss function. Finally, two functions, one real data set (i.e., a time series of quarterly unemployment rate for West Germany) and an identification of nonlinear plant are used to verify the usefulness of the hybrid approach.

  19. Full Gradient Solution to Adaptive Hybrid Control

    NASA Technical Reports Server (NTRS)

    Bean, Jacob; Schiller, Noah H.; Fuller, Chris

    2017-01-01

    This paper focuses on the adaptation mechanisms in adaptive hybrid controllers. Most adaptive hybrid controllers update two filters individually according to the filtered reference least mean squares (FxLMS) algorithm. Because this algorithm was derived for feedforward control, it does not take into account the presence of a feedback loop in the gradient calculation. This paper provides a derivation of the proper weight vector gradient for hybrid (or feedback) controllers that takes into account the presence of feedback. In this formulation, a single weight vector is updated rather than two individually. An internal model structure is assumed for the feedback part of the controller. The full gradient is equivalent to that used in the standard FxLMS algorithm with the addition of a recursive term that is a function of the modeling error. Some simulations are provided to highlight the advantages of using the full gradient in the weight vector update rather than the approximation.

  20. Beam-column joint shear prediction using hybridized deep learning neural network with genetic algorithm

    NASA Astrophysics Data System (ADS)

    Mundher Yaseen, Zaher; Abdulmohsin Afan, Haitham; Tran, Minh-Tung

    2018-04-01

    Scientifically evidenced that beam-column joints are a critical point in the reinforced concrete (RC) structure under the fluctuation loads effects. In this novel hybrid data-intelligence model developed to predict the joint shear behavior of exterior beam-column structure frame. The hybrid data-intelligence model is called genetic algorithm integrated with deep learning neural network model (GA-DLNN). The genetic algorithm is used as prior modelling phase for the input approximation whereas the DLNN predictive model is used for the prediction phase. To demonstrate this structural problem, experimental data is collected from the literature that defined the dimensional and specimens’ properties. The attained findings evidenced the efficitveness of the hybrid GA-DLNN in modelling beam-column joint shear problem. In addition, the accurate prediction achived with less input variables owing to the feasibility of the evolutionary phase.

  1. Full Gradient Solution to Adaptive Hybrid Control

    NASA Technical Reports Server (NTRS)

    Bean, Jacob; Schiller, Noah H.; Fuller, Chris

    2016-01-01

    This paper focuses on the adaptation mechanisms in adaptive hybrid controllers. Most adaptive hybrid controllers update two filters individually according to the filtered-reference least mean squares (FxLMS) algorithm. Because this algorithm was derived for feedforward control, it does not take into account the presence of a feedback loop in the gradient calculation. This paper provides a derivation of the proper weight vector gradient for hybrid (or feedback) controllers that takes into account the presence of feedback. In this formulation, a single weight vector is updated rather than two individually. An internal model structure is assumed for the feedback part of the controller. The full gradient is equivalent to that used in the standard FxLMS algorithm with the addition of a recursive term that is a function of the modeling error. Some simulations are provided to highlight the advantages of using the full gradient in the weight vector update rather than the approximation.

  2. A survey on evolutionary algorithm based hybrid intelligence in bioinformatics.

    PubMed

    Li, Shan; Kang, Liying; Zhao, Xing-Ming

    2014-01-01

    With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs) are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks.

  3. Model-on-Demand Predictive Control for Nonlinear Hybrid Systems With Application to Adaptive Behavioral Interventions

    PubMed Central

    Nandola, Naresh N.; Rivera, Daniel E.

    2011-01-01

    This paper presents a data-centric modeling and predictive control approach for nonlinear hybrid systems. System identification of hybrid systems represents a challenging problem because model parameters depend on the mode or operating point of the system. The proposed algorithm applies Model-on-Demand (MoD) estimation to generate a local linear approximation of the nonlinear hybrid system at each time step, using a small subset of data selected by an adaptive bandwidth selector. The appeal of the MoD approach lies in the fact that model parameters are estimated based on a current operating point; hence estimation of locations or modes governed by autonomous discrete events is achieved automatically. The local MoD model is then converted into a mixed logical dynamical (MLD) system representation which can be used directly in a model predictive control (MPC) law for hybrid systems using multiple-degree-of-freedom tuning. The effectiveness of the proposed MoD predictive control algorithm for nonlinear hybrid systems is demonstrated on a hypothetical adaptive behavioral intervention problem inspired by Fast Track, a real-life preventive intervention for improving parental function and reducing conduct disorder in at-risk children. Simulation results demonstrate that the proposed algorithm can be useful for adaptive intervention problems exhibiting both nonlinear and hybrid character. PMID:21874087

  4. Hybridization properties of long nucleic acid probes for detection of variable target sequences, and development of a hybridization prediction algorithm

    PubMed Central

    Öhrmalm, Christina; Jobs, Magnus; Eriksson, Ronnie; Golbob, Sultan; Elfaitouri, Amal; Benachenhou, Farid; Strømme, Maria; Blomberg, Jonas

    2010-01-01

    One of the main problems in nucleic acid-based techniques for detection of infectious agents, such as influenza viruses, is that of nucleic acid sequence variation. DNA probes, 70-nt long, some including the nucleotide analog deoxyribose-Inosine (dInosine), were analyzed for hybridization tolerance to different amounts and distributions of mismatching bases, e.g. synonymous mutations, in target DNA. Microsphere-linked 70-mer probes were hybridized in 3M TMAC buffer to biotinylated single-stranded (ss) DNA for subsequent analysis in a Luminex® system. When mismatches interrupted contiguous matching stretches of 6 nt or longer, it had a strong impact on hybridization. Contiguous matching stretches are more important than the same number of matching nucleotides separated by mismatches into several regions. dInosine, but not 5-nitroindole, substitutions at mismatching positions stabilized hybridization remarkably well, comparable to N (4-fold) wobbles in the same positions. In contrast to shorter probes, 70-nt probes with judiciously placed dInosine substitutions and/or wobble positions were remarkably mismatch tolerant, with preserved specificity. An algorithm, NucZip, was constructed to model the nucleation and zipping phases of hybridization, integrating both local and distant binding contributions. It predicted hybridization more exactly than previous algorithms, and has the potential to guide the design of variation-tolerant yet specific probes. PMID:20864443

  5. Watershed model calibration framework developed using an influence coefficient algorithm and a genetic algorithm and analysis of pollutant discharge characteristics and load reduction in a TMDL planning area.

    PubMed

    Cho, Jae Heon; Lee, Jong Ho

    2015-11-01

    Manual calibration is common in rainfall-runoff model applications. However, rainfall-runoff models include several complicated parameters; thus, significant time and effort are required to manually calibrate the parameters individually and repeatedly. Automatic calibration has relative merit regarding time efficiency and objectivity but shortcomings regarding understanding indigenous processes in the basin. In this study, a watershed model calibration framework was developed using an influence coefficient algorithm and genetic algorithm (WMCIG) to automatically calibrate the distributed models. The optimization problem used to minimize the sum of squares of the normalized residuals of the observed and predicted values was solved using a genetic algorithm (GA). The final model parameters were determined from the iteration with the smallest sum of squares of the normalized residuals of all iterations. The WMCIG was applied to a Gomakwoncheon watershed located in an area that presents a total maximum daily load (TMDL) in Korea. The proportion of urbanized area in this watershed is low, and the diffuse pollution loads of nutrients such as phosphorus are greater than the point-source pollution loads because of the concentration of rainfall that occurs during the summer. The pollution discharges from the watershed were estimated for each land-use type, and the seasonal variations of the pollution loads were analyzed. Consecutive flow measurement gauges have not been installed in this area, and it is difficult to survey the flow and water quality in this area during the frequent heavy rainfall that occurs during the wet season. The Hydrological Simulation Program-Fortran (HSPF) model was used to calculate the runoff flow and water quality in this basin. Using the water quality results, a load duration curve was constructed for the basin, the exceedance frequency of the water quality standard was calculated for each hydrologic condition class, and the percent reduction required to achieve the water quality standard was estimated. The R(2) value for the calibrated BOD5 was 0.60, which is a moderate result, and the R(2) value for the TP was 0.86, which is a good result. The percent differences obtained for the calibrated BOD5 and TP were very good; therefore, the calibration results using WMCIG were satisfactory. From the load duration curve analysis, the WQS exceedance frequencies of the BOD5 under dry conditions and low-flow conditions were 75.7% and 65%, respectively, and the exceedance frequencies under moist and mid-range conditions were higher than under other conditions. The exceedance frequencies of the TP for the high-flow, moist and mid-range conditions were high and the exceedance rate for the high-flow condition was particularly high. Most of the data from the high-flow conditions exceeded the WQSs. Thus, nonpoint-source pollutants from storm-water runoff substantially affected the TP concentration in the Gomakwoncheon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Evaluating the impact of field-scale management strategies on sediment transport to the watershed outlet.

    PubMed

    Sommerlot, Andrew R; Pouyan Nejadhashemi, A; Woznicki, Sean A; Prohaska, Michael D

    2013-10-15

    Non-point source pollution from agricultural lands is a significant contributor of sediment pollution in United States lakes and streams. Therefore, quantifying the impact of individual field management strategies at the watershed-scale provides valuable information to watershed managers and conservation agencies to enhance decision-making. In this study, four methods employing some of the most cited models in field and watershed scale analysis were compared to find a practical yet accurate method for evaluating field management strategies at the watershed outlet. The models used in this study including field-scale model (the Revised Universal Soil Loss Equation 2 - RUSLE2), spatially explicit overland sediment delivery models (SEDMOD), and a watershed-scale model (Soil and Water Assessment Tool - SWAT). These models were used to develop four modeling strategies (methods) for the River Raisin watershed: Method 1) predefined field-scale subbasin and reach layers were used in SWAT model; Method 2) subbasin-scale sediment delivery ratio was employed; Method 3) results obtained from the field-scale RUSLE2 model were incorporated as point source inputs to the SWAT watershed model; and Method 4) a hybrid solution combining analyses from the RUSLE2, SEDMOD, and SWAT models. Method 4 was selected as the most accurate among the studied methods. In addition, the effectiveness of six best management practices (BMPs) in terms of the water quality improvement and associated cost were assessed. Economic analysis was performed using Method 4, and producer requested prices for BMPs were compared with prices defined by the Environmental Quality Incentives Program (EQIP). On a per unit area basis, producers requested higher prices than EQIP in four out of six BMP categories. Meanwhile, the true cost of sediment reduction at the field and watershed scales was greater than EQIP in five of six BMP categories according to producer requested prices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Genetic algorithms with memory- and elitism-based immigrants in dynamic environments.

    PubMed

    Yang, Shengxiang

    2008-01-01

    In recent years the genetic algorithm community has shown a growing interest in studying dynamic optimization problems. Several approaches have been devised. The random immigrants and memory schemes are two major ones. The random immigrants scheme addresses dynamic environments by maintaining the population diversity while the memory scheme aims to adapt genetic algorithms quickly to new environments by reusing historical information. This paper investigates a hybrid memory and random immigrants scheme, called memory-based immigrants, and a hybrid elitism and random immigrants scheme, called elitism-based immigrants, for genetic algorithms in dynamic environments. In these schemes, the best individual from memory or the elite from the previous generation is retrieved as the base to create immigrants into the population by mutation. This way, not only can diversity be maintained but it is done more efficiently to adapt genetic algorithms to the current environment. Based on a series of systematically constructed dynamic problems, experiments are carried out to compare genetic algorithms with the memory-based and elitism-based immigrants schemes against genetic algorithms with traditional memory and random immigrants schemes and a hybrid memory and multi-population scheme. The sensitivity analysis regarding some key parameters is also carried out. Experimental results show that the memory-based and elitism-based immigrants schemes efficiently improve the performance of genetic algorithms in dynamic environments.

  8. Application of hybrid artificial fish swarm algorithm based on similar fragments in VRP

    NASA Astrophysics Data System (ADS)

    Che, Jinnuo; Zhou, Kang; Zhang, Xueyu; Tong, Xin; Hou, Lingyun; Jia, Shiyu; Zhen, Yiting

    2018-03-01

    Focused on the issue that the decrease of convergence speed and the precision of calculation at the end of the process in Artificial Fish Swarm Algorithm(AFSA) and instability of results, a hybrid AFSA based on similar fragments is proposed. Traditional AFSA enjoys a lot of obvious advantages in solving complex optimization problems like Vehicle Routing Problem(VRP). AFSA have a few limitations such as low convergence speed, low precision and instability of results. In this paper, two improvements are introduced. On the one hand, change the definition of the distance for artificial fish, as well as increase vision field of artificial fish, and the problem of speed and precision can be improved when solving VRP. On the other hand, mix artificial bee colony algorithm(ABC) into AFSA - initialize the population of artificial fish by the ABC, and it solves the problem of instability of results in some extend. The experiment results demonstrate that the optimal solution of the hybrid AFSA is easier to approach the optimal solution of the standard database than the other two algorithms. In conclusion, the hybrid algorithm can effectively solve the problem that instability of results and decrease of convergence speed and the precision of calculation at the end of the process.

  9. Control algorithms for aerobraking in the Martian atmosphere

    NASA Technical Reports Server (NTRS)

    Ward, Donald T.; Shipley, Buford W., Jr.

    1991-01-01

    The Analytic Predictor Corrector (APC) and Energy Controller (EC) atmospheric guidance concepts were adapted to control an interplanetary vehicle aerobraking in the Martian atmosphere. Changes are made to the APC to improve its robustness to density variations. These changes include adaptation of a new exit phase algorithm, an adaptive transition velocity to initiate the exit phase, refinement of the reference dynamic pressure calculation and two improved density estimation techniques. The modified controller with the hybrid density estimation technique is called the Mars Hybrid Predictor Corrector (MHPC), while the modified controller with a polynomial density estimator is called the Mars Predictor Corrector (MPC). A Lyapunov Steepest Descent Controller (LSDC) is adapted to control the vehicle. The LSDC lacked robustness, so a Lyapunov tracking exit phase algorithm is developed to guide the vehicle along a reference trajectory. This algorithm, when using the hybrid density estimation technique to define the reference path, is called the Lyapunov Hybrid Tracking Controller (LHTC). With the polynomial density estimator used to define the reference trajectory, the algorithm is called the Lyapunov Tracking Controller (LTC). These four new controllers are tested using a six degree of freedom computer simulation to evaluate their robustness. The MHPC, MPC, LHTC, and LTC show dramatic improvements in robustness over the APC and EC.

  10. A hybrid algorithm for clustering of time series data based on affinity search technique.

    PubMed

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets.

  11. Decentralized Feedback Controllers for Exponential Stabilization of Hybrid Periodic Orbits: Application to Robotic Walking.

    PubMed

    Hamed, Kaveh Akbari; Gregg, Robert D

    2016-07-01

    This paper presents a systematic algorithm to design time-invariant decentralized feedback controllers to exponentially stabilize periodic orbits for a class of hybrid dynamical systems arising from bipedal walking. The algorithm assumes a class of parameterized and nonlinear decentralized feedback controllers which coordinate lower-dimensional hybrid subsystems based on a common phasing variable. The exponential stabilization problem is translated into an iterative sequence of optimization problems involving bilinear and linear matrix inequalities, which can be easily solved with available software packages. A set of sufficient conditions for the convergence of the iterative algorithm to a stabilizing decentralized feedback control solution is presented. The power of the algorithm is demonstrated by designing a set of local nonlinear controllers that cooperatively produce stable walking for a 3D autonomous biped with 9 degrees of freedom, 3 degrees of underactuation, and a decentralization scheme motivated by amputee locomotion with a transpelvic prosthetic leg.

  12. Decentralized Feedback Controllers for Robust Stabilization of Periodic Orbits of Hybrid Systems: Application to Bipedal Walking.

    PubMed

    Hamed, Kaveh Akbari; Gregg, Robert D

    2017-07-01

    This paper presents a systematic algorithm to design time-invariant decentralized feedback controllers to exponentially and robustly stabilize periodic orbits for hybrid dynamical systems against possible uncertainties in discrete-time phases. The algorithm assumes a family of parameterized and decentralized nonlinear controllers to coordinate interconnected hybrid subsystems based on a common phasing variable. The exponential and [Formula: see text] robust stabilization problems of periodic orbits are translated into an iterative sequence of optimization problems involving bilinear and linear matrix inequalities. By investigating the properties of the Poincaré map, some sufficient conditions for the convergence of the iterative algorithm are presented. The power of the algorithm is finally demonstrated through designing a set of robust stabilizing local nonlinear controllers for walking of an underactuated 3D autonomous bipedal robot with 9 degrees of freedom, impact model uncertainties, and a decentralization scheme motivated by amputee locomotion with a transpelvic prosthetic leg.

  13. Decentralized Feedback Controllers for Exponential Stabilization of Hybrid Periodic Orbits: Application to Robotic Walking*

    PubMed Central

    Hamed, Kaveh Akbari; Gregg, Robert D.

    2016-01-01

    This paper presents a systematic algorithm to design time-invariant decentralized feedback controllers to exponentially stabilize periodic orbits for a class of hybrid dynamical systems arising from bipedal walking. The algorithm assumes a class of parameterized and nonlinear decentralized feedback controllers which coordinate lower-dimensional hybrid subsystems based on a common phasing variable. The exponential stabilization problem is translated into an iterative sequence of optimization problems involving bilinear and linear matrix inequalities, which can be easily solved with available software packages. A set of sufficient conditions for the convergence of the iterative algorithm to a stabilizing decentralized feedback control solution is presented. The power of the algorithm is demonstrated by designing a set of local nonlinear controllers that cooperatively produce stable walking for a 3D autonomous biped with 9 degrees of freedom, 3 degrees of underactuation, and a decentralization scheme motivated by amputee locomotion with a transpelvic prosthetic leg. PMID:27990059

  14. Decentralized Feedback Controllers for Robust Stabilization of Periodic Orbits of Hybrid Systems: Application to Bipedal Walking

    PubMed Central

    Hamed, Kaveh Akbari; Gregg, Robert D.

    2016-01-01

    This paper presents a systematic algorithm to design time-invariant decentralized feedback controllers to exponentially and robustly stabilize periodic orbits for hybrid dynamical systems against possible uncertainties in discrete-time phases. The algorithm assumes a family of parameterized and decentralized nonlinear controllers to coordinate interconnected hybrid subsystems based on a common phasing variable. The exponential and H2 robust stabilization problems of periodic orbits are translated into an iterative sequence of optimization problems involving bilinear and linear matrix inequalities. By investigating the properties of the Poincaré map, some sufficient conditions for the convergence of the iterative algorithm are presented. The power of the algorithm is finally demonstrated through designing a set of robust stabilizing local nonlinear controllers for walking of an underactuated 3D autonomous bipedal robot with 9 degrees of freedom, impact model uncertainties, and a decentralization scheme motivated by amputee locomotion with a transpelvic prosthetic leg. PMID:28959117

  15. A Hybrid Algorithm for Clustering of Time Series Data Based on Affinity Search Technique

    PubMed Central

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A.; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets. PMID:24982966

  16. Optimal Integration of Departures and Arrivals in Terminal Airspace

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon Jean

    2013-01-01

    Coordination of operations with spatially and temporally shared resources, such as route segments, fixes, and runways, improves the efficiency of terminal airspace management. Problems in this category are, in general, computationally difficult compared to conventional scheduling problems. This paper presents a fast time algorithm formulation using a non-dominated sorting genetic algorithm (NSGA). It was first applied to a test problem introduced in existing literature. An experiment with a test problem showed that new methods can solve the 20 aircraft problem in fast time with a 65% or 440 second delay reduction using shared departure fixes. In order to test its application in a more realistic and complicated problem, the NSGA algorithm was applied to a problem in LAX terminal airspace, where interactions between 28% of LAX arrivals and 10% of LAX departures are resolved by spatial separation in current operations, which may introduce unnecessary delays. In this work, three types of separations - spatial, temporal, and hybrid separations - were formulated using the new algorithm. The hybrid separation combines both temporal and spatial separations. Results showed that although temporal separation achieved less delay than spatial separation with a small uncertainty buffer, spatial separation outperformed temporal separation when the uncertainty buffer was increased. Hybrid separation introduced much less delay than both spatial and temporal approaches. For a total of 15 interacting departures and arrivals, when compared to spatial separation, the delay reduction of hybrid separation varied between 11% or 3.1 minutes and 64% or 10.7 minutes corresponding to an uncertainty buffer from 0 to 60 seconds. Furthermore, as a comparison with the NSGA algorithm, a First-Come-First-Serve based heuristic method was implemented for the hybrid separation. Experiments showed that the results from the NSGA algorithm have 9% to 42% less delay than the heuristic method with varied uncertainty buffer sizes.

  17. Multimodal optimization by using hybrid of artificial bee colony algorithm and BFGS algorithm

    NASA Astrophysics Data System (ADS)

    Anam, S.

    2017-10-01

    Optimization has become one of the important fields in Mathematics. Many problems in engineering and science can be formulated into optimization problems. They maybe have many local optima. The optimization problem with many local optima, known as multimodal optimization problem, is how to find the global solution. Several metaheuristic methods have been proposed to solve multimodal optimization problems such as Particle Swarm Optimization (PSO), Genetics Algorithm (GA), Artificial Bee Colony (ABC) algorithm, etc. The performance of the ABC algorithm is better than or similar to those of other population-based algorithms with the advantage of employing a fewer control parameters. The ABC algorithm also has the advantages of strong robustness, fast convergence and high flexibility. However, it has the disadvantages premature convergence in the later search period. The accuracy of the optimal value cannot meet the requirements sometimes. Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm is a good iterative method for finding a local optimum. Compared with other local optimization methods, the BFGS algorithm is better. Based on the advantages of the ABC algorithm and the BFGS algorithm, this paper proposes a hybrid of the artificial bee colony algorithm and the BFGS algorithm to solve the multimodal optimization problem. The first step is that the ABC algorithm is run to find a point. In the second step is that the point obtained by the first step is used as an initial point of BFGS algorithm. The results show that the hybrid method can overcome from the basic ABC algorithm problems for almost all test function. However, if the shape of function is flat, the proposed method cannot work well.

  18. Real-Time Event Detection for Monitoring Natural and Source ...

    EPA Pesticide Factsheets

    The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d

  19. Hybrid massively parallel fast sweeping method for static Hamilton-Jacobi equations

    NASA Astrophysics Data System (ADS)

    Detrixhe, Miles; Gibou, Frédéric

    2016-10-01

    The fast sweeping method is a popular algorithm for solving a variety of static Hamilton-Jacobi equations. Fast sweeping algorithms for parallel computing have been developed, but are severely limited. In this work, we present a multilevel, hybrid parallel algorithm that combines the desirable traits of two distinct parallel methods. The fine and coarse grained components of the algorithm take advantage of heterogeneous computer architecture common in high performance computing facilities. We present the algorithm and demonstrate its effectiveness on a set of example problems including optimal control, dynamic games, and seismic wave propagation. We give results for convergence, parallel scaling, and show state-of-the-art speedup values for the fast sweeping method.

  20. Overview of existing algorithms for emotion classification. Uncertainties in evaluations of accuracies.

    NASA Astrophysics Data System (ADS)

    Avetisyan, H.; Bruna, O.; Holub, J.

    2016-11-01

    A numerous techniques and algorithms are dedicated to extract emotions from input data. In our investigation it was stated that emotion-detection approaches can be classified into 3 following types: Keyword based / lexical-based, learning based, and hybrid. The most commonly used techniques, such as keyword-spotting method, Support Vector Machines, Naïve Bayes Classifier, Hidden Markov Model and hybrid algorithms, have impressive results in this sphere and can reach more than 90% determining accuracy.

  1. Improving Environmental Model Calibration and Prediction

    DTIC Science & Technology

    2011-01-18

    REPORT Final Report - Improving Environmental Model Calibration and Prediction 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: First, we have continued to...develop tools for efficient global optimization of environmental models. Our algorithms are hybrid algorithms that combine evolutionary strategies...toward practical hybrid optimization tools for environmental models. 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 18-01-2011 13

  2. On the Optimization of Aerospace Plane Ascent Trajectory

    NASA Astrophysics Data System (ADS)

    Al-Garni, Ahmed; Kassem, Ayman Hamdy

    A hybrid heuristic optimization technique based on genetic algorithms and particle swarm optimization has been developed and tested for trajectory optimization problems with multi-constraints and a multi-objective cost function. The technique is used to calculate control settings for two types for ascending trajectories (constant dynamic pressure and minimum-fuel-minimum-heat) for a two-dimensional model of an aerospace plane. A thorough statistical analysis is done on the hybrid technique to make comparisons with both basic genetic algorithms and particle swarm optimization techniques with respect to convergence and execution time. Genetic algorithm optimization showed better execution time performance while particle swarm optimization showed better convergence performance. The hybrid optimization technique, benefiting from both techniques, showed superior robust performance compromising convergence trends and execution time.

  3. Computing border bases using mutant strategies

    NASA Astrophysics Data System (ADS)

    Ullah, E.; Abbas Khan, S.

    2014-01-01

    Border bases, a generalization of Gröbner bases, have actively been addressed during recent years due to their applicability to industrial problems. In cryptography and coding theory a useful application of border based is to solve zero-dimensional systems of polynomial equations over finite fields, which motivates us for developing optimizations of the algorithms that compute border bases. In 2006, Kehrein and Kreuzer formulated the Border Basis Algorithm (BBA), an algorithm which allows the computation of border bases that relate to a degree compatible term ordering. In 2007, J. Ding et al. introduced mutant strategies bases on finding special lower degree polynomials in the ideal. The mutant strategies aim to distinguish special lower degree polynomials (mutants) from the other polynomials and give them priority in the process of generating new polynomials in the ideal. In this paper we develop hybrid algorithms that use the ideas of J. Ding et al. involving the concept of mutants to optimize the Border Basis Algorithm for solving systems of polynomial equations over finite fields. In particular, we recall a version of the Border Basis Algorithm which is actually called the Improved Border Basis Algorithm and propose two hybrid algorithms, called MBBA and IMBBA. The new mutants variants provide us space efficiency as well as time efficiency. The efficiency of these newly developed hybrid algorithms is discussed using standard cryptographic examples.

  4. Fuzzy-Based Hybrid Control Algorithm for the Stabilization of a Tri-Rotor UAV

    PubMed Central

    Ali, Zain Anwar; Wang, Daobo; Aamir, Muhammad

    2016-01-01

    In this paper, a new and novel mathematical fuzzy hybrid scheme is proposed for the stabilization of a tri-rotor unmanned aerial vehicle (UAV). The fuzzy hybrid scheme consists of a fuzzy logic controller, regulation pole-placement tracking (RST) controller with model reference adaptive control (MRAC), in which adaptive gains of the RST controller are being fine-tuned by a fuzzy logic controller. Brushless direct current (BLDC) motors are installed in the triangular frame of the tri-rotor UAV, which helps maintain control on its motion and different altitude and attitude changes, similar to rotorcrafts. MRAC-based MIT rule is proposed for system stability. Moreover, the proposed hybrid controller with nonlinear flight dynamics is shown in the presence of translational and rotational velocity components. The performance of the proposed algorithm is demonstrated via MATLAB simulations, in which the proposed fuzzy hybrid controller is compared with the existing adaptive RST controller. It shows that our proposed algorithm has better transient performance with zero steady-state error, and fast convergence towards stability. PMID:27171084

  5. Control of equipment isolation system using wavelet-based hybrid sliding mode control

    NASA Astrophysics Data System (ADS)

    Huang, Shieh-Kung; Loh, Chin-Hsiung

    2017-04-01

    Critical non-structural equipment, including life-saving equipment in hospitals, circuit breakers, computers, high technology instrumentations, etc., is vulnerable to strong earthquakes, and on top of that, the failure of the vibration-sensitive equipment will cause severe economic loss. In order to protect vibration-sensitive equipment or machinery against strong earthquakes, various innovative control algorithms are developed to compensate the internal forces that to be applied. These new or improved control strategies, such as the control algorithms based on optimal control theory and sliding mode control (SMC), are also developed for structures engineering as a key element in smart structure technology. The optimal control theory, one of the most common methodologies in feedback control, finds control forces through achieving a certain optimal criterion by minimizing a cost function. For example, the linear-quadratic regulator (LQR) was the most popular control algorithm over the past three decades, and a number of modifications have been proposed to increase the efficiency of classical LQR algorithm. However, except to the advantage of simplicity and ease of implementation, LQR are susceptible to parameter uncertainty and modeling error due to complex nature of civil structures. Different from LQR control, a robust and easy to be implemented control algorithm, SMC has also been studied. SMC is a nonlinear control methodology that forces the structural system to slide along surfaces or boundaries; hence this control algorithm is naturally robust with respect to parametric uncertainties of a structure. Early attempts at protecting vibration-sensitive equipment were based on the use of existing control algorithms as described above. However, in recent years, researchers have tried to renew the existing control algorithms or developing a new control algorithm to adapt the complex nature of civil structures which include the control of both structures and non-structural components. The aim of this paper is to develop a hybrid control algorithm on the control of both structures and equipments simultaneously to overcome the limitations of classical feedback control through combining the advantage of classic LQR and SMC. To suppress vibrations with the frequency contents of strong earthquakes differing from the natural frequencies of civil structures, the hybrid control algorithms integrated with the wavelet-base vibration control algorithm is developed. The performance of classical, hybrid, and wavelet-based hybrid control algorithms as well as the responses of structure and non-structural components are evaluated and discussed through numerical simulation in this study.

  6. A Hybrid Approach to Protect Palmprint Templates

    PubMed Central

    Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach. PMID:24982977

  7. A hybrid approach to protect palmprint templates.

    PubMed

    Liu, Hailun; Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach.

  8. Multiple sequence alignment using multi-objective based bacterial foraging optimization algorithm.

    PubMed

    Rani, R Ranjani; Ramyachitra, D

    2016-12-01

    Multiple sequence alignment (MSA) is a widespread approach in computational biology and bioinformatics. MSA deals with how the sequences of nucleotides and amino acids are sequenced with possible alignment and minimum number of gaps between them, which directs to the functional, evolutionary and structural relationships among the sequences. Still the computation of MSA is a challenging task to provide an efficient accuracy and statistically significant results of alignments. In this work, the Bacterial Foraging Optimization Algorithm was employed to align the biological sequences which resulted in a non-dominated optimal solution. It employs Multi-objective, such as: Maximization of Similarity, Non-gap percentage, Conserved blocks and Minimization of gap penalty. BAliBASE 3.0 benchmark database was utilized to examine the proposed algorithm against other methods In this paper, two algorithms have been proposed: Hybrid Genetic Algorithm with Artificial Bee Colony (GA-ABC) and Bacterial Foraging Optimization Algorithm. It was found that Hybrid Genetic Algorithm with Artificial Bee Colony performed better than the existing optimization algorithms. But still the conserved blocks were not obtained using GA-ABC. Then BFO was used for the alignment and the conserved blocks were obtained. The proposed Multi-Objective Bacterial Foraging Optimization Algorithm (MO-BFO) was compared with widely used MSA methods Clustal Omega, Kalign, MUSCLE, MAFFT, Genetic Algorithm (GA), Ant Colony Optimization (ACO), Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Hybrid Genetic Algorithm with Artificial Bee Colony (GA-ABC). The final results show that the proposed MO-BFO algorithm yields better alignment than most widely used methods. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. a Gsa-Svm Hybrid System for Classification of Binary Problems

    NASA Astrophysics Data System (ADS)

    Sarafrazi, Soroor; Nezamabadi-pour, Hossein; Barahman, Mojgan

    2011-06-01

    This paperhybridizesgravitational search algorithm (GSA) with support vector machine (SVM) and made a novel GSA-SVM hybrid system to improve the classification accuracy in binary problems. GSA is an optimization heuristic toolused to optimize the value of SVM kernel parameter (in this paper, radial basis function (RBF) is chosen as the kernel function). The experimental results show that this newapproach can achieve high classification accuracy and is comparable to or better than the particle swarm optimization (PSO)-SVM and genetic algorithm (GA)-SVM, which are two hybrid systems for classification.

  10. The Water, Energy, and Biogeochemical Model (WEBMOD): A TOPMODEL application developed within the Modular Modeling System

    NASA Astrophysics Data System (ADS)

    Webb, R. M.; Wolock, D. M.; Linard, J. I.; Wieczorek, M. E.

    2004-12-01

    Process-based flow and transport simulation models can help increase understanding of how hydrologic flow paths affect biogeochemical mixing and reactions in watersheds. This presentation describes the Water, Energy, and Biogeochemical Model (WEBMOD), a new model designed to simulate water and chemical transport in both pristine and agricultural watersheds. WEBMOD simulates streamflow using TOPMODEL algorithms and also simulates irrigation, canopy interception, snowpack, and tile-drain flow; these are important processes for successful multi-year simulations of agricultural watersheds. In addition, the hydrologic components of the model are linked to the U.S. Geological Survey's (USGS) geochemical model PHREEQC such that solute chemistry for the hillslopes and streams also are computed. Model development, execution, and calibration take place within the USGS Modular Modeling System. WEBMOD is being validated at ten research watersheds. Five of these watersheds are nearly pristine and comprise the USGS Water, Energy, and Biogeochemical Budget (WEBB) Program field sites: Loch Vale, Colorado; Trout Lake, Wisconsin; Sleepers River, Vermont; Panola Mountain, Georgia; and the Luquillo Experimental Forest, Puerto Rico. The remaining five watersheds contain intensely cultivated fields being studied by USGS National Water Quality Assessment Program: Merced River, California; Granger Drain, Washington; Maple Creek, Nebraska; Sugar Creek, Indiana; and Morgan Creek, Delaware. Model calibration improved understanding of observed variations in soil moisture, solute concentrations, and stream discharge at the five WEBB watersheds and is now being set up to simulate the processes at the five agricultural watersheds that are now ending their first year of data collection.

  11. Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem.

    PubMed

    Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing

    2015-01-01

    Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA.

  12. Improved Fractal Space Filling Curves Hybrid Optimization Algorithm for Vehicle Routing Problem

    PubMed Central

    Yue, Yi-xiang; Zhang, Tong; Yue, Qun-xing

    2015-01-01

    Vehicle Routing Problem (VRP) is one of the key issues in optimization of modern logistics system. In this paper, a modified VRP model with hard time window is established and a Hybrid Optimization Algorithm (HOA) based on Fractal Space Filling Curves (SFC) method and Genetic Algorithm (GA) is introduced. By incorporating the proposed algorithm, SFC method can find an initial and feasible solution very fast; GA is used to improve the initial solution. Thereafter, experimental software was developed and a large number of experimental computations from Solomon's benchmark have been studied. The experimental results demonstrate the feasibility and effectiveness of the HOA. PMID:26167171

  13. A Comparison of Hybrid Approaches for Turbofan Engine Gas Path Fault Diagnosis

    NASA Astrophysics Data System (ADS)

    Lu, Feng; Wang, Yafan; Huang, Jinquan; Wang, Qihang

    2016-09-01

    A hybrid diagnostic method utilizing Extended Kalman Filter (EKF) and Adaptive Genetic Algorithm (AGA) is presented for performance degradation estimation and sensor anomaly detection of turbofan engine. The EKF is used to estimate engine component performance degradation for gas path fault diagnosis. The AGA is introduced in the integrated architecture and applied for sensor bias detection. The contributions of this work are the comparisons of Kalman Filters (KF)-AGA algorithms and Neural Networks (NN)-AGA algorithms with a unified framework for gas path fault diagnosis. The NN needs to be trained off-line with a large number of prior fault mode data. When new fault mode occurs, estimation accuracy by the NN evidently decreases. However, the application of the Linearized Kalman Filter (LKF) and EKF will not be restricted in such case. The crossover factor and the mutation factor are adapted to the fitness function at each generation in the AGA, and it consumes less time to search for the optimal sensor bias value compared to the Genetic Algorithm (GA). In a word, we conclude that the hybrid EKF-AGA algorithm is the best choice for gas path fault diagnosis of turbofan engine among the algorithms discussed.

  14. Autumn Algorithm-Computation of Hybridization Networks for Realistic Phylogenetic Trees.

    PubMed

    Huson, Daniel H; Linz, Simone

    2018-01-01

    A minimum hybridization network is a rooted phylogenetic network that displays two given rooted phylogenetic trees using a minimum number of reticulations. Previous mathematical work on their calculation has usually assumed the input trees to be bifurcating, correctly rooted, or that they both contain the same taxa. These assumptions do not hold in biological studies and "realistic" trees have multifurcations, are difficult to root, and rarely contain the same taxa. We present a new algorithm for computing minimum hybridization networks for a given pair of "realistic" rooted phylogenetic trees. We also describe how the algorithm might be used to improve the rooting of the input trees. We introduce the concept of "autumn trees", a nice framework for the formulation of algorithms based on the mathematics of "maximum acyclic agreement forests". While the main computational problem is hard, the run-time depends mainly on how different the given input trees are. In biological studies, where the trees are reasonably similar, our parallel implementation performs well in practice. The algorithm is available in our open source program Dendroscope 3, providing a platform for biologists to explore rooted phylogenetic networks. We demonstrate the utility of the algorithm using several previously studied data sets.

  15. Comparison of optimization algorithms in intensity-modulated radiation therapy planning

    NASA Astrophysics Data System (ADS)

    Kendrick, Rachel

    Intensity-modulated radiation therapy is used to better conform the radiation dose to the target, which includes avoiding healthy tissue. Planning programs employ optimization methods to search for the best fluence of each photon beam, and therefore to create the best treatment plan. The Computational Environment for Radiotherapy Research (CERR), a program written in MATLAB, was used to examine some commonly-used algorithms for one 5-beam plan. Algorithms include the genetic algorithm, quadratic programming, pattern search, constrained nonlinear optimization, simulated annealing, the optimization method used in Varian EclipseTM, and some hybrids of these. Quadratic programing, simulated annealing, and a quadratic/simulated annealing hybrid were also separately compared using different prescription doses. The results of each dose-volume histogram as well as the visual dose color wash were used to compare the plans. CERR's built-in quadratic programming provided the best overall plan, but avoidance of the organ-at-risk was rivaled by other programs. Hybrids of quadratic programming with some of these algorithms seems to suggest the possibility of better planning programs, as shown by the improved quadratic/simulated annealing plan when compared to the simulated annealing algorithm alone. Further experimentation will be done to improve cost functions and computational time.

  16. Use of Sequent Peak Algorithm Drought Severity Index and Hydroclimatic Reconstructions from Tree-Rings to Inform Water Supply Reliability Planning

    NASA Astrophysics Data System (ADS)

    Bray, B. S.; Palhegyi, G.

    2015-12-01

    California is in the midst of a severe drought with below average runoff since WY 2012. Within this context, many water resource managers are scrutinizing water supply reliability assumptions for planning studies. Severe droughts represent a relatively rare phenomenon, occurring only a handful of times within our limited 100-year period of watershed runoff records. Furthermore, droughts may have different runoff magnitudes and durations that inherently present a challenge for direct comparisons of one drought with another. We use the sequent peak algorithm as a drought severity index (SPADSI) that accounts for both drought magnitude and duration relative to an assumed minimum release policy and fixed level-of-development (LOD) demand modeling framework. The SPADSI allows direct, quantitative evaluation of different policy options for lessening drought severity where, for example, layering a customer rationing policy onto model results reduced the SPADSI for the historical 1976-77 drought from 520 to 450 thousand acre-feet (TAF) and 1987-92 drought from 650 to 415 TAF for 2015 LOD. A strong correlation (R2 = 0.96) between Mokelumne River watershed runoff and tree-ring hydroclimate reconstructions for neighboring American and Stanislaus watersheds from Meko et al. (2014) was the basis for an extended 1100-year historical reconstruction of Mokelumne Watershed annual runoff. The reconstructed runoff timeseries is used to investigate extended historical drought durations for the Mokelumne Watershed where shorter one- to three-year droughts are most probable durations (>90%) whereas longer duration droughts lasting as long as 10 years such as occurred in 1776-85 are also possible, though much less likely. Applying the SPADSI to the reconstructed runoff timeseries showed that recent droughts e.g. 1929-34, 1976-77, and 1987-92 are all relatively severe within this millennial context, falling on the distribution tail of the extended SPADSI dataset. These findings are consistent with Meko et al. (2014) in their analysis of other watersheds in the region. These findings and other insights from the reconstructed runoff timeseries along with the SPADSI provide valuable information for water resource managers evaluating water supply reliability assumptions for future drought planning efforts.

  17. Hybridization dynamics between Colorado's native cutthroat trout and introduced rainbow trout.

    PubMed

    Metcalf, Jessica L; Siegle, Matthew R; Martin, Andrew P

    2008-01-01

    Newly formed hybrid populations provide an opportunity to examine the initial consequences of secondary contact between species and identify genetic patterns that may be important early in the evolution of hybrid inviability. Widespread introductions of rainbow trout (Oncorhynchus mykiss) into watersheds with native cutthroat trout (Oncorhynchus clarkii) have resulted in hybridization. These introductions have contributed to the decline of native cutthroat trout populations. Here, we examine the pattern of hybridization between introduced rainbow trout and 2 populations of cutthroat trout native to Colorado. For this study, we utilized 7 diagnostic, codominant nuclear markers and a diagnostic mitochondrial marker to investigate hybridization in a population of greenback cutthroat trout (Oncorhynchus clarkii stomias) and a population of Colorado River cutthroat trout (Oncorhynchus clarkii pleuriticus). We infer that cutthroat-rainbow trout hybrid swarms have formed in both populations. Although a mixture of hybrid genotypes was present, not all genotype combinations were detected at expected frequencies. We found evidence that mitochondrial DNA introgression in hybrids is asymmetric and more likely from rainbow trout than from cutthroat trout. A difference in spawning time of the 2 species or differences in the fitness between the reciprocal crosses may explain the asymmetry. Additionally, the presence of intraspecific cytonuclear associations found in both populations is concordant with current hypotheses regarding coevolution of mitochondrial and nuclear genomes.

  18. Using Automatic Control Approach In Detention Storages For Storm Water Management In An Urban Watershed

    NASA Astrophysics Data System (ADS)

    Goyal, A.; Yadav, H.; Tyagi, H.; Gosain, A. K.; Khosa, R.

    2017-12-01

    Increased imperviousness due to rapid urbanization have changed the urban hydrological cycle. As watersheds are urbanized, infiltration and groundwater recharge have decreased, surface runoff hydrograph shows higher peak indicating large volumes of surface runoff in lesser time durations. The ultimate panacea is to reduce the peak of hydrograph or increase the retention time of surface flow. SWMM is widely used hydrologic and hydraulic software which helps to simulate the urban storm water management with the provision to apply different techniques to prevent flooding. A model was setup to simulate the surface runoff and channel flow in a small urban catchment. It provides the temporal and spatial information of flooding in a catchment. Incorporating the detention storages in the drainage network helps achieve reduced flooding. Detention storages provided with predefined algorithms were for controlling the pluvial flooding in urban watersheds. The algorithm based on control theory, automated the functioning of detention storages ensuring that the storages become active on occurrence of flood in the storm water drains and shuts down when flooding is over. Detention storages can be implemented either at source or at several downstream control points. The proposed piece of work helps to mitigate the wastage of rainfall water, achieve desirable groundwater and attain a controlled urban storm water management system.

  19. Comparison of parameter-adapted segmentation methods for fluorescence micrographs.

    PubMed

    Held, Christian; Palmisano, Ralf; Häberle, Lothar; Hensel, Michael; Wittenberg, Thomas

    2011-11-01

    Interpreting images from fluorescence microscopy is often a time-consuming task with poor reproducibility. Various image processing routines that can help investigators evaluate the images are therefore useful. The critical aspect for a reliable automatic image analysis system is a robust segmentation algorithm that can perform accurate segmentation for different cell types. In this study, several image segmentation methods were therefore compared and evaluated in order to identify the most appropriate segmentation schemes that are usable with little new parameterization and robustly with different types of fluorescence-stained cells for various biological and biomedical tasks. The study investigated, compared, and enhanced four different methods for segmentation of cultured epithelial cells. The maximum-intensity linking (MIL) method, an improved MIL, a watershed method, and an improved watershed method based on morphological reconstruction were used. Three manually annotated datasets consisting of 261, 817, and 1,333 HeLa or L929 cells were used to compare the different algorithms. The comparisons and evaluations showed that the segmentation performance of methods based on the watershed transform was significantly superior to the performance of the MIL method. The results also indicate that using morphological opening by reconstruction can improve the segmentation of cells stained with a marker that exhibits the dotted surface of cells. Copyright © 2011 International Society for Advancement of Cytometry.

  20. Brain tissue segmentation in MR images based on a hybrid of MRF and social algorithms.

    PubMed

    Yousefi, Sahar; Azmi, Reza; Zahedi, Morteza

    2012-05-01

    Effective abnormality detection and diagnosis in Magnetic Resonance Images (MRIs) requires a robust segmentation strategy. Since manual segmentation is a time-consuming task which engages valuable human resources, automatic MRI segmentations received an enormous amount of attention. For this goal, various techniques have been applied. However, Markov Random Field (MRF) based algorithms have produced reasonable results in noisy images compared to other methods. MRF seeks a label field which minimizes an energy function. The traditional minimization method, simulated annealing (SA), uses Monte Carlo simulation to access the minimum solution with heavy computation burden. For this reason, MRFs are rarely used in real time processing environments. This paper proposed a novel method based on MRF and a hybrid of social algorithms that contain an ant colony optimization (ACO) and a Gossiping algorithm which can be used for segmenting single and multispectral MRIs in real time environments. Combining ACO with the Gossiping algorithm helps find the better path using neighborhood information. Therefore, this interaction causes the algorithm to converge to an optimum solution faster. Several experiments on phantom and real images were performed. Results indicate that the proposed algorithm outperforms the traditional MRF and hybrid of MRF-ACO in speed and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Recourse-based facility-location problems in hybrid uncertain environment.

    PubMed

    Wang, Shuming; Watada, Junzo; Pedrycz, Witold

    2010-08-01

    The objective of this paper is to study facility-location problems in the presence of a hybrid uncertain environment involving both randomness and fuzziness. A two-stage fuzzy-random facility-location model with recourse (FR-FLMR) is developed in which both the demands and costs are assumed to be fuzzy-random variables. The bounds of the optimal objective value of the two-stage FR-FLMR are derived. As, in general, the fuzzy-random parameters of the FR-FLMR can be regarded as continuous fuzzy-random variables with an infinite number of realizations, the computation of the recourse requires solving infinite second-stage programming problems. Owing to this requirement, the recourse function cannot be determined analytically, and, hence, the model cannot benefit from the use of techniques of classical mathematical programming. In order to solve the location problems of this nature, we first develop a technique of fuzzy-random simulation to compute the recourse function. The convergence of such simulation scenarios is discussed. In the sequel, we propose a hybrid mutation-based binary ant-colony optimization (MBACO) approach to the two-stage FR-FLMR, which comprises the fuzzy-random simulation and the simplex algorithm. A numerical experiment illustrates the application of the hybrid MBACO algorithm. The comparison shows that the hybrid MBACO finds better solutions than the one using other discrete metaheuristic algorithms, such as binary particle-swarm optimization, genetic algorithm, and tabu search.

  2. Ant-cuckoo colony optimization for feature selection in digital mammogram.

    PubMed

    Jona, J B; Nagaveni, N

    2014-01-15

    Digital mammogram is the only effective screening method to detect the breast cancer. Gray Level Co-occurrence Matrix (GLCM) textural features are extracted from the mammogram. All the features are not essential to detect the mammogram. Therefore identifying the relevant feature is the aim of this work. Feature selection improves the classification rate and accuracy of any classifier. In this study, a new hybrid metaheuristic named Ant-Cuckoo Colony Optimization a hybrid of Ant Colony Optimization (ACO) and Cuckoo Search (CS) is proposed for feature selection in Digital Mammogram. ACO is a good metaheuristic optimization technique but the drawback of this algorithm is that the ant will walk through the path where the pheromone density is high which makes the whole process slow hence CS is employed to carry out the local search of ACO. Support Vector Machine (SVM) classifier with Radial Basis Kernal Function (RBF) is done along with the ACO to classify the normal mammogram from the abnormal mammogram. Experiments are conducted in miniMIAS database. The performance of the new hybrid algorithm is compared with the ACO and PSO algorithm. The results show that the hybrid Ant-Cuckoo Colony Optimization algorithm is more accurate than the other techniques.

  3. Hybrid massively parallel fast sweeping method for static Hamilton–Jacobi equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detrixhe, Miles, E-mail: mdetrixhe@engineering.ucsb.edu; University of California Santa Barbara, Santa Barbara, CA, 93106; Gibou, Frédéric, E-mail: fgibou@engineering.ucsb.edu

    The fast sweeping method is a popular algorithm for solving a variety of static Hamilton–Jacobi equations. Fast sweeping algorithms for parallel computing have been developed, but are severely limited. In this work, we present a multilevel, hybrid parallel algorithm that combines the desirable traits of two distinct parallel methods. The fine and coarse grained components of the algorithm take advantage of heterogeneous computer architecture common in high performance computing facilities. We present the algorithm and demonstrate its effectiveness on a set of example problems including optimal control, dynamic games, and seismic wave propagation. We give results for convergence, parallel scaling,more » and show state-of-the-art speedup values for the fast sweeping method.« less

  4. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    collected data. These statistical techniques are under the area of descriptive statistics, which is a methodology to condense the data in quantitative ...ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...report when it is no longer needed. Do not return it to the originator. ARL-TR-8270 ● JAN 2017 US Army Research Laboratory An

  5. Development of a multiobjective optimization tool for the selection and placement of best management practices for nonpoint source pollution control

    NASA Astrophysics Data System (ADS)

    Maringanti, Chetan; Chaubey, Indrajeet; Popp, Jennie

    2009-06-01

    Best management practices (BMPs) are effective in reducing the transport of agricultural nonpoint source pollutants to receiving water bodies. However, selection of BMPs for placement in a watershed requires optimization of the available resources to obtain maximum possible pollution reduction. In this study, an optimization methodology is developed to select and place BMPs in a watershed to provide solutions that are both economically and ecologically effective. This novel approach develops and utilizes a BMP tool, a database that stores the pollution reduction and cost information of different BMPs under consideration. The BMP tool replaces the dynamic linkage of the distributed parameter watershed model during optimization and therefore reduces the computation time considerably. Total pollutant load from the watershed, and net cost increase from the baseline, were the two objective functions minimized during the optimization process. The optimization model, consisting of a multiobjective genetic algorithm (NSGA-II) in combination with a watershed simulation tool (Soil Water and Assessment Tool (SWAT)), was developed and tested for nonpoint source pollution control in the L'Anguille River watershed located in eastern Arkansas. The optimized solutions provided a trade-off between the two objective functions for sediment, phosphorus, and nitrogen reduction. The results indicated that buffer strips were very effective in controlling the nonpoint source pollutants from leaving the croplands. The optimized BMP plans resulted in potential reductions of 33%, 32%, and 13% in sediment, phosphorus, and nitrogen loads, respectively, from the watershed.

  6. 3D Clumped Cell Segmentation Using Curvature Based Seeded Watershed.

    PubMed

    Atta-Fosu, Thomas; Guo, Weihong; Jeter, Dana; Mizutani, Claudia M; Stopczynski, Nathan; Sousa-Neves, Rui

    2016-12-01

    Image segmentation is an important process that separates objects from the background and also from each other. Applied to cells, the results can be used for cell counting which is very important in medical diagnosis and treatment, and biological research that is often used by scientists and medical practitioners. Segmenting 3D confocal microscopy images containing cells of different shapes and sizes is still challenging as the nuclei are closely packed. The watershed transform provides an efficient tool in segmenting such nuclei provided a reasonable set of markers can be found in the image. In the presence of low-contrast variation or excessive noise in the given image, the watershed transform leads to over-segmentation (a single object is overly split into multiple objects). The traditional watershed uses the local minima of the input image and will characteristically find multiple minima in one object unless they are specified (marker-controlled watershed). An alternative to using the local minima is by a supervised technique called seeded watershed, which supplies single seeds to replace the minima for the objects. Consequently, the accuracy of a seeded watershed algorithm relies on the accuracy of the predefined seeds. In this paper, we present a segmentation approach based on the geometric morphological properties of the 'landscape' using curvatures. The curvatures are computed as the eigenvalues of the Shape matrix, producing accurate seeds that also inherit the original shape of their respective cells. We compare with some popular approaches and show the advantage of the proposed method.

  7. Algorithm refinement for stochastic partial differential equations: II. Correlated systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alexander, Francis J.; Garcia, Alejandro L.; Tartakovsky, Daniel M.

    2005-08-10

    We analyze a hybrid particle/continuum algorithm for a hydrodynamic system with long ranged correlations. Specifically, we consider the so-called train model for viscous transport in gases, which is based on a generalization of the random walk process for the diffusion of momentum. This discrete model is coupled with its continuous counterpart, given by a pair of stochastic partial differential equations. At the interface between the particle and continuum computations the coupling is by flux matching, giving exact mass and momentum conservation. This methodology is an extension of our stochastic Algorithm Refinement (AR) hybrid for simple diffusion [F. Alexander, A. Garcia,more » D. Tartakovsky, Algorithm refinement for stochastic partial differential equations: I. Linear diffusion, J. Comput. Phys. 182 (2002) 47-66]. Results from a variety of numerical experiments are presented for steady-state scenarios. In all cases the mean and variance of density and velocity are captured correctly by the stochastic hybrid algorithm. For a non-stochastic version (i.e., using only deterministic continuum fluxes) the long-range correlations of velocity fluctuations are qualitatively preserved but at reduced magnitude.« less

  8. A hybrid genetic algorithm for solving bi-objective traveling salesman problems

    NASA Astrophysics Data System (ADS)

    Ma, Mei; Li, Hecheng

    2017-08-01

    The traveling salesman problem (TSP) is a typical combinatorial optimization problem, in a traditional TSP only tour distance is taken as a unique objective to be minimized. When more than one optimization objective arises, the problem is known as a multi-objective TSP. In the present paper, a bi-objective traveling salesman problem (BOTSP) is taken into account, where both the distance and the cost are taken as optimization objectives. In order to efficiently solve the problem, a hybrid genetic algorithm is proposed. Firstly, two satisfaction degree indices are provided for each edge by considering the influences of the distance and the cost weight. The first satisfaction degree is used to select edges in a “rough” way, while the second satisfaction degree is executed for a more “refined” choice. Secondly, two satisfaction degrees are also applied to generate new individuals in the iteration process. Finally, based on genetic algorithm framework as well as 2-opt selection strategy, a hybrid genetic algorithm is proposed. The simulation illustrates the efficiency of the proposed algorithm.

  9. A hybrid artificial bee colony algorithm for numerical function optimization

    NASA Astrophysics Data System (ADS)

    Alqattan, Zakaria N.; Abdullah, Rosni

    2015-02-01

    Artificial Bee Colony (ABC) algorithm is one of the swarm intelligence algorithms; it has been introduced by Karaboga in 2005. It is a meta-heuristic optimization search algorithm inspired from the intelligent foraging behavior of the honey bees in nature. Its unique search process made it as one of the most competitive algorithm with some other search algorithms in the area of optimization, such as Genetic algorithm (GA) and Particle Swarm Optimization (PSO). However, the ABC performance of the local search process and the bee movement or the solution improvement equation still has some weaknesses. The ABC is good in avoiding trapping at the local optimum but it spends its time searching around unpromising random selected solutions. Inspired by the PSO, we propose a Hybrid Particle-movement ABC algorithm called HPABC, which adapts the particle movement process to improve the exploration of the original ABC algorithm. Numerical benchmark functions were used in order to experimentally test the HPABC algorithm. The results illustrate that the HPABC algorithm can outperform the ABC algorithm in most of the experiments (75% better in accuracy and over 3 times faster).

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tumuluru, Jaya Shankar; McCulloch, Richard Chet James

    In this work a new hybrid genetic algorithm was developed which combines a rudimentary adaptive steepest ascent hill climbing algorithm with a sophisticated evolutionary algorithm in order to optimize complex multivariate design problems. By combining a highly stochastic algorithm (evolutionary) with a simple deterministic optimization algorithm (adaptive steepest ascent) computational resources are conserved and the solution converges rapidly when compared to either algorithm alone. In genetic algorithms natural selection is mimicked by random events such as breeding and mutation. In the adaptive steepest ascent algorithm each variable is perturbed by a small amount and the variable that caused the mostmore » improvement is incremented by a small step. If the direction of most benefit is exactly opposite of the previous direction with the most benefit then the step size is reduced by a factor of 2, thus the step size adapts to the terrain. A graphical user interface was created in MATLAB to provide an interface between the hybrid genetic algorithm and the user. Additional features such as bounding the solution space and weighting the objective functions individually are also built into the interface. The algorithm developed was tested to optimize the functions developed for a wood pelleting process. Using process variables (such as feedstock moisture content, die speed, and preheating temperature) pellet properties were appropriately optimized. Specifically, variables were found which maximized unit density, bulk density, tapped density, and durability while minimizing pellet moisture content and specific energy consumption. The time and computational resources required for the optimization were dramatically decreased using the hybrid genetic algorithm when compared to MATLAB's native evolutionary optimization tool.« less

  11. Vehicle routing problem with time windows using natural inspired algorithms

    NASA Astrophysics Data System (ADS)

    Pratiwi, A. B.; Pratama, A.; Sa’diyah, I.; Suprajitno, H.

    2018-03-01

    Process of distribution of goods needs a strategy to make the total cost spent for operational activities minimized. But there are several constrains have to be satisfied which are the capacity of the vehicles and the service time of the customers. This Vehicle Routing Problem with Time Windows (VRPTW) gives complex constrains problem. This paper proposes natural inspired algorithms for dealing with constrains of VRPTW which involves Bat Algorithm and Cat Swarm Optimization. Bat Algorithm is being hybrid with Simulated Annealing, the worst solution of Bat Algorithm is replaced by the solution from Simulated Annealing. Algorithm which is based on behavior of cats, Cat Swarm Optimization, is improved using Crow Search Algorithm to make simplier and faster convergence. From the computational result, these algorithms give good performances in finding the minimized total distance. Higher number of population causes better computational performance. The improved Cat Swarm Optimization with Crow Search gives better performance than the hybridization of Bat Algorithm and Simulated Annealing in dealing with big data.

  12. A Hybrid Adaptive Routing Algorithm for Event-Driven Wireless Sensor Networks

    PubMed Central

    Figueiredo, Carlos M. S.; Nakamura, Eduardo F.; Loureiro, Antonio A. F.

    2009-01-01

    Routing is a basic function in wireless sensor networks (WSNs). For these networks, routing algorithms depend on the characteristics of the applications and, consequently, there is no self-contained algorithm suitable for every case. In some scenarios, the network behavior (traffic load) may vary a lot, such as an event-driven application, favoring different algorithms at different instants. This work presents a hybrid and adaptive algorithm for routing in WSNs, called Multi-MAF, that adapts its behavior autonomously in response to the variation of network conditions. In particular, the proposed algorithm applies both reactive and proactive strategies for routing infrastructure creation, and uses an event-detection estimation model to change between the strategies and save energy. To show the advantages of the proposed approach, it is evaluated through simulations. Comparisons with independent reactive and proactive algorithms show improvements on energy consumption. PMID:22423207

  13. A hybrid adaptive routing algorithm for event-driven wireless sensor networks.

    PubMed

    Figueiredo, Carlos M S; Nakamura, Eduardo F; Loureiro, Antonio A F

    2009-01-01

    Routing is a basic function in wireless sensor networks (WSNs). For these networks, routing algorithms depend on the characteristics of the applications and, consequently, there is no self-contained algorithm suitable for every case. In some scenarios, the network behavior (traffic load) may vary a lot, such as an event-driven application, favoring different algorithms at different instants. This work presents a hybrid and adaptive algorithm for routing in WSNs, called Multi-MAF, that adapts its behavior autonomously in response to the variation of network conditions. In particular, the proposed algorithm applies both reactive and proactive strategies for routing infrastructure creation, and uses an event-detection estimation model to change between the strategies and save energy. To show the advantages of the proposed approach, it is evaluated through simulations. Comparisons with independent reactive and proactive algorithms show improvements on energy consumption.

  14. A hybrid Jaya algorithm for reliability-redundancy allocation problems

    NASA Astrophysics Data System (ADS)

    Ghavidel, Sahand; Azizivahed, Ali; Li, Li

    2018-04-01

    This article proposes an efficient improved hybrid Jaya algorithm based on time-varying acceleration coefficients (TVACs) and the learning phase introduced in teaching-learning-based optimization (TLBO), named the LJaya-TVAC algorithm, for solving various types of nonlinear mixed-integer reliability-redundancy allocation problems (RRAPs) and standard real-parameter test functions. RRAPs include series, series-parallel, complex (bridge) and overspeed protection systems. The search power of the proposed LJaya-TVAC algorithm for finding the optimal solutions is first tested on the standard real-parameter unimodal and multi-modal functions with dimensions of 30-100, and then tested on various types of nonlinear mixed-integer RRAPs. The results are compared with the original Jaya algorithm and the best results reported in the recent literature. The optimal results obtained with the proposed LJaya-TVAC algorithm provide evidence for its better and acceptable optimization performance compared to the original Jaya algorithm and other reported optimal results.

  15. Towards a robust framework for catchment classification

    NASA Astrophysics Data System (ADS)

    Deshmukh, A.; Samal, A.; Singh, R.

    2017-12-01

    Classification of catchments based on various measures of similarity has emerged as an important technique to understand regional scale hydrologic behavior. Classification of catchment characteristics and/or streamflow response has been used reveal which characteristics are more likely to explain the observed variability of hydrologic response. However, numerous algorithms for supervised or unsupervised classification are available, making it hard to identify the algorithm most suitable for the dataset at hand. Consequently, existing catchment classification studies vary significantly in the classification algorithms employed with no previous attempt at understanding the degree of uncertainty in classification due to this algorithmic choice. This hinders the generalizability of interpretations related to hydrologic behavior. Our goal is to develop a protocol that can be followed while classifying hydrologic datasets. We focus on a classification framework for unsupervised classification and provide a step-by-step classification procedure. The steps include testing the clusterabiltiy of original dataset prior to classification, feature selection, validation of clustered data, and quantification of similarity of two clusterings. We test several commonly available methods within this framework to understand the level of similarity of classification results across algorithms. We apply the proposed framework on recently developed datasets for India to analyze to what extent catchment properties can explain observed catchment response. Our testing dataset includes watershed characteristics for over 200 watersheds which comprise of both natural (physio-climatic) characteristics and socio-economic characteristics. This framework allows us to understand the controls on observed hydrologic variability across India.

  16. Hybrid water flow-like algorithm with Tabu search for traveling salesman problem

    NASA Astrophysics Data System (ADS)

    Bostamam, Jasmin M.; Othman, Zulaiha

    2016-08-01

    This paper presents a hybrid Water Flow-like Algorithm with Tabu Search for solving travelling salesman problem (WFA-TS-TSP).WFA has been proven its outstanding performances in solving TSP meanwhile TS is a conventional algorithm which has been used since decades to solve various combinatorial optimization problem including TSP. Hybridization between WFA with TS provides a better balance of exploration and exploitation criteria which are the key elements in determining the performance of one metaheuristic. TS use two different local search namely, 2opt and 3opt separately. The proposed WFA-TS-TSP is tested on 23 sets on the well-known benchmarked symmetric TSP instances. The result shows that the proposed WFA-TS-TSP has significant better quality solutions compared to WFA. The result also shows that the WFA-TS-TSP with 3-opt obtained the best quality solution. With the result obtained, it could be concluded that WFA has potential to be further improved by using hybrid technique or using better local search technique.

  17. Improving HybrID: How to best combine indirect and direct encoding in evolutionary algorithms.

    PubMed

    Helms, Lucas; Clune, Jeff

    2017-01-01

    Many challenging engineering problems are regular, meaning solutions to one part of a problem can be reused to solve other parts. Evolutionary algorithms with indirect encoding perform better on regular problems because they reuse genomic information to create regular phenotypes. However, on problems that are mostly regular, but contain some irregularities, which describes most real-world problems, indirect encodings struggle to handle the irregularities, hurting performance. Direct encodings are better at producing irregular phenotypes, but cannot exploit regularity. An algorithm called HybrID combines the best of both: it first evolves with indirect encoding to exploit problem regularity, then switches to direct encoding to handle problem irregularity. While HybrID has been shown to outperform both indirect and direct encoding, its initial implementation required the manual specification of when to switch from indirect to direct encoding. In this paper, we test two new methods to improve HybrID by eliminating the need to manually specify this parameter. Auto-Switch-HybrID automatically switches from indirect to direct encoding when fitness stagnates. Offset-HybrID simultaneously evolves an indirect encoding with directly encoded offsets, eliminating the need to switch. We compare the original HybrID to these alternatives on three different problems with adjustable regularity. The results show that both Auto-Switch-HybrID and Offset-HybrID outperform the original HybrID on different types of problems, and thus offer more tools for researchers to solve challenging problems. The Offset-HybrID algorithm is particularly interesting because it suggests a path forward for automatically and simultaneously combining the best traits of indirect and direct encoding.

  18. Hybrid Model Based on Genetic Algorithms and SVM Applied to Variable Selection within Fruit Juice Classification

    PubMed Central

    Fernandez-Lozano, C.; Canto, C.; Gestal, M.; Andrade-Garda, J. M.; Rabuñal, J. R.; Dorado, J.; Pazos, A.

    2013-01-01

    Given the background of the use of Neural Networks in problems of apple juice classification, this paper aim at implementing a newly developed method in the field of machine learning: the Support Vector Machines (SVM). Therefore, a hybrid model that combines genetic algorithms and support vector machines is suggested in such a way that, when using SVM as a fitness function of the Genetic Algorithm (GA), the most representative variables for a specific classification problem can be selected. PMID:24453933

  19. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    NASA Astrophysics Data System (ADS)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination- r2, Nash-Sutcliffe efficiency- NSE, percent bias- PBIAS, and Kling-Gupta efficiency- KGE). The preliminary results showed that using the SUFI-2 algorithm with the objective function NSE and KGE has improved significantly the calibration (e.g. R2 and NSE is found 0.52 and 0.47 respectively for daily streamflow calibration).

  20. Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2005-05-01

    Probabilistic inference for Bayesian networks is in general NP-hard using either exact algorithms or approximate methods. However, for very complex networks, only the approximate methods such as stochastic sampling could be used to provide a solution given any time constraint. There are several simulation methods currently available. They include logic sampling (the first proposed stochastic method for Bayesian networks, the likelihood weighting algorithm) the most commonly used simulation method because of its simplicity and efficiency, the Markov blanket scoring method, and the importance sampling algorithm. In this paper, we first briefly review and compare these available simulation methods, then we propose an improved importance sampling algorithm called linear Gaussian importance sampling algorithm for general hybrid model (LGIS). LGIS is aimed for hybrid Bayesian networks consisting of both discrete and continuous random variables with arbitrary distributions. It uses linear function and Gaussian additive noise to approximate the true conditional probability distribution for continuous variable given both its parents and evidence in a Bayesian network. One of the most important features of the newly developed method is that it can adaptively learn the optimal important function from the previous samples. We test the inference performance of LGIS using a 16-node linear Gaussian model and a 6-node general hybrid model. The performance comparison with other well-known methods such as Junction tree (JT) and likelihood weighting (LW) shows that LGIS-GHM is very promising.

  1. A novel algorithm for delineating wetland depressions and ...

    EPA Pesticide Factsheets

    In traditional watershed delineation and topographic modeling, surface depressions are generally treated as spurious features and simply removed from a digital elevation model (DEM) to enforce flow continuity of water across the topographic surface to the watershed outlets. In reality, however, many depressions in the DEM are actual wetland landscape features that are seldom fully filled with water. For instance, wetland depressions in the Prairie Pothole Region (PPR) are seasonally to permanently flooded wetlands characterized by nested hierarchical structures with dynamic filling- spilling-merging surface-water hydrological processes. The objectives of this study were to delineate hierarchical wetland catchments and model their hydrologic connectivity using high-resolution LiDAR data and aerial imagery. We proposed a novel algorithm delineate the hierarchical wetland catchments and characterize their geometric and topological properties. Potential hydrologic connectivity between wetlands and streams were simulated using the least-cost path algorithm. The resulting flow network delineated putative temporary or seasonal flow paths connecting wetland depressions to each other or to the river network at scales finer than available through the National Hydrography Dataset. The results demonstrated that our proposed framework is promising for improving overland flow modeling and hydrologic connectivity analysis. Presentation at AWRA Spring Specialty Conference in Sn

  2. Managing Watersheds as Couple Human-Natural Systems: A Review of Research Opportunities

    NASA Astrophysics Data System (ADS)

    Cai, X.

    2011-12-01

    Many watersheds around the world are impaired with severe social and environmental problems due to heavy anthropogenic stresses. Humans have transformed hydrological and biochemical processes in watersheds from a stationary to non-stationary status through direct (e.g., water withdrawals) and indirect (e.g., altering vegetation and land cover) interferences. It has been found that in many watersheds that socio-economic drivers, which have caused increasingly intensive alteration of natural processes, have even overcome natural variability to become the dominant factor affecting the behavior of watershed systems. Reversing this trend requires an understanding of the drivers of this intensification trajectory, and needs tremendous policy reform and investment. As stressed by several recent National Research Council (NRC) reports, watershed management will pose an enormous challenge in the coming decades. Correspondingly, the focus of research has started an evolution from the management of reservoir, stormwater and aquifer systems to the management of integrated watershed systems, to which policy instruments designed to make more rational economic use of water resources are likely to be applied. To provide a few examples: reservoir operation studies have moved from a local to a watershed scale in order to consider upstream best management practices in soil conservation and erosion control and downstream ecological flow requirements and water rights; watersheds have been modeled as integrated hydrologic-economic systems with multidisciplinary modeling efforts, instead of traditional isolated physical systems. Today's watershed management calls for a re-definition of watersheds from isolated natural systems to coupled human-natural systems (CHNS), which are characterized by the interactions between human activities and natural processes, crossing various spatial and temporal scales within the context of a watershed. The importance of the conceptual innovation has been evidenced by 1) institutional innovation for integrated watershed management; 2) real-world management practices involving multidisciplinary expertise; 3) growing role of economics in systems analysis; 4) enhanced research programs such as the CHNS program and Water, Sustainability and Climate (WSC) program at the US National Science Foundation (NSF). Furthermore, recent scientific and technological developments are expected to accommodate integrated watershed system analysis approaches, such as: 1) increasing availability of distributed digital datasets especially from remote sensing products (e.g. digital watersheds); 2) distributed and semi-distributed watershed hydrologic modeling; 3) enhanced hydroclimatic monitoring and forecast; 4) identified evidences of vulnerability and threshold behavior of watersheds; and 5) continuing improvements in computational and optimization algorithms. Managing watersheds as CHNS will be critical for watershed sustainability, which ensures that human societies will benefit forever from the watershed through development of harmonious relationships between human and natural systems. This presentation will provide a review of the research opportunities that take advantage of the concept of CHNS and associated scientific, technological and institutional innovations/developments.

  3. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem

    PubMed Central

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585

  4. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem.

    PubMed

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.

  5. Revision of an automated microseismic location algorithm for DAS - 3C geophone hybrid array

    NASA Astrophysics Data System (ADS)

    Mizuno, T.; LeCalvez, J.; Raymer, D.

    2017-12-01

    Application of distributed acoustic sensing (DAS) has been studied in several areas in seismology. One of the areas is microseismic reservoir monitoring (e.g., Molteni et al., 2017, First Break). Considering the present limitations of DAS, which include relatively low signal-to-noise ratio (SNR) and no 3C polarization measurements, a DAS - 3C geophone hybrid array is a practical option when using a single monitoring well. Considering the large volume of data from distributed sensing, microseismic event detection and location using a source scanning type algorithm is a reasonable choice, especially for real-time monitoring. The algorithm must handle both strain rate along the borehole axis for DAS and particle velocity for 3C geophones. Only a small quantity of large SNR events will be detected throughout a large aperture encompassing the hybrid array; therefore, the aperture is to be optimized dynamically to eliminate noisy channels for a majority of events. For such hybrid array, coalescence microseismic mapping (CMM) (Drew et al., 2005, SPE) was revised. CMM forms a likelihood function of location of event and its origin time. At each receiver, a time function of event arrival likelihood is inferred using an SNR function, and it is migrated to time and space to determine hypocenter and origin time likelihood. This algorithm was revised to dynamically optimize such a hybrid array by identifying receivers where a microseismic signal is possibly detected and using only those receivers to compute the likelihood function. Currently, peak SNR is used to select receivers. To prevent false results due to small aperture, a minimum aperture threshold is employed. The algorithm refines location likelihood using 3C geophone polarization. We tested this algorithm using a ray-based synthetic dataset. Leaney (2014, PhD thesis, UBC) is used to compute particle velocity at receivers. Strain rate along the borehole axis is computed from particle velocity as DAS microseismic synthetic data. The likelihood function formed by both DAS and geophone behaves as expected with the aperture dynamically selected depending on the SNR of the event. We conclude that this algorithm can be successfully applied for such hybrid arrays to monitor microseismic activity. A study using a recently acquired dataset is planned.

  6. An effective hybrid firefly algorithm with harmony search for global numerical optimization.

    PubMed

    Guo, Lihong; Wang, Gai-Ge; Wang, Heqi; Wang, Dinan

    2013-01-01

    A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA), namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods.

  7. Invasive hybridization in a threatened species is accelerated by climate change

    NASA Astrophysics Data System (ADS)

    Muhlfeld, Clint C.; Kovach, Ryan P.; Jones, Leslie A.; Al-Chokhachy, Robert; Boyer, Matthew C.; Leary, Robb F.; Lowe, Winsor H.; Luikart, Gordon; Allendorf, Fred W.

    2014-07-01

    Climate change will decrease worldwide biodiversity through a number of potential pathways, including invasive hybridization (cross-breeding between invasive and native species). How climate warming influences the spread of hybridization and loss of native genomes poses difficult ecological and evolutionary questions with little empirical information to guide conservation management decisions. Here we combine long-term genetic monitoring data with high-resolution climate and stream temperature predictions to evaluate how recent climate warming has influenced the spatio-temporal spread of human-mediated hybridization between threatened native westslope cutthroat trout (Oncorhynchus clarkii lewisi) and non-native rainbow trout (Oncorhynchus mykiss), the world's most widely introduced invasive fish. Despite widespread release of millions of rainbow trout over the past century within the Flathead River system, a large relatively pristine watershed in western North America, historical samples revealed that hybridization was prevalent only in one (source) population. During a subsequent 30-year period of accelerated warming, hybridization spread rapidly and was strongly linked to interactions between climatic drivers--precipitation and temperature--and distance to the source population. Specifically, decreases in spring precipitation and increases in summer stream temperature probably promoted upstream expansion of hybridization throughout the system. This study shows that rapid climate warming can exacerbate interactions between native and non-native species through invasive hybridization, which could spell genomic extinction for many species.

  8. Hybrid-optimization algorithm for the management of a conjunctive-use project and well field design

    USGS Publications Warehouse

    Chiu, Yung-Chia; Nishikawa, Tracy; Martin, Peter

    2012-01-01

    Hi-Desert Water District (HDWD), the primary water-management agency in the Warren Groundwater Basin, California, plans to construct a waste water treatment plant to reduce future septic-tank effluent from reaching the groundwater system. The treated waste water will be reclaimed by recharging the groundwater basin via recharge ponds as part of a larger conjunctive-use strategy. HDWD wishes to identify the least-cost conjunctiveuse strategies for managing imported surface water, reclaimed water, and local groundwater. As formulated, the mixed-integer nonlinear programming (MINLP) groundwater-management problem seeks to minimize water delivery costs subject to constraints including potential locations of the new pumping wells, California State regulations, groundwater-level constraints, water-supply demand, available imported water, and pump/recharge capacities. In this study, a hybrid-optimization algorithm, which couples a genetic algorithm and successive-linear programming, is developed to solve the MINLP problem. The algorithm was tested by comparing results to the enumerative solution for a simplified version of the HDWD groundwater-management problem. The results indicate that the hybrid-optimization algorithm can identify the global optimum. The hybrid-optimization algorithm is then applied to solve a complex groundwater-management problem. Sensitivity analyses were also performed to assess the impact of varying the new recharge pond orientation, varying the mixing ratio of reclaimed water and pumped water, and varying the amount of imported water available. The developed conjunctive management model can provide HDWD water managers with information that will improve their ability to manage their surface water, reclaimed water, and groundwater resources.

  9. A new hybrid meta-heuristic algorithm for optimal design of large-scale dome structures

    NASA Astrophysics Data System (ADS)

    Kaveh, A.; Ilchi Ghazaan, M.

    2018-02-01

    In this article a hybrid algorithm based on a vibrating particles system (VPS) algorithm, multi-design variable configuration (Multi-DVC) cascade optimization, and an upper bound strategy (UBS) is presented for global optimization of large-scale dome truss structures. The new algorithm is called MDVC-UVPS in which the VPS algorithm acts as the main engine of the algorithm. The VPS algorithm is one of the most recent multi-agent meta-heuristic algorithms mimicking the mechanisms of damped free vibration of single degree of freedom systems. In order to handle a large number of variables, cascade sizing optimization utilizing a series of DVCs is used. Moreover, the UBS is utilized to reduce the computational time. Various dome truss examples are studied to demonstrate the effectiveness and robustness of the proposed method, as compared to some existing structural optimization techniques. The results indicate that the MDVC-UVPS technique is a powerful search and optimization method for optimizing structural engineering problems.

  10. Automated red blood cells extraction from holographic images using fully convolutional neural networks.

    PubMed

    Yi, Faliu; Moon, Inkyu; Javidi, Bahram

    2017-10-01

    In this paper, we present two models for automatically extracting red blood cells (RBCs) from RBCs holographic images based on a deep learning fully convolutional neural network (FCN) algorithm. The first model, called FCN-1, only uses the FCN algorithm to carry out RBCs prediction, whereas the second model, called FCN-2, combines the FCN approach with the marker-controlled watershed transform segmentation scheme to achieve RBCs extraction. Both models achieve good segmentation accuracy. In addition, the second model has much better performance in terms of cell separation than traditional segmentation methods. In the proposed methods, the RBCs phase images are first numerically reconstructed from RBCs holograms recorded with off-axis digital holographic microscopy. Then, some RBCs phase images are manually segmented and used as training data to fine-tune the FCN. Finally, each pixel in new input RBCs phase images is predicted into either foreground or background using the trained FCN models. The RBCs prediction result from the first model is the final segmentation result, whereas the result from the second model is used as the internal markers of the marker-controlled transform algorithm for further segmentation. Experimental results show that the given schemes can automatically extract RBCs from RBCs phase images and much better RBCs separation results are obtained when the FCN technique is combined with the marker-controlled watershed segmentation algorithm.

  11. Automated red blood cells extraction from holographic images using fully convolutional neural networks

    PubMed Central

    Yi, Faliu; Moon, Inkyu; Javidi, Bahram

    2017-01-01

    In this paper, we present two models for automatically extracting red blood cells (RBCs) from RBCs holographic images based on a deep learning fully convolutional neural network (FCN) algorithm. The first model, called FCN-1, only uses the FCN algorithm to carry out RBCs prediction, whereas the second model, called FCN-2, combines the FCN approach with the marker-controlled watershed transform segmentation scheme to achieve RBCs extraction. Both models achieve good segmentation accuracy. In addition, the second model has much better performance in terms of cell separation than traditional segmentation methods. In the proposed methods, the RBCs phase images are first numerically reconstructed from RBCs holograms recorded with off-axis digital holographic microscopy. Then, some RBCs phase images are manually segmented and used as training data to fine-tune the FCN. Finally, each pixel in new input RBCs phase images is predicted into either foreground or background using the trained FCN models. The RBCs prediction result from the first model is the final segmentation result, whereas the result from the second model is used as the internal markers of the marker-controlled transform algorithm for further segmentation. Experimental results show that the given schemes can automatically extract RBCs from RBCs phase images and much better RBCs separation results are obtained when the FCN technique is combined with the marker-controlled watershed segmentation algorithm. PMID:29082078

  12. Two Improved Algorithms for Envelope and Wavefront Reduction

    NASA Technical Reports Server (NTRS)

    Kumfert, Gary; Pothen, Alex

    1997-01-01

    Two algorithms for reordering sparse, symmetric matrices or undirected graphs to reduce envelope and wavefront are considered. The first is a combinatorial algorithm introduced by Sloan and further developed by Duff, Reid, and Scott; we describe enhancements to the Sloan algorithm that improve its quality and reduce its run time. Our test problems fall into two classes with differing asymptotic behavior of their envelope parameters as a function of the weights in the Sloan algorithm. We describe an efficient 0(nlogn + m) time implementation of the Sloan algorithm, where n is the number of rows (vertices), and m is the number of nonzeros (edges). On a collection of test problems, the improved Sloan algorithm required, on the average, only twice the time required by the simpler Reverse Cuthill-Mckee algorithm while improving the mean square wavefront by a factor of three. The second algorithm is a hybrid that combines a spectral algorithm for envelope and wavefront reduction with a refinement step that uses a modified Sloan algorithm. The hybrid algorithm reduces the envelope size and mean square wavefront obtained from the Sloan algorithm at the cost of greater running times. We illustrate how these reductions translate into tangible benefits for frontal Cholesky factorization and incomplete factorization preconditioning.

  13. Neural network control of a parallel hybrid-electric propulsion system for a small unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Harmon, Frederick G.

    2005-11-01

    Parallel hybrid-electric propulsion systems would be beneficial for small unmanned aerial vehicles (UAVs) used for military, homeland security, and disaster-monitoring missions. The benefits, due to the hybrid and electric-only modes, include increased time-on-station and greater range as compared to electric-powered UAVs and stealth modes not available with gasoline-powered UAVs. This dissertation contributes to the research fields of small unmanned aerial vehicles, hybrid-electric propulsion system control, and intelligent control. A conceptual design of a small UAV with a parallel hybrid-electric propulsion system is provided. The UAV is intended for intelligence, surveillance, and reconnaissance (ISR) missions. A conceptual design reveals the trade-offs that must be considered to take advantage of the hybrid-electric propulsion system. The resulting hybrid-electric propulsion system is a two-point design that includes an engine primarily sized for cruise speed and an electric motor and battery pack that are primarily sized for a slower endurance speed. The electric motor provides additional power for take-off, climbing, and acceleration and also serves as a generator during charge-sustaining operation or regeneration. The intelligent control of the hybrid-electric propulsion system is based on an instantaneous optimization algorithm that generates a hyper-plane from the nonlinear efficiency maps for the internal combustion engine, electric motor, and lithium-ion battery pack. The hyper-plane incorporates charge-depletion and charge-sustaining strategies. The optimization algorithm is flexible and allows the operator/user to assign relative importance between the use of gasoline, electricity, and recharging depending on the intended mission. A MATLAB/Simulink model was developed to test the control algorithms. The Cerebellar Model Arithmetic Computer (CMAC) associative memory neural network is applied to the control of the UAVs parallel hybrid-electric propulsion system. The CMAC neural network approximates the hyper-plane generated from the instantaneous optimization algorithm and produces torque commands for the internal combustion engine and electric motor. The CMAC neural network controller saves on the required memory as compared to a large look-up table by two orders of magnitude. The CMAC controller also prevents the need to compute a hyper-plane or complex logic every time step.

  14. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    Watershed-scale hydrologic models are used for a variety of applications from flood prediction, to drought analysis, to water quality assessments. A particular challenge in applying these models is calibration of the model parameters, many of which are difficult to measure at the watershed-scale. A primary goal of this dissertation is to contribute new computational methods and tools for calibration of watershed-scale hydrologic models and the Soil and Water Assessment Tool (SWAT) model, in particular. SWAT is a physically-based, watershed-scale hydrologic model developed to predict the impact of land management practices on water quality and quantity. The dissertation follows a manuscript format meaning it is comprised of three separate but interrelated research studies. The first two research studies focus on SWAT model calibration, and the third research study presents an application of the new calibration methods and tools to study climate change impacts on water resources in the Upper Neuse Watershed of North Carolina using SWAT. The objective of the first two studies is to overcome computational challenges associated with calibration of SWAT models. The first study evaluates a parallel SWAT calibration tool built using the Windows Azure cloud environment and a parallel version of the Dynamically Dimensioned Search (DDS) calibration method modified to run in Azure. The calibration tool was tested for six model scenarios constructed using three watersheds of increasing size (the Eno, Upper Neuse, and Neuse) for both a 2 year and 10 year simulation duration. Leveraging the cloud as an on demand computing resource allowed for a significantly reduced calibration time such that calibration of the Neuse watershed went from taking 207 hours on a personal computer to only 3.4 hours using 256 cores in the Azure cloud. The second study aims at increasing SWAT model calibration efficiency by creating an open source, multi-objective calibration tool using the Non-Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the methodological advances presented in the first two studies. Therefore, the dissertation contains three independent by interrelated studies that collectively advance the field of watershed-scale hydrologic modeling and analysis.

  15. Water productivity of different land uses in watersheds assessed from satellite imagery Landsat 5 Thematic Mapper

    NASA Astrophysics Data System (ADS)

    Franco, Renato A. M.; Hernandez, Fernando B. T.; Teixeira, Antonio H. C.

    2014-10-01

    Water productivity (WP) of various classes of soil usage from watersheds was estimated using the SAFER - Simple Algorithm For Evapotranspiration Retrieving - algorithm and the Monteith equation to estimate the parameters of biomass production (BIO). Monteith's equation is used to quantify the absorbed photosynthetically active radiation (APAR) and Actual Evapotranspiration (ET) was estimated with the SAFER algorithm. The objective of the research is to analyze the spatial-temporal water productivity in watersheds with different uses and soil occupation during the period from 1996 to 2010, in conditions of drought and using the Monteith model to estimate the production of BIO and using the SAFER model for ET. Results indicated an increase of 153.2% in ET value during the period 1997-2010, showing that the irrigated areas were responsible for this increase in ET values. In September 2000, image of day of year (DOY) 210 showed high values of BIO, with averages of 80.67 kg ha-1d-1. In the year 2010 (DOY:177), the mean value of BIO was 62.90 kg ha-1d-1, with an irrigated area with a maximum value of 227.5 kg ha-1d-1. The highest incremental values of BIO is verified from the start of irrigated areas equal to the value of ET, because there is a relationship between BIO and ET. The maximum water productivity (WP) value occurred in June/2001, with 3,08 kg m-3, the second highest value was in 2010 (DOY:177), with a value of 2,97 kg m-3. Irrigated agriculture show the highest WP value, with maximum value of 6.7 kg m-3. The lowest WP was obtained for DOY 267, because of the dry season with condition of low soil moisture.

  16. Morphological spot counting from stacked images for automated analysis of gene copy numbers by fluorescence in situ hybridization.

    PubMed

    Grigoryan, Artyom M; Dougherty, Edward R; Kononen, Juha; Bubendorf, Lukas; Hostetter, Galen; Kallioniemi, Olli

    2002-01-01

    Fluorescence in situ hybridization (FISH) is a molecular diagnostic technique in which a fluorescent labeled probe hybridizes to a target nucleotide sequence of deoxyribose nucleic acid. Upon excitation, each chromosome containing the target sequence produces a fluorescent signal (spot). Because fluorescent spot counting is tedious and often subjective, automated digital algorithms to count spots are desirable. New technology provides a stack of images on multiple focal planes throughout a tissue sample. Multiple-focal-plane imaging helps overcome the biases and imprecision inherent in single-focal-plane methods. This paper proposes an algorithm for global spot counting in stacked three-dimensional slice FISH images without the necessity of nuclei segmentation. It is designed to work in complex backgrounds, when there are agglomerated nuclei, and in the presence of illumination gradients. It is based on the morphological top-hat transform, which locates intensity spikes on irregular backgrounds. After finding signals in the slice images, the algorithm groups these together to form three-dimensional spots. Filters are employed to separate legitimate spots from fluorescent noise. The algorithm is set in a comprehensive toolbox that provides visualization and analytic facilities. It includes simulation software that allows examination of algorithm performance for various image and algorithm parameter settings, including signal size, signal density, and the number of slices.

  17. An effective hybrid self-adapting differential evolution algorithm for the joint replenishment and location-inventory problem in a three-level supply chain.

    PubMed

    Wang, Lin; Qu, Hui; Chen, Tao; Yan, Fang-Ping

    2013-01-01

    The integration with different decisions in the supply chain is a trend, since it can avoid the suboptimal decisions. In this paper, we provide an effective intelligent algorithm for a modified joint replenishment and location-inventory problem (JR-LIP). The problem of the JR-LIP is to determine the reasonable number and location of distribution centers (DCs), the assignment policy of customers, and the replenishment policy of DCs such that the overall cost is minimized. However, due to the JR-LIP's difficult mathematical properties, simple and effective solutions for this NP-hard problem have eluded researchers. To find an effective approach for the JR-LIP, a hybrid self-adapting differential evolution algorithm (HSDE) is designed. To verify the effectiveness of the HSDE, two intelligent algorithms that have been proven to be effective algorithms for the similar problems named genetic algorithm (GA) and hybrid DE (HDE) are chosen to compare with it. Comparative results of benchmark functions and randomly generated JR-LIPs show that HSDE outperforms GA and HDE. Moreover, a sensitive analysis of cost parameters reveals the useful managerial insight. All comparative results show that HSDE is more stable and robust in handling this complex problem especially for the large-scale problem.

  18. An Effective Hybrid Self-Adapting Differential Evolution Algorithm for the Joint Replenishment and Location-Inventory Problem in a Three-Level Supply Chain

    PubMed Central

    Chen, Tao; Yan, Fang-Ping

    2013-01-01

    The integration with different decisions in the supply chain is a trend, since it can avoid the suboptimal decisions. In this paper, we provide an effective intelligent algorithm for a modified joint replenishment and location-inventory problem (JR-LIP). The problem of the JR-LIP is to determine the reasonable number and location of distribution centers (DCs), the assignment policy of customers, and the replenishment policy of DCs such that the overall cost is minimized. However, due to the JR-LIP's difficult mathematical properties, simple and effective solutions for this NP-hard problem have eluded researchers. To find an effective approach for the JR-LIP, a hybrid self-adapting differential evolution algorithm (HSDE) is designed. To verify the effectiveness of the HSDE, two intelligent algorithms that have been proven to be effective algorithms for the similar problems named genetic algorithm (GA) and hybrid DE (HDE) are chosen to compare with it. Comparative results of benchmark functions and randomly generated JR-LIPs show that HSDE outperforms GA and HDE. Moreover, a sensitive analysis of cost parameters reveals the useful managerial insight. All comparative results show that HSDE is more stable and robust in handling this complex problem especially for the large-scale problem. PMID:24453822

  19. Study on optimal configuration of the grid-connected wind-solar-battery hybrid power system

    NASA Astrophysics Data System (ADS)

    Ma, Gang; Xu, Guchao; Ju, Rong; Wu, Tiantian

    2017-08-01

    The capacity allocation of each energy unit in the grid-connected wind-solar-battery hybrid power system is a significant segment in system design. In this paper, taking power grid dispatching into account, the research priorities are as follows: (1) We establish the mathematic models of each energy unit in the hybrid power system. (2) Based on dispatching of the power grid, energy surplus rate, system energy volatility and total cost, we establish the evaluation system for the wind-solar-battery power system and use a number of different devices as the constraint condition. (3) Based on an improved Genetic algorithm, we put forward a multi-objective optimisation algorithm to solve the optimal configuration problem in the hybrid power system, so we can achieve the high efficiency and economy of the grid-connected hybrid power system. The simulation result shows that the grid-connected wind-solar-battery hybrid power system has a higher comprehensive performance; the method of optimal configuration in this paper is useful and reasonable.

  20. Hybrid simulated annealing and its application to optimization of hidden Markov models for visual speech recognition.

    PubMed

    Lee, Jong-Seok; Park, Cheol Hoon

    2010-08-01

    We propose a novel stochastic optimization algorithm, hybrid simulated annealing (SA), to train hidden Markov models (HMMs) for visual speech recognition. In our algorithm, SA is combined with a local optimization operator that substitutes a better solution for the current one to improve the convergence speed and the quality of solutions. We mathematically prove that the sequence of the objective values converges in probability to the global optimum in the algorithm. The algorithm is applied to train HMMs that are used as visual speech recognizers. While the popular training method of HMMs, the expectation-maximization algorithm, achieves only local optima in the parameter space, the proposed method can perform global optimization of the parameters of HMMs and thereby obtain solutions yielding improved recognition performance. The superiority of the proposed algorithm to the conventional ones is demonstrated via isolated word recognition experiments.

  1. Feed-Forward Neural Network Soft-Sensor Modeling of Flotation Process Based on Particle Swarm Optimization and Gravitational Search Algorithm

    PubMed Central

    Wang, Jie-Sheng; Han, Shuang

    2015-01-01

    For predicting the key technology indicators (concentrate grade and tailings recovery rate) of flotation process, a feed-forward neural network (FNN) based soft-sensor model optimized by the hybrid algorithm combining particle swarm optimization (PSO) algorithm and gravitational search algorithm (GSA) is proposed. Although GSA has better optimization capability, it has slow convergence velocity and is easy to fall into local optimum. So in this paper, the velocity vector and position vector of GSA are adjusted by PSO algorithm in order to improve its convergence speed and prediction accuracy. Finally, the proposed hybrid algorithm is adopted to optimize the parameters of FNN soft-sensor model. Simulation results show that the model has better generalization and prediction accuracy for the concentrate grade and tailings recovery rate to meet the online soft-sensor requirements of the real-time control in the flotation process. PMID:26583034

  2. The control of a parallel hybrid-electric propulsion system for a small unmanned aerial vehicle using a CMAC neural network.

    PubMed

    Harmon, Frederick G; Frank, Andrew A; Joshi, Sanjay S

    2005-01-01

    A Simulink model, a propulsion energy optimization algorithm, and a CMAC controller were developed for a small parallel hybrid-electric unmanned aerial vehicle (UAV). The hybrid-electric UAV is intended for military, homeland security, and disaster-monitoring missions involving intelligence, surveillance, and reconnaissance (ISR). The Simulink model is a forward-facing simulation program used to test different control strategies. The flexible energy optimization algorithm for the propulsion system allows relative importance to be assigned between the use of gasoline, electricity, and recharging. A cerebellar model arithmetic computer (CMAC) neural network approximates the energy optimization results and is used to control the parallel hybrid-electric propulsion system. The hybrid-electric UAV with the CMAC controller uses 67.3% less energy than a two-stroke gasoline-powered UAV during a 1-h ISR mission and 37.8% less energy during a longer 3-h ISR mission.

  3. Hybrid grammar-based approach to nonlinear dynamical system identification from biological time series

    NASA Astrophysics Data System (ADS)

    McKinney, B. A.; Crowe, J. E., Jr.; Voss, H. U.; Crooke, P. S.; Barney, N.; Moore, J. H.

    2006-02-01

    We introduce a grammar-based hybrid approach to reverse engineering nonlinear ordinary differential equation models from observed time series. This hybrid approach combines a genetic algorithm to search the space of model architectures with a Kalman filter to estimate the model parameters. Domain-specific knowledge is used in a context-free grammar to restrict the search space for the functional form of the target model. We find that the hybrid approach outperforms a pure evolutionary algorithm method, and we observe features in the evolution of the dynamical models that correspond with the emergence of favorable model components. We apply the hybrid method to both artificially generated time series and experimentally observed protein levels from subjects who received the smallpox vaccine. From the observed data, we infer a cytokine protein interaction network for an individual’s response to the smallpox vaccine.

  4. Hybrid Parallelism for Volume Rendering on Large-, Multi-, and Many-Core Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howison, Mark; Bethel, E. Wes; Childs, Hank

    2012-01-01

    With the computing industry trending towards multi- and many-core processors, we study how a standard visualization algorithm, ray-casting volume rendering, can benefit from a hybrid parallelism approach. Hybrid parallelism provides the best of both worlds: using distributed-memory parallelism across a large numbers of nodes increases available FLOPs and memory, while exploiting shared-memory parallelism among the cores within each node ensures that each node performs its portion of the larger calculation as efficiently as possible. We demonstrate results from weak and strong scaling studies, at levels of concurrency ranging up to 216,000, and with datasets as large as 12.2 trillion cells.more » The greatest benefit from hybrid parallelism lies in the communication portion of the algorithm, the dominant cost at higher levels of concurrency. We show that reducing the number of participants with a hybrid approach significantly improves performance.« less

  5. A Two-Phase Coverage-Enhancing Algorithm for Hybrid Wireless Sensor Networks.

    PubMed

    Zhang, Qingguo; Fok, Mable P

    2017-01-09

    Providing field coverage is a key task in many sensor network applications. In certain scenarios, the sensor field may have coverage holes due to random initial deployment of sensors; thus, the desired level of coverage cannot be achieved. A hybrid wireless sensor network is a cost-effective solution to this problem, which is achieved by repositioning a portion of the mobile sensors in the network to meet the network coverage requirement. This paper investigates how to redeploy mobile sensor nodes to improve network coverage in hybrid wireless sensor networks. We propose a two-phase coverage-enhancing algorithm for hybrid wireless sensor networks. In phase one, we use a differential evolution algorithm to compute the candidate's target positions in the mobile sensor nodes that could potentially improve coverage. In the second phase, we use an optimization scheme on the candidate's target positions calculated from phase one to reduce the accumulated potential moving distance of mobile sensors, such that the exact mobile sensor nodes that need to be moved as well as their final target positions can be determined. Experimental results show that the proposed algorithm provided significant improvement in terms of area coverage rate, average moving distance, area coverage-distance rate and the number of moved mobile sensors, when compare with other approaches.

  6. A Two-Phase Coverage-Enhancing Algorithm for Hybrid Wireless Sensor Networks

    PubMed Central

    Zhang, Qingguo; Fok, Mable P.

    2017-01-01

    Providing field coverage is a key task in many sensor network applications. In certain scenarios, the sensor field may have coverage holes due to random initial deployment of sensors; thus, the desired level of coverage cannot be achieved. A hybrid wireless sensor network is a cost-effective solution to this problem, which is achieved by repositioning a portion of the mobile sensors in the network to meet the network coverage requirement. This paper investigates how to redeploy mobile sensor nodes to improve network coverage in hybrid wireless sensor networks. We propose a two-phase coverage-enhancing algorithm for hybrid wireless sensor networks. In phase one, we use a differential evolution algorithm to compute the candidate’s target positions in the mobile sensor nodes that could potentially improve coverage. In the second phase, we use an optimization scheme on the candidate’s target positions calculated from phase one to reduce the accumulated potential moving distance of mobile sensors, such that the exact mobile sensor nodes that need to be moved as well as their final target positions can be determined. Experimental results show that the proposed algorithm provided significant improvement in terms of area coverage rate, average moving distance, area coverage–distance rate and the number of moved mobile sensors, when compare with other approaches. PMID:28075365

  7. Improving HybrID: How to best combine indirect and direct encoding in evolutionary algorithms

    PubMed Central

    Helms, Lucas; Clune, Jeff

    2017-01-01

    Many challenging engineering problems are regular, meaning solutions to one part of a problem can be reused to solve other parts. Evolutionary algorithms with indirect encoding perform better on regular problems because they reuse genomic information to create regular phenotypes. However, on problems that are mostly regular, but contain some irregularities, which describes most real-world problems, indirect encodings struggle to handle the irregularities, hurting performance. Direct encodings are better at producing irregular phenotypes, but cannot exploit regularity. An algorithm called HybrID combines the best of both: it first evolves with indirect encoding to exploit problem regularity, then switches to direct encoding to handle problem irregularity. While HybrID has been shown to outperform both indirect and direct encoding, its initial implementation required the manual specification of when to switch from indirect to direct encoding. In this paper, we test two new methods to improve HybrID by eliminating the need to manually specify this parameter. Auto-Switch-HybrID automatically switches from indirect to direct encoding when fitness stagnates. Offset-HybrID simultaneously evolves an indirect encoding with directly encoded offsets, eliminating the need to switch. We compare the original HybrID to these alternatives on three different problems with adjustable regularity. The results show that both Auto-Switch-HybrID and Offset-HybrID outperform the original HybrID on different types of problems, and thus offer more tools for researchers to solve challenging problems. The Offset-HybrID algorithm is particularly interesting because it suggests a path forward for automatically and simultaneously combining the best traits of indirect and direct encoding. PMID:28334002

  8. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system.

    PubMed

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  9. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.

  10. Hybrid cryptosystem implementation using fast data encipherment algorithm (FEAL) and goldwasser-micali algorithm for file security

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Budiman, M. A.; Siburian, W. S. E.

    2018-05-01

    On the process of exchanging files, security is indispensable to avoid the theft of data. Cryptography is one of the sciences used to secure the data by way of encoding. Fast Data Encipherment Algorithm (FEAL) is a block cipher symmetric cryptographic algorithms. Therefore, the file which wants to protect is encrypted and decrypted using the algorithm FEAL. To optimize the security of the data, session key that is utilized in the algorithm FEAL encoded with the Goldwasser-Micali algorithm, which is an asymmetric cryptographic algorithm and using probabilistic concept. In the encryption process, the key was converted into binary form. The selection of values of x that randomly causes the results of the cipher key is different for each binary value. The concept of symmetry and asymmetry algorithm merger called Hybrid Cryptosystem. The use of the algorithm FEAL and Goldwasser-Micali can restore the message to its original form and the algorithm FEAL time required for encryption and decryption is directly proportional to the length of the message. However, on Goldwasser- Micali algorithm, the length of the message is not directly proportional to the time of encryption and decryption.

  11. Mining Input Data for Multivariate Probabilistic Modeling of Rainfall-Induced Landslide Hazard in the Lake ATITLÁN Watershed in Guatemala

    NASA Astrophysics Data System (ADS)

    Cobin, P. F.; Oommen, T.; Gierke, J. S.

    2013-12-01

    The Lake Atitlán watershed is home to approximately 200,000 people and is located in the western highlands of Guatemala. Steep slopes, highly susceptible to landslides during the rainy season, characterize the region. Typically these landslides occur during high-intensity precipitation events. Hurricane Stan hit Guatemala in October 2005; the resulting flooding and landslides devastated the region. Locations of landslide and non-landslide points were obtained from field observations and orthophotos taken following Hurricane Stan. Different datasets of landslide and non-landslide points across the watershed were used to compare model success at a small scale and regional scale. This study used data from multiple attributes: geology, geomorphology, distance to faults and streams, land use, slope, aspect, curvature, plan curvature, profile curvature and topographic wetness index. The open source software Weka was used for the data mining. Several attribute selection methods were applied to the data to predetermine the potential landslide causative influence. Different multivariate algorithms were then evaluated for their ability to predict landslide occurrence. The following statistical parameters were used to evaluate model accuracy: precision, recall, F measure and area under the receiver operating characteristic (ROC) curve. The attribute combinations of the most successful models were compared to the attribute evaluator results. The algorithm BayesNet yielded the most accurate model and was used to build a probability map of landslide initiation points for the regions selected in the watershed. The ultimate aim of this study is to share the methodology and results with municipal contacts from the author's time as a U.S. Peace Corps volunteer, to facilitate more effective future landslide hazard planning and mitigation.

  12. Bayesian estimation of realized stochastic volatility model by Hybrid Monte Carlo algorithm

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2014-03-01

    The hybrid Monte Carlo algorithm (HMCA) is applied for Bayesian parameter estimation of the realized stochastic volatility (RSV) model. Using the 2nd order minimum norm integrator (2MNI) for the molecular dynamics (MD) simulation in the HMCA, we find that the 2MNI is more efficient than the conventional leapfrog integrator. We also find that the autocorrelation time of the volatility variables sampled by the HMCA is very short. Thus it is concluded that the HMCA with the 2MNI is an efficient algorithm for parameter estimations of the RSV model.

  13. A novel hybrid genetic algorithm for optimal design of IPM machines for electric vehicle

    NASA Astrophysics Data System (ADS)

    Wang, Aimeng; Guo, Jiayu

    2017-12-01

    A novel hybrid genetic algorithm (HGA) is proposed to optimize the rotor structure of an IPM machine which is used in EV application. The finite element (FE) simulation results of the HGA design is compared with the genetic algorithm (GA) design and those before optimized. It is shown that the performance of the IPMSM is effectively improved by employing the GA and HGA, especially by HGA. Moreover, higher flux-weakening capability and less magnet usage are also obtained. Therefore, the validity of HGA method in IPMSM optimization design is verified.

  14. Hydrologic Impacts of Developing Forest-based Bioenergy Feedstock in Wisconsin, USA and Entre Rios, Argentina Watersheds

    NASA Astrophysics Data System (ADS)

    Heidari, A.; Mayer, A. S.; Watkins, D. W., Jr.

    2017-12-01

    Growing demand for biomass-derived fuels has resulted in an increase in bioenergy projects across the Americas in recent years, a trend that is expected to continue. However, the expansion of bioenergy feedstock production might cause unintended environmental consequences. Accordingly, the goal of this research is to investigate how forest-based bioenergy development across the Americas may affect hydrological systems on a watershed scale. This study focuses on biofuel feedstock production with hybrid poplar cultivation in a snow-dominated watershed in northern Wisconsin, USA, and eucalyptus cultivation in a warm and temperate watershed in Entre Rios, Argentina. The Soil and Water Assessment Tool (SWAT), calibrated and validated for the two watersheds, is used to evaluate the effects of land use change corresponding to a range of biofuel development scenarios. The land use change scenarios include rules for limiting the location of the biofuel feedstock, and rotation time. These variables in turn impact the magnitude and timing of runoff and evapotranspiration. In Wisconsin, long term daily streamflow simulations indicate that planting poplar will increase evapotranspiration and decrease water yield, primarily through reduced baseflow contributions to streamflow. Results are also presented in terms of changes in flow relative to biomass production, to understand the sensitivity of potential biofuel generation to hydrologic impacts, and vice versa. In the end, alternative management practices were evaluated to mitigate the impacts. Keywords: Biofuel; Soil and Water Assessment Tool; Poplar; Baseflow; Evapotranspiration

  15. Development of a WRF-RTFDDA-based high-resolution hybrid data-assimilation and forecasting system toward to operation in the Middle East

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Wu, W.; Zhang, Y.; Kucera, P. A.; Liu, Y.; Pan, L.

    2012-12-01

    Weather forecasting in the Middle East is challenging because of its complicated geographical nature including massive coastal area and heterogeneous land, and regional spare observational network. Strong air-land-sea interactions form multi-scale weather regimes in the area, which require a numerical weather prediction model capable of properly representing multi-scale atmospheric flow with appropriate initial conditions. The WRF-based Real-Time Four Dimensional Data Assimilation (RTFDDA) system is one of advanced multi-scale weather analysis and forecasting facilities developed at the Research Applications Laboratory (RAL) of NCAR. The forecasting system is applied for the Middle East with careful configuration. To overcome the limitation of the very sparsely available conventional observations in the region, we develop a hybrid data assimilation algorithm combining RTFDDA and WRF-3DVAR, which ingests remote sensing data from satellites and radar. This hybrid data assimilation blends Newtonian nudging FDDA and 3DVAR technology to effectively assimilate both conventional observations and remote sensing measurements and provide improved initial conditions for the forecasting system. For brevity, the forecasting system is called RTF3H (RTFDDA-3DVAR Hybrid). In this presentation, we will discuss the hybrid data assimilation algorithm, and its implementation, and the applications for high-impact weather events in the area. Sensitivity studies are conducted to understand the strength and limitations of this hybrid data assimilation algorithm.

  16. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    EPA Science Inventory

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  17. Multi-objective AGV scheduling in an FMS using a hybrid of genetic algorithm and particle swarm optimization.

    PubMed

    Mousavi, Maryam; Yap, Hwa Jen; Musa, Siti Nurmaya; Tahriri, Farzad; Md Dawal, Siti Zawiah

    2017-01-01

    Flexible manufacturing system (FMS) enhances the firm's flexibility and responsiveness to the ever-changing customer demand by providing a fast product diversification capability. Performance of an FMS is highly dependent upon the accuracy of scheduling policy for the components of the system, such as automated guided vehicles (AGVs). An AGV as a mobile robot provides remarkable industrial capabilities for material and goods transportation within a manufacturing facility or a warehouse. Allocating AGVs to tasks, while considering the cost and time of operations, defines the AGV scheduling process. Multi-objective scheduling of AGVs, unlike single objective practices, is a complex and combinatorial process. In the main draw of the research, a mathematical model was developed and integrated with evolutionary algorithms (genetic algorithm (GA), particle swarm optimization (PSO), and hybrid GA-PSO) to optimize the task scheduling of AGVs with the objectives of minimizing makespan and number of AGVs while considering the AGVs' battery charge. Assessment of the numerical examples' scheduling before and after the optimization proved the applicability of all the three algorithms in decreasing the makespan and AGV numbers. The hybrid GA-PSO produced the optimum result and outperformed the other two algorithms, in which the mean of AGVs operation efficiency was found to be 69.4, 74, and 79.8 percent in PSO, GA, and hybrid GA-PSO, respectively. Evaluation and validation of the model was performed by simulation via Flexsim software.

  18. Multi-objective AGV scheduling in an FMS using a hybrid of genetic algorithm and particle swarm optimization

    PubMed Central

    Yap, Hwa Jen; Musa, Siti Nurmaya; Tahriri, Farzad; Md Dawal, Siti Zawiah

    2017-01-01

    Flexible manufacturing system (FMS) enhances the firm’s flexibility and responsiveness to the ever-changing customer demand by providing a fast product diversification capability. Performance of an FMS is highly dependent upon the accuracy of scheduling policy for the components of the system, such as automated guided vehicles (AGVs). An AGV as a mobile robot provides remarkable industrial capabilities for material and goods transportation within a manufacturing facility or a warehouse. Allocating AGVs to tasks, while considering the cost and time of operations, defines the AGV scheduling process. Multi-objective scheduling of AGVs, unlike single objective practices, is a complex and combinatorial process. In the main draw of the research, a mathematical model was developed and integrated with evolutionary algorithms (genetic algorithm (GA), particle swarm optimization (PSO), and hybrid GA-PSO) to optimize the task scheduling of AGVs with the objectives of minimizing makespan and number of AGVs while considering the AGVs’ battery charge. Assessment of the numerical examples’ scheduling before and after the optimization proved the applicability of all the three algorithms in decreasing the makespan and AGV numbers. The hybrid GA-PSO produced the optimum result and outperformed the other two algorithms, in which the mean of AGVs operation efficiency was found to be 69.4, 74, and 79.8 percent in PSO, GA, and hybrid GA-PSO, respectively. Evaluation and validation of the model was performed by simulation via Flexsim software. PMID:28263994

  19. A flocking algorithm for multi-agent systems with connectivity preservation under hybrid metric-topological interactions.

    PubMed

    He, Chenlong; Feng, Zuren; Ren, Zhigang

    2018-01-01

    In this paper, we propose a connectivity-preserving flocking algorithm for multi-agent systems in which the neighbor set of each agent is determined by the hybrid metric-topological distance so that the interaction topology can be represented as the range-limited Delaunay graph, which combines the properties of the commonly used disk graph and Delaunay graph. As a result, the proposed flocking algorithm has the following advantages over the existing ones. First, range-limited Delaunay graph is sparser than the disk graph so that the information exchange among agents is reduced significantly. Second, some links irrelevant to the connectivity can be dynamically deleted during the evolution of the system. Thus, the proposed flocking algorithm is more flexible than existing algorithms, where links are not allowed to be disconnected once they are created. Finally, the multi-agent system spontaneously generates a regular quasi-lattice formation without imposing the constraint on the ratio of the sensing range of the agent to the desired distance between two adjacent agents. With the interaction topology induced by the hybrid distance, the proposed flocking algorithm can still be implemented in a distributed manner. We prove that the proposed flocking algorithm can steer the multi-agent system to a stable flocking motion, provided the initial interaction topology of multi-agent systems is connected and the hysteresis in link addition is smaller than a derived upper bound. The correctness and effectiveness of the proposed algorithm are verified by extensive numerical simulations, where the flocking algorithms based on the disk and Delaunay graph are compared.

  20. A flocking algorithm for multi-agent systems with connectivity preservation under hybrid metric-topological interactions

    PubMed Central

    Feng, Zuren; Ren, Zhigang

    2018-01-01

    In this paper, we propose a connectivity-preserving flocking algorithm for multi-agent systems in which the neighbor set of each agent is determined by the hybrid metric-topological distance so that the interaction topology can be represented as the range-limited Delaunay graph, which combines the properties of the commonly used disk graph and Delaunay graph. As a result, the proposed flocking algorithm has the following advantages over the existing ones. First, range-limited Delaunay graph is sparser than the disk graph so that the information exchange among agents is reduced significantly. Second, some links irrelevant to the connectivity can be dynamically deleted during the evolution of the system. Thus, the proposed flocking algorithm is more flexible than existing algorithms, where links are not allowed to be disconnected once they are created. Finally, the multi-agent system spontaneously generates a regular quasi-lattice formation without imposing the constraint on the ratio of the sensing range of the agent to the desired distance between two adjacent agents. With the interaction topology induced by the hybrid distance, the proposed flocking algorithm can still be implemented in a distributed manner. We prove that the proposed flocking algorithm can steer the multi-agent system to a stable flocking motion, provided the initial interaction topology of multi-agent systems is connected and the hysteresis in link addition is smaller than a derived upper bound. The correctness and effectiveness of the proposed algorithm are verified by extensive numerical simulations, where the flocking algorithms based on the disk and Delaunay graph are compared. PMID:29462217

  1. An Effective Hybrid Routing Algorithm in WSN: Ant Colony Optimization in combination with Hop Count Minimization.

    PubMed

    Jiang, Ailian; Zheng, Lihong

    2018-03-29

    Low cost, high reliability and easy maintenance are key criteria in the design of routing protocols for wireless sensor networks (WSNs). This paper investigates the existing ant colony optimization (ACO)-based WSN routing algorithms and the minimum hop count WSN routing algorithms by reviewing their strengths and weaknesses. We also consider the critical factors of WSNs, such as energy constraint of sensor nodes, network load balancing and dynamic network topology. Then we propose a hybrid routing algorithm that integrates ACO and a minimum hop count scheme. The proposed algorithm is able to find the optimal routing path with minimal total energy consumption and balanced energy consumption on each node. The algorithm has unique superiority in terms of searching for the optimal path, balancing the network load and the network topology maintenance. The WSN model and the proposed algorithm have been implemented using C++. Extensive simulation experimental results have shown that our algorithm outperforms several other WSN routing algorithms on such aspects that include the rate of convergence, the success rate in searching for global optimal solution, and the network lifetime.

  2. An Effective Hybrid Routing Algorithm in WSN: Ant Colony Optimization in combination with Hop Count Minimization

    PubMed Central

    2018-01-01

    Low cost, high reliability and easy maintenance are key criteria in the design of routing protocols for wireless sensor networks (WSNs). This paper investigates the existing ant colony optimization (ACO)-based WSN routing algorithms and the minimum hop count WSN routing algorithms by reviewing their strengths and weaknesses. We also consider the critical factors of WSNs, such as energy constraint of sensor nodes, network load balancing and dynamic network topology. Then we propose a hybrid routing algorithm that integrates ACO and a minimum hop count scheme. The proposed algorithm is able to find the optimal routing path with minimal total energy consumption and balanced energy consumption on each node. The algorithm has unique superiority in terms of searching for the optimal path, balancing the network load and the network topology maintenance. The WSN model and the proposed algorithm have been implemented using C++. Extensive simulation experimental results have shown that our algorithm outperforms several other WSN routing algorithms on such aspects that include the rate of convergence, the success rate in searching for global optimal solution, and the network lifetime. PMID:29596336

  3. Seasonal Phosphorus Sources and Loads to Upper Klamath Lake, Oregon, as Determined by a Dynamic SPARROW Model

    NASA Astrophysics Data System (ADS)

    Saleh, D.; Domagalski, J. L.; Smith, R. A.

    2016-12-01

    The SPARROW (SPAtially-Referenced Regression On Watershed Attributes) model, developed by the U.S. Geological Survey, has been used to identify and quantify the sources of nitrogen and phosphorus in watersheds and to predict their fluxes and concentration at specified locations downstream. Existing SPARROW models use a hybrid statistical approach to describe an annual average ("steady-state") relationship between sources and stream conditions based on long-term water quality monitoring data and spatially-referenced explanatory information. Although these annual models are useful for some management purposes, many water quality issues stem from intra- and inter-annual changes in constituent sources, hydrologic forcing, or other environmental conditions, which cause a lag between watershed inputs and stream water quality. We are developing a seasonal dynamic SPARROW model of sources, fluxes, and yields of phosphorus for the watershed (approximately 9,700 square kilometers) draining to Upper Klamath Lake, Oregon. The lake is hyper-eutrophic and various options are being considered for water quality improvement. The model was calibrated with 11 years of water quality data (2000 to 2010) and simulates seasonal loads and yields for a total of 44 seasons. Phosphorus sources to the watershed include animal manure, farm fertilizer, discharges of treated wastewater, and natural sources (soil and streambed sediment). The model predicts that phosphorus delivery to the lake is strongly affected by intra- and inter-annual changes in precipitation and by temporary seasonal storage of phosphorus in the watershed. The model can be used to predict how different management actions for mitigating phosphorus sources might affect phosphorus loading to the lake as well as the time required for any changes in loading to occur following implementation of the action.

  4. Swarm intelligence-based approach for optimal design of CMOS differential amplifier and comparator circuit using a hybrid salp swarm algorithm

    NASA Astrophysics Data System (ADS)

    Asaithambi, Sasikumar; Rajappa, Muthaiah

    2018-05-01

    In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.

  5. Swarm intelligence-based approach for optimal design of CMOS differential amplifier and comparator circuit using a hybrid salp swarm algorithm.

    PubMed

    Asaithambi, Sasikumar; Rajappa, Muthaiah

    2018-05-01

    In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.

  6. A New Efficient Hybrid Intelligent Model for Biodegradation Process of DMP with Fuzzy Wavelet Neural Networks

    NASA Astrophysics Data System (ADS)

    Huang, Mingzhi; Zhang, Tao; Ruan, Jujun; Chen, Xiaohong

    2017-01-01

    A new efficient hybrid intelligent approach based on fuzzy wavelet neural network (FWNN) was proposed for effectively modeling and simulating biodegradation process of Dimethyl phthalate (DMP) in an anaerobic/anoxic/oxic (AAO) wastewater treatment process. With the self learning and memory abilities of neural networks (NN), handling uncertainty capacity of fuzzy logic (FL), analyzing local details superiority of wavelet transform (WT) and global search of genetic algorithm (GA), the proposed hybrid intelligent model can extract the dynamic behavior and complex interrelationships from various water quality variables. For finding the optimal values for parameters of the proposed FWNN, a hybrid learning algorithm integrating an improved genetic optimization and gradient descent algorithm is employed. The results show, compared with NN model (optimized by GA) and kinetic model, the proposed FWNN model have the quicker convergence speed, the higher prediction performance, and smaller RMSE (0.080), MSE (0.0064), MAPE (1.8158) and higher R2 (0.9851) values. which illustrates FWNN model simulates effluent DMP more accurately than the mechanism model.

  7. A New Efficient Hybrid Intelligent Model for Biodegradation Process of DMP with Fuzzy Wavelet Neural Networks

    PubMed Central

    Huang, Mingzhi; Zhang, Tao; Ruan, Jujun; Chen, Xiaohong

    2017-01-01

    A new efficient hybrid intelligent approach based on fuzzy wavelet neural network (FWNN) was proposed for effectively modeling and simulating biodegradation process of Dimethyl phthalate (DMP) in an anaerobic/anoxic/oxic (AAO) wastewater treatment process. With the self learning and memory abilities of neural networks (NN), handling uncertainty capacity of fuzzy logic (FL), analyzing local details superiority of wavelet transform (WT) and global search of genetic algorithm (GA), the proposed hybrid intelligent model can extract the dynamic behavior and complex interrelationships from various water quality variables. For finding the optimal values for parameters of the proposed FWNN, a hybrid learning algorithm integrating an improved genetic optimization and gradient descent algorithm is employed. The results show, compared with NN model (optimized by GA) and kinetic model, the proposed FWNN model have the quicker convergence speed, the higher prediction performance, and smaller RMSE (0.080), MSE (0.0064), MAPE (1.8158) and higher R2 (0.9851) values. which illustrates FWNN model simulates effluent DMP more accurately than the mechanism model. PMID:28120889

  8. Case for a field-programmable gate array multicore hybrid machine for an image-processing application

    NASA Astrophysics Data System (ADS)

    Rakvic, Ryan N.; Ives, Robert W.; Lira, Javier; Molina, Carlos

    2011-01-01

    General purpose computer designers have recently begun adding cores to their processors in order to increase performance. For example, Intel has adopted a homogeneous quad-core processor as a base for general purpose computing. PlayStation3 (PS3) game consoles contain a multicore heterogeneous processor known as the Cell, which is designed to perform complex image processing algorithms at a high level. Can modern image-processing algorithms utilize these additional cores? On the other hand, modern advancements in configurable hardware, most notably field-programmable gate arrays (FPGAs) have created an interesting question for general purpose computer designers. Is there a reason to combine FPGAs with multicore processors to create an FPGA multicore hybrid general purpose computer? Iris matching, a repeatedly executed portion of a modern iris-recognition algorithm, is parallelized on an Intel-based homogeneous multicore Xeon system, a heterogeneous multicore Cell system, and an FPGA multicore hybrid system. Surprisingly, the cheaper PS3 slightly outperforms the Intel-based multicore on a core-for-core basis. However, both multicore systems are beaten by the FPGA multicore hybrid system by >50%.

  9. A Short-Term and High-Resolution System Load Forecasting Approach Using Support Vector Regression with Hybrid Parameters Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang

    This work proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system.« less

  10. Design and Implementation of Hybrid CORDIC Algorithm Based on Phase Rotation Estimation for NCO

    PubMed Central

    Zhang, Chaozhu; Han, Jinan; Li, Ke

    2014-01-01

    The numerical controlled oscillator has wide application in radar, digital receiver, and software radio system. Firstly, this paper introduces the traditional CORDIC algorithm. Then in order to improve computing speed and save resources, this paper proposes a kind of hybrid CORDIC algorithm based on phase rotation estimation applied in numerical controlled oscillator (NCO). Through estimating the direction of part phase rotation, the algorithm reduces part phase rotation and add-subtract unit, so that it decreases delay. Furthermore, the paper simulates and implements the numerical controlled oscillator by Quartus II software and Modelsim software. Finally, simulation results indicate that the improvement over traditional CORDIC algorithm is achieved in terms of ease of computation, resource utilization, and computing speed/delay while maintaining the precision. It is suitable for high speed and precision digital modulation and demodulation. PMID:25110750

  11. Prioritising watersheds on the basis of regional flood susceptibility and vulnerability in mountainous areas through the use of indicators

    NASA Astrophysics Data System (ADS)

    Rogelis, Carolina; Werner, Micha

    2013-04-01

    Settlements in peri-urban areas of many cities in mountainous areas such as in the Andes are susceptible to hazards such as flash floods and debris flows. Additionally these settlements are in many cases informal and thus vulnerable to such hazards, resulting in significant risk. Such watersheds are often quiet small, and generally there is little or no information from gauges to help characterise risk. To help identify watersheds in which flood management measures are to be targeted, a rapid assessment of risk is required. In this paper a novel approach is presented where indicators of susceptibility and vulnerability to flash floods were used to prioritize 106 mountain watersheds in Bogotá (Colombia). Variables recognized in literature to determine the dominant processes both in susceptibility and vulnerability to flash floods were used to construct the indicators. Susceptibility was considered to increase with flashiness and the possibility of debris flow events occurring. This was assessed through the use of an indicator composed of a morphometric indicator and a land use indicator. The former was constructed using morphological variables recognized in literature to significantly influence flashiness and occurrence of debris flows; the latter was constructed in terms of percentage of vegetation cover, urban area and bare soil. The morphometric indicator was compared with the results of a debris flow propagation algorithm to assess its capacity in indentifying the morphological conditions of a watershed that make it able to transport debris flows. Propagation was carried out through the use of the Modified Single Flow Direction algorithm, following previous identification of source areas by applying thresholds identified in the area-slope curve of the watersheds and empirical thresholds. Results show that the morphometric variables can be grouped in four categories: size, shape, hypsometry and energy, with the energy the component found to best explain the capability of the watershed to transport debris flows. The combination of the morphometric and land use indicators resulted in a susceptibility indicator that was compared with the available records of past floods in the area. This showed that the use of the land use indicator significantly improves the susceptibility assessment. Vulnerability was assessed in terms of indicators representing physical exposure, fragility of the socio-economic system and lack of resilience to cope and recover. Principal component analysis was subsequently applied to reduce variables and provide a representation of each of their facets by a component. This resulted in a composite indicator of susceptibility and vulnerability for each of the 106 watersheds. The indicator was compared with the history of flash flood damage in the watersheds. Results show that the indicator is useful in applications at regional scales for preliminary assessment to differentiate at spatial level the degree of flood susceptibility and vulnerability. This provides an initial and qualitative risk outlook in the study area and can be used for planning and prioritization of further more detailed studies.

  12. A hybrid frame concealment algorithm for H.264/AVC.

    PubMed

    Yan, Bo; Gharavi, Hamid

    2010-01-01

    In packet-based video transmissions, packets loss due to channel errors may result in the loss of the whole video frame. Recently, many error concealment algorithms have been proposed in order to combat channel errors; however, most of the existing algorithms can only deal with the loss of macroblocks and are not able to conceal the whole missing frame. In order to resolve this problem, in this paper, we have proposed a new hybrid motion vector extrapolation (HMVE) algorithm to recover the whole missing frame, and it is able to provide more accurate estimation for the motion vectors of the missing frame than other conventional methods. Simulation results show that it is highly effective and significantly outperforms other existing frame recovery methods.

  13. A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network

    NASA Astrophysics Data System (ADS)

    Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed

    This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.

  14. Numerical investigation of field enhancement by metal nano-particles using a hybrid FDTD-PSTD algorithm.

    PubMed

    Pernice, W H; Payne, F P; Gallagher, D F

    2007-09-03

    We present a novel numerical scheme for the simulation of the field enhancement by metal nano-particles in the time domain. The algorithm is based on a combination of the finite-difference time-domain method and the pseudo-spectral time-domain method for dispersive materials. The hybrid solver leads to an efficient subgridding algorithm that does not suffer from spurious field spikes as do FDTD schemes. Simulation of the field enhancement by gold particles shows the expected exponential field profile. The enhancement factors are computed for single particles and particle arrays. Due to the geometry conforming mesh the algorithm is stable for long integration times and thus suitable for the simulation of resonance phenomena in coupled nano-particle structures.

  15. An Effective Hybrid Firefly Algorithm with Harmony Search for Global Numerical Optimization

    PubMed Central

    Guo, Lihong; Wang, Gai-Ge; Wang, Heqi; Wang, Dinan

    2013-01-01

    A hybrid metaheuristic approach by hybridizing harmony search (HS) and firefly algorithm (FA), namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods. PMID:24348137

  16. A multilevel ant colony optimization algorithm for classical and isothermic DNA sequencing by hybridization with multiplicity information available.

    PubMed

    Kwarciak, Kamil; Radom, Marcin; Formanowicz, Piotr

    2016-04-01

    The classical sequencing by hybridization takes into account a binary information about sequence composition. A given element from an oligonucleotide library is or is not a part of the target sequence. However, the DNA chip technology has been developed and it enables to receive a partial information about multiplicity of each oligonucleotide the analyzed sequence consist of. Currently, it is not possible to assess the exact data of such type but even partial information should be very useful. Two realistic multiplicity information models are taken into consideration in this paper. The first one, called "one and many" assumes that it is possible to obtain information if a given oligonucleotide occurs in a reconstructed sequence once or more than once. According to the second model, called "one, two and many", one is able to receive from biochemical experiment information if a given oligonucleotide is present in an analyzed sequence once, twice or at least three times. An ant colony optimization algorithm has been implemented to verify the above models and to compare with existing algorithms for sequencing by hybridization which utilize the additional information. The proposed algorithm solves the problem with any kind of hybridization errors. Computational experiment results confirm that using even the partial information about multiplicity leads to increased quality of reconstructed sequences. Moreover, they also show that the more precise model enables to obtain better solutions and the ant colony optimization algorithm outperforms the existing ones. Test data sets and the proposed ant colony optimization algorithm are available on: http://bioserver.cs.put.poznan.pl/download/ACO4mSBH.zip. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Testing trivializing maps in the Hybrid Monte Carlo algorithm

    PubMed Central

    Engel, Georg P.; Schaefer, Stefan

    2011-01-01

    We test a recent proposal to use approximate trivializing maps in a field theory to speed up Hybrid Monte Carlo simulations. Simulating the CPN−1 model, we find a small improvement with the leading order transformation, which is however compensated by the additional computational overhead. The scaling of the algorithm towards the continuum is not changed. In particular, the effect of the topological modes on the autocorrelation times is studied. PMID:21969733

  18. Batch Scheduling for Hybrid Assembly Differentiation Flow Shop to Minimize Total Actual Flow Time

    NASA Astrophysics Data System (ADS)

    Maulidya, R.; Suprayogi; Wangsaputra, R.; Halim, A. H.

    2018-03-01

    A hybrid assembly differentiation flow shop is a three-stage flow shop consisting of Machining, Assembly and Differentiation Stages and producing different types of products. In the machining stage, parts are processed in batches on different (unrelated) machines. In the assembly stage, each part of the different parts is assembled into an assembly product. Finally, the assembled products will further be processed into different types of final products in the differentiation stage. In this paper, we develop a batch scheduling model for a hybrid assembly differentiation flow shop to minimize the total actual flow time defined as the total times part spent in the shop floor from the arrival times until its due date. We also proposed a heuristic algorithm for solving the problems. The proposed algorithm is tested using a set of hypothetic data. The solution shows that the algorithm can solve the problems effectively.

  19. Available Transfer Capability Determination Using Hybrid Evolutionary Algorithm

    NASA Astrophysics Data System (ADS)

    Jirapong, Peeraool; Ongsakul, Weerakorn

    2008-10-01

    This paper proposes a new hybrid evolutionary algorithm (HEA) based on evolutionary programming (EP), tabu search (TS), and simulated annealing (SA) to determine the available transfer capability (ATC) of power transactions between different control areas in deregulated power systems. The optimal power flow (OPF)-based ATC determination is used to evaluate the feasible maximum ATC value within real and reactive power generation limits, line thermal limits, voltage limits, and voltage and angle stability limits. The HEA approach simultaneously searches for real power generations except slack bus in a source area, real power loads in a sink area, and generation bus voltages to solve the OPF-based ATC problem. Test results on the modified IEEE 24-bus reliability test system (RTS) indicate that ATC determination by the HEA could enhance ATC far more than those from EP, TS, hybrid TS/SA, and improved EP (IEP) algorithms, leading to an efficient utilization of the existing transmission system.

  20. Performance of Geno-Fuzzy Model on rainfall-runoff predictions in claypan watersheds

    USDA-ARS?s Scientific Manuscript database

    Fuzzy logic provides a relatively simple approach to simulate complex hydrological systems while accounting for the uncertainty of environmental variables. The objective of this study was to develop a fuzzy inference system (FIS) with genetic algorithm (GA) optimization for membership functions (MF...

  1. Pitch-Learning Algorithm For Speech Encoders

    NASA Technical Reports Server (NTRS)

    Bhaskar, B. R. Udaya

    1988-01-01

    Adaptive algorithm detects and corrects errors in sequence of estimates of pitch period of speech. Algorithm operates in conjunction with techniques used to estimate pitch period. Used in such parametric and hybrid speech coders as linear predictive coders and adaptive predictive coders.

  2. Developing a Shuffled Complex-Self Adaptive Hybrid Evolution (SC-SAHEL) Framework for Water Resources Management and Water-Energy System Optimization

    NASA Astrophysics Data System (ADS)

    Rahnamay Naeini, M.; Sadegh, M.; AghaKouchak, A.; Hsu, K. L.; Sorooshian, S.; Yang, T.

    2017-12-01

    Meta-Heuristic optimization algorithms have gained a great deal of attention in a wide variety of fields. Simplicity and flexibility of these algorithms, along with their robustness, make them attractive tools for solving optimization problems. Different optimization methods, however, hold algorithm-specific strengths and limitations. Performance of each individual algorithm obeys the "No-Free-Lunch" theorem, which means a single algorithm cannot consistently outperform all possible optimization problems over a variety of problems. From users' perspective, it is a tedious process to compare, validate, and select the best-performing algorithm for a specific problem or a set of test cases. In this study, we introduce a new hybrid optimization framework, entitled Shuffled Complex-Self Adaptive Hybrid EvoLution (SC-SAHEL), which combines the strengths of different evolutionary algorithms (EAs) in a parallel computing scheme, and allows users to select the most suitable algorithm tailored to the problem at hand. The concept of SC-SAHEL is to execute different EAs as separate parallel search cores, and let all participating EAs to compete during the course of the search. The newly developed SC-SAHEL algorithm is designed to automatically select, the best performing algorithm for the given optimization problem. This algorithm is rigorously effective in finding the global optimum for several strenuous benchmark test functions, and computationally efficient as compared to individual EAs. We benchmark the proposed SC-SAHEL algorithm over 29 conceptual test functions, and two real-world case studies - one hydropower reservoir model and one hydrological model (SAC-SMA). Results show that the proposed framework outperforms individual EAs in an absolute majority of the test problems, and can provide competitive results to the fittest EA algorithm with more comprehensive information during the search. The proposed framework is also flexible for merging additional EAs, boundary-handling techniques, and sampling schemes, and has good potential to be used in Water-Energy system optimal operation and management.

  3. a New Multi-Spectral Threshold Normalized Difference Water Index Mst-Ndwi Water Extraction Method - a Case Study in Yanhe Watershed

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Zhao, H.; Hao, H.; Wang, C.

    2018-05-01

    Accurate remote sensing water extraction is one of the primary tasks of watershed ecological environment study. Since the Yanhe water system has typical characteristics of a small water volume and narrow river channel, which leads to the difficulty for conventional water extraction methods such as Normalized Difference Water Index (NDWI). A new Multi-Spectral Threshold segmentation of the NDWI (MST-NDWI) water extraction method is proposed to achieve the accurate water extraction in Yanhe watershed. In the MST-NDWI method, the spectral characteristics of water bodies and typical backgrounds on the Landsat/TM images have been evaluated in Yanhe watershed. The multi-spectral thresholds (TM1, TM4, TM5) based on maximum-likelihood have been utilized before NDWI water extraction to realize segmentation for a division of built-up lands and small linear rivers. With the proposed method, a water map is extracted from the Landsat/TM images in 2010 in China. An accuracy assessment is conducted to compare the proposed method with the conventional water indexes such as NDWI, Modified NDWI (MNDWI), Enhanced Water Index (EWI), and Automated Water Extraction Index (AWEI). The result shows that the MST-NDWI method generates better water extraction accuracy in Yanhe watershed and can effectively diminish the confusing background objects compared to the conventional water indexes. The MST-NDWI method integrates NDWI and Multi-Spectral Threshold segmentation algorithms, with richer valuable information and remarkable results in accurate water extraction in Yanhe watershed.

  4. From community preferences to design: Investigation of human-centered optimization algorithms in web-based, democratic planning of watershed restoration

    NASA Astrophysics Data System (ADS)

    Babbar-Sebens, M.; Mukhopadhyay, S.

    2014-12-01

    Web 2.0 technologies are useful resources for reaching out to larger stakeholder communities and involve them in policy making and planning efforts. While these technologies have been used in the past to support education and communication endeavors, we have developed a novel, web-based, interactive planning tool that involves the community in using science-based methods for the design of potential runoff management strategies on their landscape. The tool, Watershed REstoration using Spatio-Temporal Optimization of Resources (WRESTORE), uses a democratic voting process coupled with visualization interfaces, computational simulation and optimization models, and user modeling techniques to support a human-centered design approach. The tool can be used to engage diverse watershed stakeholders and landowners via the internet, thereby improving opportunities for outreach and collaborations. Users are able to (a) design multiple types of conservation practices at their field-scale catchment and at the entire watershed scale, (b) examine impacts and limitations of their decisions on their neighboring catchments and on the entire watershed, (c) compare alternatives via a cost-benefit analysis, (d) vote on their "favorite" designs based on their preferences and constraints, and (e) propose their "favorite" alternatives to policy makers and other stakeholders. In this presentation, we will demonstrate the effectiveness of WRESTORE for designing alternatives of conservation practices to reduce peak flows in a Midwestern watershed, present results on multiple approaches for engaging with larger communities, and discuss potential for future developments.

  5. Synergistic use of active and passive microwave in soil moisture estimation

    NASA Technical Reports Server (NTRS)

    O'Neill, P.; Chauhan, N.; Jackson, T.; Saatchi, S.

    1992-01-01

    Data gathered during the MACHYDRO experiment in central Pennsylvania in July 1990 have been utilized to study the synergistic use of active and passive microwave systems for estimating soil moisture. These data sets were obtained during an eleven-day period with NASA's Airborne Synthetic Aperture Radar (AIRSAR) and Push-Broom Microwave Radiometer (PBMR) over an instrumented watershed which included agricultural fields with a number of different crop covers. Simultaneous ground truth measurements were also made in order to characterize the state of vegetation and soil moisture under a variety of meteorological conditions. A combination algorithm is presented as applied to a representative corn field in the MACHYDRO watershed.

  6. Hybrid wavefront sensing and image correction algorithm for imaging through turbulent media

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Robertson Rzasa, John; Ko, Jonathan; Davis, Christopher C.

    2017-09-01

    It is well known that passive image correction of turbulence distortions often involves using geometry-dependent deconvolution algorithms. On the other hand, active imaging techniques using adaptive optic correction should use the distorted wavefront information for guidance. Our work shows that a hybrid hardware-software approach is possible to obtain accurate and highly detailed images through turbulent media. The processing algorithm also takes much fewer iteration steps in comparison with conventional image processing algorithms. In our proposed approach, a plenoptic sensor is used as a wavefront sensor to guide post-stage image correction on a high-definition zoomable camera. Conversely, we show that given the ground truth of the highly detailed image and the plenoptic imaging result, we can generate an accurate prediction of the blurred image on a traditional zoomable camera. Similarly, the ground truth combined with the blurred image from the zoomable camera would provide the wavefront conditions. In application, our hybrid approach can be used as an effective way to conduct object recognition in a turbulent environment where the target has been significantly distorted or is even unrecognizable.

  7. Hybrid stochastic simplifications for multiscale gene networks.

    PubMed

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-09-07

    Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach.

  8. Invasive hybridization in a threatened species is accelerated by climate change

    USGS Publications Warehouse

    Muhlfeld, Clint C.; Kovach, Ryan P.; Jones, Leslie A.; Al-Chokhachy, Robert K.; Boyer, Matthew C.; Leary, Robb F.; Lowe, Winsor H.; Luikart, Gordon; Allendorf, Fred W.

    2014-01-01

    Climate change will decrease worldwide biodiversity through a number of potential pathways, including invasive hybridization (cross-breeding between invasive and native species). How climate warming influences the spread of hybridization and loss of native genomes poses difficult ecological and evolutionary questions with little empirical information to guide conservation management decisions. Here we combine long-term genetic monitoring data with high-resolution climate and stream temperature predictions to evaluate how recent climate warming has influenced the spatio-temporal spread of human-mediated hybridization between threatened native westslope cutthroat trout (Oncorhynchus clarkii lewisi) and non-native rainbow trout (Oncorhynchus mykiss), the world’s most widely introduced invasive fish. Despite widespread release of millions of rainbow trout over the past century within the Flathead River system, a large relatively pristine watershed in western North America, historical samples revealed that hybridization was prevalent only in one (source) population. During a subsequent 30-year period of accelerated warming, hybridization spread rapidly and was strongly linked to interactions between climatic drivers—precipitation and temperature—and distance to the source population. Specifically, decreases in spring precipitation and increases in summer stream temperature probably promoted upstream expansion of hybridization throughout the system. This study shows that rapid climate warming can exacerbate interactions between native and non-native species through invasive hybridization, which could spell genomic extinction for many species.

  9. A Novel Grid SINS/DVL Integrated Navigation Algorithm for Marine Application

    PubMed Central

    Kang, Yingyao; Zhao, Lin; Cheng, Jianhua; Fan, Xiaoliang

    2018-01-01

    Integrated navigation algorithms under the grid frame have been proposed based on the Kalman filter (KF) to solve the problem of navigation in some special regions. However, in the existing study of grid strapdown inertial navigation system (SINS)/Doppler velocity log (DVL) integrated navigation algorithms, the Earth models of the filter dynamic model and the SINS mechanization are not unified. Besides, traditional integrated systems with the KF based correction scheme are susceptible to measurement errors, which would decrease the accuracy and robustness of the system. In this paper, an adaptive robust Kalman filter (ARKF) based hybrid-correction grid SINS/DVL integrated navigation algorithm is designed with the unified reference ellipsoid Earth model to improve the navigation accuracy in middle-high latitude regions for marine application. Firstly, to unify the Earth models, the mechanization of grid SINS is introduced and the error equations are derived based on the same reference ellipsoid Earth model. Then, a more accurate grid SINS/DVL filter model is designed according to the new error equations. Finally, a hybrid-correction scheme based on the ARKF is proposed to resist the effect of measurement errors. Simulation and experiment results show that, compared with the traditional algorithms, the proposed navigation algorithm can effectively improve the navigation performance in middle-high latitude regions by the unified Earth models and the ARKF based hybrid-correction scheme. PMID:29373549

  10. SNBRFinder: A Sequence-Based Hybrid Algorithm for Enhanced Prediction of Nucleic Acid-Binding Residues.

    PubMed

    Yang, Xiaoxia; Wang, Jia; Sun, Jun; Liu, Rong

    2015-01-01

    Protein-nucleic acid interactions are central to various fundamental biological processes. Automated methods capable of reliably identifying DNA- and RNA-binding residues in protein sequence are assuming ever-increasing importance. The majority of current algorithms rely on feature-based prediction, but their accuracy remains to be further improved. Here we propose a sequence-based hybrid algorithm SNBRFinder (Sequence-based Nucleic acid-Binding Residue Finder) by merging a feature predictor SNBRFinderF and a template predictor SNBRFinderT. SNBRFinderF was established using the support vector machine whose inputs include sequence profile and other complementary sequence descriptors, while SNBRFinderT was implemented with the sequence alignment algorithm based on profile hidden Markov models to capture the weakly homologous template of query sequence. Experimental results show that SNBRFinderF was clearly superior to the commonly used sequence profile-based predictor and SNBRFinderT can achieve comparable performance to the structure-based template methods. Leveraging the complementary relationship between these two predictors, SNBRFinder reasonably improved the performance of both DNA- and RNA-binding residue predictions. More importantly, the sequence-based hybrid prediction reached competitive performance relative to our previous structure-based counterpart. Our extensive and stringent comparisons show that SNBRFinder has obvious advantages over the existing sequence-based prediction algorithms. The value of our algorithm is highlighted by establishing an easy-to-use web server that is freely accessible at http://ibi.hzau.edu.cn/SNBRFinder.

  11. A New Approach for Solving the Generalized Traveling Salesman Problem

    NASA Astrophysics Data System (ADS)

    Pop, P. C.; Matei, O.; Sabo, C.

    The generalized traveling problem (GTSP) is an extension of the classical traveling salesman problem. The GTSP is known to be an NP-hard problem and has many interesting applications. In this paper we present a local-global approach for the generalized traveling salesman problem. Based on this approach we describe a novel hybrid metaheuristic algorithm for solving the problem using genetic algorithms. Computational results are reported for Euclidean TSPlib instances and compared with the existing ones. The obtained results point out that our hybrid algorithm is an appropriate method to explore the search space of this complex problem and leads to good solutions in a reasonable amount of time.

  12. A process-based algorithm for simulating terraces in SWAT

    USDA-ARS?s Scientific Manuscript database

    Terraces in crop fields are one of the most important soil and water conservation measures that affect runoff and erosion processes in a watershed. In large hydrological programs such as the Soil and Water Assessment Tool (SWAT), terrace effects are simulated by adjusting the slope length and the US...

  13. An Ant Colony Optimization and Hybrid Metaheuristics Algorithm to Solve the Split Delivery Vehicle Routing Problem

    DTIC Science & Technology

    2015-01-01

    programming formulation of traveling salesman problems , Journal of the ACM, 7(4), 326-329. Montemanni, R., Gambardella, L. M., Rizzoli, A.E., Donati. A.V... salesman problem . BioSystem, 43(1), 73-81. Dror, M., Trudeau, P., 1989. Savings by split delivery routing. Transportation Science, 23, 141- 145. Dror, M...An Ant Colony Optimization and Hybrid Metaheuristics Algorithm to solve the Split Delivery Vehicle Routing Problem Authors: Gautham Rajappa

  14. Identification of piecewise affine systems based on fuzzy PCA-guided robust clustering technique

    NASA Astrophysics Data System (ADS)

    Khanmirza, Esmaeel; Nazarahari, Milad; Mousavi, Alireza

    2016-12-01

    Hybrid systems are a class of dynamical systems whose behaviors are based on the interaction between discrete and continuous dynamical behaviors. Since a general method for the analysis of hybrid systems is not available, some researchers have focused on specific types of hybrid systems. Piecewise affine (PWA) systems are one of the subsets of hybrid systems. The identification of PWA systems includes the estimation of the parameters of affine subsystems and the coefficients of the hyperplanes defining the partition of the state-input domain. In this paper, we have proposed a PWA identification approach based on a modified clustering technique. By using a fuzzy PCA-guided robust k-means clustering algorithm along with neighborhood outlier detection, the two main drawbacks of the well-known clustering algorithms, i.e., the poor initialization and the presence of outliers, are eliminated. Furthermore, this modified clustering technique enables us to determine the number of subsystems without any prior knowledge about system. In addition, applying the structure of the state-input domain, that is, considering the time sequence of input-output pairs, provides a more efficient clustering algorithm, which is the other novelty of this work. Finally, the proposed algorithm has been evaluated by parameter identification of an IGV servo actuator. Simulation together with experiment analysis has proved the effectiveness of the proposed method.

  15. Hybrid genetic algorithm in the Hopfield network for maximum 2-satisfiability problem

    NASA Astrophysics Data System (ADS)

    Kasihmuddin, Mohd Shareduwan Mohd; Sathasivam, Saratha; Mansor, Mohd. Asyraf

    2017-08-01

    Heuristic method was designed for finding optimal solution more quickly compared to classical methods which are too complex to comprehend. In this study, a hybrid approach that utilizes Hopfield network and genetic algorithm in doing maximum 2-Satisfiability problem (MAX-2SAT) was proposed. Hopfield neural network was used to minimize logical inconsistency in interpretations of logic clauses or program. Genetic algorithm (GA) has pioneered the implementation of methods that exploit the idea of combination and reproduce a better solution. The simulation incorporated with and without genetic algorithm will be examined by using Microsoft Visual 2013 C++ Express software. The performance of both searching techniques in doing MAX-2SAT was evaluate based on global minima ratio, ratio of satisfied clause and computation time. The result obtained form the computer simulation demonstrates the effectiveness and acceleration features of genetic algorithm in doing MAX-2SAT in Hopfield network.

  16. Remote-sensing image encryption in hybrid domains

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoqiang; Zhu, Guiliang; Ma, Shilong

    2012-04-01

    Remote-sensing technology plays an important role in military and industrial fields. Remote-sensing image is the main means of acquiring information from satellites, which always contain some confidential information. To securely transmit and store remote-sensing images, we propose a new image encryption algorithm in hybrid domains. This algorithm makes full use of the advantages of image encryption in both spatial domain and transform domain. First, the low-pass subband coefficients of image DWT (discrete wavelet transform) decomposition are sorted by a PWLCM system in transform domain. Second, the image after IDWT (inverse discrete wavelet transform) reconstruction is diffused with 2D (two-dimensional) Logistic map and XOR operation in spatial domain. The experiment results and algorithm analyses show that the new algorithm possesses a large key space and can resist brute-force, statistical and differential attacks. Meanwhile, the proposed algorithm has the desirable encryption efficiency to satisfy requirements in practice.

  17. Evolutionary Algorithms Approach to the Solution of Damage Detection Problems

    NASA Astrophysics Data System (ADS)

    Salazar Pinto, Pedro Yoajim; Begambre, Oscar

    2010-09-01

    In this work is proposed a new Self-Configured Hybrid Algorithm by combining the Particle Swarm Optimization (PSO) and a Genetic Algorithm (GA). The aim of the proposed strategy is to increase the stability and accuracy of the search. The central idea is the concept of Guide Particle, this particle (the best PSO global in each generation) transmits its information to a particle of the following PSO generation, which is controlled by the GA. Thus, the proposed hybrid has an elitism feature that improves its performance and guarantees the convergence of the procedure. In different test carried out in benchmark functions, reported in the international literature, a better performance in stability and accuracy was observed; therefore the new algorithm was used to identify damage in a simple supported beam using modal data. Finally, it is worth noting that the algorithm is independent of the initial definition of heuristic parameters.

  18. The Ordered Clustered Travelling Salesman Problem: A Hybrid Genetic Algorithm

    PubMed Central

    Ahmed, Zakir Hussain

    2014-01-01

    The ordered clustered travelling salesman problem is a variation of the usual travelling salesman problem in which a set of vertices (except the starting vertex) of the network is divided into some prespecified clusters. The objective is to find the least cost Hamiltonian tour in which vertices of any cluster are visited contiguously and the clusters are visited in the prespecified order. The problem is NP-hard, and it arises in practical transportation and sequencing problems. This paper develops a hybrid genetic algorithm using sequential constructive crossover, 2-opt search, and a local search for obtaining heuristic solution to the problem. The efficiency of the algorithm has been examined against two existing algorithms for some asymmetric and symmetric TSPLIB instances of various sizes. The computational results show that the proposed algorithm is very effective in terms of solution quality and computational time. Finally, we present solution to some more symmetric TSPLIB instances. PMID:24701148

  19. Understanding controls of hydrologic processes across two headwater monolithological catchments using model-data synthesis

    NASA Astrophysics Data System (ADS)

    Xiao, D.; Shi, Y.; Hoagland, B.; Del Vecchio, J.; Russo, T. A.; DiBiase, R. A.; Li, L.

    2017-12-01

    How do watershed hydrologic processes differ in catchments derived from different lithology? This study compares two first order, deciduous forest watersheds in Pennsylvania, a sandstone watershed, Garner Run (GR, 1.34 km2), and a shale-derived watershed, Shale Hills (SH, 0.08 km2). Both watersheds are simulated using a combination of national datasets and field measurements, and a physics-based land surface hydrologic model, Flux-PIHM. We aim to evaluate the effects of lithology on watershed hydrology and assess if we can simulate a new watershed without intensive measurements, i.e., directly use calibration information from one watershed (SH) to reproduce hydrologic dynamics of another watershed (GR). Without any calibration, the model at GR based on national datasets and calibration inforamtion from SH cannot capture some discharge peaks or the baseflow during dry periods. The model prediction agrees well with the GR field discharge and soil moisture after calibrating the soil hydraulic parameters using the uncertainty based Hornberger-Spear-Young algorithm and the Latin Hypercube Sampling method. Agreeing with the field observation and national datasets, the difference in parameter values shows that the sandstone watershed has a larger averaged soil pore diameter, greater water storage created by porosity, lower water retention ability, and greater preferential flow. The water budget calculation shows that the riparian zone and the colluvial valley serves as buffer zones that stores water at GR. Using the same procedure, we compared Flux-PIHM simulations with and without a field measured surface boulder map at GR. When the boulder map is used, the prediction of areal averaged soil moisture is improved, without performing extra calibration. When calibrated separately, the cases with or without boulder map yield different calibration values, but their hydrologic predictions are similar, showing equifinality. The calibrated soil hydraulic parameter values in the with boulder map case is more physically plausible than the without boulder map case. We switched the topography and soil properties between GR and SH, and results indicate that the hydrologic processes are more sensitive to changes in domain topography than to changes in the soil properties.

  20. Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images

    NASA Astrophysics Data System (ADS)

    Awumah, Anna; Mahanti, Prasun; Robinson, Mark

    2016-10-01

    Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).

  1. SU-F-J-88: Comparison of Two Deformable Image Registration Algorithms for CT-To-CT Contour Propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopal, A; Xu, H; Chen, S

    Purpose: To compare the contour propagation accuracy of two deformable image registration (DIR) algorithms in the Raystation treatment planning system – the “Hybrid” algorithm based on image intensities and anatomical information; and the “Biomechanical” algorithm based on linear anatomical elasticity and finite element modeling. Methods: Both DIR algorithms were used for CT-to-CT deformation for 20 lung radiation therapy patients that underwent treatment plan revisions. Deformation accuracy was evaluated using landmark tracking to measure the target registration error (TRE) and inverse consistency error (ICE). The deformed contours were also evaluated against physician drawn contours using Dice similarity coefficients (DSC). Contour propagationmore » was qualitatively assessed using a visual quality score assigned by physicians, and a refinement quality score (0 0.9 for lungs, > 0.85 for heart, > 0.8 for liver) and similar qualitative assessments (VQS < 0.35, RQS > 0.75 for lungs). When anatomical structures were used to control the deformation, the DSC improved more significantly for the biomechanical DIR compared to the hybrid DIR, while the VQS and RQS improved only for the controlling structures. However, while the inclusion of controlling structures improved the TRE for the hybrid DIR, it increased the TRE for the biomechanical DIR. Conclusion: The hybrid DIR was found to perform slightly better than the biomechanical DIR based on lower TRE while the DSC, VQS, and RQS studies yielded comparable results for both. The use of controlling structures showed considerable improvement in the hybrid DIR results and is recommended for clinical use in contour propagation.« less

  2. Hybrid optimization and Bayesian inference techniques for a non-smooth radiation detection problem

    DOE PAGES

    Stefanescu, Razvan; Schmidt, Kathleen; Hite, Jason; ...

    2016-12-12

    In this paper, we propose several algorithms to recover the location and intensity of a radiation source located in a simulated 250 × 180 m block of an urban center based on synthetic measurements. Radioactive decay and detection are Poisson random processes, so we employ likelihood functions based on this distribution. Owing to the domain geometry and the proposed response model, the negative logarithm of the likelihood is only piecewise continuous differentiable, and it has multiple local minima. To address these difficulties, we investigate three hybrid algorithms composed of mixed optimization techniques. For global optimization, we consider simulated annealing, particlemore » swarm, and genetic algorithm, which rely solely on objective function evaluations; that is, they do not evaluate the gradient in the objective function. By employing early stopping criteria for the global optimization methods, a pseudo-optimum point is obtained. This is subsequently utilized as the initial value by the deterministic implicit filtering method, which is able to find local extrema in non-smooth functions, to finish the search in a narrow domain. These new hybrid techniques, combining global optimization and implicit filtering address, difficulties associated with the non-smooth response, and their performances, are shown to significantly decrease the computational time over the global optimization methods. To quantify uncertainties associated with the source location and intensity, we employ the delayed rejection adaptive Metropolis and DiffeRential Evolution Adaptive Metropolis algorithms. Finally, marginal densities of the source properties are obtained, and the means of the chains compare accurately with the estimates produced by the hybrid algorithms.« less

  3. Automatic generation of smart earthquake-resistant building system: Hybrid system of base-isolation and building-connection.

    PubMed

    Kasagi, M; Fujita, K; Tsuji, M; Takewaki, I

    2016-02-01

    A base-isolated building may sometimes exhibit an undesirable large response to a long-duration, long-period earthquake ground motion and a connected building system without base-isolation may show a large response to a near-fault (rather high-frequency) earthquake ground motion. To overcome both deficiencies, a new hybrid control system of base-isolation and building-connection is proposed and investigated. In this new hybrid building system, a base-isolated building is connected to a stiffer free wall with oil dampers. It has been demonstrated in a preliminary research that the proposed hybrid system is effective both for near-fault (rather high-frequency) and long-duration, long-period earthquake ground motions and has sufficient redundancy and robustness for a broad range of earthquake ground motions.An automatic generation algorithm of this kind of smart structures of base-isolation and building-connection hybrid systems is presented in this paper. It is shown that, while the proposed algorithm does not work well in a building without the connecting-damper system, it works well in the proposed smart hybrid system with the connecting damper system.

  4. Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management.

    PubMed

    Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E; Choi, Tsan-Ming

    2016-08-01

    In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management.

  5. An Effective Hybrid Cuckoo Search Algorithm with Improved Shuffled Frog Leaping Algorithm for 0-1 Knapsack Problems

    PubMed Central

    Wang, Gai-Ge; Feng, Qingjiang; Zhao, Xiang-Jun

    2014-01-01

    An effective hybrid cuckoo search algorithm (CS) with improved shuffled frog-leaping algorithm (ISFLA) is put forward for solving 0-1 knapsack problem. First of all, with the framework of SFLA, an improved frog-leap operator is designed with the effect of the global optimal information on the frog leaping and information exchange between frog individuals combined with genetic mutation with a small probability. Subsequently, in order to improve the convergence speed and enhance the exploitation ability, a novel CS model is proposed with considering the specific advantages of Lévy flights and frog-leap operator. Furthermore, the greedy transform method is used to repair the infeasible solution and optimize the feasible solution. Finally, numerical simulations are carried out on six different types of 0-1 knapsack instances, and the comparative results have shown the effectiveness of the proposed algorithm and its ability to achieve good quality solutions, which outperforms the binary cuckoo search, the binary differential evolution, and the genetic algorithm. PMID:25404940

  6. Accelerating k-NN Algorithm with Hybrid MPI and OpenSHMEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Jian; Hamidouche, Khaled; Zheng, Jie

    2015-08-05

    Machine Learning algorithms are benefiting from the continuous improvement of programming models, including MPI, MapReduce and PGAS. k-Nearest Neighbors (k-NN) algorithm is a widely used machine learning algorithm, applied to supervised learning tasks such as classification. Several parallel implementations of k-NN have been proposed in the literature and practice. However, on high-performance computing systems with high-speed interconnects, it is important to further accelerate existing designs of the k-NN algorithm through taking advantage of scalable programming models. To improve the performance of k-NN on large-scale environment with InfiniBand network, this paper proposes several alternative hybrid MPI+OpenSHMEM designs and performs a systemicmore » evaluation and analysis on typical workloads. The hybrid designs leverage the one-sided memory access to better overlap communication with computation than the existing pure MPI design, and propose better schemes for efficient buffer management. The implementation based on k-NN program from MaTEx with MVAPICH2-X (Unified MPI+PGAS Communication Runtime over InfiniBand) shows up to 9.0% time reduction for training KDD Cup 2010 workload over 512 cores, and 27.6% time reduction for small workload with balanced communication and computation. Experiments of running with varied number of cores show that our design can maintain good scalability.« less

  7. A novel approach for dimension reduction of microarray.

    PubMed

    Aziz, Rabia; Verma, C K; Srivastava, Namita

    2017-12-01

    This paper proposes a new hybrid search technique for feature (gene) selection (FS) using Independent component analysis (ICA) and Artificial Bee Colony (ABC) called ICA+ABC, to select informative genes based on a Naïve Bayes (NB) algorithm. An important trait of this technique is the optimization of ICA feature vector using ABC. ICA+ABC is a hybrid search algorithm that combines the benefits of extraction approach, to reduce the size of data and wrapper approach, to optimize the reduced feature vectors. This hybrid search technique is facilitated by evaluating the performance of ICA+ABC on six standard gene expression datasets of classification. Extensive experiments were conducted to compare the performance of ICA+ABC with the results obtained from recently published Minimum Redundancy Maximum Relevance (mRMR) +ABC algorithm for NB classifier. Also to check the performance that how ICA+ABC works as feature selection with NB classifier, compared the combination of ICA with popular filter techniques and with other similar bio inspired algorithm such as Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). The result shows that ICA+ABC has a significant ability to generate small subsets of genes from the ICA feature vector, that significantly improve the classification accuracy of NB classifier compared to other previously suggested methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Localization Algorithms of Underwater Wireless Sensor Networks: A Survey

    PubMed Central

    Han, Guangjie; Jiang, Jinfang; Shu, Lei; Xu, Yongjun; Wang, Feng

    2012-01-01

    In Underwater Wireless Sensor Networks (UWSNs), localization is one of most important technologies since it plays a critical role in many applications. Motivated by widespread adoption of localization, in this paper, we present a comprehensive survey of localization algorithms. First, we classify localization algorithms into three categories based on sensor nodes’ mobility: stationary localization algorithms, mobile localization algorithms and hybrid localization algorithms. Moreover, we compare the localization algorithms in detail and analyze future research directions of localization algorithms in UWSNs. PMID:22438752

  9. Hybrid Resource Allocation Scheme with Proportional Fairness in OFDMA-Based Cognitive Radio Systems

    NASA Astrophysics Data System (ADS)

    Li, Li; Xu, Changqing; Fan, Pingzhi; He, Jian

    In this paper, the resource allocation problem for proportional fairness in hybrid Cognitive Radio (CR) systems is studied. In OFDMA-based CR systems, traditional resource allocation algorithms can not guarantee proportional rates among CR users (CRU) in each OFDM symbol because the number of available subchannels might be smaller than that of CRUs in some OFDM symbols. To deal with this time-varying nature of available spectrum resource, a hybrid CR scheme in which CRUs are allowed to use subchannels in both spectrum holes and primary users (PU) bands is adopted and a resource allocation algorithm is proposed to guarantee proportional rates among CRUs with no undue interference to PUs.

  10. Hybrid stochastic simulation of reaction-diffusion systems with slow and fast dynamics.

    PubMed

    Strehl, Robert; Ilie, Silvana

    2015-12-21

    In this paper, we present a novel hybrid method to simulate discrete stochastic reaction-diffusion models arising in biochemical signaling pathways. We study moderately stiff systems, for which we can partition each reaction or diffusion channel into either a slow or fast subset, based on its propensity. Numerical approaches missing this distinction are often limited with respect to computational run time or approximation quality. We design an approximate scheme that remedies these pitfalls by using a new blending strategy of the well-established inhomogeneous stochastic simulation algorithm and the tau-leaping simulation method. The advantages of our hybrid simulation algorithm are demonstrated on three benchmarking systems, with special focus on approximation accuracy and efficiency.

  11. Localization for robotic capsule looped by axially magnetized permanent-magnet ring based on hybrid strategy.

    PubMed

    Yang, Wanan; Li, Yan; Qin, Fengqing

    2015-01-01

    To actively maneuver a robotic capsule for interactive diagnosis in the gastrointestinal tract, visualizing accurate position and orientation of the capsule when it moves in the gastrointestinal tract is essential. A possible method that encloses the circuits, batteries, imaging device, etc into the capsule looped by an axially magnetized permanent-magnet ring is proposed. Based on expression of the axially magnetized permanent-magnet ring's magnetic fields, a localization and orientation model was established. An improved hybrid strategy that combines the advantages of particle-swarm optimization, clone algorithm, and the Levenberg-Marquardt algorithm was found to solve the model. Experiments showed that the hybrid strategy has good accuracy, convergence, and real time performance.

  12. A hybrid linear/nonlinear training algorithm for feedforward neural networks.

    PubMed

    McLoone, S; Brown, M D; Irwin, G; Lightbody, A

    1998-01-01

    This paper presents a new hybrid optimization strategy for training feedforward neural networks. The algorithm combines gradient-based optimization of nonlinear weights with singular value decomposition (SVD) computation of linear weights in one integrated routine. It is described for the multilayer perceptron (MLP) and radial basis function (RBF) networks and then extended to the local model network (LMN), a new feedforward structure in which a global nonlinear model is constructed from a set of locally valid submodels. Simulation results are presented demonstrating the superiority of the new hybrid training scheme compared to second-order gradient methods. It is particularly effective for the LMN architecture where the linear to nonlinear parameter ratio is large.

  13. Improved classification accuracy by feature extraction using genetic algorithms

    NASA Astrophysics Data System (ADS)

    Patriarche, Julia; Manduca, Armando; Erickson, Bradley J.

    2003-05-01

    A feature extraction algorithm has been developed for the purposes of improving classification accuracy. The algorithm uses a genetic algorithm / hill-climber hybrid to generate a set of linearly recombined features, which may be of reduced dimensionality compared with the original set. The genetic algorithm performs the global exploration, and a hill climber explores local neighborhoods. Hybridizing the genetic algorithm with a hill climber improves both the rate of convergence, and the final overall cost function value; it also reduces the sensitivity of the genetic algorithm to parameter selection. The genetic algorithm includes the operators: crossover, mutation, and deletion / reactivation - the last of these effects dimensionality reduction. The feature extractor is supervised, and is capable of deriving a separate feature space for each tissue (which are reintegrated during classification). A non-anatomical digital phantom was developed as a gold standard for testing purposes. In tests with the phantom, and with images of multiple sclerosis patients, classification with feature extractor derived features yielded lower error rates than using standard pulse sequences, and with features derived using principal components analysis. Using the multiple sclerosis patient data, the algorithm resulted in a mean 31% reduction in classification error of pure tissues.

  14. Optimal Control of Hybrid Systems in Air Traffic Applications

    NASA Astrophysics Data System (ADS)

    Kamgarpour, Maryam

    Growing concerns over the scalability of air traffic operations, air transportation fuel emissions and prices, as well as the advent of communication and sensing technologies motivate improvements to the air traffic management system. To address such improvements, in this thesis a hybrid dynamical model as an abstraction of the air traffic system is considered. Wind and hazardous weather impacts are included using a stochastic model. This thesis focuses on the design of algorithms for verification and control of hybrid and stochastic dynamical systems and the application of these algorithms to air traffic management problems. In the deterministic setting, a numerically efficient algorithm for optimal control of hybrid systems is proposed based on extensions of classical optimal control techniques. This algorithm is applied to optimize the trajectory of an Airbus 320 aircraft in the presence of wind and storms. In the stochastic setting, the verification problem of reaching a target set while avoiding obstacles (reach-avoid) is formulated as a two-player game to account for external agents' influence on system dynamics. The solution approach is applied to air traffic conflict prediction in the presence of stochastic wind. Due to the uncertainty in forecasts of the hazardous weather, and hence the unsafe regions of airspace for aircraft flight, the reach-avoid framework is extended to account for stochastic target and safe sets. This methodology is used to maximize the probability of the safety of aircraft paths through hazardous weather. Finally, the problem of modeling and optimization of arrival air traffic and runway configuration in dense airspace subject to stochastic weather data is addressed. This problem is formulated as a hybrid optimal control problem and is solved with a hierarchical approach that decouples safety and performance. As illustrated with this problem, the large scale of air traffic operations motivates future work on the efficient implementation of the proposed algorithms.

  15. Exploring the correlation between annual precipitation and potential evaporation

    NASA Astrophysics Data System (ADS)

    Chen, X.; Buchberger, S. G.

    2017-12-01

    The interdependence between precipitation and potential evaporation is closely related to the classic Budyko framework. In this study, a systematic investigation of the correlation between precipitation and potential evaporation at the annual time step is conducted at both point scale and watershed scale. The point scale precipitation and potential evaporation data over the period of 1984-2015 are collected from 259 weather stations across the United States. The watershed scale precipitation data of 203 watersheds across the United States are obtained from the Model Parameter Estimation Experiment (MOPEX) dataset from 1983 to 2002; and potential evaporation data of these 203 watersheds in the same period are obtained from a remote-sensing algorithm. The results show that majority of the weather stations (77%) and watersheds (79%) exhibit a statistically significant negative correlation between annual precipitation and annual potential evaporation. The aggregated data cloud of precipitation versus potential evaporation follows a curve based on the combination of the Budyko-type equation and Bouchet's complementary relationship. Our result suggests that annual precipitation and potential evaporation are not independent when both Budyko's hypothesis and Bouchet's hypothesis are valid. Furthermore, we find that the wet surface evaporation, which is controlled primarily by short wave radiation as defined in Bouchet's hypothesis, exhibits less dependence on precipitation than the potential evaporation. As a result, we suggest that wet surface evaporation is a better representation of energy supply than potential evaporation in the Budyko framework.

  16. Identification of drought in Dhalai river watershed using MCDM and ANN models

    NASA Astrophysics Data System (ADS)

    Aher, Sainath; Shinde, Sambhaji; Guha, Shantamoy; Majumder, Mrinmoy

    2017-03-01

    An innovative approach for drought identification is developed using Multi-Criteria Decision Making (MCDM) and Artificial Neural Network (ANN) models from surveyed drought parameter data around the Dhalai river watershed in Tripura hinterlands, India. Total eight drought parameters, i.e., precipitation, soil moisture, evapotranspiration, vegetation canopy, cropping pattern, temperature, cultivated land, and groundwater level were obtained from expert, literature and cultivator survey. Then, the Analytic Hierarchy Process (AHP) and Analytic Network Process (ANP) were used for weighting of parameters and Drought Index Identification (DII). Field data of weighted parameters in the meso scale Dhalai River watershed were collected and used to train the ANN model. The developed ANN model was used in the same watershed for identification of drought. Results indicate that the Limited-Memory Quasi-Newton algorithm was better than the commonly used training method. Results obtained from the ANN model shows the drought index developed from the study area ranges from 0.32 to 0.72. Overall analysis revealed that, with appropriate training, the ANN model can be used in the areas where the model is calibrated, or other areas where the range of input parameters is similar to the calibrated region for drought identification.

  17. Strategies to reduce the complexity of hydrologic data assimilation for high-dimensional models

    NASA Astrophysics Data System (ADS)

    Hernandez, F.; Liang, X.

    2017-12-01

    Probabilistic forecasts in the geosciences offer invaluable information by allowing to estimate the uncertainty of predicted conditions (including threats like floods and droughts). However, while forecast systems based on modern data assimilation algorithms are capable of producing multi-variate probability distributions of future conditions, the computational resources required to fully characterize the dependencies between the model's state variables render their applicability impractical for high-resolution cases. This occurs because of the quadratic space complexity of storing the covariance matrices that encode these dependencies and the cubic time complexity of performing inference operations with them. In this work we introduce two complementary strategies to reduce the size of the covariance matrices that are at the heart of Bayesian assimilation methods—like some variants of (ensemble) Kalman filters and of particle filters—and variational methods. The first strategy involves the optimized grouping of state variables by clustering individual cells of the model into "super-cells." A dynamic fuzzy clustering approach is used to take into account the states (e.g., soil moisture) and forcings (e.g., precipitation) of each cell at each time step. The second strategy consists in finding a compressed representation of the covariance matrix that still encodes the most relevant information but that can be more efficiently stored and processed. A learning and a belief-propagation inference algorithm are developed to take advantage of this modified low-rank representation. The two proposed strategies are incorporated into OPTIMISTS, a state-of-the-art hybrid Bayesian/variational data assimilation algorithm, and comparative streamflow forecasting tests are performed using two watersheds modeled with the Distributed Hydrology Soil Vegetation Model (DHSVM). Contrasts are made between the efficiency gains and forecast accuracy losses of each strategy used in isolation, and of those achieved through their coupling. We expect these developments to help catalyze improvements in the predictive accuracy of large-scale forecasting operations by lowering the costs of deploying advanced data assimilation techniques.

  18. Method for hyperspectral imagery exploitation and pixel spectral unmixing

    NASA Technical Reports Server (NTRS)

    Lin, Ching-Fang (Inventor)

    2003-01-01

    An efficiently hybrid approach to exploit hyperspectral imagery and unmix spectral pixels. This hybrid approach uses a genetic algorithm to solve the abundance vector for the first pixel of a hyperspectral image cube. This abundance vector is used as initial state in a robust filter to derive the abundance estimate for the next pixel. By using Kalman filter, the abundance estimate for a pixel can be obtained in one iteration procedure which is much fast than genetic algorithm. The output of the robust filter is fed to genetic algorithm again to derive accurate abundance estimate for the current pixel. The using of robust filter solution as starting point of the genetic algorithm speeds up the evolution of the genetic algorithm. After obtaining the accurate abundance estimate, the procedure goes to next pixel, and uses the output of genetic algorithm as the previous state estimate to derive abundance estimate for this pixel using robust filter. And again use the genetic algorithm to derive accurate abundance estimate efficiently based on the robust filter solution. This iteration continues until pixels in a hyperspectral image cube end.

  19. Hybrid services efficient provisioning over the network coding-enabled elastic optical networks

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Gu, Rentao; Ji, Yuefeng; Kavehrad, Mohsen

    2017-03-01

    As a variety of services have emerged, hybrid services have become more common in real optical networks. Although the elastic spectrum resource optimizations over the elastic optical networks (EONs) have been widely investigated, little research has been carried out on the hybrid services of the routing and spectrum allocation (RSA), especially over the network coding-enabled EON. We investigated the RSA for the unicast service and network coding-based multicast service over the network coding-enabled EON with the constraints of time delay and transmission distance. To address this issue, a mathematical model was built to minimize the total spectrum consumption for the hybrid services over the network coding-enabled EON under the constraints of time delay and transmission distance. The model guarantees different routing constraints for different types of services. The immediate nodes over the network coding-enabled EON are assumed to be capable of encoding the flows for different kinds of information. We proposed an efficient heuristic algorithm of the network coding-based adaptive routing and layered graph-based spectrum allocation algorithm (NCAR-LGSA). From the simulation results, NCAR-LGSA shows highly efficient performances in terms of the spectrum resources utilization under different network scenarios compared with the benchmark algorithms.

  20. Superpixel-based segmentation of glottal area from videolaryngoscopy images

    NASA Astrophysics Data System (ADS)

    Turkmen, H. Irem; Albayrak, Abdulkadir; Karsligil, M. Elif; Kocak, Ismail

    2017-11-01

    Segmentation of the glottal area with high accuracy is one of the major challenges for the development of systems for computer-aided diagnosis of vocal-fold disorders. We propose a hybrid model combining conventional methods with a superpixel-based segmentation approach. We first employed a superpixel algorithm to reveal the glottal area by eliminating the local variances of pixels caused by bleedings, blood vessels, and light reflections from mucosa. Then, the glottal area was detected by exploiting a seeded region-growing algorithm in a fully automatic manner. The experiments were conducted on videolaryngoscopy images obtained from both patients having pathologic vocal folds as well as healthy subjects. Finally, the proposed hybrid approach was compared with conventional region-growing and active-contour model-based glottal area segmentation algorithms. The performance of the proposed method was evaluated in terms of segmentation accuracy and elapsed time. The F-measure, true negative rate, and dice coefficients of the hybrid method were calculated as 82%, 93%, and 82%, respectively, which are superior to the state-of-art glottal-area segmentation methods. The proposed hybrid model achieved high success rates and robustness, making it suitable for developing a computer-aided diagnosis system that can be used in clinical routines.

  1. A hybrid neural learning algorithm using evolutionary learning and derivative free local search method.

    PubMed

    Ghosh, Ranadhir; Yearwood, John; Ghosh, Moumita; Bagirov, Adil

    2006-06-01

    In this paper we investigate a hybrid model based on the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. Also we discuss different variants for hybrid models using the Discrete Gradient method and an evolutionary strategy for determining the weights in a feed forward artificial neural network. The Discrete Gradient method has the advantage of being able to jump over many local minima and find very deep local minima. However, earlier research has shown that a good starting point for the discrete gradient method can improve the quality of the solution point. Evolutionary algorithms are best suited for global optimisation problems. Nevertheless they are cursed with longer training times and often unsuitable for real world application. For optimisation problems such as weight optimisation for ANNs in real world applications the dimensions are large and time complexity is critical. Hence the idea of a hybrid model can be a suitable option. In this paper we propose different fusion strategies for hybrid models combining the evolutionary strategy with the discrete gradient method to obtain an optimal solution much quicker. Three different fusion strategies are discussed: a linear hybrid model, an iterative hybrid model and a restricted local search hybrid model. Comparative results on a range of standard datasets are provided for different fusion hybrid models.

  2. Hybrid stochastic simplifications for multiscale gene networks

    PubMed Central

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-01-01

    Background Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. Results We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Conclusion Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach. PMID:19735554

  3. 3D segmentations of neuronal nuclei from confocal microscope image stacks

    PubMed Central

    LaTorre, Antonio; Alonso-Nanclares, Lidia; Muelas, Santiago; Peña, José-María; DeFelipe, Javier

    2013-01-01

    In this paper, we present an algorithm to create 3D segmentations of neuronal cells from stacks of previously segmented 2D images. The idea behind this proposal is to provide a general method to reconstruct 3D structures from 2D stacks, regardless of how these 2D stacks have been obtained. The algorithm not only reuses the information obtained in the 2D segmentation, but also attempts to correct some typical mistakes made by the 2D segmentation algorithms (for example, under segmentation of tightly-coupled clusters of cells). We have tested our algorithm in a real scenario—the segmentation of the neuronal nuclei in different layers of the rat cerebral cortex. Several representative images from different layers of the cerebral cortex have been considered and several 2D segmentation algorithms have been compared. Furthermore, the algorithm has also been compared with the traditional 3D Watershed algorithm and the results obtained here show better performance in terms of correctly identified neuronal nuclei. PMID:24409123

  4. 3D segmentations of neuronal nuclei from confocal microscope image stacks.

    PubMed

    Latorre, Antonio; Alonso-Nanclares, Lidia; Muelas, Santiago; Peña, José-María; Defelipe, Javier

    2013-01-01

    In this paper, we present an algorithm to create 3D segmentations of neuronal cells from stacks of previously segmented 2D images. The idea behind this proposal is to provide a general method to reconstruct 3D structures from 2D stacks, regardless of how these 2D stacks have been obtained. The algorithm not only reuses the information obtained in the 2D segmentation, but also attempts to correct some typical mistakes made by the 2D segmentation algorithms (for example, under segmentation of tightly-coupled clusters of cells). We have tested our algorithm in a real scenario-the segmentation of the neuronal nuclei in different layers of the rat cerebral cortex. Several representative images from different layers of the cerebral cortex have been considered and several 2D segmentation algorithms have been compared. Furthermore, the algorithm has also been compared with the traditional 3D Watershed algorithm and the results obtained here show better performance in terms of correctly identified neuronal nuclei.

  5. Thermodynamics of RNA structures by Wang–Landau sampling

    PubMed Central

    Lou, Feng; Clote, Peter

    2010-01-01

    Motivation: Thermodynamics-based dynamic programming RNA secondary structure algorithms have been of immense importance in molecular biology, where applications range from the detection of novel selenoproteins using expressed sequence tag (EST) data, to the determination of microRNA genes and their targets. Dynamic programming algorithms have been developed to compute the minimum free energy secondary structure and partition function of a given RNA sequence, the minimum free-energy and partition function for the hybridization of two RNA molecules, etc. However, the applicability of dynamic programming methods depends on disallowing certain types of interactions (pseudoknots, zig-zags, etc.), as their inclusion renders structure prediction an nondeterministic polynomial time (NP)-complete problem. Nevertheless, such interactions have been observed in X-ray structures. Results: A non-Boltzmannian Monte Carlo algorithm was designed by Wang and Landau to estimate the density of states for complex systems, such as the Ising model, that exhibit a phase transition. In this article, we apply the Wang-Landau (WL) method to compute the density of states for secondary structures of a given RNA sequence, and for hybridizations of two RNA sequences. Our method is shown to be much faster than existent software, such as RNAsubopt. From density of states, we compute the partition function over all secondary structures and over all pseudoknot-free hybridizations. The advantage of the WL method is that by adding a function to evaluate the free energy of arbitary pseudoknotted structures and of arbitrary hybridizations, we can estimate thermodynamic parameters for situations known to be NP-complete. This extension to pseudoknots will be made in the sequel to this article; in contrast, the current article describes the WL algorithm applied to pseudoknot-free secondary structures and hybridizations. Availability: The WL RNA hybridization web server is under construction at http://bioinformatics.bc.edu/clotelab/. Contact: clote@bc.edu PMID:20529917

  6. Efficient hybrid non-equilibrium molecular dynamics--Monte Carlo simulations with symmetric momentum reversal.

    PubMed

    Chen, Yunjie; Roux, Benoît

    2014-09-21

    Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.

  7. Efficient hybrid non-equilibrium molecular dynamics - Monte Carlo simulations with symmetric momentum reversal

    NASA Astrophysics Data System (ADS)

    Chen, Yunjie; Roux, Benoît

    2014-09-01

    Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct equilibrium probability distribution with this prescription.

  8. Hybrid protection algorithms based on game theory in multi-domain optical networks

    NASA Astrophysics Data System (ADS)

    Guo, Lei; Wu, Jingjing; Hou, Weigang; Liu, Yejun; Zhang, Lincong; Li, Hongming

    2011-12-01

    With the network size increasing, the optical backbone is divided into multiple domains and each domain has its own network operator and management policy. At the same time, the failures in optical network may lead to a huge data loss since each wavelength carries a lot of traffic. Therefore, the survivability in multi-domain optical network is very important. However, existing survivable algorithms can achieve only the unilateral optimization for profit of either users or network operators. Then, they cannot well find the double-win optimal solution with considering economic factors for both users and network operators. Thus, in this paper we develop the multi-domain network model with involving multiple Quality of Service (QoS) parameters. After presenting the link evaluation approach based on fuzzy mathematics, we propose the game model to find the optimal solution to maximize the user's utility, the network operator's utility, and the joint utility of user and network operator. Since the problem of finding double-win optimal solution is NP-complete, we propose two new hybrid protection algorithms, Intra-domain Sub-path Protection (ISP) algorithm and Inter-domain End-to-end Protection (IEP) algorithm. In ISP and IEP, the hybrid protection means that the intelligent algorithm based on Bacterial Colony Optimization (BCO) and the heuristic algorithm are used to solve the survivability in intra-domain routing and inter-domain routing, respectively. Simulation results show that ISP and IEP have the similar comprehensive utility. In addition, ISP has better resource utilization efficiency, lower blocking probability, and higher network operator's utility, while IEP has better user's utility.

  9. An evolutionary computation based algorithm for calculating solar differential rotation by automatic tracking of coronal bright points

    NASA Astrophysics Data System (ADS)

    Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.

    2016-03-01

    Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.

  10. Fault-tolerant clock synchronization in distributed systems

    NASA Technical Reports Server (NTRS)

    Ramanathan, Parameswaran; Shin, Kang G.; Butler, Ricky W.

    1990-01-01

    Existing fault-tolerant clock synchronization algorithms are compared and contrasted. These include the following: software synchronization algorithms, such as convergence-averaging, convergence-nonaveraging, and consistency algorithms, as well as probabilistic synchronization; hardware synchronization algorithms; and hybrid synchronization. The worst-case clock skews guaranteed by representative algorithms are compared, along with other important aspects such as time, message, and cost overhead imposed by the algorithms. More recent developments such as hardware-assisted software synchronization and algorithms for synchronizing large, partially connected distributed systems are especially emphasized.

  11. Mountain bicycle frame testing as an example of practical implementation of hybrid simulation using RTFEM

    NASA Astrophysics Data System (ADS)

    Mucha, Waldemar; Kuś, Wacław

    2018-01-01

    The paper presents a practical implementation of hybrid simulation using Real Time Finite Element Method (RTFEM). Hybrid simulation is a technique for investigating dynamic material and structural properties of mechanical systems by performing numerical analysis and experiment at the same time. It applies to mechanical systems with elements too difficult or impossible to model numerically. These elements are tested experimentally, while the rest of the system is simulated numerically. Data between the experiment and numerical simulation are exchanged in real time. Authors use Finite Element Method to perform the numerical simulation. The following paper presents the general algorithm for hybrid simulation using RTFEM and possible improvements of the algorithm for computation time reduction developed by the authors. The paper focuses on practical implementation of presented methods, which involves testing of a mountain bicycle frame, where the shock absorber is tested experimentally while the rest of the frame is simulated numerically.

  12. An adaptive deep-coupled GNSS/INS navigation system with hybrid pre-filter processing

    NASA Astrophysics Data System (ADS)

    Wu, Mouyan; Ding, Jicheng; Zhao, Lin; Kang, Yingyao; Luo, Zhibin

    2018-02-01

    The deep-coupling of a global navigation satellite system (GNSS) with an inertial navigation system (INS) can provide accurate and reliable navigation information. There are several kinds of deeply-coupled structures. These can be divided mainly into coherent and non-coherent pre-filter based structures, which have their own strong advantages and disadvantages, especially in accuracy and robustness. In this paper, the existing pre-filters of the deeply-coupled structures are analyzed and modified to improve them firstly. Then, an adaptive GNSS/INS deeply-coupled algorithm with hybrid pre-filters processing is proposed to combine the advantages of coherent and non-coherent structures. An adaptive hysteresis controller is designed to implement the hybrid pre-filters processing strategy. The simulation and vehicle test results show that the adaptive deeply-coupled algorithm with hybrid pre-filters processing can effectively improve navigation accuracy and robustness, especially in a GNSS-challenged environment.

  13. Hybrid cryptosystem for image file using elgamal and double playfair cipher algorithm

    NASA Astrophysics Data System (ADS)

    Hardi, S. M.; Tarigan, J. T.; Safrina, N.

    2018-03-01

    In this paper, we present an implementation of an image file encryption using hybrid cryptography. We chose ElGamal algorithm to perform asymmetric encryption and Double Playfair for the symmetric encryption. Our objective is to show that these algorithms are capable to encrypt an image file with an acceptable running time and encrypted file size while maintaining the level of security. The application was built using C# programming language and ran as a stand alone desktop application under Windows Operating System. Our test shows that the system is capable to encrypt an image with a resolution of 500×500 to a size of 976 kilobytes with an acceptable running time.

  14. Self-tuning control of attitude and momentum management for the Space Station

    NASA Technical Reports Server (NTRS)

    Shieh, L. S.; Sunkel, J. W.; Yuan, Z. Z.; Zhao, X. M.

    1992-01-01

    This paper presents a hybrid state-space self-tuning design methodology using dual-rate sampling for suboptimal digital adaptive control of attitude and momentum management for the Space Station. This new hybrid adaptive control scheme combines an on-line recursive estimation algorithm for indirectly identifying the parameters of a continuous-time system from the available fast-rate sampled data of the inputs and states and a controller synthesis algorithm for indirectly finding the slow-rate suboptimal digital controller from the designed optimal analog controller. The proposed method enables the development of digitally implementable control algorithms for the robust control of Space Station Freedom with unknown environmental disturbances and slowly time-varying dynamics.

  15. The use of a MODIS band-ratio algorithm versus a new hybrid approach for estimating colored dissolved organic matter (CDOM)

    EPA Science Inventory

    Satellite remote sensing offers synoptic and frequent monitoring of optical water quality parameters, such as chlorophyll-a, turbidity, and colored dissolved organic matter (CDOM). While traditional satellite algorithms were developed for the open ocean, these algorithms often do...

  16. Uncertainty in nutrient loads from tile drained landscapes: Effect of sampling frequency, calculation algorithm, and compositing strategies

    USDA-ARS?s Scientific Manuscript database

    Accurate estimates of annual nutrient loads are required to evaluate trends in water quality following changes in land use or management and to calibrate and validate water quality models. While much emphasis has been placed on understanding the uncertainty of watershed-scale nutrient load estimates...

  17. Using High Resolution Spatial Data and Genetic Algorithms to Optimize Riparian Zone Condition and Impervious Cover Estimates in New England Watersheds

    EPA Science Inventory

    Under EPA’s Green Infrastructure Initiative, a variety of research activities are underway to evaluate the effectiveness of green infrastructure in mitigating the effects of urbanization and stormwater impacts on stream biota and habitat. One aspect of this is evaluating th...

  18. Creation of operation algorithms for combined operation of anti-lock braking system (ABS) and electric machine included in the combined power plant

    NASA Astrophysics Data System (ADS)

    Bakhmutov, S. V.; Ivanov, V. G.; Karpukhin, K. E.; Umnitsyn, A. A.

    2018-02-01

    The paper considers the Anti-lock Braking System (ABS) operation algorithm, which enables the implementation of hybrid braking, i.e. the braking process combining friction brake mechanisms and e-machine (electric machine), which operates in the energy recovery mode. The provided materials focus only on the rectilinear motion of the vehicle. That the ABS task consists in the maintenance of the target wheel slip ratio, which depends on the tyre-road adhesion coefficient. The tyre-road adhesion coefficient was defined based on the vehicle deceleration. In the course of calculated studies, the following operation algorithm of hybrid braking was determined. At adhesion coefficient ≤0.1, driving axle braking occurs only due to the e-machine operating in the energy recovery mode. In other cases, depending on adhesion coefficient, the e-machine provides the brake torque, which changes from 35 to 100% of the maximum available brake torque. Virtual tests showed that values of the wheel slip ratio are close to the required ones. Thus, this algorithm makes it possible to implement hybrid braking by means of the two sources creating the brake torque.

  19. INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm

    PubMed Central

    Gao, Yanbin; Liu, Shifei; Atia, Mohamed M.; Noureldin, Aboelmagd

    2015-01-01

    This paper takes advantage of the complementary characteristics of Global Positioning System (GPS) and Light Detection and Ranging (LiDAR) to provide periodic corrections to Inertial Navigation System (INS) alternatively in different environmental conditions. In open sky, where GPS signals are available and LiDAR measurements are sparse, GPS is integrated with INS. Meanwhile, in confined outdoor environments and indoors, where GPS is unreliable or unavailable and LiDAR measurements are rich, LiDAR replaces GPS to integrate with INS. This paper also proposes an innovative hybrid scan matching algorithm that combines the feature-based scan matching method and Iterative Closest Point (ICP) based scan matching method. The algorithm can work and transit between two modes depending on the number of matched line features over two scans, thus achieving efficiency and robustness concurrently. Two integration schemes of INS and LiDAR with hybrid scan matching algorithm are implemented and compared. Real experiments are performed on an Unmanned Ground Vehicle (UGV) for both outdoor and indoor environments. Experimental results show that the multi-sensor integrated system can remain sub-meter navigation accuracy during the whole trajectory. PMID:26389906

  20. INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm.

    PubMed

    Gao, Yanbin; Liu, Shifei; Atia, Mohamed M; Noureldin, Aboelmagd

    2015-09-15

    This paper takes advantage of the complementary characteristics of Global Positioning System (GPS) and Light Detection and Ranging (LiDAR) to provide periodic corrections to Inertial Navigation System (INS) alternatively in different environmental conditions. In open sky, where GPS signals are available and LiDAR measurements are sparse, GPS is integrated with INS. Meanwhile, in confined outdoor environments and indoors, where GPS is unreliable or unavailable and LiDAR measurements are rich, LiDAR replaces GPS to integrate with INS. This paper also proposes an innovative hybrid scan matching algorithm that combines the feature-based scan matching method and Iterative Closest Point (ICP) based scan matching method. The algorithm can work and transit between two modes depending on the number of matched line features over two scans, thus achieving efficiency and robustness concurrently. Two integration schemes of INS and LiDAR with hybrid scan matching algorithm are implemented and compared. Real experiments are performed on an Unmanned Ground Vehicle (UGV) for both outdoor and indoor environments. Experimental results show that the multi-sensor integrated system can remain sub-meter navigation accuracy during the whole trajectory.

  1. A Temperature Compensation Method for Piezo-Resistive Pressure Sensor Utilizing Chaotic Ions Motion Algorithm Optimized Hybrid Kernel LSSVM.

    PubMed

    Li, Ji; Hu, Guoqing; Zhou, Yonghong; Zou, Chong; Peng, Wei; Alam Sm, Jahangir

    2016-10-14

    A piezo-resistive pressure sensor is made of silicon, the nature of which is considerably influenced by ambient temperature. The effect of temperature should be eliminated during the working period in expectation of linear output. To deal with this issue, an approach consists of a hybrid kernel Least Squares Support Vector Machine (LSSVM) optimized by a chaotic ions motion algorithm presented. To achieve the learning and generalization for excellent performance, a hybrid kernel function, constructed by a local kernel as Radial Basis Function (RBF) kernel, and a global kernel as polynomial kernel is incorporated into the Least Squares Support Vector Machine. The chaotic ions motion algorithm is introduced to find the best hyper-parameters of the Least Squares Support Vector Machine. The temperature data from a calibration experiment is conducted to validate the proposed method. With attention on algorithm robustness and engineering applications, the compensation result shows the proposed scheme outperforms other compared methods on several performance measures as maximum absolute relative error, minimum absolute relative error mean and variance of the averaged value on fifty runs. Furthermore, the proposed temperature compensation approach lays a foundation for more extensive research.

  2. Accurate modeling of switched reluctance machine based on hybrid trained WNN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Shoujun, E-mail: sunnyway@nwpu.edu.cn; Ge, Lefei; Ma, Shaojie

    2014-04-15

    According to the strong nonlinear electromagnetic characteristics of switched reluctance machine (SRM), a novel accurate modeling method is proposed based on hybrid trained wavelet neural network (WNN) which combines improved genetic algorithm (GA) with gradient descent (GD) method to train the network. In the novel method, WNN is trained by GD method based on the initial weights obtained per improved GA optimization, and the global parallel searching capability of stochastic algorithm and local convergence speed of deterministic algorithm are combined to enhance the training accuracy, stability and speed. Based on the measured electromagnetic characteristics of a 3-phase 12/8-pole SRM, themore » nonlinear simulation model is built by hybrid trained WNN in Matlab. The phase current and mechanical characteristics from simulation under different working conditions meet well with those from experiments, which indicates the accuracy of the model for dynamic and static performance evaluation of SRM and verifies the effectiveness of the proposed modeling method.« less

  3. A Hybrid alldifferent-Tabu Search Algorithm for Solving Sudoku Puzzles

    PubMed Central

    Crawford, Broderick; Paredes, Fernando; Norero, Enrique

    2015-01-01

    The Sudoku problem is a well-known logic-based puzzle of combinatorial number-placement. It consists in filling a n 2 × n 2 grid, composed of n columns, n rows, and n subgrids, each one containing distinct integers from 1 to n 2. Such a puzzle belongs to the NP-complete collection of problems, to which there exist diverse exact and approximate methods able to solve it. In this paper, we propose a new hybrid algorithm that smartly combines a classic tabu search procedure with the alldifferent global constraint from the constraint programming world. The alldifferent constraint is known to be efficient for domain filtering in the presence of constraints that must be pairwise different, which are exactly the kind of constraints that Sudokus own. This ability clearly alleviates the work of the tabu search, resulting in a faster and more robust approach for solving Sudokus. We illustrate interesting experimental results where our proposed algorithm outperforms the best results previously reported by hybrids and approximate methods. PMID:26078751

  4. An evolution based biosensor receptor DNA sequence generation algorithm.

    PubMed

    Kim, Eungyeong; Lee, Malrey; Gatton, Thomas M; Lee, Jaewan; Zang, Yupeng

    2010-01-01

    A biosensor is composed of a bioreceptor, an associated recognition molecule, and a signal transducer that can selectively detect target substances for analysis. DNA based biosensors utilize receptor molecules that allow hybridization with the target analyte. However, most DNA biosensor research uses oligonucleotides as the target analytes and does not address the potential problems of real samples. The identification of recognition molecules suitable for real target analyte samples is an important step towards further development of DNA biosensors. This study examines the characteristics of DNA used as bioreceptors and proposes a hybrid evolution-based DNA sequence generating algorithm, based on DNA computing, to identify suitable DNA bioreceptor recognition molecules for stable hybridization with real target substances. The Traveling Salesman Problem (TSP) approach is applied in the proposed algorithm to evaluate the safety and fitness of the generated DNA sequences. This approach improves efficiency and stability for enhanced and variable-length DNA sequence generation and allows extension to generation of variable-length DNA sequences with diverse receptor recognition requirements.

  5. A Hybrid alldifferent-Tabu Search Algorithm for Solving Sudoku Puzzles.

    PubMed

    Soto, Ricardo; Crawford, Broderick; Galleguillos, Cristian; Paredes, Fernando; Norero, Enrique

    2015-01-01

    The Sudoku problem is a well-known logic-based puzzle of combinatorial number-placement. It consists in filling a n(2) × n(2) grid, composed of n columns, n rows, and n subgrids, each one containing distinct integers from 1 to n(2). Such a puzzle belongs to the NP-complete collection of problems, to which there exist diverse exact and approximate methods able to solve it. In this paper, we propose a new hybrid algorithm that smartly combines a classic tabu search procedure with the alldifferent global constraint from the constraint programming world. The alldifferent constraint is known to be efficient for domain filtering in the presence of constraints that must be pairwise different, which are exactly the kind of constraints that Sudokus own. This ability clearly alleviates the work of the tabu search, resulting in a faster and more robust approach for solving Sudokus. We illustrate interesting experimental results where our proposed algorithm outperforms the best results previously reported by hybrids and approximate methods.

  6. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications.

    PubMed

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-08-06

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively.

  7. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications

    PubMed Central

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-01-01

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively. PMID:27509495

  8. Hybrid Nested Partitions and Math Programming Framework for Large-scale Combinatorial Optimization

    DTIC Science & Technology

    2010-03-31

    optimization problems: 1) exact algorithms and 2) metaheuristic algorithms . This project will integrate concepts from these two technologies to develop...optimal solutions within an acceptable amount of computation time, and 2) metaheuristic algorithms such as genetic algorithms , tabu search, and the...integer programming decomposition approaches, such as Dantzig Wolfe decomposition and Lagrangian relaxation, and metaheuristics such as the Nested

  9. An item-oriented recommendation algorithm on cold-start problem

    NASA Astrophysics Data System (ADS)

    Qiu, Tian; Chen, Guang; Zhang, Zi-Ke; Zhou, Tao

    2011-09-01

    Based on a hybrid algorithm incorporating the heat conduction and probability spreading processes (Proc. Natl. Acad. Sci. U.S.A., 107 (2010) 4511), in this letter, we propose an improved method by introducing an item-oriented function, focusing on solving the dilemma of the recommendation accuracy between the cold and popular items. Differently from previous works, the present algorithm does not require any additional information (e.g., tags). Further experimental results obtained in three real datasets, RYM, Netflix and MovieLens, show that, compared with the original hybrid method, the proposed algorithm significantly enhances the recommendation accuracy of the cold items, while it keeps the recommendation accuracy of the overall and the popular items. This work might shed some light on both understanding and designing effective methods for long-tailed online applications of recommender systems.

  10. Hybrid radiosity-SP3 equation based bioluminescence tomography reconstruction for turbid medium with low- and non-scattering regions

    NASA Astrophysics Data System (ADS)

    Chen, Xueli; Zhang, Qitan; Yang, Defu; Liang, Jimin

    2014-01-01

    To provide an ideal solution for a specific problem of gastric cancer detection in which low-scattering regions simultaneously existed with both the non- and high-scattering regions, a novel hybrid radiosity-SP3 equation based reconstruction algorithm for bioluminescence tomography was proposed in this paper. In the algorithm, the third-order simplified spherical harmonics approximation (SP3) was combined with the radiosity equation to describe the bioluminescent light propagation in tissues, which provided acceptable accuracy for the turbid medium with both low- and non-scattering regions. The performance of the algorithm was evaluated with digital mouse based simulations and a gastric cancer-bearing mouse based in situ experiment. Primary results demonstrated the feasibility and superiority of the proposed algorithm for the turbid medium with low- and non-scattering regions.

  11. Scattering properties of electromagnetic waves from metal object in the lower terahertz region

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Dang, H. X.; Hu, T. Y.; Su, Xiang; Lv, R. C.; Li, Hao; Tan, X. M.; Cui, T. J.

    2018-01-01

    An efficient hybrid algorithm is proposed to analyze the electromagnetic scattering properties of metal objects in the lower terahertz (THz) frequency. The metal object can be viewed as perfectly electrical conducting object with a slightly rough surface in the lower THz region. Hence the THz scattered field from metal object can be divided into coherent and incoherent parts. The physical optics and truncated-wedge incremental-length diffraction coefficients methods are combined to compute the coherent part; while the small perturbation method is used for the incoherent part. With the MonteCarlo method, the radar cross section of the rough metal surface is computed by the multilevel fast multipole algorithm and the proposed hybrid algorithm, respectively. The numerical results show that the proposed algorithm has good accuracy to simulate the scattering properties rapidly in the lower THz region.

  12. PS-FW: A Hybrid Algorithm Based on Particle Swarm and Fireworks for Global Optimization

    PubMed Central

    Chen, Shuangqing; Wei, Lixin; Guan, Bing

    2018-01-01

    Particle swarm optimization (PSO) and fireworks algorithm (FWA) are two recently developed optimization methods which have been applied in various areas due to their simplicity and efficiency. However, when being applied to high-dimensional optimization problems, PSO algorithm may be trapped in the local optima owing to the lack of powerful global exploration capability, and fireworks algorithm is difficult to converge in some cases because of its relatively low local exploitation efficiency for noncore fireworks. In this paper, a hybrid algorithm called PS-FW is presented, in which the modified operators of FWA are embedded into the solving process of PSO. In the iteration process, the abandonment and supplement mechanism is adopted to balance the exploration and exploitation ability of PS-FW, and the modified explosion operator and the novel mutation operator are proposed to speed up the global convergence and to avoid prematurity. To verify the performance of the proposed PS-FW algorithm, 22 high-dimensional benchmark functions have been employed, and it is compared with PSO, FWA, stdPSO, CPSO, CLPSO, FIPS, Frankenstein, and ALWPSO algorithms. Results show that the PS-FW algorithm is an efficient, robust, and fast converging optimization method for solving global optimization problems. PMID:29675036

  13. Finite element model updating using the shadow hybrid Monte Carlo technique

    NASA Astrophysics Data System (ADS)

    Boulkaibet, I.; Mthembu, L.; Marwala, T.; Friswell, M. I.; Adhikari, S.

    2015-02-01

    Recent research in the field of finite element model updating (FEM) advocates the adoption of Bayesian analysis techniques to dealing with the uncertainties associated with these models. However, Bayesian formulations require the evaluation of the Posterior Distribution Function which may not be available in analytical form. This is the case in FEM updating. In such cases sampling methods can provide good approximations of the Posterior distribution when implemented in the Bayesian context. Markov Chain Monte Carlo (MCMC) algorithms are the most popular sampling tools used to sample probability distributions. However, the efficiency of these algorithms is affected by the complexity of the systems (the size of the parameter space). The Hybrid Monte Carlo (HMC) offers a very important MCMC approach to dealing with higher-dimensional complex problems. The HMC uses the molecular dynamics (MD) steps as the global Monte Carlo (MC) moves to reach areas of high probability where the gradient of the log-density of the Posterior acts as a guide during the search process. However, the acceptance rate of HMC is sensitive to the system size as well as the time step used to evaluate the MD trajectory. To overcome this limitation we propose the use of the Shadow Hybrid Monte Carlo (SHMC) algorithm. The SHMC algorithm is a modified version of the Hybrid Monte Carlo (HMC) and designed to improve sampling for large-system sizes and time steps. This is done by sampling from a modified Hamiltonian function instead of the normal Hamiltonian function. In this paper, the efficiency and accuracy of the SHMC method is tested on the updating of two real structures; an unsymmetrical H-shaped beam structure and a GARTEUR SM-AG19 structure and is compared to the application of the HMC algorithm on the same structures.

  14. A Hybrid Maximum Power Point Tracking Method for Automobile Exhaust Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Quan, Rui; Zhou, Wei; Yang, Guangyou; Quan, Shuhai

    2017-05-01

    To make full use of the maximum output power of automobile exhaust thermoelectric generator (AETEG) based on Bi2Te3 thermoelectric modules (TEMs), taking into account the advantages and disadvantages of existing maximum power point tracking methods, and according to the output characteristics of TEMs, a hybrid maximum power point tracking method combining perturb and observe (P&O) algorithm, quadratic interpolation and constant voltage tracking method was put forward in this paper. Firstly, it searched the maximum power point with P&O algorithms and a quadratic interpolation method, then, it forced the AETEG to work at its maximum power point with constant voltage tracking. A synchronous buck converter and controller were implemented in the electric bus of the AETEG applied in a military sports utility vehicle, and the whole system was modeled and simulated with a MATLAB/Simulink environment. Simulation results demonstrate that the maximum output power of the AETEG based on the proposed hybrid method is increased by about 3.0% and 3.7% compared with that using only the P&O algorithm and the quadratic interpolation method, respectively. The shorter tracking time is only 1.4 s, which is reduced by half compared with that of the P&O algorithm and quadratic interpolation method, respectively. The experimental results demonstrate that the tracked maximum power is approximately equal to the real value using the proposed hybrid method,and it can preferentially deal with the voltage fluctuation of the AETEG with only P&O algorithm, and resolve the issue that its working point can barely be adjusted only with constant voltage tracking when the operation conditions change.

  15. Improved Evolutionary Hybrids for Flexible Ligand Docking in Autodock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belew, R.K.; Hart, W.E.; Morris, G.M.

    1999-01-27

    In this paper we evaluate the design of the hybrid evolutionary algorithms (EAs) that are currently used to perform flexible ligand binding in the Autodock docking software. Hybrid EAs incorporate specialized operators that exploit domain-specific features to accelerate an EA's search. We consider hybrid EAs that use an integrated local search operator to reline individuals within each iteration of the search. We evaluate several factors that impact the efficacy of a hybrid EA, and we propose new hybrid EAs that provide more robust convergence to low-energy docking configurations than the methods currently available in Autodock.

  16. Novel hybrid GPU-CPU implementation of parallelized Monte Carlo parametric expectation maximization estimation method for population pharmacokinetic data analysis.

    PubMed

    Ng, C M

    2013-10-01

    The development of a population PK/PD model, an essential component for model-based drug development, is both time- and labor-intensive. A graphical-processing unit (GPU) computing technology has been proposed and used to accelerate many scientific computations. The objective of this study was to develop a hybrid GPU-CPU implementation of parallelized Monte Carlo parametric expectation maximization (MCPEM) estimation algorithm for population PK data analysis. A hybrid GPU-CPU implementation of the MCPEM algorithm (MCPEMGPU) and identical algorithm that is designed for the single CPU (MCPEMCPU) were developed using MATLAB in a single computer equipped with dual Xeon 6-Core E5690 CPU and a NVIDIA Tesla C2070 GPU parallel computing card that contained 448 stream processors. Two different PK models with rich/sparse sampling design schemes were used to simulate population data in assessing the performance of MCPEMCPU and MCPEMGPU. Results were analyzed by comparing the parameter estimation and model computation times. Speedup factor was used to assess the relative benefit of parallelized MCPEMGPU over MCPEMCPU in shortening model computation time. The MCPEMGPU consistently achieved shorter computation time than the MCPEMCPU and can offer more than 48-fold speedup using a single GPU card. The novel hybrid GPU-CPU implementation of parallelized MCPEM algorithm developed in this study holds a great promise in serving as the core for the next-generation of modeling software for population PK/PD analysis.

  17. Hybrid stochastic simulation of reaction-diffusion systems with slow and fast dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strehl, Robert; Ilie, Silvana, E-mail: silvana@ryerson.ca

    2015-12-21

    In this paper, we present a novel hybrid method to simulate discrete stochastic reaction-diffusion models arising in biochemical signaling pathways. We study moderately stiff systems, for which we can partition each reaction or diffusion channel into either a slow or fast subset, based on its propensity. Numerical approaches missing this distinction are often limited with respect to computational run time or approximation quality. We design an approximate scheme that remedies these pitfalls by using a new blending strategy of the well-established inhomogeneous stochastic simulation algorithm and the tau-leaping simulation method. The advantages of our hybrid simulation algorithm are demonstrated onmore » three benchmarking systems, with special focus on approximation accuracy and efficiency.« less

  18. Implementation of a watershed algorithm on FPGAs

    NASA Astrophysics Data System (ADS)

    Zahirazami, Shahram; Akil, Mohamed

    1998-10-01

    In this article we present an implementation of a watershed algorithm on a multi-FPGA architecture. This implementation is based on an hierarchical FIFO. A separate FIFO for each gray level. The gray scale value of a pixel is taken for the altitude of the point. In this way we look at the image as a relief. We proceed by a flooding step. It's like as we immerse the relief in a lake. The water begins to come up and when the water of two different catchment basins reach each other, we will construct a separator or a `Watershed'. This approach is data dependent, hence the process time is different for different images. The H-FIFO is used to guarantee the nature of immersion, it means that we need two types of priority. All the points of an altitude `n' are processed before any point of altitude `n + 1'. And inside an altitude water propagates with a constant velocity in all directions from the source. This operator needs two images as input. An original image or it's gradient and the marker image. A classic way to construct the marker image is to build an image of minimal regions. Each minimal region has it's unique label. This label is the color of the water and will be used to see whether two different water touch each other. The algorithm at first fill the hierarchy FIFO with neighbors of all the regions who are not colored. Next it fetches the first pixel from the first non-empty FIFO and treats this pixel. This pixel will take the color of its neighbor, and all the neighbors who are not already in the H-FIFO are put in their correspondent FIFO. The process is over when the H-FIFO is empty. The result is a segmented and labeled image.

  19. Application of genetic algorithm to land use optimization for non-point source pollution control based on CLUE-S and SWAT

    NASA Astrophysics Data System (ADS)

    Wang, Qingrui; Liu, Ruimin; Men, Cong; Guo, Lijia

    2018-05-01

    The genetic algorithm (GA) was combined with the Conversion of Land Use and its Effect at Small regional extent (CLUE-S) model to obtain an optimized land use pattern for controlling non-point source (NPS) pollution. The performance of the combination was evaluated. The effect of the optimized land use pattern on the NPS pollution control was estimated by the Soil and Water Assessment Tool (SWAT) model and an assistant map was drawn to support the land use plan for the future. The Xiangxi River watershed was selected as the study area. Two scenarios were used to simulate the land use change. Under the historical trend scenario (Markov chain prediction), the forest area decreased by 2035.06 ha, and was mainly converted into paddy and dryland area. In contrast, under the optimized scenario (genetic algorithm (GA) prediction), up to 3370 ha of dryland area was converted into forest area. Spatially, the conversion of paddy and dryland into forest occurred mainly in the northwest and southeast of the watershed, where the slope land occupied a large proportion. The organic and inorganic phosphorus loads decreased by 3.6% and 3.7%, respectively, in the optimized scenario compared to those in the historical trend scenario. GA showed a better performance in optimized land use prediction. A comparison of the land use patterns in 2010 under the real situation and in 2020 under the optimized situation showed that Shennongjia and Shuiyuesi should convert 1201.76 ha and 1115.33 ha of dryland into forest areas, respectively, which represented the greatest changes in all regions in the watershed. The results of this study indicated that GA and the CLUE-S model can be used to optimize the land use patterns in the future and that SWAT can be used to evaluate the effect of land use optimization on non-point source pollution control. These methods may provide support for land use plan of an area.

  20. Development of seismic tomography software for hybrid supercomputers

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.

  1. Systematic Design of High-performance Hybrid Feedback Algorithms

    DTIC Science & Technology

    2015-06-24

    Automatic Control, vol. 59, no. 9, pp. 2426- 2441 , 2014. J6. Liberzon, D.; Nešić, D.; Teel, A.R., “Lyapunov-based small-gain theorems for hybrid...on Automatic Control, vol. 59, no. 9, pp. 2426- 2441 , 2014. J6. Liberzon, D.; Nešić, D.; Teel, A.R., “Lyapunov-based small-gain theorems for hybrid

  2. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks

    PubMed Central

    Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin

    2017-01-01

    Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller’s direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20–40% while ensuring feasible data delay. PMID:28914816

  3. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks.

    PubMed

    Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin

    2017-09-15

    Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller's direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20-40% while ensuring feasible data delay.

  4. Construction cost estimation of spherical storage tanks: artificial neural networks and hybrid regression—GA algorithms

    NASA Astrophysics Data System (ADS)

    Arabzadeh, Vida; Niaki, S. T. A.; Arabzadeh, Vahid

    2017-10-01

    One of the most important processes in the early stages of construction projects is to estimate the cost involved. This process involves a wide range of uncertainties, which make it a challenging task. Because of unknown issues, using the experience of the experts or looking for similar cases are the conventional methods to deal with cost estimation. The current study presents data-driven methods for cost estimation based on the application of artificial neural network (ANN) and regression models. The learning algorithms of the ANN are the Levenberg-Marquardt and the Bayesian regulated. Moreover, regression models are hybridized with a genetic algorithm to obtain better estimates of the coefficients. The methods are applied in a real case, where the input parameters of the models are assigned based on the key issues involved in a spherical tank construction. The results reveal that while a high correlation between the estimated cost and the real cost exists; both ANNs could perform better than the hybridized regression models. In addition, the ANN with the Levenberg-Marquardt learning algorithm (LMNN) obtains a better estimation than the ANN with the Bayesian-regulated learning algorithm (BRNN). The correlation between real data and estimated values is over 90%, while the mean square error is achieved around 0.4. The proposed LMNN model can be effective to reduce uncertainty and complexity in the early stages of the construction project.

  5. A hybrid personalized data recommendation approach for geoscience data sharing

    NASA Astrophysics Data System (ADS)

    WANG, M.; Wang, J.

    2016-12-01

    Recommender systems are effective tools helping Internet users overcome information overloading. The two most widely used recommendation algorithms are collaborating filtering (CF) and content-based filtering (CBF). A number of recommender systems based on those two algorithms were developed for multimedia, online sells, and other domains. Each of the two algorithms has its advantages and shortcomings. Hybrid approaches that combine these two algorithms are better choices in many cases. In geoscience data sharing domain, where the items (datasets) are more informative (in space and time) and domain-specific, no recommender system is specialized for data users. This paper reports a dynamic weighted hybrid recommendation algorithm that combines CF and CBF for geoscience data sharing portal. We first derive users' ratings on items with their historical visiting time by Jenks Natural Break. In the CBF part, we incorporate the space, time, and subject information of geoscience datasets to compute item similarity. Predicted ratings were computed with k-NN method separately using CBF and CF, and then combined with weights. With training dataset we attempted to find the best model describing ideal weights and users' co-rating numbers. A logarithmic function was confirmed to be the best model. The model was then used to tune the weights of CF and CBF on user-item basis with test dataset. Evaluation results show that the dynamic weighted approach outperforms either solo CF or CBF approach in terms of Precision and Recall.

  6. Resolution-Adaptive Hybrid MIMO Architectures for Millimeter Wave Communications

    NASA Astrophysics Data System (ADS)

    Choi, Jinseok; Evans, Brian L.; Gatherer, Alan

    2017-12-01

    In this paper, we propose a hybrid analog-digital beamforming architecture with resolution-adaptive ADCs for millimeter wave (mmWave) receivers with large antenna arrays. We adopt array response vectors for the analog combiners and derive ADC bit-allocation (BA) solutions in closed form. The BA solutions reveal that the optimal number of ADC bits is logarithmically proportional to the RF chain's signal-to-noise ratio raised to the 1/3 power. Using the solutions, two proposed BA algorithms minimize the mean square quantization error of received analog signals under a total ADC power constraint. Contributions of this paper include 1) ADC bit-allocation algorithms to improve communication performance of a hybrid MIMO receiver, 2) approximation of the capacity with the BA algorithm as a function of channels, and 3) a worst-case analysis of the ergodic rate of the proposed MIMO receiver that quantifies system tradeoffs and serves as the lower bound. Simulation results demonstrate that the BA algorithms outperform a fixed-ADC approach in both spectral and energy efficiency, and validate the capacity and ergodic rate formula. For a power constraint equivalent to that of fixed 4-bit ADCs, the revised BA algorithm makes the quantization error negligible while achieving 22% better energy efficiency. Having negligible quantization error allows existing state-of-the-art digital beamformers to be readily applied to the proposed system.

  7. Application of Approximate Pattern Matching in Two Dimensional Spaces to Grid Layout for Biochemical Network Maps

    PubMed Central

    Inoue, Kentaro; Shimozono, Shinichi; Yoshida, Hideaki; Kurata, Hiroyuki

    2012-01-01

    Background For visualizing large-scale biochemical network maps, it is important to calculate the coordinates of molecular nodes quickly and to enhance the understanding or traceability of them. The grid layout is effective in drawing compact, orderly, balanced network maps with node label spaces, but existing grid layout algorithms often require a high computational cost because they have to consider complicated positional constraints through the entire optimization process. Results We propose a hybrid grid layout algorithm that consists of a non-grid, fast layout (preprocessor) algorithm and an approximate pattern matching algorithm that distributes the resultant preprocessed nodes on square grid points. To demonstrate the feasibility of the hybrid layout algorithm, it is characterized in terms of the calculation time, numbers of edge-edge and node-edge crossings, relative edge lengths, and F-measures. The proposed algorithm achieves outstanding performances compared with other existing grid layouts. Conclusions Use of an approximate pattern matching algorithm quickly redistributes the laid-out nodes by fast, non-grid algorithms on the square grid points, while preserving the topological relationships among the nodes. The proposed algorithm is a novel use of the pattern matching, thereby providing a breakthrough for grid layout. This application program can be freely downloaded from http://www.cadlive.jp/hybridlayout/hybridlayout.html. PMID:22679486

  8. Induction as Knowledge Integration

    NASA Technical Reports Server (NTRS)

    Smith, Benjamin D.; Rosenbloom, Paul S.

    1996-01-01

    Two key issues for induction algorithms are the accuracy of the learned hypothesis and the computational resources consumed in inducing that hypothesis. One of the most promising ways to improve performance along both dimensions is to make use of additional knowledge. Multi-strategy learning algorithms tackle this problem by employing several strategies for handling different kinds of knowledge in different ways. However, integrating knowledge into an induction algorithm can be difficult when the new knowledge differs significantly from the knowledge the algorithm already uses. In many cases the algorithm must be rewritten. This paper presents Knowledge Integration framework for Induction (KII), a KII, that provides a uniform mechanism for integrating knowledge into induction. In theory, arbitrary knowledge can be integrated with this mechanism, but in practice the knowledge representation language determines both the knowledge that can be integrated, and the costs of integration and induction. By instantiating KII with various set representations, algorithms can be generated at different trade-off points along these dimensions. One instantiation of KII, called RS-KII, is presented that can implement hybrid induction algorithms, depending on which knowledge it utilizes. RS-KII is demonstrated to implement AQ-11, as well as a hybrid algorithm that utilizes a domain theory and noisy examples. Other algorithms are also possible.

  9. A novel harmony search-K means hybrid algorithm for clustering gene expression data

    PubMed Central

    Nazeer, KA Abdul; Sebastian, MP; Kumar, SD Madhu

    2013-01-01

    Recent progress in bioinformatics research has led to the accumulation of huge quantities of biological data at various data sources. The DNA microarray technology makes it possible to simultaneously analyze large number of genes across different samples. Clustering of microarray data can reveal the hidden gene expression patterns from large quantities of expression data that in turn offers tremendous possibilities in functional genomics, comparative genomics, disease diagnosis and drug development. The k- ¬means clustering algorithm is widely used for many practical applications. But the original k-¬means algorithm has several drawbacks. It is computationally expensive and generates locally optimal solutions based on the random choice of the initial centroids. Several methods have been proposed in the literature for improving the performance of the k-¬means algorithm. A meta-heuristic optimization algorithm named harmony search helps find out near-global optimal solutions by searching the entire solution space. Low clustering accuracy of the existing algorithms limits their use in many crucial applications of life sciences. In this paper we propose a novel Harmony Search-K means Hybrid (HSKH) algorithm for clustering the gene expression data. Experimental results show that the proposed algorithm produces clusters with better accuracy in comparison with the existing algorithms. PMID:23390351

  10. A novel harmony search-K means hybrid algorithm for clustering gene expression data.

    PubMed

    Nazeer, Ka Abdul; Sebastian, Mp; Kumar, Sd Madhu

    2013-01-01

    Recent progress in bioinformatics research has led to the accumulation of huge quantities of biological data at various data sources. The DNA microarray technology makes it possible to simultaneously analyze large number of genes across different samples. Clustering of microarray data can reveal the hidden gene expression patterns from large quantities of expression data that in turn offers tremendous possibilities in functional genomics, comparative genomics, disease diagnosis and drug development. The k- ¬means clustering algorithm is widely used for many practical applications. But the original k-¬means algorithm has several drawbacks. It is computationally expensive and generates locally optimal solutions based on the random choice of the initial centroids. Several methods have been proposed in the literature for improving the performance of the k-¬means algorithm. A meta-heuristic optimization algorithm named harmony search helps find out near-global optimal solutions by searching the entire solution space. Low clustering accuracy of the existing algorithms limits their use in many crucial applications of life sciences. In this paper we propose a novel Harmony Search-K means Hybrid (HSKH) algorithm for clustering the gene expression data. Experimental results show that the proposed algorithm produces clusters with better accuracy in comparison with the existing algorithms.

  11. Application of approximate pattern matching in two dimensional spaces to grid layout for biochemical network maps.

    PubMed

    Inoue, Kentaro; Shimozono, Shinichi; Yoshida, Hideaki; Kurata, Hiroyuki

    2012-01-01

    For visualizing large-scale biochemical network maps, it is important to calculate the coordinates of molecular nodes quickly and to enhance the understanding or traceability of them. The grid layout is effective in drawing compact, orderly, balanced network maps with node label spaces, but existing grid layout algorithms often require a high computational cost because they have to consider complicated positional constraints through the entire optimization process. We propose a hybrid grid layout algorithm that consists of a non-grid, fast layout (preprocessor) algorithm and an approximate pattern matching algorithm that distributes the resultant preprocessed nodes on square grid points. To demonstrate the feasibility of the hybrid layout algorithm, it is characterized in terms of the calculation time, numbers of edge-edge and node-edge crossings, relative edge lengths, and F-measures. The proposed algorithm achieves outstanding performances compared with other existing grid layouts. Use of an approximate pattern matching algorithm quickly redistributes the laid-out nodes by fast, non-grid algorithms on the square grid points, while preserving the topological relationships among the nodes. The proposed algorithm is a novel use of the pattern matching, thereby providing a breakthrough for grid layout. This application program can be freely downloaded from http://www.cadlive.jp/hybridlayout/hybridlayout.html.

  12. Network reliability maximization for stochastic-flow network subject to correlated failures using genetic algorithm and tabu\\xA0search

    NASA Astrophysics Data System (ADS)

    Yeh, Cheng-Ta; Lin, Yi-Kuei; Yang, Jo-Yun

    2018-07-01

    Network reliability is an important performance index for many real-life systems, such as electric power systems, computer systems and transportation systems. These systems can be modelled as stochastic-flow networks (SFNs) composed of arcs and nodes. Most system supervisors respect the network reliability maximization by finding the optimal multi-state resource assignment, which is one resource to each arc. However, a disaster may cause correlated failures for the assigned resources, affecting the network reliability. This article focuses on determining the optimal resource assignment with maximal network reliability for SFNs. To solve the problem, this study proposes a hybrid algorithm integrating the genetic algorithm and tabu search to determine the optimal assignment, called the hybrid GA-TS algorithm (HGTA), and integrates minimal paths, recursive sum of disjoint products and the correlated binomial distribution to calculate network reliability. Several practical numerical experiments are adopted to demonstrate that HGTA has better computational quality than several popular soft computing algorithms.

  13. PSO-Based Smart Grid Application for Sizing and Optimization of Hybrid Renewable Energy Systems

    PubMed Central

    Mohamed, Mohamed A.; Eltamaly, Ali M.; Alolah, Abdulrahman I.

    2016-01-01

    This paper introduces an optimal sizing algorithm for a hybrid renewable energy system using smart grid load management application based on the available generation. This algorithm aims to maximize the system energy production and meet the load demand with minimum cost and highest reliability. This system is formed by photovoltaic array, wind turbines, storage batteries, and diesel generator as a backup source of energy. Demand profile shaping as one of the smart grid applications is introduced in this paper using load shifting-based load priority. Particle swarm optimization is used in this algorithm to determine the optimum size of the system components. The results obtained from this algorithm are compared with those from the iterative optimization technique to assess the adequacy of the proposed algorithm. The study in this paper is performed in some of the remote areas in Saudi Arabia and can be expanded to any similar regions around the world. Numerous valuable results are extracted from this study that could help researchers and decision makers. PMID:27513000

  14. PSO-Based Smart Grid Application for Sizing and Optimization of Hybrid Renewable Energy Systems.

    PubMed

    Mohamed, Mohamed A; Eltamaly, Ali M; Alolah, Abdulrahman I

    2016-01-01

    This paper introduces an optimal sizing algorithm for a hybrid renewable energy system using smart grid load management application based on the available generation. This algorithm aims to maximize the system energy production and meet the load demand with minimum cost and highest reliability. This system is formed by photovoltaic array, wind turbines, storage batteries, and diesel generator as a backup source of energy. Demand profile shaping as one of the smart grid applications is introduced in this paper using load shifting-based load priority. Particle swarm optimization is used in this algorithm to determine the optimum size of the system components. The results obtained from this algorithm are compared with those from the iterative optimization technique to assess the adequacy of the proposed algorithm. The study in this paper is performed in some of the remote areas in Saudi Arabia and can be expanded to any similar regions around the world. Numerous valuable results are extracted from this study that could help researchers and decision makers.

  15. Design and implementation of intelligent electronic warfare decision making algorithm

    NASA Astrophysics Data System (ADS)

    Peng, Hsin-Hsien; Chen, Chang-Kuo; Hsueh, Chi-Shun

    2017-05-01

    Electromagnetic signals and the requirements of timely response have been a rapid growth in modern electronic warfare. Although jammers are limited resources, it is possible to achieve the best electronic warfare efficiency by tactical decisions. This paper proposes the intelligent electronic warfare decision support system. In this work, we develop a novel hybrid algorithm, Digital Pheromone Particle Swarm Optimization, based on Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO) and Shuffled Frog Leaping Algorithm (SFLA). We use PSO to solve the problem and combine the concept of pheromones in ACO to accumulate more useful information in spatial solving process and speed up finding the optimal solution. The proposed algorithm finds the optimal solution in reasonable computation time by using the method of matrix conversion in SFLA. The results indicated that jammer allocation was more effective. The system based on the hybrid algorithm provides electronic warfare commanders with critical information to assist commanders in effectively managing the complex electromagnetic battlefield.

  16. Diversified models for portfolio selection based on uncertain semivariance

    NASA Astrophysics Data System (ADS)

    Chen, Lin; Peng, Jin; Zhang, Bo; Rosyida, Isnaini

    2017-02-01

    Since the financial markets are complex, sometimes the future security returns are represented mainly based on experts' estimations due to lack of historical data. This paper proposes a semivariance method for diversified portfolio selection, in which the security returns are given subjective to experts' estimations and depicted as uncertain variables. In the paper, three properties of the semivariance of uncertain variables are verified. Based on the concept of semivariance of uncertain variables, two types of mean-semivariance diversified models for uncertain portfolio selection are proposed. Since the models are complex, a hybrid intelligent algorithm which is based on 99-method and genetic algorithm is designed to solve the models. In this hybrid intelligent algorithm, 99-method is applied to compute the expected value and semivariance of uncertain variables, and genetic algorithm is employed to seek the best allocation plan for portfolio selection. At last, several numerical examples are presented to illustrate the modelling idea and the effectiveness of the algorithm.

  17. Maximization Network Throughput Based on Improved Genetic Algorithm and Network Coding for Optical Multicast Networks

    NASA Astrophysics Data System (ADS)

    Wei, Chengying; Xiong, Cuilian; Liu, Huanlin

    2017-12-01

    Maximal multicast stream algorithm based on network coding (NC) can improve the network's throughput for wavelength-division multiplexing (WDM) networks, which however is far less than the network's maximal throughput in terms of theory. And the existing multicast stream algorithms do not give the information distribution pattern and routing in the meantime. In the paper, an improved genetic algorithm is brought forward to maximize the optical multicast throughput by NC and to determine the multicast stream distribution by hybrid chromosomes construction for multicast with single source and multiple destinations. The proposed hybrid chromosomes are constructed by the binary chromosomes and integer chromosomes, while the binary chromosomes represent optical multicast routing and the integer chromosomes indicate the multicast stream distribution. A fitness function is designed to guarantee that each destination can receive the maximum number of decoding multicast streams. The simulation results showed that the proposed method is far superior over the typical maximal multicast stream algorithms based on NC in terms of network throughput in WDM networks.

  18. 75 FR 27850 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-18

    ... Change, as Modified by Amendment No. 1 Thereto, Related to the Hybrid Matching Algorithms May 12, 2010... allocation algorithms to choose from when executing incoming electronic orders. The menu format allows the Exchange to utilize different allocation algorithms on a class-by-class basis. The menu includes, among...

  19. Improvements to a five-phase ABS algorithm for experimental validation

    NASA Astrophysics Data System (ADS)

    Gerard, Mathieu; Pasillas-Lépine, William; de Vries, Edwin; Verhaegen, Michel

    2012-10-01

    The anti-lock braking system (ABS) is the most important active safety system for passenger cars. Unfortunately, the literature is not really precise about its description, stability and performance. This research improves a five-phase hybrid ABS control algorithm based on wheel deceleration [W. Pasillas-Lépine, Hybrid modeling and limit cycle analysis for a class of five-phase anti-lock brake algorithms, Veh. Syst. Dyn. 44 (2006), pp. 173-188] and validates it on a tyre-in-the-loop laboratory facility. Five relevant effects are modelled so that the simulation matches the reality: oscillations in measurements, wheel acceleration reconstruction, brake pressure dynamics, brake efficiency changes and tyre relaxation. The time delays in measurement and actuation have been identified as the main difficulty for the initial algorithm to work in practice. Three methods are proposed in order to deal with these delays. It is verified that the ABS limit cycles encircle the optimal braking point, without assuming any tyre parameter being a priori known. The ABS algorithm is compared with the commercial algorithm developed by Bosch.

  20. Image-processing algorithms for inspecting characteristics of hybrid rice seed

    NASA Astrophysics Data System (ADS)

    Cheng, Fang; Ying, Yibin

    2004-03-01

    Incompletely closed glumes, germ and disease are three characteristics of hybrid rice seed. Image-processing algorithms developed to detect these seed characteristics were presented in this paper. The rice seed used for this study involved five varieties of Jinyou402, Shanyou10, Zhongyou207, Jiayou and IIyou. The algorithms were implemented with a 5*600 images set, a 4*400 images set and the other 5*600 images set respectively. The image sets included black background images, white background images and both sides images of rice seed. Results show that the algorithm for inspecting seeds with incompletely closed glumes based on Radon Transform achieved an accuracy of 96% for normal seeds, 92% for seeds with fine fissure and 87% for seeds with unclosed glumes, the algorithm for inspecting germinated seeds on panicle based on PCA and ANN achieved n average accuracy of 98% for normal seeds, 88% for germinated seeds on panicle and the algorithm for inspecting diseased seeds based on color features achieved an accuracy of 92% for normal and healthy seeds, 95% for spot diseased seeds and 83% for severe diseased seeds.

  1. A Hybrid Cellular Genetic Algorithm for Multi-objective Crew Scheduling Problem

    NASA Astrophysics Data System (ADS)

    Jolai, Fariborz; Assadipour, Ghazal

    Crew scheduling is one of the important problems of the airline industry. This problem aims to cover a number of flights by crew members, such that all the flights are covered. In a robust scheduling the assignment should be so that the total cost, delays, and unbalanced utilization are minimized. As the problem is NP-hard and the objectives are in conflict with each other, a multi-objective meta-heuristic called CellDE, which is a hybrid cellular genetic algorithm, is implemented as the optimization method. The proposed algorithm provides the decision maker with a set of non-dominated or Pareto-optimal solutions, and enables them to choose the best one according to their preferences. A set of problems of different sizes is generated and solved using the proposed algorithm. Evaluating the performance of the proposed algorithm, three metrics are suggested, and the diversity and the convergence of the achieved Pareto front are appraised. Finally a comparison is made between CellDE and PAES, another meta-heuristic algorithm. The results show the superiority of CellDE.

  2. A Hybrid CPU/GPU Pattern-Matching Algorithm for Deep Packet Inspection

    PubMed Central

    Chen, Yaw-Chung

    2015-01-01

    The large quantities of data now being transferred via high-speed networks have made deep packet inspection indispensable for security purposes. Scalable and low-cost signature-based network intrusion detection systems have been developed for deep packet inspection for various software platforms. Traditional approaches that only involve central processing units (CPUs) are now considered inadequate in terms of inspection speed. Graphic processing units (GPUs) have superior parallel processing power, but transmission bottlenecks can reduce optimal GPU efficiency. In this paper we describe our proposal for a hybrid CPU/GPU pattern-matching algorithm (HPMA) that divides and distributes the packet-inspecting workload between a CPU and GPU. All packets are initially inspected by the CPU and filtered using a simple pre-filtering algorithm, and packets that might contain malicious content are sent to the GPU for further inspection. Test results indicate that in terms of random payload traffic, the matching speed of our proposed algorithm was 3.4 times and 2.7 times faster than those of the AC-CPU and AC-GPU algorithms, respectively. Further, HPMA achieved higher energy efficiency than the other tested algorithms. PMID:26437335

  3. A Hybrid CPU/GPU Pattern-Matching Algorithm for Deep Packet Inspection.

    PubMed

    Lee, Chun-Liang; Lin, Yi-Shan; Chen, Yaw-Chung

    2015-01-01

    The large quantities of data now being transferred via high-speed networks have made deep packet inspection indispensable for security purposes. Scalable and low-cost signature-based network intrusion detection systems have been developed for deep packet inspection for various software platforms. Traditional approaches that only involve central processing units (CPUs) are now considered inadequate in terms of inspection speed. Graphic processing units (GPUs) have superior parallel processing power, but transmission bottlenecks can reduce optimal GPU efficiency. In this paper we describe our proposal for a hybrid CPU/GPU pattern-matching algorithm (HPMA) that divides and distributes the packet-inspecting workload between a CPU and GPU. All packets are initially inspected by the CPU and filtered using a simple pre-filtering algorithm, and packets that might contain malicious content are sent to the GPU for further inspection. Test results indicate that in terms of random payload traffic, the matching speed of our proposed algorithm was 3.4 times and 2.7 times faster than those of the AC-CPU and AC-GPU algorithms, respectively. Further, HPMA achieved higher energy efficiency than the other tested algorithms.

  4. An Evolutionary Algorithm for Fast Intensity Based Image Matching Between Optical and SAR Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Fischer, Peter; Schuegraf, Philipp; Merkle, Nina; Storch, Tobias

    2018-04-01

    This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR) optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search) and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.

  5. On the changing contribution of snow to the hydrology of the Fraser River Basin, Canada

    NASA Astrophysics Data System (ADS)

    Dery, S. J.; Kang, D.; Shi, X.; Gao, H.

    2013-12-01

    This talk will present an application of the Variable Infiltration Capacity (VIC) model to the Fraser River Basin (FRB) of British Columbia (BC), Canada over the latter half of the 20th century. The Fraser River is the longest waterway in BC and supports the world's most abundant Pacific Ocean salmon populations. Previous modeling and observational studies have demonstrated that the FRB is a snow-dominated system but with climate change it may evolve to a pluvial regime. Thus the goal of this study is to evaluate the changing contribution of snow to the hydrology of the watershed over the latter half of the 20th century. To this end, a 0.25° atmospheric forcing dataset is used to drive the VIC model from 1948 to 2006 at a daily time step over a domain covering the entire FRB. A model evaluation is first conducted over 11 major sub-watersheds of the FRB to quantitatively assess the spatial variations of snow water equivalent (SWE) and runoff. The ratio of the spatially averaged maximum SWE to runoff (RSR) is used to quantify the contribution of snow to the runoff in the 11 sub-watersheds of interest. From 1948 to 2006, RSR exhibits a significant decreasing trend in 9 of the 11 sub-watersheds (at a 0.05 of p-value according to the Mann-Kendall Test statistics). Changes in snow accumulation and melt lead to significant advances of the spring freshet throughout the basin. As the climate continues to warm, ecological processes and human usage of natural resources in the FRB may be substantially affected by its transition from a snow to a hybrid (nival/pluvial) and even a rain-dominated watershed.

  6. An Extended Spectral-Spatial Classification Approach for Hyperspectral Data

    NASA Astrophysics Data System (ADS)

    Akbari, D.

    2017-11-01

    In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.

  7. Stand-alone hybrid wind-photovoltaic power generation systems optimal sizing

    NASA Astrophysics Data System (ADS)

    Crǎciunescu, Aurelian; Popescu, Claudia; Popescu, Mihai; Florea, Leonard Marin

    2013-10-01

    Wind and photovoltaic energy resources have attracted energy sectors to generate power on a large scale. A drawback, common to these options, is their unpredictable nature and dependence on day time and meteorological conditions. Fortunately, the problems caused by the variable nature of these resources can be partially overcome by integrating the two resources in proper combination, using the strengths of one source to overcome the weakness of the other. The hybrid systems that combine wind and solar generating units with battery backup can attenuate their individual fluctuations and can match with the power requirements of the beneficiaries. In order to efficiently and economically utilize the hybrid energy system, one optimum match design sizing method is necessary. In this way, literature offers a variety of methods for multi-objective optimal designing of hybrid wind/photovoltaic (WG/PV) generating systems, one of the last being genetic algorithms (GA) and particle swarm optimization (PSO). In this paper, mathematical models of hybrid WG/PV components and a short description of the last proposed multi-objective optimization algorithms are given.

  8. Hybrid DFP-CG method for solving unconstrained optimization problems

    NASA Astrophysics Data System (ADS)

    Osman, Wan Farah Hanan Wan; Asrul Hery Ibrahim, Mohd; Mamat, Mustafa

    2017-09-01

    The conjugate gradient (CG) method and quasi-Newton method are both well known method for solving unconstrained optimization method. In this paper, we proposed a new method by combining the search direction between conjugate gradient method and quasi-Newton method based on BFGS-CG method developed by Ibrahim et al. The Davidon-Fletcher-Powell (DFP) update formula is used as an approximation of Hessian for this new hybrid algorithm. Numerical result showed that the new algorithm perform well than the ordinary DFP method and proven to posses both sufficient descent and global convergence properties.

  9. Hybrid glowworm swarm optimization for task scheduling in the cloud environment

    NASA Astrophysics Data System (ADS)

    Zhou, Jing; Dong, Shoubin

    2018-06-01

    In recent years many heuristic algorithms have been proposed to solve task scheduling problems in the cloud environment owing to their optimization capability. This article proposes a hybrid glowworm swarm optimization (HGSO) based on glowworm swarm optimization (GSO), which uses a technique of evolutionary computation, a strategy of quantum behaviour based on the principle of neighbourhood, offspring production and random walk, to achieve more efficient scheduling with reasonable scheduling costs. The proposed HGSO reduces the redundant computation and the dependence on the initialization of GSO, accelerates the convergence and more easily escapes from local optima. The conducted experiments and statistical analysis showed that in most cases the proposed HGSO algorithm outperformed previous heuristic algorithms to deal with independent tasks.

  10. Multidirectional hybrid algorithm for the split common fixed point problem and application to the split common null point problem.

    PubMed

    Li, Xia; Guo, Meifang; Su, Yongfu

    2016-01-01

    In this article, a new multidirectional monotone hybrid iteration algorithm for finding a solution to the split common fixed point problem is presented for two countable families of quasi-nonexpansive mappings in Banach spaces. Strong convergence theorems are proved. The application of the result is to consider the split common null point problem of maximal monotone operators in Banach spaces. Strong convergence theorems for finding a solution of the split common null point problem are derived. This iteration algorithm can accelerate the convergence speed of iterative sequence. The results of this paper improve and extend the recent results of Takahashi and Yao (Fixed Point Theory Appl 2015:87, 2015) and many others .

  11. Active mask segmentation of fluorescence microscope images.

    PubMed

    Srinivasa, Gowri; Fickus, Matthew C; Guo, Yusong; Linstedt, Adam D; Kovacević, Jelena

    2009-08-01

    We propose a new active mask algorithm for the segmentation of fluorescence microscope images of punctate patterns. It combines the (a) flexibility offered by active-contour methods, (b) speed offered by multiresolution methods, (c) smoothing offered by multiscale methods, and (d) statistical modeling offered by region-growing methods into a fast and accurate segmentation tool. The framework moves from the idea of the "contour" to that of "inside and outside," or masks, allowing for easy multidimensional segmentation. It adapts to the topology of the image through the use of multiple masks. The algorithm is almost invariant under initialization, allowing for random initialization, and uses a few easily tunable parameters. Experiments show that the active mask algorithm matches the ground truth well and outperforms the algorithm widely used in fluorescence microscopy, seeded watershed, both qualitatively, as well as quantitatively.

  12. Air traffic surveillance and control using hybrid estimation and protocol-based conflict resolution

    NASA Astrophysics Data System (ADS)

    Hwang, Inseok

    The continued growth of air travel and recent advances in new technologies for navigation, surveillance, and communication have led to proposals by the Federal Aviation Administration (FAA) to provide reliable and efficient tools to aid Air Traffic Control (ATC) in performing their tasks. In this dissertation, we address four problems frequently encountered in air traffic surveillance and control; multiple target tracking and identity management, conflict detection, conflict resolution, and safety verification. We develop a set of algorithms and tools to aid ATC; These algorithms have the provable properties of safety, computational efficiency, and convergence. Firstly, we develop a multiple-maneuvering-target tracking and identity management algorithm which can keep track of maneuvering aircraft in noisy environments and of their identities. Secondly, we propose a hybrid probabilistic conflict detection algorithm between multiple aircraft which uses flight mode estimates as well as aircraft current state estimates. Our algorithm is based on hybrid models of aircraft, which incorporate both continuous dynamics and discrete mode switching. Thirdly, we develop an algorithm for multiple (greater than two) aircraft conflict avoidance that is based on a closed-form analytic solution and thus provides guarantees of safety. Finally, we consider the problem of safety verification of control laws for safety critical systems, with application to air traffic control systems. We approach safety verification through reachability analysis, which is a computationally expensive problem. We develop an over-approximate method for reachable set computation using polytopic approximation methods and dynamic optimization. These algorithms may be used either in a fully autonomous way, or as supporting tools to increase controllers' situational awareness and to reduce their work load.

  13. Evaluation of hybrid SART  +  OS  +  TV iterative reconstruction algorithm for optical-CT gel dosimeter imaging

    NASA Astrophysics Data System (ADS)

    Du, Yi; Wang, Xiangang; Xiang, Xincheng; Wei, Zhouping

    2016-12-01

    Optical computed tomography (optical-CT) is a high-resolution, fast, and easily accessible readout modality for gel dosimeters. This paper evaluates a hybrid iterative image reconstruction algorithm for optical-CT gel dosimeter imaging, namely, the simultaneous algebraic reconstruction technique (SART) integrated with ordered subsets (OS) iteration and total variation (TV) minimization regularization. The mathematical theory and implementation workflow of the algorithm are detailed. Experiments on two different optical-CT scanners were performed for cross-platform validation. For algorithm evaluation, the iterative convergence is first shown, and peak-to-noise-ratio (PNR) and contrast-to-noise ratio (CNR) results are given with the cone-beam filtered backprojection (FDK) algorithm and the FDK results followed by median filtering (mFDK) as reference. The effect on spatial gradients and reconstruction artefacts is also investigated. The PNR curve illustrates that the results of SART  +  OS  +  TV finally converges to that of FDK but with less noise, which implies that the dose-OD calibration method for FDK is also applicable to the proposed algorithm. The CNR in selected regions-of-interest (ROIs) of SART  +  OS  +  TV results is almost double that of FDK and 50% higher than that of mFDK. The artefacts in SART  +  OS  +  TV results are still visible, but have been much suppressed with little spatial gradient loss. Based on the assessment, we can conclude that this hybrid SART  +  OS  +  TV algorithm outperforms both FDK and mFDK in denoising, preserving spatial dose gradients and reducing artefacts, and its effectiveness and efficiency are platform independent.

  14. Hybridization of decomposition and local search for multiobjective optimization.

    PubMed

    Ke, Liangjun; Zhang, Qingfu; Battiti, Roberto

    2014-10-01

    Combining ideas from evolutionary algorithms, decomposition approaches, and Pareto local search, this paper suggests a simple yet efficient memetic algorithm for combinatorial multiobjective optimization problems: memetic algorithm based on decomposition (MOMAD). It decomposes a combinatorial multiobjective problem into a number of single objective optimization problems using an aggregation method. MOMAD evolves three populations: 1) population P(L) for recording the current solution to each subproblem; 2) population P(P) for storing starting solutions for Pareto local search; and 3) an external population P(E) for maintaining all the nondominated solutions found so far during the search. A problem-specific single objective heuristic can be applied to these subproblems to initialize the three populations. At each generation, a Pareto local search method is first applied to search a neighborhood of each solution in P(P) to update P(L) and P(E). Then a single objective local search is applied to each perturbed solution in P(L) for improving P(L) and P(E), and reinitializing P(P). The procedure is repeated until a stopping condition is met. MOMAD provides a generic hybrid multiobjective algorithmic framework in which problem specific knowledge, well developed single objective local search and heuristics and Pareto local search methods can be hybridized. It is a population based iterative method and thus an anytime algorithm. Extensive experiments have been conducted in this paper to study MOMAD and compare it with some other state-of-the-art algorithms on the multiobjective traveling salesman problem and the multiobjective knapsack problem. The experimental results show that our proposed algorithm outperforms or performs similarly to the best so far heuristics on these two problems.

  15. Fast-SG: an alignment-free algorithm for hybrid assembly.

    PubMed

    Di Genova, Alex; Ruz, Gonzalo A; Sagot, Marie-France; Maass, Alejandro

    2018-05-01

    Long-read sequencing technologies are the ultimate solution for genome repeats, allowing near reference-level reconstructions of large genomes. However, long-read de novo assembly pipelines are computationally intense and require a considerable amount of coverage, thereby hindering their broad application to the assembly of large genomes. Alternatively, hybrid assembly methods that combine short- and long-read sequencing technologies can reduce the time and cost required to produce de novo assemblies of large genomes. Here, we propose a new method, called Fast-SG, that uses a new ultrafast alignment-free algorithm specifically designed for constructing a scaffolding graph using light-weight data structures. Fast-SG can construct the graph from either short or long reads. This allows the reuse of efficient algorithms designed for short-read data and permits the definition of novel modular hybrid assembly pipelines. Using comprehensive standard datasets and benchmarks, we show how Fast-SG outperforms the state-of-the-art short-read aligners when building the scaffoldinggraph and can be used to extract linking information from either raw or error-corrected long reads. We also show how a hybrid assembly approach using Fast-SG with shallow long-read coverage (5X) and moderate computational resources can produce long-range and accurate reconstructions of the genomes of Arabidopsis thaliana (Ler-0) and human (NA12878). Fast-SG opens a door to achieve accurate hybrid long-range reconstructions of large genomes with low effort, high portability, and low cost.

  16. Image Reconstruction for Hybrid True-Color Micro-CT

    PubMed Central

    Xu, Qiong; Yu, Hengyong; Bennett, James; He, Peng; Zainon, Rafidah; Doesburg, Robert; Opie, Alex; Walsh, Mike; Shen, Haiou; Butler, Anthony; Butler, Phillip; Mou, Xuanqin; Wang, Ge

    2013-01-01

    X-ray micro-CT is an important imaging tool for biomedical researchers. Our group has recently proposed a hybrid “true-color” micro-CT system to improve contrast resolution with lower system cost and radiation dose. The system incorporates an energy-resolved photon-counting true-color detector into a conventional micro-CT configuration, and can be used for material decomposition. In this paper, we demonstrate an interior color-CT image reconstruction algorithm developed for this hybrid true-color micro-CT system. A compressive sensing-based statistical interior tomography method is employed to reconstruct each channel in the local spectral imaging chain, where the reconstructed global gray-scale image from the conventional imaging chain served as the initial guess. Principal component analysis was used to map the spectral reconstructions into the color space. The proposed algorithm was evaluated by numerical simulations, physical phantom experiments, and animal studies. The results confirm the merits of the proposed algorithm, and demonstrate the feasibility of the hybrid true-color micro-CT system. Additionally, a “color diffusion” phenomenon was observed whereby high-quality true-color images are produced not only inside the region of interest, but also in neighboring regions. It appears harnessing that this phenomenon could potentially reduce the color detector size for a given ROI, further reducing system cost and radiation dose. PMID:22481806

  17. A Hybrid Metaheuristic DE/CS Algorithm for UCAV Three-Dimension Path Planning

    PubMed Central

    Wang, Gaige; Guo, Lihong; Duan, Hong; Wang, Heqi; Liu, Luo; Shao, Mingzhen

    2012-01-01

    Three-dimension path planning for uninhabited combat air vehicle (UCAV) is a complicated high-dimension optimization problem, which primarily centralizes on optimizing the flight route considering the different kinds of constrains under complicated battle field environments. A new hybrid metaheuristic differential evolution (DE) and cuckoo search (CS) algorithm is proposed to solve the UCAV three-dimension path planning problem. DE is applied to optimize the process of selecting cuckoos of the improved CS model during the process of cuckoo updating in nest. The cuckoos can act as an agent in searching the optimal UCAV path. And then, the UCAV can find the safe path by connecting the chosen nodes of the coordinates while avoiding the threat areas and costing minimum fuel. This new approach can accelerate the global convergence speed while preserving the strong robustness of the basic CS. The realization procedure for this hybrid metaheuristic approach DE/CS is also presented. In order to make the optimized UCAV path more feasible, the B-Spline curve is adopted for smoothing the path. To prove the performance of this proposed hybrid metaheuristic method, it is compared with basic CS algorithm. The experiment shows that the proposed approach is more effective and feasible in UCAV three-dimension path planning than the basic CS model. PMID:23193383

  18. A hybrid metaheuristic DE/CS algorithm for UCAV three-dimension path planning.

    PubMed

    Wang, Gaige; Guo, Lihong; Duan, Hong; Wang, Heqi; Liu, Luo; Shao, Mingzhen

    2012-01-01

    Three-dimension path planning for uninhabited combat air vehicle (UCAV) is a complicated high-dimension optimization problem, which primarily centralizes on optimizing the flight route considering the different kinds of constrains under complicated battle field environments. A new hybrid metaheuristic differential evolution (DE) and cuckoo search (CS) algorithm is proposed to solve the UCAV three-dimension path planning problem. DE is applied to optimize the process of selecting cuckoos of the improved CS model during the process of cuckoo updating in nest. The cuckoos can act as an agent in searching the optimal UCAV path. And then, the UCAV can find the safe path by connecting the chosen nodes of the coordinates while avoiding the threat areas and costing minimum fuel. This new approach can accelerate the global convergence speed while preserving the strong robustness of the basic CS. The realization procedure for this hybrid metaheuristic approach DE/CS is also presented. In order to make the optimized UCAV path more feasible, the B-Spline curve is adopted for smoothing the path. To prove the performance of this proposed hybrid metaheuristic method, it is compared with basic CS algorithm. The experiment shows that the proposed approach is more effective and feasible in UCAV three-dimension path planning than the basic CS model.

  19. A short-term and high-resolution distribution system load forecasting approach using support vector regression with hybrid parameters optimization

    DOE PAGES

    Jiang, Huaiguang; Zhang, Yingchen; Muljadi, Eduard; ...

    2016-01-01

    This paper proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system. The performance of the proposed approach is compared to some classic methods in later sections of the paper.« less

  20. A New DEM Generalization Method Based on Watershed and Tree Structure

    PubMed Central

    Chen, Yonggang; Ma, Tianwu; Chen, Xiaoyin; Chen, Zhende; Yang, Chunju; Lin, Chenzhi; Shan, Ligang

    2016-01-01

    The DEM generalization is the basis of multi-dimensional observation, the basis of expressing and analyzing the terrain. DEM is also the core of building the Multi-Scale Geographic Database. Thus, many researchers have studied both the theory and the method of DEM generalization. This paper proposed a new method of generalizing terrain, which extracts feature points based on the tree model construction which considering the nested relationship of watershed characteristics. The paper used the 5 m resolution DEM of the Jiuyuan gully watersheds in the Loess Plateau as the original data and extracted the feature points in every single watershed to reconstruct the DEM. The paper has achieved generalization from 1:10000 DEM to 1:50000 DEM by computing the best threshold. The best threshold is 0.06. In the last part of the paper, the height accuracy of the generalized DEM is analyzed by comparing it with some other classic methods, such as aggregation, resample, and VIP based on the original 1:50000 DEM. The outcome shows that the method performed well. The method can choose the best threshold according to the target generalization scale to decide the density of the feature points in the watershed. Meanwhile, this method can reserve the skeleton of the terrain, which can meet the needs of different levels of generalization. Additionally, through overlapped contour contrast, elevation statistical parameters and slope and aspect analysis, we found out that the W8D algorithm performed well and effectively in terrain representation. PMID:27517296

  1. Assessing Habitat Suitability at Multiple Scales: A Landscape-Level Approach

    Treesearch

    Kurt H. Riitters; R.V. O' Neill; K.B. Jones

    1997-01-01

    The distribution and abundance of many plants and animals are influenced by the spatial arrangement of suitable habitats across landscapes. We derived habitat maps from a digital land cover map of the ~178,000 km2 Chesapeake Bay Watershed by using a spatial filtering algorithm. The regional amounts and patterns of habitats were different for...

  2. Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems

    NASA Astrophysics Data System (ADS)

    Demir, I.; Sermet, M. Y.; Sit, M. A.

    2016-12-01

    Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.

  3. Simulation of semi-arid hydrological processes at different spatial resolutions using the AgroEcoSystem-Watershed (AgES-W) model

    NASA Astrophysics Data System (ADS)

    Green, T. R.; Erksine, R. H.; David, O.; Ascough, J. C., II; Kipka, H.; Lloyd, W. J.; McMaster, G. S.

    2015-12-01

    Water movement and storage within a watershed may be simulated at different spatial resolutions of land areas or hydrological response units (HRUs). Here, effects of HRU size on simulated soil water and surface runoff are tested using the AgroEcoSystem-Watershed (AgES-W) model with three different resolutions of HRUs. We studied a 56-ha agricultural watershed in northern Colorado, USA farmed primarily under a wheat-fallow rotation. The delineation algorithm was based upon topography (surface flow paths), land use (crop management strips and native grass), and mapped soil units (three types), which produced HRUs that follow the land use and soil boundaries. AgES-W model parameters that control surface and subsurface hydrology were calibrated using simulated daily soil moisture at different landscape positions and depths where soil moisture was measured hourly and averaged up to daily values. Parameter sets were both uniform and spatially variable with depth and across the watershed (5 different calibration approaches). Although forward simulations were computationally efficient (less than 1 minute each), each calibration required thousands of model runs. Execution of such large jobs was facilitated by using the Object Modeling System with the Cloud Services Innovation Platform to manage four virtual machines on a commercial web service configured with a total of 64 computational cores and 120 GB of memory. Results show how spatially distributed and averaged soil moisture and runoff at the outlet vary with different HRU delineations. The results will help guide HRU delineation, spatial resolution and parameter estimation methods for improved hydrological simulations in this and other semi-arid agricultural watersheds.

  4. A constraint-based evolutionary learning approach to the expectation maximization for optimal estimation of the hidden Markov model for speech signal modeling.

    PubMed

    Huda, Shamsul; Yearwood, John; Togneri, Roberto

    2009-02-01

    This paper attempts to overcome the tendency of the expectation-maximization (EM) algorithm to locate a local rather than global maximum when applied to estimate the hidden Markov model (HMM) parameters in speech signal modeling. We propose a hybrid algorithm for estimation of the HMM in automatic speech recognition (ASR) using a constraint-based evolutionary algorithm (EA) and EM, the CEL-EM. The novelty of our hybrid algorithm (CEL-EM) is that it is applicable for estimation of the constraint-based models with many constraints and large numbers of parameters (which use EM) like HMM. Two constraint-based versions of the CEL-EM with different fusion strategies have been proposed using a constraint-based EA and the EM for better estimation of HMM in ASR. The first one uses a traditional constraint-handling mechanism of EA. The other version transforms a constrained optimization problem into an unconstrained problem using Lagrange multipliers. Fusion strategies for the CEL-EM use a staged-fusion approach where EM has been plugged with the EA periodically after the execution of EA for a specific period of time to maintain the global sampling capabilities of EA in the hybrid algorithm. A variable initialization approach (VIA) has been proposed using a variable segmentation to provide a better initialization for EA in the CEL-EM. Experimental results on the TIMIT speech corpus show that CEL-EM obtains higher recognition accuracies than the traditional EM algorithm as well as a top-standard EM (VIA-EM, constructed by applying the VIA to EM).

  5. Hybrid simulation combining two space-time discretization of the discrete-velocity Boltzmann equation

    NASA Astrophysics Data System (ADS)

    Horstmann, Jan Tobias; Le Garrec, Thomas; Mincu, Daniel-Ciprian; Lévêque, Emmanuel

    2017-11-01

    Despite the efficiency and low dissipation of the stream-collide scheme of the discrete-velocity Boltzmann equation, which is nowadays implemented in many lattice Boltzmann solvers, a major drawback exists over alternative discretization schemes, i.e. finite-volume or finite-difference, that is the limitation to Cartesian uniform grids. In this paper, an algorithm is presented that combines the positive features of each scheme in a hybrid lattice Boltzmann method. In particular, the node-based streaming of the distribution functions is coupled with a second-order finite-volume discretization of the advection term of the Boltzmann equation under the Bhatnagar-Gross-Krook approximation. The algorithm is established on a multi-domain configuration, with the individual schemes being solved on separate sub-domains and connected by an overlapping interface of at least 2 grid cells. A critical parameter in the coupling is the CFL number equal to unity, which is imposed by the stream-collide algorithm. Nevertheless, a semi-implicit treatment of the collision term in the finite-volume formulation allows us to obtain a stable solution for this condition. The algorithm is validated in the scope of three different test cases on a 2D periodic mesh. It is shown that the accuracy of the combined discretization schemes agrees with the order of each separate scheme involved. The overall numerical error of the hybrid algorithm in the macroscopic quantities is contained between the error of the two individual algorithms. Finally, we demonstrate how such a coupling can be used to adapt to anisotropic flows with some gradual mesh refinement in the FV domain.

  6. Identification of homogeneous regions for regionalization of watersheds by two-level self-organizing feature maps

    NASA Astrophysics Data System (ADS)

    Farsadnia, F.; Rostami Kamrood, M.; Moghaddam Nia, A.; Modarres, R.; Bray, M. T.; Han, D.; Sadatinejad, J.

    2014-02-01

    One of the several methods in estimating flood quantiles in ungauged or data-scarce watersheds is regional frequency analysis. Amongst the approaches to regional frequency analysis, different clustering techniques have been proposed to determine hydrologically homogeneous regions in the literature. Recently, Self-Organization feature Map (SOM), a modern hydroinformatic tool, has been applied in several studies for clustering watersheds. However, further studies are still needed with SOM on the interpretation of SOM output map for identifying hydrologically homogeneous regions. In this study, two-level SOM and three clustering methods (fuzzy c-mean, K-mean, and Ward's Agglomerative hierarchical clustering) are applied in an effort to identify hydrologically homogeneous regions in Mazandaran province watersheds in the north of Iran, and their results are compared with each other. Firstly the SOM is used to form a two-dimensional feature map. Next, the output nodes of the SOM are clustered by using unified distance matrix algorithm and three clustering methods to form regions for flood frequency analysis. The heterogeneity test indicates the four regions achieved by the two-level SOM and Ward approach after adjustments are sufficiently homogeneous. The results suggest that the combination of SOM and Ward is much better than the combination of either SOM and FCM or SOM and K-mean.

  7. Hybrid Architectures for Evolutionary Computing Algorithms

    DTIC Science & Technology

    2008-01-01

    other EC algorithms to FPGA Core Burns P1026/MAPLD 200532 Genetic Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based...on Parallel and Distributed Processing (IPPS/SPDP 󈨦), pp. 316-320, Proceedings. IEEE Computer Society 1998. [12] Scott, S. D. , Samal , A., and...Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based Genetic Algorithm”, Proceedings of the 1995 ACM Third

  8. Multi-period project portfolio selection under risk considerations and stochastic income

    NASA Astrophysics Data System (ADS)

    Tofighian, Ali Asghar; Moezzi, Hamid; Khakzar Barfuei, Morteza; Shafiee, Mahmood

    2018-02-01

    This paper deals with multi-period project portfolio selection problem. In this problem, the available budget is invested on the best portfolio of projects in each period such that the net profit is maximized. We also consider more realistic assumptions to cover wider range of applications than those reported in previous studies. A novel mathematical model is presented to solve the problem, considering risks, stochastic incomes, and possibility of investing extra budget in each time period. Due to the complexity of the problem, an effective meta-heuristic method hybridized with a local search procedure is presented to solve the problem. The algorithm is based on genetic algorithm (GA), which is a prominent method to solve this type of problems. The GA is enhanced by a new solution representation and well selected operators. It also is hybridized with a local search mechanism to gain better solution in shorter time. The performance of the proposed algorithm is then compared with well-known algorithms, like basic genetic algorithm (GA), particle swarm optimization (PSO), and electromagnetism-like algorithm (EM-like) by means of some prominent indicators. The computation results show the superiority of the proposed algorithm in terms of accuracy, robustness and computation time. At last, the proposed algorithm is wisely combined with PSO to improve the computing time considerably.

  9. Load Frequency Control of a Two-Area Thermal-Hybrid Power System Using a Novel Quasi-Opposition Harmony Search Algorithm

    NASA Astrophysics Data System (ADS)

    Mahto, Tarkeshwar; Mukherjee, V.

    2016-09-01

    In the present work, a two-area thermal-hybrid interconnected power system, consisting of a thermal unit in one area and a hybrid wind-diesel unit in other area is considered. Capacitive energy storage (CES) and CES with static synchronous series compensator (SSSC) are connected to the studied two-area model to compensate for varying load demand, intermittent output power and area frequency oscillation. A novel quasi-opposition harmony search (QOHS) algorithm is proposed and applied to tune the various tunable parameters of the studied power system model. Simulation study reveals that inclusion of CES unit in both the areas yields superb damping performance for frequency and tie-line power deviation. From the simulation results it is further revealed that inclusion of SSSC is not viable from both technical as well as economical point of view as no considerable improvement in transient performance is noted with its inclusion in the tie-line of the studied power system model. The results presented in this paper demonstrate the potential of the proposed QOHS algorithm and show its effectiveness and robustness for solving frequency and power drift problems of the studied power systems. Binary coded genetic algorithm is taken for sake of comparison.

  10. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    NASA Astrophysics Data System (ADS)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  11. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    NASA Astrophysics Data System (ADS)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  12. Classification of Two Class Motor Imagery Tasks Using Hybrid GA-PSO Based K-Means Clustering.

    PubMed

    Suraj; Tiwari, Purnendu; Ghosh, Subhojit; Sinha, Rakesh Kumar

    2015-01-01

    Transferring the brain computer interface (BCI) from laboratory condition to meet the real world application needs BCI to be applied asynchronously without any time constraint. High level of dynamism in the electroencephalogram (EEG) signal reasons us to look toward evolutionary algorithm (EA). Motivated by these two facts, in this work a hybrid GA-PSO based K-means clustering technique has been used to distinguish two class motor imagery (MI) tasks. The proposed hybrid GA-PSO based K-means clustering is found to outperform genetic algorithm (GA) and particle swarm optimization (PSO) based K-means clustering techniques in terms of both accuracy and execution time. The lesser execution time of hybrid GA-PSO technique makes it suitable for real time BCI application. Time frequency representation (TFR) techniques have been used to extract the feature of the signal under investigation. TFRs based features are extracted and relying on the concept of event related synchronization (ERD) and desynchronization (ERD) feature vector is formed.

  13. Classification of Two Class Motor Imagery Tasks Using Hybrid GA-PSO Based K-Means Clustering

    PubMed Central

    Suraj; Tiwari, Purnendu; Ghosh, Subhojit; Sinha, Rakesh Kumar

    2015-01-01

    Transferring the brain computer interface (BCI) from laboratory condition to meet the real world application needs BCI to be applied asynchronously without any time constraint. High level of dynamism in the electroencephalogram (EEG) signal reasons us to look toward evolutionary algorithm (EA). Motivated by these two facts, in this work a hybrid GA-PSO based K-means clustering technique has been used to distinguish two class motor imagery (MI) tasks. The proposed hybrid GA-PSO based K-means clustering is found to outperform genetic algorithm (GA) and particle swarm optimization (PSO) based K-means clustering techniques in terms of both accuracy and execution time. The lesser execution time of hybrid GA-PSO technique makes it suitable for real time BCI application. Time frequency representation (TFR) techniques have been used to extract the feature of the signal under investigation. TFRs based features are extracted and relying on the concept of event related synchronization (ERD) and desynchronization (ERD) feature vector is formed. PMID:25972896

  14. The regionalization of national-scale SPARROW models for stream nutrients

    USGS Publications Warehouse

    Schwarz, Gregory E.; Alexander, Richard B.; Smith, Richard A.; Preston, Stephen D.

    2011-01-01

    This analysis modifies the parsimonious specification of recently published total nitrogen (TN) and total phosphorus (TP) national-scale SPAtially Referenced Regressions On Watershed attributes models to allow each model coefficient to vary geographically among three major river basins of the conterminous United States. Regionalization of the national models reduces the standard errors in the prediction of TN and TP loads, expressed as a percentage of the predicted load, by about 6 and 7%. We develop and apply a method for combining national-scale and regional-scale information to estimate a hybrid model that imposes cross-region constraints that limit regional variation in model coefficients, effectively reducing the number of free model parameters as compared to a collection of independent regional models. The hybrid TN and TP regional models have improved model fit relative to the respective national models, reducing the standard error in the prediction of loads, expressed as a percentage of load, by about 5 and 4%. Only 19% of the TN hybrid model coefficients and just 2% of the TP hybrid model coefficients show evidence of substantial regional specificity (more than ±100% deviation from the national model estimate). The hybrid models have much greater precision in the estimated coefficients than do the unconstrained regional models, demonstrating the efficacy of pooling information across regions to improve regional models.

  15. Bands selection and classification of hyperspectral images based on hybrid kernels SVM by evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Hu, Yan-Yan; Li, Dong-Sheng

    2016-01-01

    The hyperspectral images(HSI) consist of many closely spaced bands carrying the most object information. While due to its high dimensionality and high volume nature, it is hard to get satisfactory classification performance. In order to reduce HSI data dimensionality preparation for high classification accuracy, it is proposed to combine a band selection method of artificial immune systems (AIS) with a hybrid kernels support vector machine (SVM-HK) algorithm. In fact, after comparing different kernels for hyperspectral analysis, the approach mixed radial basis function kernel (RBF-K) with sigmoid kernel (Sig-K) and applied the optimized hybrid kernels in SVM classifiers. Then the SVM-HK algorithm used to induce the bands selection of an improved version of AIS. The AIS was composed of clonal selection and elite antibody mutation, including evaluation process with optional index factor (OIF). Experimental classification performance was on a San Diego Naval Base acquired by AVIRIS, the HRS dataset shows that the method is able to efficiently achieve bands redundancy removal while outperforming the traditional SVM classifier.

  16. An Improved Hybrid Encoding Cuckoo Search Algorithm for 0-1 Knapsack Problems

    PubMed Central

    Feng, Yanhong; Jia, Ke; He, Yichao

    2014-01-01

    Cuckoo search (CS) is a new robust swarm intelligence method that is based on the brood parasitism of some cuckoo species. In this paper, an improved hybrid encoding cuckoo search algorithm (ICS) with greedy strategy is put forward for solving 0-1 knapsack problems. First of all, for solving binary optimization problem with ICS, based on the idea of individual hybrid encoding, the cuckoo search over a continuous space is transformed into the synchronous evolution search over discrete space. Subsequently, the concept of confidence interval (CI) is introduced; hence, the new position updating is designed and genetic mutation with a small probability is introduced. The former enables the population to move towards the global best solution rapidly in every generation, and the latter can effectively prevent the ICS from trapping into the local optimum. Furthermore, the greedy transform method is used to repair the infeasible solution and optimize the feasible solution. Experiments with a large number of KP instances show the effectiveness of the proposed algorithm and its ability to achieve good quality solutions. PMID:24527026

  17. Aeon: Synthesizing Scheduling Algorithms from High-Level Models

    NASA Astrophysics Data System (ADS)

    Monette, Jean-Noël; Deville, Yves; van Hentenryck, Pascal

    This paper describes the aeon system whose aim is to synthesize scheduling algorithms from high-level models. A eon, which is entirely written in comet, receives as input a high-level model for a scheduling application which is then analyzed to generate a dedicated scheduling algorithm exploiting the structure of the model. A eon provides a variety of synthesizers for generating complete or heuristic algorithms. Moreover, synthesizers are compositional, making it possible to generate complex hybrid algorithms naturally. Preliminary experimental results indicate that this approach may be competitive with state-of-the-art search algorithms.

  18. A hybrid multiscale Monte Carlo algorithm (HyMSMC) to cope with disparity in time scales and species populations in intracellular networks.

    PubMed

    Samant, Asawari; Ogunnaike, Babatunde A; Vlachos, Dionisios G

    2007-05-24

    The fundamental role that intrinsic stochasticity plays in cellular functions has been shown via numerous computational and experimental studies. In the face of such evidence, it is important that intracellular networks are simulated with stochastic algorithms that can capture molecular fluctuations. However, separation of time scales and disparity in species population, two common features of intracellular networks, make stochastic simulation of such networks computationally prohibitive. While recent work has addressed each of these challenges separately, a generic algorithm that can simultaneously tackle disparity in time scales and population scales in stochastic systems is currently lacking. In this paper, we propose the hybrid, multiscale Monte Carlo (HyMSMC) method that fills in this void. The proposed HyMSMC method blends stochastic singular perturbation concepts, to deal with potential stiffness, with a hybrid of exact and coarse-grained stochastic algorithms, to cope with separation in population sizes. In addition, we introduce the computational singular perturbation (CSP) method as a means of systematically partitioning fast and slow networks and computing relaxation times for convergence. We also propose a new criteria of convergence of fast networks to stochastic low-dimensional manifolds, which further accelerates the algorithm. We use several prototype and biological examples, including a gene expression model displaying bistability, to demonstrate the efficiency, accuracy and applicability of the HyMSMC method. Bistable models serve as stringent tests for the success of multiscale MC methods and illustrate limitations of some literature methods.

  19. Towards unbiased benchmarking of evolutionary and hybrid algorithms for real-valued optimisation

    NASA Astrophysics Data System (ADS)

    MacNish, Cara

    2007-12-01

    Randomised population-based algorithms, such as evolutionary, genetic and swarm-based algorithms, and their hybrids with traditional search techniques, have proven successful and robust on many difficult real-valued optimisation problems. This success, along with the readily applicable nature of these techniques, has led to an explosion in the number of algorithms and variants proposed. In order for the field to advance it is necessary to carry out effective comparative evaluations of these algorithms, and thereby better identify and understand those properties that lead to better performance. This paper discusses the difficulties of providing benchmarking of evolutionary and allied algorithms that is both meaningful and logistically viable. To be meaningful the benchmarking test must give a fair comparison that is free, as far as possible, from biases that favour one style of algorithm over another. To be logistically viable it must overcome the need for pairwise comparison between all the proposed algorithms. To address the first problem, we begin by attempting to identify the biases that are inherent in commonly used benchmarking functions. We then describe a suite of test problems, generated recursively as self-similar or fractal landscapes, designed to overcome these biases. For the second, we describe a server that uses web services to allow researchers to 'plug in' their algorithms, running on their local machines, to a central benchmarking repository.

  20. 21st century locomotive technology: quarterly technical status report 26

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lembit Salasoo; Ramu Chandra

    2009-08-24

    Parasitic losses due to hybrid sodium battery thermal management do not significantly reduce the fuel saving benefits of the hybrid locomotive. Optimal thermal management trajectories were converted into realizable algorithms which were robust and gave excellent performance to limit thermal excusions and maintain fuel savings.

  1. Auto-Relevancy Baseline: A Hybrid System Without Human Feedback

    DTIC Science & Technology

    2010-11-01

    classical Bayes algorithm upon the pseudo-hybridization of SemanticA and Latent Semantic IndexingBC systems should smooth out historically high yet...black box emulated a machine learning topic expert. Similar to some Web methods, the initial topics within the legal document were expanded upon

  2. Nonlinear inversion of potential-field data using a hybrid-encoding genetic algorithm

    USGS Publications Warehouse

    Chen, C.; Xia, J.; Liu, J.; Feng, G.

    2006-01-01

    Using a genetic algorithm to solve an inverse problem of complex nonlinear geophysical equations is advantageous because it does not require computer gradients of models or "good" initial models. The multi-point search of a genetic algorithm makes it easier to find the globally optimal solution while avoiding falling into a local extremum. As is the case in other optimization approaches, the search efficiency for a genetic algorithm is vital in finding desired solutions successfully in a multi-dimensional model space. A binary-encoding genetic algorithm is hardly ever used to resolve an optimization problem such as a simple geophysical inversion with only three unknowns. The encoding mechanism, genetic operators, and population size of the genetic algorithm greatly affect search processes in the evolution. It is clear that improved operators and proper population size promote the convergence. Nevertheless, not all genetic operations perform perfectly while searching under either a uniform binary or a decimal encoding system. With the binary encoding mechanism, the crossover scheme may produce more new individuals than with the decimal encoding. On the other hand, the mutation scheme in a decimal encoding system will create new genes larger in scope than those in the binary encoding. This paper discusses approaches of exploiting the search potential of genetic operations in the two encoding systems and presents an approach with a hybrid-encoding mechanism, multi-point crossover, and dynamic population size for geophysical inversion. We present a method that is based on the routine in which the mutation operation is conducted in the decimal code and multi-point crossover operation in the binary code. The mix-encoding algorithm is called the hybrid-encoding genetic algorithm (HEGA). HEGA provides better genes with a higher probability by a mutation operator and improves genetic algorithms in resolving complicated geophysical inverse problems. Another significant result is that final solution is determined by the average model derived from multiple trials instead of one computation due to the randomness in a genetic algorithm procedure. These advantages were demonstrated by synthetic and real-world examples of inversion of potential-field data. ?? 2005 Elsevier Ltd. All rights reserved.

  3. Scalable and Power Efficient Data Analytics for Hybrid Exascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Samatova, Nagiza; Wu, Kesheng

    This project developed a generic and optimized set of core data analytics functions. These functions organically consolidate a broad constellation of high performance analytical pipelines. As the architectures of emerging HPC systems become inherently heterogeneous, there is a need to design algorithms for data analysis kernels accelerated on hybrid multi-node, multi-core HPC architectures comprised of a mix of CPUs, GPUs, and SSDs. Furthermore, the power-aware trend drives the advances in our performance-energy tradeoff analysis framework which enables our data analysis kernels algorithms and software to be parameterized so that users can choose the right power-performance optimizations.

  4. Multicanonical hybrid Monte Carlo algorithm: Boosting simulations of compact QED

    NASA Astrophysics Data System (ADS)

    Arnold, G.; Schilling, K.; Lippert, Th.

    1999-03-01

    We demonstrate that substantial progress can be achieved in the study of the phase structure of four-dimensional compact QED by a joint use of hybrid Monte Carlo and multicanonical algorithms through an efficient parallel implementation. This is borne out by the observation of considerable speedup of tunnelling between the metastable states, close to the phase transition, on the Wilson line. We estimate that the creation of adequate samples (with order 100 flip-flops) becomes a matter of half a year's run time at 2 Gflops sustained performance for lattices of size up to 244.

  5. StreamCat and LakeCat: An overview of algorithms, data, and models developed at the US EPA Western Ecology Division to facilitate and advance watershed prediction in the conterminous US.

    EPA Science Inventory

    Geospatial data and techniques have long been critical to advancing the analysis and management of freshwater ecosystems. However, these data and techniques have often been limited to specific sample sites or regional analyses because of the difficulty associated with generating ...

  6. Development of a GIS interface for WEPP Model application to Great Lakes forested watersheds

    Treesearch

    J. R. Frankenberger; S. Dun; D. C. Flanagan; J. Q. Wu; W. J. Elliot

    2011-01-01

    This presentation will highlight efforts on development of a new online WEPP GIS interface, targeted toward application in forested regions bordering the Great Lakes. The key components and algorithms of the online GIS system will be outlined. The general procedures used to provide input to the WEPP model and to display model output will be demonstrated.

  7. Groupwise connectivity-based parcellation of the whole human cortical surface using watershed-driven dimension reduction.

    PubMed

    Lefranc, Sandrine; Roca, Pauline; Perrot, Matthieu; Poupon, Cyril; Le Bihan, Denis; Mangin, Jean-François; Rivière, Denis

    2016-05-01

    Segregating the human cortex into distinct areas based on structural connectivity criteria is of widespread interest in neuroscience. This paper presents a groupwise connectivity-based parcellation framework for the whole cortical surface using a new high quality diffusion dataset of 79 healthy subjects. Our approach performs gyrus by gyrus to parcellate the whole human cortex. The main originality of the method is to compress for each gyrus the connectivity profiles used for the clustering without any anatomical prior information. This step takes into account the interindividual cortical and connectivity variability. To this end, we consider intersubject high density connectivity areas extracted using a surface-based watershed algorithm. A wide validation study has led to a fully automatic pipeline which is robust to variations in data preprocessing (tracking type, cortical mesh characteristics and boundaries of initial gyri), data characteristics (including number of subjects), and the main algorithmic parameters. A remarkable reproducibility is achieved in parcellation results for the whole cortex, leading to clear and stable cortical patterns. This reproducibility has been tested across non-overlapping subgroups and the validation is presented mainly on the pre- and postcentral gyri. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Optimal control of hybrid qubits: Implementing the quantum permutation algorithm

    NASA Astrophysics Data System (ADS)

    Rivera-Ruiz, C. M.; de Lima, E. F.; Fanchini, F. F.; Lopez-Richard, V.; Castelano, L. K.

    2018-03-01

    The optimal quantum control theory is employed to determine electric pulses capable of producing quantum gates with a fidelity higher than 0.9997, when noise is not taken into account. Particularly, these quantum gates were chosen to perform the permutation algorithm in hybrid qubits in double quantum dots (DQDs). The permutation algorithm is an oracle based quantum algorithm that solves the problem of the permutation parity faster than a classical algorithm without the necessity of entanglement between particles. The only requirement for achieving the speedup is the use of a one-particle quantum system with at least three levels. The high fidelity found in our results is closely related to the quantum speed limit, which is a measure of how fast a quantum state can be manipulated. Furthermore, we model charge noise by considering an average over the optimal field centered at different values of the reference detuning, which follows a Gaussian distribution. When the Gaussian spread is of the order of 5 μ eV (10% of the correct value), the fidelity is still higher than 0.95. Our scheme also can be used for the practical realization of different quantum algorithms in DQDs.

  9. Hybrid Artificial Root Foraging Optimizer Based Multilevel Threshold for Image Segmentation

    PubMed Central

    Liu, Yang; Liu, Junfei

    2016-01-01

    This paper proposes a new plant-inspired optimization algorithm for multilevel threshold image segmentation, namely, hybrid artificial root foraging optimizer (HARFO), which essentially mimics the iterative root foraging behaviors. In this algorithm the new growth operators of branching, regrowing, and shrinkage are initially designed to optimize continuous space search by combining root-to-root communication and coevolution mechanism. With the auxin-regulated scheme, various root growth operators are guided systematically. With root-to-root communication, individuals exchange information in different efficient topologies, which essentially improve the exploration ability. With coevolution mechanism, the hierarchical spatial population driven by evolutionary pressure of multiple subpopulations is structured, which ensure that the diversity of root population is well maintained. The comparative results on a suit of benchmarks show the superiority of the proposed algorithm. Finally, the proposed HARFO algorithm is applied to handle the complex image segmentation problem based on multilevel threshold. Computational results of this approach on a set of tested images show the outperformance of the proposed algorithm in terms of optimization accuracy computation efficiency. PMID:27725826

  10. Hybrid Artificial Root Foraging Optimizer Based Multilevel Threshold for Image Segmentation.

    PubMed

    Liu, Yang; Liu, Junfei; Tian, Liwei; Ma, Lianbo

    2016-01-01

    This paper proposes a new plant-inspired optimization algorithm for multilevel threshold image segmentation, namely, hybrid artificial root foraging optimizer (HARFO), which essentially mimics the iterative root foraging behaviors. In this algorithm the new growth operators of branching, regrowing, and shrinkage are initially designed to optimize continuous space search by combining root-to-root communication and coevolution mechanism. With the auxin-regulated scheme, various root growth operators are guided systematically. With root-to-root communication, individuals exchange information in different efficient topologies, which essentially improve the exploration ability. With coevolution mechanism, the hierarchical spatial population driven by evolutionary pressure of multiple subpopulations is structured, which ensure that the diversity of root population is well maintained. The comparative results on a suit of benchmarks show the superiority of the proposed algorithm. Finally, the proposed HARFO algorithm is applied to handle the complex image segmentation problem based on multilevel threshold. Computational results of this approach on a set of tested images show the outperformance of the proposed algorithm in terms of optimization accuracy computation efficiency.

  11. Massively parallel algorithm and implementation of RI-MP2 energy calculation for peta-scale many-core supercomputers.

    PubMed

    Katouda, Michio; Naruse, Akira; Hirano, Yukihiko; Nakajima, Takahito

    2016-11-15

    A new parallel algorithm and its implementation for the RI-MP2 energy calculation utilizing peta-flop-class many-core supercomputers are presented. Some improvements from the previous algorithm (J. Chem. Theory Comput. 2013, 9, 5373) have been performed: (1) a dual-level hierarchical parallelization scheme that enables the use of more than 10,000 Message Passing Interface (MPI) processes and (2) a new data communication scheme that reduces network communication overhead. A multi-node and multi-GPU implementation of the present algorithm is presented for calculations on a central processing unit (CPU)/graphics processing unit (GPU) hybrid supercomputer. Benchmark results of the new algorithm and its implementation using the K computer (CPU clustering system) and TSUBAME 2.5 (CPU/GPU hybrid system) demonstrate high efficiency. The peak performance of 3.1 PFLOPS is attained using 80,199 nodes of the K computer. The peak performance of the multi-node and multi-GPU implementation is 514 TFLOPS using 1349 nodes and 4047 GPUs of TSUBAME 2.5. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. Retinal vessel segmentation on SLO image

    PubMed Central

    Xu, Juan; Ishikawa, Hiroshi; Wollstein, Gadi; Schuman, Joel S.

    2010-01-01

    A scanning laser ophthalmoscopy (SLO) image, taken from optical coherence tomography (OCT), usually has lower global/local contrast and more noise compared to the traditional retinal photograph, which makes the vessel segmentation challenging work. A hybrid algorithm is proposed to efficiently solve these problems by fusing several designed methods, taking the advantages of each method and reducing the error measurements. The algorithm has several steps consisting of image preprocessing, thresholding probe and weighted fusing. Four different methods are first designed to transform the SLO image into feature response images by taking different combinations of matched filter, contrast enhancement and mathematical morphology operators. A thresholding probe algorithm is then applied on those response images to obtain four vessel maps. Weighted majority opinion is used to fuse these vessel maps and generate a final vessel map. The experimental results showed that the proposed hybrid algorithm could successfully segment the blood vessels on SLO images, by detecting the major and small vessels and suppressing the noises. The algorithm showed substantial potential in various clinical applications. The use of this method can be also extended to medical image registration based on blood vessel location. PMID:19163149

  13. The theory of variational hybrid quantum-classical algorithms

    NASA Astrophysics Data System (ADS)

    McClean, Jarrod R.; Romero, Jonathan; Babbush, Ryan; Aspuru-Guzik, Alán

    2016-02-01

    Many quantum algorithms have daunting resource requirements when compared to what is available today. To address this discrepancy, a quantum-classical hybrid optimization scheme known as ‘the quantum variational eigensolver’ was developed (Peruzzo et al 2014 Nat. Commun. 5 4213) with the philosophy that even minimal quantum resources could be made useful when used in conjunction with classical routines. In this work we extend the general theory of this algorithm and suggest algorithmic improvements for practical implementations. Specifically, we develop a variational adiabatic ansatz and explore unitary coupled cluster where we establish a connection from second order unitary coupled cluster to universal gate sets through a relaxation of exponential operator splitting. We introduce the concept of quantum variational error suppression that allows some errors to be suppressed naturally in this algorithm on a pre-threshold quantum device. Additionally, we analyze truncation and correlated sampling in Hamiltonian averaging as ways to reduce the cost of this procedure. Finally, we show how the use of modern derivative free optimization techniques can offer dramatic computational savings of up to three orders of magnitude over previously used optimization techniques.

  14. Load balancing prediction method of cloud storage based on analytic hierarchy process and hybrid hierarchical genetic algorithm.

    PubMed

    Zhou, Xiuze; Lin, Fan; Yang, Lvqing; Nie, Jing; Tan, Qian; Zeng, Wenhua; Zhang, Nian

    2016-01-01

    With the continuous expansion of the cloud computing platform scale and rapid growth of users and applications, how to efficiently use system resources to improve the overall performance of cloud computing has become a crucial issue. To address this issue, this paper proposes a method that uses an analytic hierarchy process group decision (AHPGD) to evaluate the load state of server nodes. Training was carried out by using a hybrid hierarchical genetic algorithm (HHGA) for optimizing a radial basis function neural network (RBFNN). The AHPGD makes the aggregative indicator of virtual machines in cloud, and become input parameters of predicted RBFNN. Also, this paper proposes a new dynamic load balancing scheduling algorithm combined with a weighted round-robin algorithm, which uses the predictive periodical load value of nodes based on AHPPGD and RBFNN optimized by HHGA, then calculates the corresponding weight values of nodes and makes constant updates. Meanwhile, it keeps the advantages and avoids the shortcomings of static weighted round-robin algorithm.

  15. Optimisation of the hybrid renewable energy system by HOMER, PSO and CPSO for the study area

    NASA Astrophysics Data System (ADS)

    Khare, Vikas; Nema, Savita; Baredar, Prashant

    2017-04-01

    This study is based on simulation and optimisation of the renewable energy system of the police control room at Sagar in central India. To analyse this hybrid system, the meteorological data of solar insolation and hourly wind speeds of Sagar in central India (longitude 78°45‧ and latitude 23°50‧) have been considered. The pattern of load consumption is studied and suitably modelled for optimisation of the hybrid energy system using HOMER software. The results are compared with those of the particle swarm optimisation and the chaotic particle swarm optimisation algorithms. The use of these two algorithms to optimise the hybrid system leads to a higher quality result with faster convergence. Based on the optimisation result, it has been found that replacing conventional energy sources by the solar-wind hybrid renewable energy system will be a feasible solution for the distribution of electric power as a stand-alone application at the police control room. This system is more environmentally friendly than the conventional diesel generator. The fuel cost reduction is approximately 70-80% more than that of the conventional diesel generator.

  16. Primer reporte del híbrido intergenérico Vermivora chrysoptera x Vermivora pinus (“Brewsters Warbler”) en Nicaragua

    Treesearch

    W.J. Arendt; M.A. Tórrez

    2008-01-01

    This article is in Spanish. During an inventory to assess avian biodiversity among the critical watersheds that flow into the Pacific from Nicaragua, we captured a hybrid Vermivora chrysoptera (“Golden-winged Warbler”) with V. pinus (“Blue-winged Warbler”) known as Brewster’s Warbler (V. leucobronchialis). The capture ocurred on November 30, 2007 in dry forest (...

  17. Applied estimation for hybrid dynamical systems using perceptional information

    NASA Astrophysics Data System (ADS)

    Plotnik, Aaron M.

    This dissertation uses the motivating example of robotic tracking of mobile deep ocean animals to present innovations in robotic perception and estimation for hybrid dynamical systems. An approach to estimation for hybrid systems is presented that utilizes uncertain perceptional information about the system's mode to improve tracking of its mode and continuous states. This results in significant improvements in situations where previously reported methods of estimation for hybrid systems perform poorly due to poor distinguishability of the modes. The specific application that motivates this research is an automatic underwater robotic observation system that follows and films individual deep ocean animals. A first version of such a system has been developed jointly by the Stanford Aerospace Robotics Laboratory and Monterey Bay Aquarium Research Institute (MBARI). This robotic observation system is successfully fielded on MBARI's ROVs, but agile specimens often evade the system. When a human ROV pilot performs this task, one advantage that he has over the robotic observation system in these situations is the ability to use visual perceptional information about the target, immediately recognizing any changes in the specimen's behavior mode. With the approach of the human pilot in mind, a new version of the robotic observation system is proposed which is extended to (a) derive perceptional information (visual cues) about the behavior mode of the tracked specimen, and (b) merge this dissimilar, discrete and uncertain information with more traditional continuous noisy sensor data by extending existing algorithms for hybrid estimation. These performance enhancements are enabled by integrating techniques in hybrid estimation, computer vision and machine learning. First, real-time computer vision and classification algorithms extract a visual observation of the target's behavior mode. Existing hybrid estimation algorithms are extended to admit this uncertain but discrete observation, complementing the information available from more traditional sensors. State tracking is achieved using a new form of Rao-Blackwellized particle filter called the mode-observed Gaussian Particle Filter. Performance is demonstrated using data from simulation and data collected on actual specimens in the ocean. The framework for estimation using both traditional and perceptional information is easily extensible to other stochastic hybrid systems with mode-related perceptional observations available.

  18. A diagnostic algorithm for atypical spitzoid tumors: guidelines for immunohistochemical and molecular assessment.

    PubMed

    Cho-Vega, Jeong Hee

    2016-07-01

    Atypical spitzoid tumors are a morphologically diverse group of rare melanocytic lesions most frequently seen in children and young adults. As atypical spitzoid tumors bear striking resemblance to Spitz nevus and spitzoid melanomas clinically and histopathologically, it is crucial to determine its malignant potential and predict its clinical behavior. To date, many researchers have attempted to differentiate atypical spitzoid tumors from unequivocal melanomas based on morphological, immonohistochemical, and molecular diagnostic differences. A diagnostic algorithm is proposed here to assess the malignant potential of atypical spitzoid tumors by using a combination of immunohistochemical and cytogenetic/molecular tests. Together with classical morphological evaluation, this algorithm includes a set of immunohistochemistry assays (p16(Ink4a), a dual-color Ki67/MART-1, and HMB45), fluorescence in situ hybridization (FISH) with five probes (6p25, 8q24, 11q13, CEN9, and 9p21), and an array-based comparative genomic hybridization. This review discusses details of the algorithm, the rationale of each test used in the algorithm, and utility of this algorithm in routine dermatopathology practice. This algorithmic approach will provide a comprehensive diagnostic tool that complements conventional histological criteria and will significantly contribute to improve the diagnosis and prediction of the clinical behavior of atypical spitzoid tumors.

  19. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array

    NASA Astrophysics Data System (ADS)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A. Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-01

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele’s (ZDT’s) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  20. Evaluation of flash-flood discharge forecasts in complex terrain using precipitation

    USGS Publications Warehouse

    Yates, D.; Warner, T.T.; Brandes, E.A.; Leavesley, G.H.; Sun, Jielun; Mueller, C.K.

    2001-01-01

    Operational prediction of flash floods produced by thunderstorm (convective) precipitation in mountainous areas requires accurate estimates or predictions of the precipitation distribution in space and time. The details of the spatial distribution are especially critical in complex terrain because the watersheds are generally small in size, and small position errors in the forecast or observed placement of the precipitation can distribute the rain over the wrong watershed. In addition to the need for good precipitation estimates and predictions, accurate flood prediction requires a surface-hydrologic model that is capable of predicting stream or river discharge based on the precipitation-rate input data. Different techniques for the estimation and prediction of convective precipitation will be applied to the Buffalo Creek, Colorado flash flood of July 1996, where over 75 mm of rain from a thunderstorm fell on the watershed in less than 1 h. The hydrologic impact of the precipitation was exacerbated by the fact that a significant fraction of the watershed experienced a wildfire approximately two months prior to the rain event. Precipitation estimates from the National Weather Service's operational Weather Surveillance Radar-Doppler 1988 and the National Center for Atmospheric Research S-band, research, dual-polarization radar, colocated to the east of Denver, are compared. In addition, very short range forecasts from a convection-resolving dynamic model, which is initialized variationally using the radar reflectivity and Doppler winds, are compared with forecasts from an automated-algorithmic forecast system that also employs the radar data. The radar estimates of rain rate, and the two forecasting systems that employ the radar data, have degraded accuracy by virtue of the fact that they are applied in complex terrain. Nevertheless, the radar data and forecasts from the dynamic model and the automated algorithm could be operationally useful for input to surface-hydrologic models employed for flood warning. Precipitation data provided by these various techniques at short time scales and at fine spatial resolutions are employed as detailed input to a distributed-parameter hydrologic model for flash-flood prediction and analysis. With the radar-based precipitation estimates employed as input, the simulated flood discharge was similar to that observed. The dynamic-model precipitation forecast showed the most promise in providing a significant discharge-forecast lead time. The algorithmic system's precipitation forecast did not demonstrate as much skill, but the associated discharge forecast would still have been sufficient to have provided an alert of impending flood danger.

  1. Exploring storage and runoff generation processes for urban flooding through a physically based watershed model

    NASA Astrophysics Data System (ADS)

    Smith, B. K.; Smith, J. A.; Baeck, M. L.; Miller, A. J.

    2015-03-01

    A physically based model of the 14 km2 Dead Run watershed in Baltimore County, MD was created to test the impacts of detention basin storage and soil storage on the hydrologic response of a small urban watershed during flood events. The Dead Run model was created using the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) algorithms and validated using U.S. Geological Survey stream gaging observations for the Dead Run watershed and 5 subbasins over the largest 21 warm season flood events during 2008-2012. Removal of the model detention basins resulted in a median peak discharge increase of 11% and a detention efficiency of 0.5, which was defined as the percent decrease in peak discharge divided by percent detention controlled area. Detention efficiencies generally decreased with increasing basin size. We tested the efficiency of detention basin networks by focusing on the "drainage network order," akin to the stream order but including storm drains, streams, and culverts. The detention efficiency increased dramatically between first-order detention and second-order detention but was similar for second and third-order detention scenarios. Removal of the soil compacted layer, a common feature in urban soils, resulted in a 7% decrease in flood peak discharges. This decrease was statistically similar to the flood peak decrease caused by existing detention. Current soil storage within the Dead Run watershed decreased flood peak discharges by a median of 60%. Numerical experiment results suggested that detention basin storage and increased soil storage have the potential to substantially decrease flood peak discharges.

  2. How do Watershed Characteristics and Precipitation Influence Post-Wildfire Valley Sediment Storage and Delivery Over Time?

    NASA Astrophysics Data System (ADS)

    Brogan, D. J.; Nelson, P. A.; MacDonald, L. H.

    2016-12-01

    Considerable advances have been made in understanding post-wildfire runoff, erosion, and mass wasting at the hillslope and small watershed scale, but the larger-scale effects on flooding, water quality, and sedimentation are often the most significant impacts. The problem is that we have virtually no watershed-specific tools to quantify the proportion of eroded sediment that is stored or delivered from watersheds larger than about 2-5 km2. In this study we are quantifying how channel and valley bottom characteristics affect post-wildfire sediment storage and delivery. Our research is based on intensive monitoring of sediment storage over time in two 15 km2 watersheds (Skin Gulch and Hill Gulch) burned in the 2012 High Park Fire using repeated cross section and longitudinal surveys from fall 2012 through summer 2016, five airborne laser scanning (ALS) datasets from fall 2012 through summer 2015, and both radar and ground-based precipitation measurements. We have computed changes in sediment storage by differencing successive cross sections, and computed spatially explicit changes in successive ALS point clouds using the multiscale model to model cloud comparison (M3C2) algorithm. These channel changes are being related to potential morphometric controls, including valley width, valley slope, confinement, contributing area, valley expansion or contraction, topographic curvature (planform and profile), and estimated sediment inputs. We hypothesize that maximum rainfall intensity and lateral confinement will be the primary independent variables that describe observed patterns of erosion and deposition, and that the results can help predict post-wildfire sediment delivery and identify high priority areas for restoration.

  3. Optimization of green infrastructure network at semi-urbanized watersheds to manage stormwater volume, peak flow and life cycle cost: Case study of Dead Run watershed in Maryland

    NASA Astrophysics Data System (ADS)

    Heidari Haratmeh, B.; Rai, A.; Minsker, B. S.

    2016-12-01

    Green Infrastructure (GI) has become widely known as a sustainable solution for stormwater management in urban environments. Despite more recognition and acknowledgment, researchers and practitioners lack clear and explicit guidelines on how GI practices should be implemented in urban settings. This study is developing a noisy-based multi-objective, multi-scaled genetic algorithm that determines optimal GI networks for environmental, economic and social objectives. The methodology accounts for uncertainty in modeling results and is designed to perform at sub-watershed as well as patch scale using two different simulation models, SWMM and RHESSys, in a Cloud-based implementation using a Web interface. As an initial case study, a semi-urbanized watershed— DeadRun 5— in Baltimore County, Maryland, is selected. The objective of the study is to minimize life cycle cost, maximize human preference for human well-being and the difference between pre-development hydrographs generated from current rainfall events and design storms, as well as those that result from proposed GI scenarios. Initial results for DeadRun5 watershed suggest that placing GI in the proximity of the watershed outlet optimizes life cycle cost, stormwater volume, and peak flow capture. The framework can easily present outcomes of GI design scenarios to both designers and local stakeholders, and future plans include receiving feedback from users on candidate designs, and interactively updating optimal GI network designs in a crowd-sourced design process. This approach can also be helpful in deriving design guidelines that better meet stakeholder needs.

  4. Testing algorithms for critical slowing down

    NASA Astrophysics Data System (ADS)

    Cossu, Guido; Boyle, Peter; Christ, Norman; Jung, Chulwoo; Jüttner, Andreas; Sanfilippo, Francesco

    2018-03-01

    We present the preliminary tests on two modifications of the Hybrid Monte Carlo (HMC) algorithm. Both algorithms are designed to travel much farther in the Hamiltonian phase space for each trajectory and reduce the autocorrelations among physical observables thus tackling the critical slowing down towards the continuum limit. We present a comparison of costs of the new algorithms with the standard HMC evolution for pure gauge fields, studying the autocorrelation times for various quantities including the topological charge.

  5. Implementation of hybrid clustering based on partitioning around medoids algorithm and divisive analysis on human Papillomavirus DNA

    NASA Astrophysics Data System (ADS)

    Arimbi, Mentari Dian; Bustamam, Alhadi; Lestari, Dian

    2017-03-01

    Data clustering can be executed through partition or hierarchical method for many types of data including DNA sequences. Both clustering methods can be combined by processing partition algorithm in the first level and hierarchical in the second level, called hybrid clustering. In the partition phase some popular methods such as PAM, K-means, or Fuzzy c-means methods could be applied. In this study we selected partitioning around medoids (PAM) in our partition stage. Furthermore, following the partition algorithm, in hierarchical stage we applied divisive analysis algorithm (DIANA) in order to have more specific clusters and sub clusters structures. The number of main clusters is determined using Davies Bouldin Index (DBI) value. We choose the optimal number of clusters if the results minimize the DBI value. In this work, we conduct the clustering on 1252 HPV DNA sequences data from GenBank. The characteristic extraction is initially performed, followed by normalizing and genetic distance calculation using Euclidean distance. In our implementation, we used the hybrid PAM and DIANA using the R open source programming tool. In our results, we obtained 3 main clusters with average DBI value is 0.979, using PAM in the first stage. After executing DIANA in the second stage, we obtained 4 sub clusters for Cluster-1, 9 sub clusters for Cluster-2 and 2 sub clusters in Cluster-3, with the BDI value 0.972, 0.771, and 0.768 for each main cluster respectively. Since the second stage produce lower DBI value compare to the DBI value in the first stage, we conclude that this hybrid approach can improve the accuracy of our clustering results.

  6. Validating predictions from climate envelope models

    USGS Publications Warehouse

    Watling, J.; Bucklin, D.; Speroterra, C.; Brandt, L.; Cabal, C.; Romañach, Stephanie S.; Mazzotti, Frank J.

    2013-01-01

    Climate envelope models are a potentially important conservation tool, but their ability to accurately forecast species’ distributional shifts using independent survey data has not been fully evaluated. We created climate envelope models for 12 species of North American breeding birds previously shown to have experienced poleward range shifts. For each species, we evaluated three different approaches to climate envelope modeling that differed in the way they treated climate-induced range expansion and contraction, using random forests and maximum entropy modeling algorithms. All models were calibrated using occurrence data from 1967–1971 (t1) and evaluated using occurrence data from 1998–2002 (t2). Model sensitivity (the ability to correctly classify species presences) was greater using the maximum entropy algorithm than the random forest algorithm. Although sensitivity did not differ significantly among approaches, for many species, sensitivity was maximized using a hybrid approach that assumed range expansion, but not contraction, in t2. Species for which the hybrid approach resulted in the greatest improvement in sensitivity have been reported from more land cover types than species for which there was little difference in sensitivity between hybrid and dynamic approaches, suggesting that habitat generalists may be buffered somewhat against climate-induced range contractions. Specificity (the ability to correctly classify species absences) was maximized using the random forest algorithm and was lowest using the hybrid approach. Overall, our results suggest cautious optimism for the use of climate envelope models to forecast range shifts, but also underscore the importance of considering non-climate drivers of species range limits. The use of alternative climate envelope models that make different assumptions about range expansion and contraction is a new and potentially useful way to help inform our understanding of climate change effects on species.

  7. New Hybrid Algorithms for Estimating Tree Stem Diameters at Breast Height Using a Two Dimensional Terrestrial Laser Scanner

    PubMed Central

    Kong, Jianlei; Ding, Xiaokang; Liu, Jinhao; Yan, Lei; Wang, Jianli

    2015-01-01

    In this paper, a new algorithm to improve the accuracy of estimating diameter at breast height (DBH) for tree trunks in forest areas is proposed. First, the information is collected by a two-dimensional terrestrial laser scanner (2DTLS), which emits laser pulses to generate a point cloud. After extraction and filtration, the laser point clusters of the trunks are obtained, which are optimized by an arithmetic means method. Then, an algebraic circle fitting algorithm in polar form is non-linearly optimized by the Levenberg-Marquardt method to form a new hybrid algorithm, which is used to acquire the diameters and positions of the trees. Compared with previous works, this proposed method improves the accuracy of diameter estimation of trees significantly and effectively reduces the calculation time. Moreover, the experimental results indicate that this method is stable and suitable for the most challenging conditions, which has practical significance in improving the operating efficiency of forest harvester and reducing the risk of causing accidents. PMID:26147726

  8. Exploration of the Performance of a Hybrid Closed Loop Insulin Delivery Algorithm That Includes Insulin Delivery Limits Designed to Protect Against Hypoglycemia.

    PubMed

    de Bock, Martin; Dart, Julie; Roy, Anirban; Davey, Raymond; Soon, Wayne; Berthold, Carolyn; Retterath, Adam; Grosman, Benyamin; Kurtz, Natalie; Davis, Elizabeth; Jones, Timothy

    2017-01-01

    Hypoglycemia remains a risk for closed loop insulin delivery particularly following exercise or if the glucose sensor is inaccurate. The aim of this study was to test whether an algorithm that includes a limit to insulin delivery is effective at protecting against hypoglycemia under those circumstances. An observational study on 8 participants with type 1 diabetes was conducted, where a hybrid closed loop system (HCL) (Medtronic™ 670G) was challenged with hypoglycemic stimuli: exercise and an overreading glucose sensor. There was no overnight or exercise-induced hypoglycemia during HCL insulin delivery. All daytime hypoglycemia was attributable to postmeal bolused insulin in those participants with a more aggressive carbohydrate factor. HCL systems rely on accurate carbohydrate ratios and carbohydrate counting to avoid hypoglycemia. The algorithm that was tested against moderate exercise and an overreading glucose sensor performed well in terms of hypoglycemia avoidance. Algorithm refinement continues in preparation for long-term outpatient trials.

  9. PSO/ACO algorithm-based risk assessment of human neural tube defects in Heshun County, China.

    PubMed

    Liao, Yi Lan; Wang, Jin Feng; Wu, Ji Lei; Wang, Jiao Jiao; Zheng, Xiao Ying

    2012-10-01

    To develop a new technique for assessing the risk of birth defects, which are a major cause of infant mortality and disability in many parts of the world. The region of interest in this study was Heshun County, the county in China with the highest rate of neural tube defects (NTDs). A hybrid particle swarm optimization/ant colony optimization (PSO/ACO) algorithm was used to quantify the probability of NTDs occurring at villages with no births. The hybrid PSO/ACO algorithm is a form of artificial intelligence adapted for hierarchical classification. It is a powerful technique for modeling complex problems involving impacts of causes. The algorithm was easy to apply, with the accuracy of the results being 69.5%±7.02% at the 95% confidence level. The proposed method is simple to apply, has acceptable fault tolerance, and greatly enhances the accuracy of calculations. Copyright © 2012 The Editorial Board of Biomedical and Environmental Sciences. Published by Elsevier B.V. All rights reserved.

  10. Traffic sharing algorithms for hybrid mobile networks

    NASA Technical Reports Server (NTRS)

    Arcand, S.; Murthy, K. M. S.; Hafez, R.

    1995-01-01

    In a hybrid (terrestrial + satellite) mobile personal communications networks environment, a large size satellite footprint (supercell) overlays on a large number of smaller size, contiguous terrestrial cells. We assume that the users have either a terrestrial only single mode terminal (SMT) or a terrestrial/satellite dual mode terminal (DMT) and the ratio of DMT to the total terminals is defined gamma. It is assumed that the call assignments to and handovers between terrestrial cells and satellite supercells take place in a dynamic fashion when necessary. The objectives of this paper are twofold, (1) to propose and define a class of traffic sharing algorithms to manage terrestrial and satellite network resources efficiently by handling call handovers dynamically, and (2) to analyze and evaluate the algorithms by maximizing the traffic load handling capability (defined in erl/cell) over a wide range of terminal ratios (gamma) given an acceptable range of blocking probabilities. Two of the algorithms (G & S) in the proposed class perform extremely well for a wide range of gamma.

  11. Study on Underwater Image Denoising Algorithm Based on Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Jian, Sun; Wen, Wang

    2017-02-01

    This paper analyzes the application of MATLAB in underwater image processing, the transmission characteristics of the underwater laser light signal and the kinds of underwater noise has been described, the common noise suppression algorithm: Wiener filter, median filter, average filter algorithm is brought out. Then the advantages and disadvantages of each algorithm in image sharpness and edge protection areas have been compared. A hybrid filter algorithm based on wavelet transform has been proposed which can be used for Color Image Denoising. At last the PSNR and NMSE of each algorithm has been given out, which compares the ability to de-noising

  12. A hybrid meta-heuristic algorithm for the vehicle routing problem with stochastic travel times considering the driver's satisfaction

    NASA Astrophysics Data System (ADS)

    Tavakkoli-Moghaddam, Reza; Alinaghian, Mehdi; Salamat-Bakhsh, Alireza; Norouzi, Narges

    2012-05-01

    A vehicle routing problem is a significant problem that has attracted great attention from researchers in recent years. The main objectives of the vehicle routing problem are to minimize the traveled distance, total traveling time, number of vehicles and cost function of transportation. Reducing these variables leads to decreasing the total cost and increasing the driver's satisfaction level. On the other hand, this satisfaction, which will decrease by increasing the service time, is considered as an important logistic problem for a company. The stochastic time dominated by a probability variable leads to variation of the service time, while it is ignored in classical routing problems. This paper investigates the problem of the increasing service time by using the stochastic time for each tour such that the total traveling time of the vehicles is limited to a specific limit based on a defined probability. Since exact solutions of the vehicle routing problem that belong to the category of NP-hard problems are not practical in a large scale, a hybrid algorithm based on simulated annealing with genetic operators was proposed to obtain an efficient solution with reasonable computational cost and time. Finally, for some small cases, the related results of the proposed algorithm were compared with results obtained by the Lingo 8 software. The obtained results indicate the efficiency of the proposed hybrid simulated annealing algorithm.

  13. Command and Control of Teams of Autonomous Units

    DTIC Science & Technology

    2012-06-01

    done by a hybrid genetic algorithm (GA) particle swarm optimization ( PSO ) algorithm called PIDGION-alternate. This training algorithm is an ANN ...human controller will recognize the behaviors as being safe and correct. As the HyperNEAT approach produces Artificial Neural Nets ( ANN ), we can...optimization technique that generates efficient ANN controls from simple environmental feedback. FALCONET has been tested showing that it can produce

  14. 75 FR 502 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Order Approving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ... electronic matching algorithm from CBOE Rule 6.45B shall apply to SAL executions (e.g., pro-rata, price-time... entitlement when the pro-rata algorithm is in effect for SAL in selected Hybrid 3.0 classes as part of a pilot... what it would have been under the pre-pilot allocation algorithm. The Exchange will reduce the DPM/LMM...

  15. Scheduling of hybrid types of machines with two-machine flowshop as the first type and a single machine as the second type

    NASA Astrophysics Data System (ADS)

    Hsiao, Ming-Chih; Su, Ling-Huey

    2018-02-01

    This research addresses the problem of scheduling hybrid machine types, in which one type is a two-machine flowshop and another type is a single machine. A job is either processed on the two-machine flowshop or on the single machine. The objective is to determine a production schedule for all jobs so as to minimize the makespan. The problem is NP-hard since the two parallel machines problem was proved to be NP-hard. Simulated annealing algorithms are developed to solve the problem optimally. A mixed integer programming (MIP) is developed and used to evaluate the performance for two SAs. Computational experiments demonstrate the efficiency of the simulated annealing algorithms, the quality of the simulated annealing algorithms will also be reported.

  16. Optimal clustering of MGs based on droop controller for improving reliability using a hybrid of harmony search and genetic algorithms.

    PubMed

    Abedini, Mohammad; Moradi, Mohammad H; Hosseinian, S M

    2016-03-01

    This paper proposes a novel method to address reliability and technical problems of microgrids (MGs) based on designing a number of self-adequate autonomous sub-MGs via adopting MGs clustering thinking. In doing so, a multi-objective optimization problem is developed where power losses reduction, voltage profile improvement and reliability enhancement are considered as the objective functions. To solve the optimization problem a hybrid algorithm, named HS-GA, is provided, based on genetic and harmony search algorithms, and a load flow method is given to model different types of DGs as droop controller. The performance of the proposed method is evaluated in two case studies. The results provide support for the performance of the proposed method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Solving the vehicle routing problem by a hybrid meta-heuristic algorithm

    NASA Astrophysics Data System (ADS)

    Yousefikhoshbakht, Majid; Khorram, Esmaile

    2012-08-01

    The vehicle routing problem (VRP) is one of the most important combinational optimization problems that has nowadays received much attention because of its real application in industrial and service problems. The VRP involves routing a fleet of vehicles, each of them visiting a set of nodes such that every node is visited by exactly one vehicle only once. So, the objective is to minimize the total distance traveled by all the vehicles. This paper presents a hybrid two-phase algorithm called sweep algorithm (SW) + ant colony system (ACS) for the classical VRP. At the first stage, the VRP is solved by the SW, and at the second stage, the ACS and 3-opt local search are used for improving the solutions. Extensive computational tests on standard instances from the literature confirm the effectiveness of the presented approach.

  18. An artificial neural network controller based on MPSO-BFGS hybrid optimization for spherical flying robot

    NASA Astrophysics Data System (ADS)

    Liu, Xiaolin; Li, Lanfei; Sun, Hanxu

    2017-12-01

    Spherical flying robot can perform various tasks in the complex and varied environment to reduce labor costs. However, it is difficult to guarantee the stability of the spherical flying robot in the case of strong coupling and time-varying disturbance. In this paper, an artificial neural network controller (ANNC) based on MPSO-BFGS hybrid optimization algorithm is proposed. The MPSO algorithm is used to optimize the initial weights of the controller to avoid the local optimal solution. The BFGS algorithm is introduced to improve the convergence ability of the network. We use Lyapunov method to analyze the stability of ANNC. The controller is simulated under the condition of nonlinear coupling disturbance. The experimental results show that the proposed controller can obtain the expected value in shoter time compared with the other considered methods.

  19. Evaluation of vertical coordinate and vertical mixing algorithms in the HYbrid-Coordinate Ocean Model (HYCOM)

    NASA Astrophysics Data System (ADS)

    Halliwell, George R.

    Vertical coordinate and vertical mixing algorithms included in the HYbrid Coordinate Ocean Model (HYCOM) are evaluated in low-resolution climatological simulations of the Atlantic Ocean. The hybrid vertical coordinates are isopycnic in the deep ocean interior, but smoothly transition to level (pressure) coordinates near the ocean surface, to sigma coordinates in shallow water regions, and back again to level coordinates in very shallow water. By comparing simulations to climatology, the best model performance is realized using hybrid coordinates in conjunction with one of the three available differential vertical mixing models: the nonlocal K-Profile Parameterization, the NASA GISS level 2 turbulence closure, and the Mellor-Yamada level 2.5 turbulence closure. Good performance is also achieved using the quasi-slab Price-Weller-Pinkel dynamical instability model. Differences among these simulations are too small relative to other errors and biases to identify the "best" vertical mixing model for low-resolution climate simulations. Model performance deteriorates slightly when the Kraus-Turner slab mixed layer model is used with hybrid coordinates. This deterioration is smallest when solar radiation penetrates beneath the mixed layer and when shear instability mixing is included. A simulation performed using isopycnic coordinates to emulate the Miami Isopycnic Coordinate Ocean Model (MICOM), which uses Kraus-Turner mixing without penetrating shortwave radiation and shear instability mixing, demonstrates that the advantages of switching from isopycnic to hybrid coordinates and including more sophisticated turbulence closures outweigh the negative numerical effects of maintaining hybrid vertical coordinates.

  20. Modeling nutrient retention at the watershed scale: Does small stream research apply to the whole river network?

    NASA Astrophysics Data System (ADS)

    Aguilera, Rosana; Marcé, Rafael; Sabater, Sergi

    2013-06-01

    are conveyed from terrestrial and upstream sources through drainage networks. Streams and rivers contribute to regulate the material exported downstream by means of transformation, storage, and removal of nutrients. It has been recently suggested that the efficiency of process rates relative to available nutrient concentration in streams eventually declines, following an efficiency loss (EL) dynamics. However, most of these predictions are based at the reach scale in pristine streams, failing to describe the role of entire river networks. Models provide the means to study nutrient cycling from the stream network perspective via upscaling to the watershed the key mechanisms occurring at the reach scale. We applied a hybrid process-based and statistical model (SPARROW, Spatially Referenced Regression on Watershed Attributes) as a heuristic approach to describe in-stream nutrient processes in a highly impaired, high stream order watershed (the Llobregat River Basin, NE Spain). The in-stream decay specifications of the model were modified to include a partial saturation effect in uptake efficiency (expressed as a power law) and better capture biological nutrient retention in river systems under high anthropogenic stress. The stream decay coefficients were statistically significant in both nitrate and phosphate models, indicating the potential role of in-stream processing in limiting nutrient export. However, the EL concept did not reliably describe the patterns of nutrient uptake efficiency for the concentration gradient and streamflow values found in the Llobregat River basin, posing in doubt its complete applicability to explain nutrient retention processes in stream networks comprising highly impaired rivers.

  1. Mapping soil textural fractions across a large watershed in north-east Florida.

    PubMed

    Lamsal, S; Mishra, U

    2010-08-01

    Assessment of regional scale soil spatial variation and mapping their distribution is constrained by sparse data which are collected using field surveys that are labor intensive and cost prohibitive. We explored geostatistical (ordinary kriging-OK), regression (Regression Tree-RT), and hybrid methods (RT plus residual Sequential Gaussian Simulation-SGS) to map soil textural fractions across the Santa Fe River Watershed (3585 km(2)) in north-east Florida. Soil samples collected from four depths (L1: 0-30 cm, L2: 30-60 cm, L3: 60-120 cm, and L4: 120-180 cm) at 141 locations were analyzed for soil textural fractions (sand, silt and clay contents), and combined with textural data (15 profiles) assembled under the Florida Soil Characterization program. Textural fractions in L1 and L2 were autocorrelated, and spatially mapped across the watershed. OK performance was poor, which may be attributed to the sparse sampling. RT model structure varied among textural fractions, and the model explained variations ranged from 25% for L1 silt to 61% for L2 clay content. Regression residuals were simulated using SGS, and the average of simulated residuals were used to approximate regression residual distribution map, which were added to regression trend maps. Independent validation of the prediction maps showed that regression models performed slightly better than OK, and regression combined with average of simulated regression residuals improved predictions beyond the regression model. Sand content >90% in both 0-30 and 30-60 cm covered 80.6% of the watershed area. Copyright 2010 Elsevier Ltd. All rights reserved.

  2. Remotely-Sensed Urban Wet-Landscapes AN Indicator of Coupled Effects of Human Impact and Climate Change

    NASA Astrophysics Data System (ADS)

    Ji, Wei

    2016-06-01

    This study proposes the concept of urban wet-landscapes (loosely-defined wetlands) as against dry-landscapes (mainly impervious surfaces). The study is to examine whether the dynamics of urban wet-landscapes is a sensitive indicator of the coupled effects of the two major driving forces of urban landscape change - human built-up impact and climate (precipitation) variation. Using a series of satellite images, the study was conducted in the Kansas City metropolitan area of the United States. A rule-based classification algorithm was developed to identify fine-scale, hidden wetlands that could not be appropriately detected based on their spectral differentiability by a traditional image classification. The spatial analyses of wetland changes were implemented at the scales of metropolitan, watershed, and sub-watershed as well as based on the size of surface water bodies in order to reveal urban wetland change trends in relation to the driving forces. The study identified that wet-landscape dynamics varied in trend and magnitude from the metropolitan, watersheds, to sub-watersheds. The study also found that increased precipitation in the region in the past decades swelled larger wetlands in particular while smaller wetlands decreased mainly due to human development activities. These findings suggest that wet-landscapes, as against the dry-landscapes, can be a more effective indicator of the coupled effects of human impact and climate change.

  3. Automatic segmentation and supervised learning-based selection of nuclei in cancer tissue images.

    PubMed

    Nandy, Kaustav; Gudla, Prabhakar R; Amundsen, Ryan; Meaburn, Karen J; Misteli, Tom; Lockett, Stephen J

    2012-09-01

    Analysis of preferential localization of certain genes within the cell nuclei is emerging as a new technique for the diagnosis of breast cancer. Quantitation requires accurate segmentation of 100-200 cell nuclei in each tissue section to draw a statistically significant result. Thus, for large-scale analysis, manual processing is too time consuming and subjective. Fortuitously, acquired images generally contain many more nuclei than are needed for analysis. Therefore, we developed an integrated workflow that selects, following automatic segmentation, a subpopulation of accurately delineated nuclei for positioning of fluorescence in situ hybridization-labeled genes of interest. Segmentation was performed by a multistage watershed-based algorithm and screening by an artificial neural network-based pattern recognition engine. The performance of the workflow was quantified in terms of the fraction of automatically selected nuclei that were visually confirmed as well segmented and by the boundary accuracy of the well-segmented nuclei relative to a 2D dynamic programming-based reference segmentation method. Application of the method was demonstrated for discriminating normal and cancerous breast tissue sections based on the differential positioning of the HES5 gene. Automatic results agreed with manual analysis in 11 out of 14 cancers, all four normal cases, and all five noncancerous breast disease cases, thus showing the accuracy and robustness of the proposed approach. Published 2012 Wiley Periodicals, Inc.

  4. Identification of inelastic parameters based on deep drawing forming operations using a global-local hybrid Particle Swarm approach

    NASA Astrophysics Data System (ADS)

    Vaz, Miguel; Luersen, Marco A.; Muñoz-Rojas, Pablo A.; Trentin, Robson G.

    2016-04-01

    Application of optimization techniques to the identification of inelastic material parameters has substantially increased in recent years. The complex stress-strain paths and high nonlinearity, typical of this class of problems, require the development of robust and efficient techniques for inverse problems able to account for an irregular topography of the fitness surface. Within this framework, this work investigates the application of the gradient-based Sequential Quadratic Programming method, of the Nelder-Mead downhill simplex algorithm, of Particle Swarm Optimization (PSO), and of a global-local PSO-Nelder-Mead hybrid scheme to the identification of inelastic parameters based on a deep drawing operation. The hybrid technique has shown to be the best strategy by combining the good PSO performance to approach the global minimum basin of attraction with the efficiency demonstrated by the Nelder-Mead algorithm to obtain the minimum itself.

  5. Impact of Noise on a Dynamical System: Prediction and Uncertainties from a Swarm-Optimized Neural Network

    PubMed Central

    López-Caraballo, C. H.; Lazzús, J. A.; Salfate, I.; Rojas, P.; Rivera, M.; Palma-Chilla, L.

    2015-01-01

    An artificial neural network (ANN) based on particle swarm optimization (PSO) was developed for the time series prediction. The hybrid ANN+PSO algorithm was applied on Mackey-Glass chaotic time series in the short-term x(t + 6). The performance prediction was evaluated and compared with other studies available in the literature. Also, we presented properties of the dynamical system via the study of chaotic behaviour obtained from the predicted time series. Next, the hybrid ANN+PSO algorithm was complemented with a Gaussian stochastic procedure (called stochastic hybrid ANN+PSO) in order to obtain a new estimator of the predictions, which also allowed us to compute the uncertainties of predictions for noisy Mackey-Glass chaotic time series. Thus, we studied the impact of noise for several cases with a white noise level (σ N) from 0.01 to 0.1. PMID:26351449

  6. Impact of Noise on a Dynamical System: Prediction and Uncertainties from a Swarm-Optimized Neural Network.

    PubMed

    López-Caraballo, C H; Lazzús, J A; Salfate, I; Rojas, P; Rivera, M; Palma-Chilla, L

    2015-01-01

    An artificial neural network (ANN) based on particle swarm optimization (PSO) was developed for the time series prediction. The hybrid ANN+PSO algorithm was applied on Mackey-Glass chaotic time series in the short-term x(t + 6). The performance prediction was evaluated and compared with other studies available in the literature. Also, we presented properties of the dynamical system via the study of chaotic behaviour obtained from the predicted time series. Next, the hybrid ANN+PSO algorithm was complemented with a Gaussian stochastic procedure (called stochastic hybrid ANN+PSO) in order to obtain a new estimator of the predictions, which also allowed us to compute the uncertainties of predictions for noisy Mackey-Glass chaotic time series. Thus, we studied the impact of noise for several cases with a white noise level (σ(N)) from 0.01 to 0.1.

  7. High speed corner and gap-seal computations using an LU-SGS scheme

    NASA Technical Reports Server (NTRS)

    Coirier, William J.

    1989-01-01

    The hybrid Lower-Upper Symmetric Gauss-Seidel (LU-SGS) algorithm was added to a widely used series of 2D/3D Euler/Navier-Stokes solvers and was demonstrated for a particular class of high-speed flows. A limited study was conducted to compare the hybrid LU-SGS for approximate Newton iteration and diagonalized Beam-Warming (DBW) schemes on a work and convergence history basis. The hybrid LU-SGS algorithm is more efficient and easier to implement than the DBW scheme originally present in the code for the cases considered. The code was validated for the hypersonic flow through two mutually perpendicular flat plates and then used to investigate the flow field in and around a simplified scramjet module gap seal configuration. Due to the similarities, the gap seal flow was compared to hypersonic corner flow at the same freestream conditions and Reynolds number.

  8. H-Ransac a Hybrid Point Cloud Segmentation Combining 2d and 3d Data

    NASA Astrophysics Data System (ADS)

    Adam, A.; Chatzilari, E.; Nikolopoulos, S.; Kompatsiaris, I.

    2018-05-01

    In this paper, we present a novel 3D segmentation approach operating on point clouds generated from overlapping images. The aim of the proposed hybrid approach is to effectively segment co-planar objects, by leveraging the structural information originating from the 3D point cloud and the visual information from the 2D images, without resorting to learning based procedures. More specifically, the proposed hybrid approach, H-RANSAC, is an extension of the well-known RANSAC plane-fitting algorithm, incorporating an additional consistency criterion based on the results of 2D segmentation. Our expectation that the integration of 2D data into 3D segmentation will achieve more accurate results, is validated experimentally in the domain of 3D city models. Results show that HRANSAC can successfully delineate building components like main facades and windows, and provide more accurate segmentation results compared to the typical RANSAC plane-fitting algorithm.

  9. State estimation of stochastic non-linear hybrid dynamic system using an interacting multiple model algorithm.

    PubMed

    Elenchezhiyan, M; Prakash, J

    2015-09-01

    In this work, state estimation schemes for non-linear hybrid dynamic systems subjected to stochastic state disturbances and random errors in measurements using interacting multiple-model (IMM) algorithms are formulated. In order to compute both discrete modes and continuous state estimates of a hybrid dynamic system either an IMM extended Kalman filter (IMM-EKF) or an IMM based derivative-free Kalman filters is proposed in this study. The efficacy of the proposed IMM based state estimation schemes is demonstrated by conducting Monte-Carlo simulation studies on the two-tank hybrid system and switched non-isothermal continuous stirred tank reactor system. Extensive simulation studies reveal that the proposed IMM based state estimation schemes are able to generate fairly accurate continuous state estimates and discrete modes. In the presence and absence of sensor bias, the simulation studies reveal that the proposed IMM unscented Kalman filter (IMM-UKF) based simultaneous state and parameter estimation scheme outperforms multiple-model UKF (MM-UKF) based simultaneous state and parameter estimation scheme. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Optimizing Thermal-Elastic Properties of C/C–SiC Composites Using a Hybrid Approach and PSO Algorithm

    PubMed Central

    Xu, Yingjie; Gao, Tian

    2016-01-01

    Carbon fiber-reinforced multi-layered pyrocarbon–silicon carbide matrix (C/C–SiC) composites are widely used in aerospace structures. The complicated spatial architecture and material heterogeneity of C/C–SiC composites constitute the challenge for tailoring their properties. Thus, discovering the intrinsic relations between the properties and the microstructures and sequentially optimizing the microstructures to obtain composites with the best performances becomes the key for practical applications. The objective of this work is to optimize the thermal-elastic properties of unidirectional C/C–SiC composites by controlling the multi-layered matrix thicknesses. A hybrid approach based on micromechanical modeling and back propagation (BP) neural network is proposed to predict the thermal-elastic properties of composites. Then, a particle swarm optimization (PSO) algorithm is interfaced with this hybrid model to achieve the optimal design for minimizing the coefficient of thermal expansion (CTE) of composites with the constraint of elastic modulus. Numerical examples demonstrate the effectiveness of the proposed hybrid model and optimization method. PMID:28773343

  11. Matrix Algebra for GPU and Multicore Architectures (MAGMA) for Large Petascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, Jack J.; Tomov, Stanimire

    2014-03-24

    The goal of the MAGMA project is to create a new generation of linear algebra libraries that achieve the fastest possible time to an accurate solution on hybrid Multicore+GPU-based systems, using all the processing power that future high-end systems can make available within given energy constraints. Our efforts at the University of Tennessee achieved the goals set in all of the five areas identified in the proposal: 1. Communication optimal algorithms; 2. Autotuning for GPU and hybrid processors; 3. Scheduling and memory management techniques for heterogeneity and scale; 4. Fault tolerance and robustness for large scale systems; 5. Building energymore » efficiency into software foundations. The University of Tennessee’s main contributions, as proposed, were the research and software development of new algorithms for hybrid multi/many-core CPUs and GPUs, as related to two-sided factorizations and complete eigenproblem solvers, hybrid BLAS, and energy efficiency for dense, as well as sparse, operations. Furthermore, as proposed, we investigated and experimented with various techniques targeting the five main areas outlined.« less

  12. Optimizing coherent anti-Stokes Raman scattering by genetic algorithm controlled pulse shaping

    NASA Astrophysics Data System (ADS)

    Yang, Wenlong; Sokolov, Alexei

    2010-10-01

    The hybrid coherent anti-Stokes Raman scattering (CARS) has been successful applied to fast chemical sensitive detections. As the development of femto-second pulse shaping techniques, it is of great interest to find the optimum pulse shapes for CARS. The optimum pulse shapes should minimize the non-resonant four wave mixing (NRFWM) background and maximize the CARS signal. A genetic algorithm (GA) is developed to make a heuristic searching for optimized pulse shapes, which give the best signal the background ratio. The GA is shown to be able to rediscover the hybrid CARS scheme and find optimized pulse shapes for customized applications by itself.

  13. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    PubMed Central

    Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450

  14. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    PubMed

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  15. Multiple Time-Step Dual-Hamiltonian Hybrid Molecular Dynamics — Monte Carlo Canonical Propagation Algorithm

    PubMed Central

    Weare, Jonathan; Dinner, Aaron R.; Roux, Benoît

    2016-01-01

    A multiple time-step integrator based on a dual Hamiltonian and a hybrid method combining molecular dynamics (MD) and Monte Carlo (MC) is proposed to sample systems in the canonical ensemble. The Dual Hamiltonian Multiple Time-Step (DHMTS) algorithm is based on two similar Hamiltonians: a computationally expensive one that serves as a reference and a computationally inexpensive one to which the workload is shifted. The central assumption is that the difference between the two Hamiltonians is slowly varying. Earlier work has shown that such dual Hamiltonian multiple time-step schemes effectively precondition nonlinear differential equations for dynamics by reformulating them into a recursive root finding problem that can be solved by propagating a correction term through an internal loop, analogous to RESPA. Of special interest in the present context, a hybrid MD-MC version of the DHMTS algorithm is introduced to enforce detailed balance via a Metropolis acceptance criterion and ensure consistency with the Boltzmann distribution. The Metropolis criterion suppresses the discretization errors normally associated with the propagation according to the computationally inexpensive Hamiltonian, treating the discretization error as an external work. Illustrative tests are carried out to demonstrate the effectiveness of the method. PMID:26918826

  16. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend

    PubMed Central

    Boonjing, Veera; Intakosum, Sarun

    2016-01-01

    This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span. PMID:27974883

  17. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend.

    PubMed

    Inthachot, Montri; Boonjing, Veera; Intakosum, Sarun

    2016-01-01

    This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span.

  18. Highly noise-tolerant hybrid algorithm for phase retrieval from a single-shot spatial carrier fringe pattern

    NASA Astrophysics Data System (ADS)

    Dong, Zhichao; Cheng, Haobo

    2018-01-01

    A highly noise-tolerant hybrid algorithm (NTHA) is proposed in this study for phase retrieval from a single-shot spatial carrier fringe pattern (SCFP), which effectively combines the merits of spatial carrier phase shift method and two dimensional continuous wavelet transform (2D-CWT). NTHA firstly extracts three phase-shifted fringe patterns from the SCFP with one pixel malposition; then calculates phase gradients by subtracting the reference phase from the other two target phases, which are retrieved respectively from three phase-shifted fringe patterns by 2D-CWT; finally, reconstructs the phase map by a least square gradient integration method. Its typical characters include but not limited to: (1) doesn't require the spatial carrier to be constant; (2) the subtraction mitigates edge errors of 2D-CWT; (3) highly noise-tolerant, because not only 2D-CWT is noise-insensitive, but also the noise in the fringe pattern doesn't directly take part in the phase reconstruction as in previous hybrid algorithm. Its feasibility and performances are validated extensively by simulations and contrastive experiments to temporal phase shift method, Fourier transform and 2D-CWT methods.

  19. A real-time simulation evaluation of an advanced detection. Isolation and accommodation algorithm for sensor failures in turbine engines

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.; Delaat, J. C.

    1986-01-01

    An advanced sensor failure detection, isolation, and accommodation (ADIA) algorithm has been developed for use with an aircraft turbofan engine control system. In a previous paper the authors described the ADIA algorithm and its real-time implementation. Subsequent improvements made to the algorithm and implementation are discussed, and the results of an evaluation presented. The evaluation used a real-time, hybrid computer simulation of an F100 turbofan engine.

  20. Storm Identification and Tracking for Hydrologic Modeling Using Hourly Accumulated NEXRAD Precipitation Data

    NASA Astrophysics Data System (ADS)

    Olivera, F.; Choi, J.; Socolofsky, S.

    2006-12-01

    Watershed responses to storm events are strongly affected by the spatial and temporal patterns of rainfall; that is, the spatial distribution of the precipitation intensity and its evolution over time. Although real storms are moving entities with non-uniform intensities in both space and time, hydrological applications often synthesize these attributes by assuming storms that are uniformly distributed and have variable intensity according to a pre-defined hyetograph shape. As one considers watersheds of greater size, the non-uniformity of rainfall becomes more important, because a storm may not cover the watershed's entire area and may not stay in the watershed for its full duration. In order to incorporate parameters such as storm area, propagation velocity and direction, and intensity distribution in the definition of synthetic storms, it is necessary to determine these storm characteristics from spatially distributed precipitation data. To date, most algorithms for identifying and tracking storms have been applied to short time-step radar reflectivity data (i.e., 15 minutes or less), where storm features are captured in an effectively synoptic manner. For the entire United States, however, the most reliable distributed precipitation data are the one-hour accumulated 4 km × 4 km gridded NEXRAD data of the U.S. National Weather Service (NWS) (NWS 2005. The one-hour aggregation level of the data, though, makes it more difficult to identify and track storms than when using sequences of synoptic radar reflectivity data, because storms can traverse over a number of NEXRAD cells and change size and shape appreciably between consecutive data maps. In this paper, we present a methodology to overcome the identification and tracking difficulties and to extract the characteristics of moving storms (e.g. size, propagation velocity and direction, and intensity distribution) from one-hour accumulated distributed rainfall data. The algorithm uses Gaussian Mixture Models (GMM) for storm identification and image processing for storm tracking. The method has been successfully applied to Brazos County in Texas using the 2003 Multi-sensor Precipitation Estimator (MPE) NEXRAD rainfall data.

Top