Science.gov

Sample records for algorithm shows good

  1. Why is Boris algorithm so good?

    SciTech Connect

    Qin, Hong; Zhang, Shuangxi; Xiao, Jianyuan; Liu, Jian; Sun, Yajuan; Tang, William M.

    2013-08-15

    Due to its excellent long term accuracy, the Boris algorithm is the de facto standard for advancing a charged particle. Despite its popularity, up to now there has been no convincing explanation why the Boris algorithm has this advantageous feature. In this paper, we provide an answer to this question. We show that the Boris algorithm conserves phase space volume, even though it is not symplectic. The global bound on energy error typically associated with symplectic algorithms still holds for the Boris algorithm, making it an effective algorithm for the multi-scale dynamics of plasmas.

  2. Why is Boris Algorithm So Good?

    SciTech Connect

    et al, Hong Qin

    2013-03-03

    Due to its excellent long term accuracy, the Boris algorithm is the de facto standard for advancing a charged particle. Despite its popularity, up to now there has been no convincing explanation why the Boris algorithm has this advantageous feature. In this letter, we provide an answer to this question. We show that the Boris algorithm conserves phase space volume, even though it is not symplectic. The global bound on energy error typically associated with symplectic algorithms still holds for the Boris algorithm, making it an effective algorithm for the multi-scale dynamics of plasmas.

  3. Can You Show You Are a Good Lecturer?

    ERIC Educational Resources Information Center

    Wood, Leigh N.; Harding, Ansie

    2007-01-01

    Measurement of the quality of teaching activities is becoming increasingly important since universities are rewarding performance in terms of promotion, awards and bonuses and research is no longer the only key performance indicator. Good teaching is not easy to identify and measure. This paper specifically deals with the issue of good teaching in…

  4. Winners show the way to good management in health care.

    PubMed

    Schwefel, D; Pons, M C

    1994-01-01

    To stimulate resourcefulness in the health care services of the Philippines, the German Agency for Technical Cooperation (GTZ) organized a competition to discover and publicize examples of good management. The results provide a rich fund of new ideas. PMID:7999220

  5. Cationorm shows good tolerability on human HCE-2 corneal epithelial cell cultures.

    PubMed

    Kinnunen, Kati; Kauppinen, Anu; Piippo, Niina; Koistinen, Arto; Toropainen, Elisa; Kaarniranta, Kai

    2014-03-01

    mitochondrial metabolism to 73% with Cationorm and 53% with BAK from that of the control cells after 30 min exposure in MTT assay. BAK was the only test compound having clear adverse effects on the cell number and metabolism in CCK-8 assay. The activity of caspase-3 did not show significant differences between the groups. Inflammatory response after exposure to Cationorm was significantly lower than after exposure to BAK. There were no significant differences in NF-κB activity between the groups. Diluted Cationorm and Systane with polyquaternium-1/polidronium chloride 0.001% showed good tolerability on HCE-2 cells and thereby provide a clear improvement when compared to BAK-containing eye drop formulations.

  6. Cationorm shows good tolerability on human HCE-2 corneal epithelial cell cultures.

    PubMed

    Kinnunen, Kati; Kauppinen, Anu; Piippo, Niina; Koistinen, Arto; Toropainen, Elisa; Kaarniranta, Kai

    2014-03-01

    mitochondrial metabolism to 73% with Cationorm and 53% with BAK from that of the control cells after 30 min exposure in MTT assay. BAK was the only test compound having clear adverse effects on the cell number and metabolism in CCK-8 assay. The activity of caspase-3 did not show significant differences between the groups. Inflammatory response after exposure to Cationorm was significantly lower than after exposure to BAK. There were no significant differences in NF-κB activity between the groups. Diluted Cationorm and Systane with polyquaternium-1/polidronium chloride 0.001% showed good tolerability on HCE-2 cells and thereby provide a clear improvement when compared to BAK-containing eye drop formulations. PMID:24462278

  7. You Showed Your Whiteness: You Don't Get a "Good" White People's Medal

    ERIC Educational Resources Information Center

    Hayes, Cleveland; Juarez, Brenda G.

    2009-01-01

    The White liberal is a person who finds themselves defined as White, as an oppressor, in short, and retreats in horror from that designation. The desire to be and to be known as a good White person stems from the recognition that Whiteness is problematic, recognition that many White liberals try to escape by being demonstrably different from…

  8. Nonoperatively treated forearm shaft fractures in children show good long-term recovery

    PubMed Central

    Sinikumpu, Juha-Jaakko; Victorzon, Sarita; Antila, Eeva; Pokka, Tytti; Serlo, Willy

    2014-01-01

    Background and purpose — The incidence of forearm shaft fractures in children has increased and operative treatment has increased compared with nonoperative treatment in recent years. We analyzed the long-term results of nonoperative treatment. Patients and methods — We performed a population-based age- and sex-matched case-control study in Vaasa Central Hospital, concerning fractures treated in the period 1995–1999. There were 47 nonoperatively treated both-bone forearm shaft fractures, and the patients all participated in the study. 1 healthy control per case was randomly selected and evaluated for comparison. We analyzed clinical and radiographic outcomes of all fractures at a mean of 11 (9–14) years after the trauma. Results — The main outcome, pronosupination of the forearm, was not decreased in the long term. Grip strength was also equally as good as in the controls. Wrist mobility was similar in flexion (85°) and extension (83°) compared to the contralateral side. The patients were satisfied with the outcome, and pain-free. Radiographally, 4 cases had radio-carpal joint degeneration and 4 had a local bone deformity. Interpretation — The long-term outcome of nonoperatively treated both-bone forearm shaft fractures in children was excellent. PMID:25238437

  9. Using the Talk Show to "Talk Back" to O'Connor's "Good Country People."

    ERIC Educational Resources Information Center

    Schevera, Nicholas

    1998-01-01

    Describes how a teacher of a college introductory-literature course used role-playing, a talk-show format, and reader-audience participation to help students make collaborative meaning for, and to promote students' active engagement with a Flannery O'Connor short story. (SR)

  10. Oxygen isotopes in tree rings show good coherence between species and sites in Bolivia

    NASA Astrophysics Data System (ADS)

    Baker, Jessica C. A.; Hunt, Sarah F. P.; Clerici, Santiago J.; Newton, Robert J.; Bottrell, Simon H.; Leng, Melanie J.; Heaton, Timothy H. E.; Helle, Gerhard; Argollo, Jaime; Gloor, Manuel; Brienen, Roel J. W.

    2015-10-01

    A tree ring oxygen isotope (δ18OTR) chronology developed from one species (Cedrela odorata) growing in a single site has been shown to be a sensitive proxy for rainfall over the Amazon Basin, thus allowing reconstructions of precipitation in a region where meteorological records are short and scarce. Although these results suggest that there should be large-scale (> 100 km) spatial coherence of δ18OTR records in the Amazon, this has not been tested. Furthermore, it is of interest to investigate whether other, possibly longer-lived, species similarly record interannual variation of Amazon precipitation, and can be used to develop climate sensitive isotope chronologies. In this study, we measured δ18O in tree rings from seven lowland and one highland tree species from Bolivia. We found that cross-dating with δ18OTR gave more accurate tree ring dates than using ring width. Our "isotope cross-dating approach" is confirmed with radiocarbon "bomb-peak" dates, and has the potential to greatly facilitate development of δ18OTR records in the tropics, identify dating errors, and check annual ring formation in tropical trees. Six of the seven lowland species correlated significantly with C. odorata, showing that variation in δ18OTR has a coherent imprint across very different species, most likely arising from a dominant influence of source water δ18O on δ18OTR. In addition we show that δ18OTR series cohere over large distances, within and between species. Comparison of two C. odorata δ18OTR chronologies from sites several hundreds of kilometres apart showed a very strong correlation (r = 0.80, p < 0.001, 1901-2001), and a significant (but weaker) relationship was found between lowland C. odorata trees and a Polylepis tarapacana tree growing in the distant Altiplano (r = 0.39, p < 0.01, 1931-2001). This large-scale coherence of δ18OTR records is probably triggered by a strong spatial coherence in precipitation δ18O due to large-scale controls. These results

  11. When being narrow minded is a good thing: locally biased people show stronger contextual cueing.

    PubMed

    Bellaera, Lauren; von Mühlenen, Adrian; Watson, Derrick G

    2014-01-01

    Repeated contexts allow us to find relevant information more easily. Learning such contexts has been proposed to depend upon either global processing of the repeated contexts, or alternatively processing of the local region surrounding the target information. In this study, we measured the extent to which observers were by default biased to process towards a more global or local level. The findings showed that the ability to use context to help guide their search was strongly related to an observer's local/global processing bias. Locally biased people could use context to help improve their search better than globally biased people. The results suggest that the extent to which context can be used depends crucially on the observer's attentional bias and thus also to factors and influences that can change this bias. PMID:24313265

  12. Validity of reduced radiation dose for localized diffuse large B-cell lymphoma showing a good response to chemotherapy.

    PubMed

    Koiwai, Keiichiro; Sasaki, Shigeru; Yoshizawa, Eriko; Ina, Hironobu; Fukazawa, Ayumu; Sakai, Katsuya; Ozawa, Takesumi; Matsushita, Hirohide; Kadoya, Masumi

    2014-03-01

    To evaluate the validity of a decrease in the radiation dose for patients who were good responders to chemotherapy for localized diffuse large B-cell lymphoma (DLBCL), 91 patients with localized DLBCL who underwent radiotherapy after multi-agent chemotherapy from 1988-2008 were reviewed. Exclusion criteria were as follows: central nervous system or nasal cavity primary site, or Stage II with bulky tumor (≥10 cm). Of these patients, 62 were identified as good responders to chemotherapy. They were divided into two groups receiving either a higher or a lower radiation dose (32-50.4 Gy or 15-30.6 Gy, respectively). There were no statistically significant differences between the lower and higher dose groups in progression-free survival, locoregional progression-free survival or overall survival. Adaptation of decreased radiation dose may be valid for localized DLBCL patients who show a good response to chemotherapy. PMID:24187329

  13. Climatic associations of British species distributions show good transferability in time but low predictive accuracy for range change.

    PubMed

    Rapacciuolo, Giovanni; Roy, David B; Gillings, Simon; Fox, Richard; Walker, Kevin; Purvis, Andy

    2012-01-01

    Conservation planners often wish to predict how species distributions will change in response to environmental changes. Species distribution models (SDMs) are the primary tool for making such predictions. Many methods are widely used; however, they all make simplifying assumptions, and predictions can therefore be subject to high uncertainty. With global change well underway, field records of observed range shifts are increasingly being used for testing SDM transferability. We used an unprecedented distribution dataset documenting recent range changes of British vascular plants, birds, and butterflies to test whether correlative SDMs based on climate change provide useful approximations of potential distribution shifts. We modelled past species distributions from climate using nine single techniques and a consensus approach, and projected the geographical extent of these models to a more recent time period based on climate change; we then compared model predictions with recent observed distributions in order to estimate the temporal transferability and prediction accuracy of our models. We also evaluated the relative effect of methodological and taxonomic variation on the performance of SDMs. Models showed good transferability in time when assessed using widespread metrics of accuracy. However, models had low accuracy to predict where occupancy status changed between time periods, especially for declining species. Model performance varied greatly among species within major taxa, but there was also considerable variation among modelling frameworks. Past climatic associations of British species distributions retain a high explanatory power when transferred to recent time--due to their accuracy to predict large areas retained by species--but fail to capture relevant predictors of change. We strongly emphasize the need for caution when using SDMs to predict shifts in species distributions: high explanatory power on temporally-independent records--as assessed using

  14. Good Show by Today's Students

    ERIC Educational Resources Information Center

    Lowry, W. Kenneth

    1977-01-01

    Investigates whether today's students would score as well as students of the 1930-1950 era on achievement tests. Uses the Progressive Achievement Test, a test widely used in the 1930-1950 era as a barometer of student ability. (RK)

  15. Chia Seed Shows Good Protein Quality, Hypoglycemic Effect and Improves the Lipid Profile and Liver and Intestinal Morphology of Wistar Rats.

    PubMed

    da Silva, Bárbara Pereira; Dias, Desirrê Morais; de Castro Moreira, Maria Eliza; Toledo, Renata Celi Lopes; da Matta, Sérgio Luis Pinto; Lucia, Ceres Mattos Della; Martino, Hércia Stampini Duarte; Pinheiro-Sant'Ana, Helena Maria

    2016-09-01

    Chia has been consumed by the world population due to its high fiber, lipids and proteins content. The objective was to evaluate the protein quality of chia untreated (seed and flour) and heat treated (90 °C/20 min), their influence on glucose and lipid homeostasis and integrity of liver and intestinal morphology of Wistar rats. 36 male rats, weanling, divided into six groups which received control diet (casein), free protein diet (aproteic) and four diet tests (chia seed; chia seed with heat treatment; chia flour and chia flour with heat treatment) for 14 days were used. The protein efficiency ratio (PER), net protein ratio (NPR) and true digestibility (TD) were evaluated. The biochemical variables and liver and intestinal morphologies of animals were determined. The values of PER, NPR and TD did not differ among the animals that were fed with chia and were lower than the control group. The animals that were fed with chia showed lower concentrations of glucose; triacylglycerides, low-density lipoprotein cholesterol and very low-density lipoprotein and higher high-density lipoprotein cholesterol than the control group. The liver weight of animals that were fed with chia was lower than the control group. Crypt depth and thickness of intestinal muscle layers were higher in groups that were fed with chia. The consumption of chia has shown good digestibility, hypoglycemic effect, improved lipid and glycemic profiles and reduced fat deposition in liver of animals, and also promoted changes in intestinal tissue that enhanced its functionality. PMID:27193017

  16. Toxicity assessments of nonsteroidal anti-inflammatory drugs in isolated mitochondria, rat hepatocytes, and zebrafish show good concordance across chemical classes.

    PubMed

    Nadanaciva, Sashi; Aleo, Michael D; Strock, Christopher J; Stedman, Donald B; Wang, Huijun; Will, Yvonne

    2013-10-15

    To reduce costly late-stage compound attrition, there has been an increased focus on assessing compounds in in vitro assays that predict attributes of human safety liabilities, before preclinical in vivo studies are done. Relevant questions when choosing a panel of assays for predicting toxicity are (a) whether there is general concordance in the data among the assays, and (b) whether, in a retrospective analysis, the rank order of toxicity of compounds in the assays correlates with the known safety profile of the drugs in humans. The aim of our study was to answer these questions using nonsteroidal anti-inflammatory drugs (NSAIDs) as a test set since NSAIDs are generally associated with gastrointestinal injury, hepatotoxicity, and/or cardiovascular risk, with mitochondrial impairment and endoplasmic reticulum stress being possible contributing factors. Eleven NSAIDs, flufenamic acid, tolfenamic acid, mefenamic acid, diclofenac, meloxicam, sudoxicam, piroxicam, diflunisal, acetylsalicylic acid, nimesulide, and sulindac (and its two metabolites, sulindac sulfide and sulindac sulfone), were tested for their effects on (a) the respiration of rat liver mitochondria, (b) a panel of mechanistic endpoints in rat hepatocytes, and (c) the viability and organ morphology of zebrafish. We show good concordance for distinguishing among/between NSAID chemical classes in the observations among the three approaches. Furthermore, the assays were complementary and able to correctly identify "toxic" and "non-toxic" drugs in accordance with their human safety profile, with emphasis on hepatic and gastrointestinal safety. We recommend implementing our multi-assay approach in the drug discovery process to reduce compound attrition.

  17. Chia Seed Shows Good Protein Quality, Hypoglycemic Effect and Improves the Lipid Profile and Liver and Intestinal Morphology of Wistar Rats.

    PubMed

    da Silva, Bárbara Pereira; Dias, Desirrê Morais; de Castro Moreira, Maria Eliza; Toledo, Renata Celi Lopes; da Matta, Sérgio Luis Pinto; Lucia, Ceres Mattos Della; Martino, Hércia Stampini Duarte; Pinheiro-Sant'Ana, Helena Maria

    2016-09-01

    Chia has been consumed by the world population due to its high fiber, lipids and proteins content. The objective was to evaluate the protein quality of chia untreated (seed and flour) and heat treated (90 °C/20 min), their influence on glucose and lipid homeostasis and integrity of liver and intestinal morphology of Wistar rats. 36 male rats, weanling, divided into six groups which received control diet (casein), free protein diet (aproteic) and four diet tests (chia seed; chia seed with heat treatment; chia flour and chia flour with heat treatment) for 14 days were used. The protein efficiency ratio (PER), net protein ratio (NPR) and true digestibility (TD) were evaluated. The biochemical variables and liver and intestinal morphologies of animals were determined. The values of PER, NPR and TD did not differ among the animals that were fed with chia and were lower than the control group. The animals that were fed with chia showed lower concentrations of glucose; triacylglycerides, low-density lipoprotein cholesterol and very low-density lipoprotein and higher high-density lipoprotein cholesterol than the control group. The liver weight of animals that were fed with chia was lower than the control group. Crypt depth and thickness of intestinal muscle layers were higher in groups that were fed with chia. The consumption of chia has shown good digestibility, hypoglycemic effect, improved lipid and glycemic profiles and reduced fat deposition in liver of animals, and also promoted changes in intestinal tissue that enhanced its functionality.

  18. Toxicity assessments of nonsteroidal anti-inflammatory drugs in isolated mitochondria, rat hepatocytes, and zebrafish show good concordance across chemical classes

    SciTech Connect

    Nadanaciva, Sashi; Aleo, Michael D.; Strock, Christopher J.; Stedman, Donald B.; Wang, Huijun; Will, Yvonne

    2013-10-15

    To reduce costly late-stage compound attrition, there has been an increased focus on assessing compounds in in vitro assays that predict attributes of human safety liabilities, before preclinical in vivo studies are done. Relevant questions when choosing a panel of assays for predicting toxicity are (a) whether there is general concordance in the data among the assays, and (b) whether, in a retrospective analysis, the rank order of toxicity of compounds in the assays correlates with the known safety profile of the drugs in humans. The aim of our study was to answer these questions using nonsteroidal anti-inflammatory drugs (NSAIDs) as a test set since NSAIDs are generally associated with gastrointestinal injury, hepatotoxicity, and/or cardiovascular risk, with mitochondrial impairment and endoplasmic reticulum stress being possible contributing factors. Eleven NSAIDs, flufenamic acid, tolfenamic acid, mefenamic acid, diclofenac, meloxicam, sudoxicam, piroxicam, diflunisal, acetylsalicylic acid, nimesulide, and sulindac (and its two metabolites, sulindac sulfide and sulindac sulfone), were tested for their effects on (a) the respiration of rat liver mitochondria, (b) a panel of mechanistic endpoints in rat hepatocytes, and (c) the viability and organ morphology of zebrafish. We show good concordance for distinguishing among/between NSAID chemical classes in the observations among the three approaches. Furthermore, the assays were complementary and able to correctly identify “toxic” and “non-toxic” drugs in accordance with their human safety profile, with emphasis on hepatic and gastrointestinal safety. We recommend implementing our multi-assay approach in the drug discovery process to reduce compound attrition. - Highlights: • NSAIDS cause liver and GI toxicity. • Mitochondrial uncoupling contributes to NSAID liver toxicity. • ER stress is a mechanism that contributes to liver toxicity. • Zebrafish and cell based assays are complimentary.

  19. Five Good Reasons to Show "Great Guy" (1936) in Our U.S. History and American Studies Classes (and the Challenges We'll Face)

    ERIC Educational Resources Information Center

    Allocco, Katherine

    2010-01-01

    One of the most versatile and multi-faceted films that an educator can use to illustrate urban America in the 1930s is "Great Guy," a relatively obscure film from 1936 directed by John G. Blystone and starring James Cagney and Mae Clarke. There are some simple practical considerations that make the film such a good fit for an American history or…

  20. Good Agreements Make Good Friends

    PubMed Central

    Han, The Anh; Pereira, Luís Moniz; Santos, Francisco C.; Lenaerts, Tom

    2013-01-01

    When starting a new collaborative endeavor, it pays to establish upfront how strongly your partner commits to the common goal and what compensation can be expected in case the collaboration is violated. Diverse examples in biological and social contexts have demonstrated the pervasiveness of making prior agreements on posterior compensations, suggesting that this behavior could have been shaped by natural selection. Here, we analyze the evolutionary relevance of such a commitment strategy and relate it to the costly punishment strategy, where no prior agreements are made. We show that when the cost of arranging a commitment deal lies within certain limits, substantial levels of cooperation can be achieved. Moreover, these levels are higher than that achieved by simple costly punishment, especially when one insists on sharing the arrangement cost. Not only do we show that good agreements make good friends, agreements based on shared costs result in even better outcomes. PMID:24045873

  1. Infrared sauna in patients with rheumatoid arthritis and ankylosing spondylitis. A pilot study showing good tolerance, short-term improvement of pain and stiffness, and a trend towards long-term beneficial effects.

    PubMed

    Oosterveld, Fredrikus G J; Rasker, Johannes J; Floors, Mark; Landkroon, Robert; van Rennes, Bob; Zwijnenberg, Jan; van de Laar, Mart A F J; Koel, Gerard J

    2009-01-01

    To study the effects of infrared (IR) Sauna, a form of total-body hyperthermia in patients with rheumatoid arthritis (RA) and ankylosing spondylitis (AS) patients were treated for a 4-week period with a series of eight IR treatments. Seventeen RA patients and 17 AS patients were studied. IR was well tolerated, and no adverse effects were reported, no exacerbation of disease. Pain and stiffness decreased clinically, and improvements were statistically significant (p < 0.05 and p < 0.001 in RA and AS patients, respectively) during an IR session. Fatigue also decreased. Both RA and AS patients felt comfortable on average during and especially after treatment. In the RA and AS patients, pain, stiffness, and fatigue also showed clinical improvements during the 4-week treatment period, but these did not reach statistical significance. No relevant changes in disease activity scores were found, indicating no exacerbation of disease activity. In conclusion, infrared treatment has statistically significant short-term beneficial effects and clinically relevant period effects during treatment in RA and AS patients without enhancing disease activity. IR has good tolerability and no adverse effects.

  2. Applications and accuracy of the parallel diagonal dominant algorithm

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He

    1993-01-01

    The Parallel Diagonal Dominant (PDD) algorithm is a highly efficient, ideally scalable tridiagonal solver. In this paper, a detailed study of the PDD algorithm is given. First the PDD algorithm is introduced. Then the algorithm is extended to solve periodic tridiagonal systems. A variant, the reduced PDD algorithm, is also proposed. Accuracy analysis is provided for a class of tridiagonal systems, the symmetric, and anti-symmetric Toeplitz tridiagonal systems. Implementation results show that the analysis gives a good bound on the relative error, and the algorithm is a good candidate for the emerging massively parallel machines.

  3. "The Show"

    ERIC Educational Resources Information Center

    Gehring, John

    2004-01-01

    For the past 16 years, the blue-collar city of Huntington, West Virginia, has rolled out the red carpet to welcome young wrestlers and their families as old friends. They have come to town chasing the same dream for a spot in what many of them call "The Show". For three days, under the lights of an arena packed with 5,000 fans, the state's best…

  4. Good Schools.

    ERIC Educational Resources Information Center

    Schoenheimer, Henry P.

    This book contains seventeen thumb-nail sketches of schools in Europe, the United States, Asia, Britain, and Australia, as they appeared in the eye of the author as a professional educator and a journalist while travelling around the world. The author considers the schools described to be good schools, and not necessarily the 17 best schools in…

  5. Cloud model bat algorithm.

    PubMed

    Zhou, Yongquan; Xie, Jian; Li, Liangliang; Ma, Mingzhi

    2014-01-01

    Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: "bats approach their prey." Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization. PMID:24967425

  6. "Good mothering" or "good citizenship"?

    PubMed

    Porter, Maree; Kerridge, Ian H; Jordens, Christopher F C

    2012-03-01

    Umbilical cord blood banking is one of many biomedical innovations that confront pregnant women with new choices about what they should do to secure their own and their child's best interests. Many mothers can now choose to donate their baby's umbilical cord blood (UCB) to a public cord blood bank or pay to store it in a private cord blood bank. Donation to a public bank is widely regarded as an altruistic act of civic responsibility. Paying to store UCB may be regarded as a "unique opportunity" to provide "insurance" for the child's future. This paper reports findings from a survey of Australian women that investigated the decision to either donate or store UCB. We conclude that mothers are faced with competing discourses that force them to choose between being a "good mother" and fulfilling their role as a "good citizen." We discuss this finding with reference to the concept of value pluralism. PMID:23180199

  7. "Good mothering" or "good citizenship"?

    PubMed

    Porter, Maree; Kerridge, Ian H; Jordens, Christopher F C

    2012-03-01

    Umbilical cord blood banking is one of many biomedical innovations that confront pregnant women with new choices about what they should do to secure their own and their child's best interests. Many mothers can now choose to donate their baby's umbilical cord blood (UCB) to a public cord blood bank or pay to store it in a private cord blood bank. Donation to a public bank is widely regarded as an altruistic act of civic responsibility. Paying to store UCB may be regarded as a "unique opportunity" to provide "insurance" for the child's future. This paper reports findings from a survey of Australian women that investigated the decision to either donate or store UCB. We conclude that mothers are faced with competing discourses that force them to choose between being a "good mother" and fulfilling their role as a "good citizen." We discuss this finding with reference to the concept of value pluralism.

  8. A Winner Determination Algorithm for Combinatorial Auctions Based on Hybrid Artificial Fish Swarm Algorithm

    NASA Astrophysics Data System (ADS)

    Zheng, Genrang; Lin, ZhengChun

    The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.

  9. Algorithms and Algorithmic Languages.

    ERIC Educational Resources Information Center

    Veselov, V. M.; Koprov, V. M.

    This paper is intended as an introduction to a number of problems connected with the description of algorithms and algorithmic languages, particularly the syntaxes and semantics of algorithmic languages. The terms "letter, word, alphabet" are defined and described. The concept of the algorithm is defined and the relation between the algorithm and…

  10. Comparing barrier algorithms

    NASA Technical Reports Server (NTRS)

    Arenstorf, Norbert S.; Jordan, Harry F.

    1987-01-01

    A barrier is a method for synchronizing a large number of concurrent computer processes. After considering some basic synchronization mechanisms, a collection of barrier algorithms with either linear or logarithmic depth are presented. A graphical model is described that profiles the execution of the barriers and other parallel programming constructs. This model shows how the interaction between the barrier algorithms and the work that they synchronize can impact their performance. One result is that logarithmic tree structured barriers show good performance when synchronizing fixed length work, while linear self-scheduled barriers show better performance when synchronizing fixed length work with an imbedded critical section. The linear barriers are better able to exploit the process skew associated with critical sections. Timing experiments, performed on an eighteen processor Flex/32 shared memory multiprocessor, that support these conclusions are detailed.

  11. Advanced optimization of permanent magnet wigglers using a genetic algorithm

    SciTech Connect

    Hajima, Ryoichi

    1995-12-31

    In permanent magnet wigglers, magnetic imperfection of each magnet piece causes field error. This field error can be reduced or compensated by sorting magnet pieces in proper order. We showed a genetic algorithm has good property for this sorting scheme. In this paper, this optimization scheme is applied to the case of permanent magnets which have errors in the direction of field. The result shows the genetic algorithm is superior to other algorithms.

  12. A Revision of the NASA Team Sea Ice Algorithm

    NASA Technical Reports Server (NTRS)

    Markus, T.; Cavalieri, Donald J.

    1998-01-01

    In a recent paper, two operational algorithms to derive ice concentration from satellite multichannel passive microwave sensors have been compared. Although the results of these, known as the NASA Team algorithm and the Bootstrap algorithm, have been validated and are generally in good agreement, there are areas where the ice concentrations differ, by up to 30%. These differences can be explained by shortcomings in one or the other algorithm. Here, we present an algorithm which, in addition to the 19 and 37 GHz channels used by both the Bootstrap and NASA Team algorithms, makes use of the 85 GHz channels as well. Atmospheric effects particularly at 85 GHz are reduced by using a forward atmospheric radiative transfer model. Comparisons with the NASA Team and Bootstrap algorithm show that the individual shortcomings of these algorithms are not apparent in this new approach. The results further show better quantitative agreement with ice concentrations derived from NOAA AVHRR infrared data.

  13. Spectrum parameter estimation in Brillouin scattering distributed temperature sensor based on cuckoo search algorithm combined with the improved differential evolution algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Yanjun; Yu, Chunjuan; Fu, Xinghu; Liu, Wenzhe; Bi, Weihong

    2015-12-01

    In the distributed optical fiber sensing system based on Brillouin scattering, strain and temperature are the main measuring parameters which can be obtained by analyzing the Brillouin center frequency shift. The novel algorithm which combines the cuckoo search algorithm (CS) with the improved differential evolution (IDE) algorithm is proposed for the Brillouin scattering parameter estimation. The CS-IDE algorithm is compared with CS algorithm and analyzed in different situation. The results show that both the CS and CS-IDE algorithm have very good convergence. The analysis reveals that the CS-IDE algorithm can extract the scattering spectrum features with different linear weight ratio, linewidth combination and SNR. Moreover, the BOTDR temperature measuring system based on electron optical frequency shift is set up to verify the effectiveness of the CS-IDE algorithm. Experimental results show that there is a good linear relationship between the Brillouin center frequency shift and temperature changes.

  14. A hybrid monkey search algorithm for clustering analysis.

    PubMed

    Chen, Xin; Zhou, Yongquan; Luo, Qifang

    2014-01-01

    Clustering is a popular data analysis and data mining technique. The k-means clustering algorithm is one of the most commonly used methods. However, it highly depends on the initial solution and is easy to fall into local optimum solution. In view of the disadvantages of the k-means method, this paper proposed a hybrid monkey algorithm based on search operator of artificial bee colony algorithm for clustering analysis and experiment on synthetic and real life datasets to show that the algorithm has a good performance than that of the basic monkey algorithm for clustering analysis.

  15. Autonomous photogrammetric network design based on changing environment genetic algorithms

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Lu, Nai-Guang; Dong, Mingli

    2008-10-01

    In order to get good accuracy, designer used to consider how to place cameras. Usually, cameras placement design is a multidimensional optimal problem, so people used genetic algorithms to solve it. But genetic algorithms could result in premature or convergent problem. Sometime we get local minimum and observe vibrating phenomenon. Those will get inaccurate design. So we try to solve the problem using the changing environment genetic algorithms. The work proposes giving those species groups difference environment during difference stage to improve the property. Computer simulation result shows the acceleration in convergent speed and ability of selecting good individual. This work would be used in other application.

  16. A scalable parallel algorithm for multiple objective linear programs

    NASA Technical Reports Server (NTRS)

    Wiecek, Malgorzata M.; Zhang, Hong

    1994-01-01

    This paper presents an ADBASE-based parallel algorithm for solving multiple objective linear programs (MOLP's). Job balance, speedup and scalability are of primary interest in evaluating efficiency of the new algorithm. Implementation results on Intel iPSC/2 and Paragon multiprocessors show that the algorithm significantly speeds up the process of solving MOLP's, which is understood as generating all or some efficient extreme points and unbounded efficient edges. The algorithm gives specially good results for large and very large problems. Motivation and justification for solving such large MOLP's are also included.

  17. Nios II hardware acceleration of the epsilon quadratic sieve algorithm

    NASA Astrophysics Data System (ADS)

    Meyer-Bäse, Uwe; Botella, Guillermo; Castillo, Encarnacion; García, Antonio

    2010-04-01

    The quadratic sieve (QS) algorithm is one of the most powerful algorithms to factor large composite primes used to break RSA cryptographic systems. The hardware structure of the QS algorithm seems to be a good fit for FPGA acceleration. Our new ɛ-QS algorithm further simplifies the hardware architecture making it an even better candidate for C2H acceleration. This paper shows our design results in FPGA resource and performance when implementing very long arithmetic on the Nios microprocessor platform with C2H acceleration for different libraries (GMP, LIP, FLINT, NRMP) and QS architecture choices for factoring 32-2048 bit RSA numbers.

  18. Good Medicine and Good Healthcare Demand Good Information (Systems).

    PubMed

    Winter, A; Hilgers, R-D; Hofestädt, R; Hübner, U; Knaup-Gregori, P; Ose, C; Schmoor, C; Timmer, A; Wege, D

    2015-01-01

    The demand for evidence-based health informatics and benchmarking of 'good' information systems in health care gives an opportunity to continue reporting on recent papers in the German journal GMS Medical Informatics, Biometry and Epidemiology (MIBE) here. The publications in focus deal with a comparison of benchmarking initiatives in German-speaking countries, use of communication standards in telemonitoring scenarios, the estimation of national cancer incidence rates and modifications of parametric tests. Furthermore papers in this issue of MIM are introduced which originally have been presented at the Annual Conference of the German Society of Medical Informatics, Biometry and Epidemiology. They deal as well with evidence and evaluation of 'good' information systems but also with data harmonization, surveillance in obstetrics, adaptive designs and parametrical testing in statistical analysis, patient registries and signal processing. PMID:26395286

  19. An ant colony algorithm on continuous searching space

    NASA Astrophysics Data System (ADS)

    Xie, Jing; Cai, Chao

    2015-12-01

    Ant colony algorithm is heuristic, bionic and parallel. Because of it is property of positive feedback, parallelism and simplicity to cooperate with other method, it is widely adopted in planning on discrete space. But it is still not good at planning on continuous space. After a basic introduction to the basic ant colony algorithm, we will propose an ant colony algorithm on continuous space. Our method makes use of the following three tricks. We search for the next nodes of the route according to fixed-step to guarantee the continuity of solution. When storing pheromone, it discretizes field of pheromone, clusters states and sums up the values of pheromone of these states. When updating pheromone, it makes good resolutions measured in relative score functions leave more pheromone, so that ant colony algorithm can find a sub-optimal solution in shorter time. The simulated experiment shows that our ant colony algorithm can find sub-optimal solution in relatively shorter time.

  20. Good School Districts Require Good School Boards.

    ERIC Educational Resources Information Center

    Krysiak, Barbara H.

    2002-01-01

    Discusses issues surrounding superintendents' efforts to establish effective working relationships with their school boards. Includes topics such as role confusion, dealing with conflict, and responding to micromanaging board members. Other topics include elected versus appointed boards, good board members, and new board members. (Contains 14…

  1. Good Concrete Activity Is Good Mental Activity

    ERIC Educational Resources Information Center

    McDonough, Andrea

    2016-01-01

    Early years mathematics classrooms can be colourful, exciting, and challenging places of learning. Andrea McDonough and fellow teachers have noticed that some students make good decisions about using materials to assist their problem solving, but this is not always the case. These experiences lead her to ask the following questions: (1) Are…

  2. Asymmetric intimacy and algorithm for detecting communities in bipartite networks

    NASA Astrophysics Data System (ADS)

    Wang, Xingyuan; Qin, Xiaomeng

    2016-11-01

    In this paper, an algorithm to choose a good partition in bipartite networks has been proposed. Bipartite networks have more theoretical significance and broader prospect of application. In view of distinctive structure of bipartite networks, in our method, two parameters are defined to show the relationships between the same type nodes and heterogeneous nodes respectively. Moreover, our algorithm employs a new method of finding and expanding the core communities in bipartite networks. Two kinds of nodes are handled separately and merged, and then the sub-communities are obtained. After that, objective communities will be found according to the merging rule. The proposed algorithm has been simulated in real-world networks and artificial networks, and the result verifies the accuracy and reliability of the parameters on intimacy for our algorithm. Eventually, comparisons with similar algorithms depict that the proposed algorithm has better performance.

  3. Covariance Structure Model Fit Testing under Missing Data: An Application of the Supplemented EM Algorithm

    ERIC Educational Resources Information Center

    Cai, Li; Lee, Taehun

    2009-01-01

    We apply the Supplemented EM algorithm (Meng & Rubin, 1991) to address a chronic problem with the "two-stage" fitting of covariance structure models in the presence of ignorable missing data: the lack of an asymptotically chi-square distributed goodness-of-fit statistic. We show that the Supplemented EM algorithm provides a convenient…

  4. [Multispectral image compression algorithms for color reproduction].

    PubMed

    Liang, Wei; Zeng, Ping; Luo, Xue-mei; Wang, Yi-feng; Xie, Kun

    2015-01-01

    In order to improve multispectral images compression efficiency and further facilitate their storage and transmission for the application of color reproduction and so on, in which fields high color accuracy is desired, WF serial methods is proposed, and APWS_RA algorithm is designed. Then the WF_APWS_RA algorithm, which has advantages of low complexity, good illuminant stability and supporting consistent coior reproduction across devices, is presented. The conventional MSE based wavelet embedded coding principle is first studied. And then color perception distortion criterion and visual characteristic matrix W are proposed. Meanwhile, APWS_RA algorithm is formed by optimizing the. rate allocation strategy of APWS. Finally, combined above technologies, a new coding method named WF_APWS_RA is designed. Colorimetric error criterion is used in the algorithm and APWS_RA is applied on visual weighted multispectral image. In WF_APWS_RA, affinity propagation clustering is utilized to exploit spectral correlation of weighted image. Then two-dimensional wavelet transform is used to remove the spatial redundancy. Subsequently, error compensation mechanism and rate pre-allocation are combined to accomplish the embedded wavelet coding. Experimental results show that at the same bit rate, compared with classical coding algorithms, WF serial algorithms have better performance on color retention. APWS_RA preserves least spectral error and WF APWS_RA algorithm has obvious superiority on color accuracy.

  5. Image segmentation using an improved differential algorithm

    NASA Astrophysics Data System (ADS)

    Gao, Hao; Shi, Yujiao; Wu, Dongmei

    2014-10-01

    Among all the existing segmentation techniques, the thresholding technique is one of the most popular due to its simplicity, robustness, and accuracy (e.g. the maximum entropy method, Otsu's method, and K-means clustering). However, the computation time of these algorithms grows exponentially with the number of thresholds due to their exhaustive searching strategy. As a population-based optimization algorithm, differential algorithm (DE) uses a population of potential solutions and decision-making processes. It has shown considerable success in solving complex optimization problems within a reasonable time limit. Thus, applying this method into segmentation algorithm should be a good choice during to its fast computational ability. In this paper, we first propose a new differential algorithm with a balance strategy, which seeks a balance between the exploration of new regions and the exploitation of the already sampled regions. Then, we apply the new DE into the traditional Otsu's method to shorten the computation time. Experimental results of the new algorithm on a variety of images show that, compared with the EA-based thresholding methods, the proposed DE algorithm gets more effective and efficient results. It also shortens the computation time of the traditional Otsu method.

  6. Genetic algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  7. What Are Good Universities?

    ERIC Educational Resources Information Center

    Connell, Raewyn

    2016-01-01

    This paper considers how we can arrive at a concept of the good university. It begins with ideas expressed by Australian Vice-Chancellors and in the "league tables" for universities, which essentially reproduce existing privilege. It then considers definitions of the good university via wish lists, classic texts, horror lists, structural…

  8. Making Good Tenure Decisions.

    ERIC Educational Resources Information Center

    Becker, Samuel L.; Galvin, Kathleen M.; Houston, Marsha; Friedrich, Gustav W.; Pearson, Judy C.; Seiler, William J.; Trent, Judith S.

    2001-01-01

    Presents criteria and procedures that can help to substantially increase the probability of a good tenure decision. Notes that the tenure procedures must be designed and followed in a way that ensures, to the degree possible, validity, fairness, and equity. Stresses the importance of maintaining good records and mentoring. (SG)

  9. A Pretty Good Fit

    ERIC Educational Resources Information Center

    Erickson, Tim

    2008-01-01

    We often look for a best-fit function to a set of data. This article describes how a "pretty good" fit might be better than a "best" fit when it comes to promoting conceptual understanding of functions. In a pretty good fit, students design the function themselves rather than choosing it from a menu; they use appropriate variable names; and they…

  10. Advice on Good Grooming.

    ERIC Educational Resources Information Center

    Tingey, Carol

    1987-01-01

    Suggestions are presented from parents on how to help children with disabilities (with particular focus on Downs Syndrome) learn good grooming habits in such areas as good health, exercise, cleanliness, teeth and hair care, skin care, glasses and other devices, and social behavior. (CB)

  11. "Good Citizen" Program.

    ERIC Educational Resources Information Center

    Placer Hills Union Elementary School District, Meadow Vista, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: The "Good Citizen" Program was developed for many reasons: to keep the campus clean, to reward students for improvement, to reward students for good deeds, to improve the total school climate, to reward students for excellence, and to offer staff members a method of reward for positive…

  12. Cape of Good Hope

    Atmospheric Science Data Center

    2013-04-16

    ... that requires the terrain projected radiances to be a smooth function of angle, for each pixel. Any cloud above the surface will ... parallax and therefore the radiance vs. angle should not be smooth. But this algorithm fails for near-surface clouds and homogeneous, ...

  13. Flower pollination algorithm: A novel approach for multiobjective optimization

    NASA Astrophysics Data System (ADS)

    Yang, Xin-She; Karamanoglu, Mehmet; He, Xingshi

    2014-09-01

    Multiobjective design optimization problems require multiobjective optimization techniques to solve, and it is often very challenging to obtain high-quality Pareto fronts accurately. In this article, the recently developed flower pollination algorithm (FPA) is extended to solve multiobjective optimization problems. The proposed method is used to solve a set of multiobjective test functions and two bi-objective design benchmarks, and a comparison of the proposed algorithm with other algorithms has been made, which shows that the FPA is efficient with a good convergence rate. Finally, the importance for further parametric studies and theoretical analysis is highlighted and discussed.

  14. A novel image encryption algorithm based on DNA subsequence operation.

    PubMed

    Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng

    2012-01-01

    We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack.

  15. Television Quiz Show Simulation

    ERIC Educational Resources Information Center

    Hill, Jonnie Lynn

    2007-01-01

    This article explores the simulation of four television quiz shows for students in China studying English as a foreign language (EFL). It discusses the adaptation and implementation of television quiz shows and how the students reacted to them.

  16. Scalable Nearest Neighbor Algorithms for High Dimensional Data.

    PubMed

    Muja, Marius; Lowe, David G

    2014-11-01

    For many computer vision and machine learning problems, large training sets are key for good performance. However, the most computationally expensive part of many computer vision and machine learning algorithms consists of finding nearest neighbor matches to high dimensional vectors that represent the training data. We propose new algorithms for approximate nearest neighbor matching and evaluate and compare them with previous algorithms. For matching high dimensional features, we find two algorithms to be the most efficient: the randomized k-d forest and a new algorithm proposed in this paper, the priority search k-means tree. We also propose a new algorithm for matching binary features by searching multiple hierarchical clustering trees and show it outperforms methods typically used in the literature. We show that the optimal nearest neighbor algorithm and its parameters depend on the data set characteristics and describe an automated configuration procedure for finding the best algorithm to search a particular data set. In order to scale to very large data sets that would otherwise not fit in the memory of a single machine, we propose a distributed nearest neighbor matching framework that can be used with any of the algorithms described in the paper. All this research has been released as an open source library called fast library for approximate nearest neighbors (FLANN), which has been incorporated into OpenCV and is now one of the most popular libraries for nearest neighbor matching. PMID:26353063

  17. Scalable Nearest Neighbor Algorithms for High Dimensional Data.

    PubMed

    Muja, Marius; Lowe, David G

    2014-11-01

    For many computer vision and machine learning problems, large training sets are key for good performance. However, the most computationally expensive part of many computer vision and machine learning algorithms consists of finding nearest neighbor matches to high dimensional vectors that represent the training data. We propose new algorithms for approximate nearest neighbor matching and evaluate and compare them with previous algorithms. For matching high dimensional features, we find two algorithms to be the most efficient: the randomized k-d forest and a new algorithm proposed in this paper, the priority search k-means tree. We also propose a new algorithm for matching binary features by searching multiple hierarchical clustering trees and show it outperforms methods typically used in the literature. We show that the optimal nearest neighbor algorithm and its parameters depend on the data set characteristics and describe an automated configuration procedure for finding the best algorithm to search a particular data set. In order to scale to very large data sets that would otherwise not fit in the memory of a single machine, we propose a distributed nearest neighbor matching framework that can be used with any of the algorithms described in the paper. All this research has been released as an open source library called fast library for approximate nearest neighbors (FLANN), which has been incorporated into OpenCV and is now one of the most popular libraries for nearest neighbor matching.

  18. Fast Optimal Load Balancing Algorithms for 1D Partitioning

    SciTech Connect

    Pinar, Ali; Aykanat, Cevdet

    2002-12-09

    One-dimensional decomposition of nonuniform workload arrays for optimal load balancing is investigated. The problem has been studied in the literature as ''chains-on-chains partitioning'' problem. Despite extensive research efforts, heuristics are still used in parallel computing community with the ''hope'' of good decompositions and the ''myth'' of exact algorithms being hard to implement and not runtime efficient. The main objective of this paper is to show that using exact algorithms instead of heuristics yields significant load balance improvements with negligible increase in preprocessing time. We provide detailed pseudocodes of our algorithms so that our results can be easily reproduced. We start with a review of literature on chains-on-chains partitioning problem. We propose improvements on these algorithms as well as efficient implementation tips. We also introduce novel algorithms, which are asymptotically and runtime efficient. We experimented with data sets from two different applications: Sparse matrix computations and Direct volume rendering. Experiments showed that the proposed algorithms are 100 times faster than a single sparse-matrix vector multiplication for 64-way decompositions on average. Experiments also verify that load balance can be significantly improved by using exact algorithms instead of heuristics. These two findings show that exact algorithms with efficient implementations discussed in this paper can effectively replace heuristics.

  19. Public goods and procreation.

    PubMed

    Anomaly, Jonathan

    2014-01-01

    Procreation is the ultimate public goods problem. Each new child affects the welfare of many other people, and some (but not all) children produce uncompensated value that future people will enjoy. This essay addresses challenges that arise if we think of procreation and parenting as public goods. These include whether individual choices are likely to lead to a socially desirable outcome, and whether changes in laws, social norms, or access to genetic engineering and embryo selection might improve the aggregate outcome of our reproductive choices.

  20. Public goods and procreation.

    PubMed

    Anomaly, Jonathan

    2014-01-01

    Procreation is the ultimate public goods problem. Each new child affects the welfare of many other people, and some (but not all) children produce uncompensated value that future people will enjoy. This essay addresses challenges that arise if we think of procreation and parenting as public goods. These include whether individual choices are likely to lead to a socially desirable outcome, and whether changes in laws, social norms, or access to genetic engineering and embryo selection might improve the aggregate outcome of our reproductive choices. PMID:25743046

  1. A Holographic Road Show.

    ERIC Educational Resources Information Center

    Kirkpatrick, Larry D.; Rugheimer, Mac

    1979-01-01

    Describes the viewing sessions and the holograms of a holographic road show. The traveling exhibits, believed to stimulate interest in physics, include a wide variety of holograms and demonstrate several physical principles. (GA)

  2. Reconsidering the "Good Divorce"

    ERIC Educational Resources Information Center

    Amato, Paul R.; Kane, Jennifer B.; James, Spencer

    2011-01-01

    This study attempted to assess the notion that a "good divorce" protects children from the potential negative consequences of marital dissolution. A cluster analysis of data on postdivorce parenting from 944 families resulted in three groups: cooperative coparenting, parallel parenting, and single parenting. Children in the cooperative coparenting…

  3. Good-Neighbor Policy

    ERIC Educational Resources Information Center

    Drozdowski, Mark J.

    2007-01-01

    In this article, the author draws on his experience as the director of the Fitchburg State College Foundation in Fitchburg, Massachusetts, to make a distinction between being a good neighbor to local non-profit organizations by sharing strategies and information, and creating conflicts of interest when both the college and its neighbor…

  4. Restructuring for Good Governance

    ERIC Educational Resources Information Center

    Robert, Stephen; Carey, Russell C.

    2006-01-01

    American higher education has never been more in need of good governance than it is right now. Yet much of the structure many boards have inherited or created tends to stall or impede timely, well-informed, and broadly supported decision making. At many institutions (ours included), layers of governance have been added with each passing year,…

  5. The Good Mentor.

    ERIC Educational Resources Information Center

    Rowley, James B.

    1999-01-01

    There are six basic qualities of good mentoring: commitment to the mentoring role, acceptance of beginning teachers, proficiency at providing instructional support, interpersonal effectiveness, skill at modeling continuous learning, and ability to communicate hope and optimism. A sidebar explains the Mentoring Leadership and Resource network. (10…

  6. Public Education, Public Good.

    ERIC Educational Resources Information Center

    Tomlinson, John

    1986-01-01

    Criticizes policies which would damage or destroy a public education system. Examines the relationship between government-provided education and democracy. Concludes that privatization of public education would emphasize self-interest and selfishness, further jeopardizing the altruism and civic mindedness necessary for the public good. (JDH)

  7. Doing good & doing well.

    PubMed

    Barnett, K; Pittman, M

    2001-01-01

    Leaders cannot make the "business case" for community benefit in the traditional sense of near-term financial returns on investment. The concept of returns must be expanded to encompass more long-term--yet concrete and measurable--benefits that may be accrued both by nonprofit hospitals and local communities. Hospitals can "do well" economically through a more strategic approach to "doing good."

  8. Good Laboratory Practice

    NASA Astrophysics Data System (ADS)

    Hadjicostas, Evsevios

    The principles of Good Laboratory Practice (GLP) in conjunction with the principles of Total Quality Management (see chapter 6) ensure the quality and reliability of the laboratory results, which in turn help to ensure the protection of the environment and human health and safety. A step further is the accreditation of laboratories to ISO 17025 (see chapter 2) to perform specified activities.

  9. Competing Sudakov veto algorithms

    NASA Astrophysics Data System (ADS)

    Kleiss, Ronald; Verheyen, Rob

    2016-07-01

    We present a formalism to analyze the distribution produced by a Monte Carlo algorithm. We perform these analyses on several versions of the Sudakov veto algorithm, adding a cutoff, a second variable and competition between emission channels. The formal analysis allows us to prove that multiple, seemingly different competition algorithms, including those that are currently implemented in most parton showers, lead to the same result. Finally, we test their performance in a semi-realistic setting and show that there are significantly faster alternatives to the commonly used algorithms.

  10. The Superior Lambert Algorithm

    NASA Astrophysics Data System (ADS)

    der, G.

    2011-09-01

    Lambert algorithms are used extensively for initial orbit determination, mission planning, space debris correlation, and missile targeting, just to name a few applications. Due to the significance of the Lambert problem in Astrodynamics, Gauss, Battin, Godal, Lancaster, Gooding, Sun and many others (References 1 to 15) have provided numerous formulations leading to various analytic solutions and iterative methods. Most Lambert algorithms and their computer programs can only work within one revolution, break down or converge slowly when the transfer angle is near zero or 180 degrees, and their multi-revolution limitations are either ignored or barely addressed. Despite claims of robustness, many Lambert algorithms fail without notice, and the users seldom have a clue why. The DerAstrodynamics lambert2 algorithm, which is based on the analytic solution formulated by Sun, works for any number of revolutions and converges rapidly at any transfer angle. It provides significant capability enhancements over every other Lambert algorithm in use today. These include improved speed, accuracy, robustness, and multirevolution capabilities as well as implementation simplicity. Additionally, the lambert2 algorithm provides a powerful tool for solving the angles-only problem without artificial singularities (pointed out by Gooding in Reference 16), which involves 3 lines of sight captured by optical sensors, or systems such as the Air Force Space Surveillance System (AFSSS). The analytic solution is derived from the extended Godal’s time equation by Sun, while the iterative method of solution is that of Laguerre, modified for robustness. The Keplerian solution of a Lambert algorithm can be extended to include the non-Keplerian terms of the Vinti algorithm via a simple targeting technique (References 17 to 19). Accurate analytic non-Keplerian trajectories can be predicted for satellites and ballistic missiles, while performing at least 100 times faster in speed than most

  11. MLSD-OSEM reconstruction algorithm for cosmic ray muon radiography

    NASA Astrophysics Data System (ADS)

    Liu, Yuanyuan; Zhao, Ziran; Chen, Zhiqiang; Zhang, Li; Xing, Yuxiang

    2008-03-01

    Cosmic ray muon radiography which has a good penetrability and sensitivity to high-Z materials is an effective way for detecting shielded nuclear materials. Reconstruction algorithm is the key point of this technique. Currently, there are two main algorithms about this technique. One is the Point of Closest Approach (POCA) reconstruction algorithm which uses the track information to reconstruct; the other is the Maximum Likelihood estimation, such as the Maximum Likelihood Scattering (MLS) and the Maximum Likelihood Scattering and Displacement (MLSD) reconstruction algorithms which are proposed by the Los Alamos National Laboratory (LANL). The performance of MLSD is better than MLS. Since MLSD reconstruction algorithm includes scattering and displacement information while MLS reconstruction algorithm only includes scattering information. In order to get this Maximum Likelihood estimation, in this paper, we propose to use EM method to get the estimation (MLS-EM and MLSD-EM). Then, in order to saving reconstruction time we use the OS technique to accelerate MLS and MLSD reconstruction algorithm with the initial value set to be the result of the POCA reconstruction algorithm. That is, the Maximum Likelihood Scattering-OSEM (MLS-OSEM) and the Maximum Likelihood Scattering and Displacement-OSEM (MLSD-OSEM). Numerical simulations show that the MLSD-OSEM is an effective algorithm and the performance of MLSD-OSEM is better than MLS-OSEM.

  12. The Ozone Show.

    ERIC Educational Resources Information Center

    Mathieu, Aaron

    2000-01-01

    Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)

  13. Show What You Know

    ERIC Educational Resources Information Center

    Eccleston, Jeff

    2007-01-01

    Big things come in small packages. This saying came to the mind of the author after he created a simple math review activity for his fourth grade students. Though simple, it has proven to be extremely advantageous in reinforcing math concepts. He uses this activity, which he calls "Show What You Know," often. This activity provides the perfect…

  14. Showing What They Know

    ERIC Educational Resources Information Center

    Cech, Scott J.

    2008-01-01

    Having students show their skills in three dimensions, known as performance-based assessment, dates back at least to Socrates. Individual schools such as Barrington High School--located just outside of Providence--have been requiring students to actively demonstrate their knowledge for years. The Rhode Island's high school graduating class became…

  15. Stage a Water Show

    ERIC Educational Resources Information Center

    Frasier, Debra

    2008-01-01

    In the author's book titled "The Incredible Water Show," the characters from "Miss Alaineus: A Vocabulary Disaster" used an ocean of information to stage an inventive performance about the water cycle. In this article, the author relates how she turned the story into hands-on science teaching for real-life fifth-grade students. The author also…

  16. What Do Maps Show?

    ERIC Educational Resources Information Center

    Geological Survey (Dept. of Interior), Reston, VA.

    This curriculum packet, appropriate for grades 4-8, features a teaching poster which shows different types of maps (different views of Salt Lake City, Utah), as well as three reproducible maps and reproducible activity sheets which complement the maps. The poster provides teacher background, including step-by-step lesson plans for four geography…

  17. Obesity in show cats.

    PubMed

    Corbee, R J

    2014-12-01

    Obesity is an important disease with a high prevalence in cats. Because obesity is related to several other diseases, it is important to identify the population at risk. Several risk factors for obesity have been described in the literature. A higher incidence of obesity in certain cat breeds has been suggested. The aim of this study was to determine whether obesity occurs more often in certain breeds. The second aim was to relate the increased prevalence of obesity in certain breeds to the official standards of that breed. To this end, 268 cats of 22 different breeds investigated by determining their body condition score (BCS) on a nine-point scale by inspection and palpation, at two different cat shows. Overall, 45.5% of the show cats had a BCS > 5, and 4.5% of the show cats had a BCS > 7. There were significant differences between breeds, which could be related to the breed standards. Most overweight and obese cats were in the neutered group. It warrants firm discussions with breeders and cat show judges to come to different interpretations of the standards in order to prevent overweight conditions in certain breeds from being the standard of beauty. Neutering predisposes for obesity and requires early nutritional intervention to prevent obese conditions. PMID:24612018

  18. Show Me the Way

    ERIC Educational Resources Information Center

    Dicks, Matthew J.

    2005-01-01

    Because today's students have grown up steeped in video games and the Internet, most of them expect feedback, and usually gratification, very soon after they expend effort on a task. Teachers can get quick feedback to students by showing them videotapes of their learning performances. The author, a 3rd grade teacher describes how the seemingly…

  19. The Art Show

    ERIC Educational Resources Information Center

    Scolarici, Alicia

    2004-01-01

    This article describes what once was thought to be impossible--a formal art show extravaganza at an elementary school with 1,000 students, a Department of Defense Dependent School (DODDS) located overseas, on RAF Lakenheath, England. The dream of this this event involved the transformation of the school cafeteria into an elegant art show…

  20. Honored Teacher Shows Commitment.

    ERIC Educational Resources Information Center

    Ratte, Kathy

    1987-01-01

    Part of the acceptance speech of the 1985 National Council for the Social Studies Teacher of the Year, this article describes the censorship experience of this honored social studies teacher. The incident involved the showing of a videotape version of the feature film entitled "The Seduction of Joe Tynan." (JDH)

  1. Algorithms to detect multiprotein modularity conserved during evolution.

    PubMed

    Hodgkinson, Luqman; Karp, Richard M

    2012-01-01

    Detecting essential multiprotein modules that change infrequently during evolution is a challenging algorithmic task that is important for understanding the structure, function, and evolution of the biological cell. In this paper, we define a measure of modularity for interactomes and present a linear-time algorithm, Produles, for detecting multiprotein modularity conserved during evolution that improves on the running time of previous algorithms for related problems and offers desirable theoretical guarantees. We present a biologically motivated graph theoretic set of evaluation measures complementary to previous evaluation measures, demonstrate that Produles exhibits good performance by all measures, and describe certain recurrent anomalies in the performance of previous algorithms that are not detected by previous measures. Consideration of the newly defined measures and algorithm performance on these measures leads to useful insights on the nature of interactomics data and the goals of previous and current algorithms. Through randomization experiments, we demonstrate that conserved modularity is a defining characteristic of interactomes. Computational experiments on current experimentally derived interactomes for Homo sapiens and Drosophila melanogaster, combining results across algorithms, show that nearly 10 percent of current interactome proteins participate in multiprotein modules with good evidence in the protein interaction data of being conserved between human and Drosophila.

  2. Sparse Algorithms Are Not Stable: A No-Free-Lunch Theorem.

    PubMed

    Huan Xu; Caramanis, C; Mannor, S

    2012-01-01

    We consider two desired properties of learning algorithms: sparsity and algorithmic stability. Both properties are believed to lead to good generalization ability. We show that these two properties are fundamentally at odds with each other: A sparse algorithm cannot be stable and vice versa. Thus, one has to trade off sparsity and stability in designing a learning algorithm. In particular, our general result implies that ℓ(1)-regularized regression (Lasso) cannot be stable, while ℓ(2)-regularized regression is known to have strong stability properties and is therefore not sparse.

  3. A scalable and practical one-pass clustering algorithm for recommender system

    NASA Astrophysics Data System (ADS)

    Khalid, Asra; Ghazanfar, Mustansar Ali; Azam, Awais; Alahmari, Saad Ali

    2015-12-01

    KMeans clustering-based recommendation algorithms have been proposed claiming to increase the scalability of recommender systems. One potential drawback of these algorithms is that they perform training offline and hence cannot accommodate the incremental updates with the arrival of new data, making them unsuitable for the dynamic environments. From this line of research, a new clustering algorithm called One-Pass is proposed, which is a simple, fast, and accurate. We show empirically that the proposed algorithm outperforms K-Means in terms of recommendation and training time while maintaining a good level of accuracy.

  4. Serving the public good.

    PubMed

    Greene, Jennifer C

    2010-05-01

    This discussion foregrounds four key issues engaged by the articles presented in this special issue: the unique challenges and opportunities of environmental education evaluation, how to think well about the evaluation approaches and purposes that best match this domain, evaluation capacity building in environmental education and action, and accountability and activist pressures on contemporary evaluation. Environmental education evaluators are encouraged to consider positioning their work in service of the public good.

  5. 'Good palliative care' orders.

    PubMed

    Maddocks, I

    1993-01-01

    A Select Committee of the Parliament of South Australia, considering revisions to legislation governing care of the dying, did not support allowing doctors to assist suicide. They recommended that no liability attach to the provision of reasonable palliative care which happens to shorten life. The Committee affirmed the suggestion that positive open orders to provide 'good palliative care' should replace 'do not resuscitate' orders. PMID:7506978

  6. Taking in a Show.

    PubMed

    Boden, Timothy W

    2016-01-01

    Many medical practices have cut back on education and staff development expenses, especially those costs associated with conventions and conferences. But there are hard-to-value returns on your investment in these live events--beyond the obvious benefits of acquired knowledge and skills. Major vendors still exhibit their services and wares at many events, and the exhibit hall is a treasure-house of information and resources for the savvy physician or administrator. Make and stick to a purposeful plan to exploit the trade show. You can compare products, gain new insights and ideas, and even negotiate better deals with representatives anxious to realize returns on their exhibition investments. PMID:27249887

  7. Taking in a Show.

    PubMed

    Boden, Timothy W

    2016-01-01

    Many medical practices have cut back on education and staff development expenses, especially those costs associated with conventions and conferences. But there are hard-to-value returns on your investment in these live events--beyond the obvious benefits of acquired knowledge and skills. Major vendors still exhibit their services and wares at many events, and the exhibit hall is a treasure-house of information and resources for the savvy physician or administrator. Make and stick to a purposeful plan to exploit the trade show. You can compare products, gain new insights and ideas, and even negotiate better deals with representatives anxious to realize returns on their exhibition investments.

  8. Obesity in show dogs.

    PubMed

    Corbee, R J

    2013-10-01

    Obesity is an important disease with a growing incidence. Because obesity is related to several other diseases, and decreases life span, it is important to identify the population at risk. Several risk factors for obesity have been described in the literature. A higher incidence of obesity in certain breeds is often suggested. The aim of this study was to determine whether obesity occurs more often in certain breeds. The second aim was to relate the increased prevalence of obesity in certain breeds to the official standards of that breed. To this end, we investigated 1379 dogs of 128 different breeds by determining their body condition score (BCS). Overall, 18.6% of the show dogs had a BCS >5, and 1.1% of the show dogs had a BCS>7. There were significant differences between breeds, which could be correlated to the breed standards. It warrants firm discussions with breeders and judges in order to come to different interpretations of the standards to prevent overweight conditions from being the standard of beauty. PMID:22882163

  9. Automatic design of decision-tree algorithms with evolutionary algorithms.

    PubMed

    Barros, Rodrigo C; Basgalupp, Márcio P; de Carvalho, André C P L F; Freitas, Alex A

    2013-01-01

    This study reports the empirical analysis of a hyper-heuristic evolutionary algorithm that is capable of automatically designing top-down decision-tree induction algorithms. Top-down decision-tree algorithms are of great importance, considering their ability to provide an intuitive and accurate knowledge representation for classification problems. The automatic design of these algorithms seems timely, given the large literature accumulated over more than 40 years of research in the manual design of decision-tree induction algorithms. The proposed hyper-heuristic evolutionary algorithm, HEAD-DT, is extensively tested using 20 public UCI datasets and 10 microarray gene expression datasets. The algorithms automatically designed by HEAD-DT are compared with traditional decision-tree induction algorithms, such as C4.5 and CART. Experimental results show that HEAD-DT is capable of generating algorithms which are significantly more accurate than C4.5 and CART.

  10. Not a "reality" show.

    PubMed

    Wrong, Terence; Baumgart, Erica

    2013-01-01

    The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show. PMID:23631336

  11. Not a "reality" show.

    PubMed

    Wrong, Terence; Baumgart, Erica

    2013-01-01

    The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show.

  12. Cooperation and the common good.

    PubMed

    Johnstone, Rufus A; Rodrigues, António M M

    2016-02-01

    In this paper, we draw the attention of biologists to a result from the economic literature, which suggests that when individuals are engaged in a communal activity of benefit to all, selection may favour cooperative sharing of resources even among non-relatives. Provided that group members all invest some resources in the public good, they should refrain from conflict over the division of these resources. The reason is that, given diminishing returns on investment in public and private goods, claiming (or ceding) a greater share of total resources only leads to the actor (or its competitors) investing more in the public good, such that the marginal costs and benefits of investment remain in balance. This cancels out any individual benefits of resource competition. We illustrate how this idea may be applied in the context of biparental care, using a sequential game in which parents first compete with one another over resources, and then choose how to allocate the resources they each obtain to care of their joint young (public good) versus their own survival and future reproductive success (private good). We show that when the two parents both invest in care to some extent, they should refrain from any conflict over the division of resources. The same effect can also support asymmetric outcomes in which one parent competes for resources and invests in care, whereas the other does not invest but refrains from competition. The fact that the caring parent gains higher fitness pay-offs at these equilibria suggests that abandoning a partner is not always to the latter's detriment, when the potential for resource competition is taken into account, but may instead be of benefit to the 'abandoned' mate. PMID:26729926

  13. Public medical shows.

    PubMed

    Walusinski, Olivier

    2014-01-01

    In the second half of the 19th century, Jean-Martin Charcot (1825-1893) became famous for the quality of his teaching and his innovative neurological discoveries, bringing many French and foreign students to Paris. A hunger for recognition, together with progressive and anticlerical ideals, led Charcot to invite writers, journalists, and politicians to his lessons, during which he presented the results of his work on hysteria. These events became public performances, for which physicians and patients were transformed into actors. Major newspapers ran accounts of these consultations, more like theatrical shows in some respects. The resultant enthusiasm prompted other physicians in Paris and throughout France to try and imitate them. We will compare the form and substance of Charcot's lessons with those given by Jules-Bernard Luys (1828-1897), Victor Dumontpallier (1826-1899), Ambroise-Auguste Liébault (1823-1904), Hippolyte Bernheim (1840-1919), Joseph Grasset (1849-1918), and Albert Pitres (1848-1928). We will also note their impact on contemporary cinema and theatre. PMID:25273491

  14. The Great Cometary Show

    NASA Astrophysics Data System (ADS)

    2007-01-01

    its high spatial and spectral resolution, it was possible to zoom into the very heart of this very massive star. In this innermost region, the observations are dominated by the extremely dense stellar wind that totally obscures the underlying central star. The AMBER observations show that this dense stellar wind is not spherically symmetric, but exhibits a clearly elongated structure. Overall, the AMBER observations confirm that the extremely high mass loss of Eta Carinae's massive central star is non-spherical and much stronger along the poles than in the equatorial plane. This is in agreement with theoretical models that predict such an enhanced polar mass-loss in the case of rapidly rotating stars. ESO PR Photo 06c/07 ESO PR Photo 06c/07 RS Ophiuchi in Outburst Several papers from this special feature focus on the later stages in a star's life. One looks at the binary system Gamma 2 Velorum, which contains the closest example of a star known as a Wolf-Rayet. A single AMBER observation allowed the astronomers to separate the spectra of the two components, offering new insights in the modeling of Wolf-Rayet stars, but made it also possible to measure the separation between the two stars. This led to a new determination of the distance of the system, showing that previous estimates were incorrect. The observations also revealed information on the region where the winds from the two stars collide. The famous binary system RS Ophiuchi, an example of a recurrent nova, was observed just 5 days after it was discovered to be in outburst on 12 February 2006, an event that has been expected for 21 years. AMBER was able to detect the extension of the expanding nova emission. These observations show a complex geometry and kinematics, far from the simple interpretation of a spherical fireball in extension. AMBER has detected a high velocity jet probably perpendicular to the orbital plane of the binary system, and allowed a precise and careful study of the wind and the shockwave

  15. Stretched View Showing 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Stretched View Showing 'Victoria'

    This pair of images from the panoramic camera on NASA's Mars Exploration Rover Opportunity served as initial confirmation that the two-year-old rover is within sight of 'Victoria Crater,' which it has been approaching for more than a year. Engineers on the rover team were unsure whether Opportunity would make it as far as Victoria, but scientists hoped for the chance to study such a large crater with their roving geologist. Victoria Crater is 800 meters (nearly half a mile) in diameter, about six times wider than 'Endurance Crater,' where Opportunity spent several months in 2004 examining rock layers affected by ancient water.

    When scientists using orbital data calculated that they should be able to detect Victoria's rim in rover images, they scrutinized frames taken in the direction of the crater by the panoramic camera. To positively characterize the subtle horizon profile of the crater and some of the features leading up to it, researchers created a vertically-stretched image (top) from a mosaic of regular frames from the panoramic camera (bottom), taken on Opportunity's 804th Martian day (April 29, 2006).

    The stretched image makes mild nearby dunes look like more threatening peaks, but that is only a result of the exaggerated vertical dimension. This vertical stretch technique was first applied to Viking Lander 2 panoramas by Philip Stooke, of the University of Western Ontario, Canada, to help locate the lander with respect to orbiter images. Vertically stretching the image allows features to be more readily identified by the Mars Exploration Rover science team.

    The bright white dot near the horizon to the right of center (barely visible without labeling or zoom-in) is thought to be a light-toned outcrop on the far wall of the crater, suggesting that the rover can see over the low rim of Victoria. In figure 1, the northeast and southeast rims are labeled

  16. Multipartite entanglement in quantum algorithms

    SciTech Connect

    Bruss, D.; Macchiavello, C.

    2011-05-15

    We investigate the entanglement features of the quantum states employed in quantum algorithms. In particular, we analyze the multipartite entanglement properties in the Deutsch-Jozsa, Grover, and Simon algorithms. Our results show that for these algorithms most instances involve multipartite entanglement.

  17. On algorithmic rate-coded AER generation.

    PubMed

    Linares-Barranco, Alejandro; Jimenez-Moreno, Gabriel; Linares-Barranco, Bernabé; Civit-Balcells, Antón

    2006-05-01

    This paper addresses the problem of converting a conventional video stream based on sequences of frames into the spike event-based representation known as the address-event-representation (AER). In this paper we concentrate on rate-coded AER. The problem is addressed as an algorithmic problem, in which different methods are proposed, implemented and tested through software algorithms. The proposed algorithms are comparatively evaluated according to different criteria. Emphasis is put on the potential of such algorithms for a) doing the frame-based to event-based representation in real time, and b) that the resulting event streams ressemble as much as possible those generated naturally by rate-coded address-event VLSI chips, such as silicon AER retinae. It is found that simple and straightforward algorithms tend to have high potential for real time but produce event distributions that differ considerably from those obtained in AER VLSI chips. On the other hand, sophisticated algorithms that yield better event distributions are not efficient for real time operations. The methods based on linear-feedback-shift-register (LFSR) pseudorandom number generation is a good compromise, which is feasible for real time and yield reasonably well distributed events in time. Our software experiments, on a 1.6-GHz Pentium IV, show that at 50% AER bus load the proposed algorithms require between 0.011 and 1.14 ms per 8 bit-pixel per frame. One of the proposed LFSR methods is implemented in real time hardware using a prototyping board that includes a VirtexE 300 FPGA. The demonstration hardware is capable of transforming frames of 64 x 64 pixels of 8-bit depth at a frame rate of 25 frames per second, producing spike events at a peak rate of 10(7) events per second. PMID:16722179

  18. Hash based parallel algorithms for mining association rules

    SciTech Connect

    Shintani, Takahiko; Kitsuregawa, Masaru

    1996-12-31

    In this paper, we propose four parallel algorithms (NPA, SPA, HPA and RPA-ELD) for mining association rules on shared-nothing parallel machines to improve its performance. In NPA, candidate itemsets are just copied amongst all the processors, which can lead to memory overflow for large transaction databases. The remaining three algorithms partition the candidate itemsets over the processors. If it is partitioned simply (SPA), transaction data has to be broadcast to all processors. HPA partitions the candidate itemsets using a hash function to eliminate broadcasting, which also reduces the comparison workload significantly. HPA-ELD fully utilizes the available memory space by detecting the extremely large itemsets and copying them, which is also very effective at flattering the load over the processors. We implemented these algorithms in a shared-nothing environment. Performance evaluations show that the best algorithm, HPA-ELD, attains good linearity on speedup ratio and is effective for handling skew.

  19. A Parallel Prefix Algorithm for Almost Toeplitz Tridiagonal Systems

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He; Joslin, Ronald D.

    1995-01-01

    A compact scheme is a discretization scheme that is advantageous in obtaining highly accurate solutions. However, the resulting systems from compact schemes are tridiagonal systems that are difficult to solve efficiently on parallel computers. Considering the almost symmetric Toeplitz structure, a parallel algorithm, simple parallel prefix (SPP), is proposed. The SPP algorithm requires less memory than the conventional LU decomposition and is efficient on parallel machines. It consists of a prefix communication pattern and AXPY operations. Both the computation and the communication can be truncated without degrading the accuracy when the system is diagonally dominant. A formal accuracy study has been conducted to provide a simple truncation formula. Experimental results have been measured on a MasPar MP-1 SIMD machine and on a Cray 2 vector machine. Experimental results show that the simple parallel prefix algorithm is a good algorithm for symmetric, almost symmetric Toeplitz tridiagonal systems and for the compact scheme on high-performance computers.

  20. Good Clinical Practice Training

    PubMed Central

    Arango, Jaime; Chuck, Tina; Ellenberg, Susan S.; Foltz, Bridget; Gorman, Colleen; Hinrichs, Heidi; McHale, Susan; Merchant, Kunal; Shapley, Stephanie; Wild, Gretchen

    2016-01-01

    Good Clinical Practice (GCP) is an international standard for the design, conduct, performance, monitoring, auditing, recording, analyses, and reporting of clinical trials. The goal of GCP is to ensure the protection of the rights, integrity, and confidentiality of clinical trial participants and to ensure the credibility and accuracy of data and reported results. In the United States, trial sponsors generally require investigators to complete GCP training prior to participating in each clinical trial to foster GCP and as a method to meet regulatory expectations (ie, sponsor’s responsibility to select qualified investigators per 21 CFR 312.50 and 312.53(a) for drugs and biologics and 21 CFR 812.40 and 812.43(a) for medical devices). This training requirement is often extended to investigative site staff, as deemed relevant by the sponsor, institution, or investigator. Those who participate in multiple clinical trials are often required by sponsors to complete repeated GCP training, which is unnecessarily burdensome. The Clinical Trials Transformation Initiative convened a multidisciplinary project team involving partners from academia, industry, other researchers and research staff, and government to develop recommendations for streamlining current GCP training practices. Recommendations drafted by the project team, including the minimum key training elements, frequency, format, and evidence of training completion, were presented to a broad group of experts to foster discussion of the current issues and to seek consensus on proposed solutions. PMID:27390628

  1. Eggs: good or bad?

    PubMed

    Griffin, Bruce A

    2016-08-01

    Eggs have one of the lowest energy to nutrient density ratios of any food, and contain a quality of protein that is superior to beef steak and similar to dairy. From a nutritional perspective, this must qualify eggs as 'good'. The greater burden of proof has been to establish that eggs are not 'bad', by increasing awareness of the difference between dietary and blood cholesterol, and accumulating sufficient evidence to exonerate eggs from their associations with CVD and diabetes. After 60 years of research, a general consensus has now been reached that dietary cholesterol, chiefly from eggs, exerts a relatively small effect on serum LDL-cholesterol and CVD risk, in comparison with other diet and lifestyle factors. While dietary guidelines have been revised worldwide to reflect this view, associations between egg intake and the incidence of diabetes, and increased CVD risk in diabetes, prevail. These associations may be explained, in part, by residual confounding produced by other dietary components. The strength of evidence that links egg intake to increased CVD risk in diabetes is also complicated by variation in the response of serum LDL-cholesterol to eggs and dietary cholesterol in types 1 and 2 diabetes. On balance, the answer to the question as to whether eggs are 'bad', is probably 'no', but we do need to gain a better understanding of the effects of dietary cholesterol and its association with CVD risk in diabetes.

  2. Firefly algorithm with chaos

    NASA Astrophysics Data System (ADS)

    Gandomi, A. H.; Yang, X.-S.; Talatahari, S.; Alavi, A. H.

    2013-01-01

    A recently developed metaheuristic optimization algorithm, firefly algorithm (FA), mimics the social behavior of fireflies based on the flashing and attraction characteristics of fireflies. In the present study, we will introduce chaos into FA so as to increase its global search mobility for robust global optimization. Detailed studies are carried out on benchmark problems with different chaotic maps. Here, 12 different chaotic maps are utilized to tune the attractive movement of the fireflies in the algorithm. The results show that some chaotic FAs can clearly outperform the standard FA.

  3. An improved filter-u least mean square vibration control algorithm for aircraft framework.

    PubMed

    Huang, Quanzhen; Luo, Jun; Gao, Zhiyuan; Zhu, Xiaojin; Li, Hengyu

    2014-09-01

    Active vibration control of aerospace vehicle structures is very a hot spot and in which filter-u least mean square (FULMS) algorithm is one of the key methods. But for practical reasons and technical limitations, vibration reference signal extraction is always a difficult problem for FULMS algorithm. To solve the vibration reference signal extraction problem, an improved FULMS vibration control algorithm is proposed in this paper. Reference signal is constructed based on the controller structure and the data in the algorithm process, using a vibration response residual signal extracted directly from the vibration structure. To test the proposed algorithm, an aircraft frame model is built and an experimental platform is constructed. The simulation and experimental results show that the proposed algorithm is more practical with a good vibration suppression performance.

  4. An improved filter-u least mean square vibration control algorithm for aircraft framework

    NASA Astrophysics Data System (ADS)

    Huang, Quanzhen; Luo, Jun; Gao, Zhiyuan; Zhu, Xiaojin; Li, Hengyu

    2014-09-01

    Active vibration control of aerospace vehicle structures is very a hot spot and in which filter-u least mean square (FULMS) algorithm is one of the key methods. But for practical reasons and technical limitations, vibration reference signal extraction is always a difficult problem for FULMS algorithm. To solve the vibration reference signal extraction problem, an improved FULMS vibration control algorithm is proposed in this paper. Reference signal is constructed based on the controller structure and the data in the algorithm process, using a vibration response residual signal extracted directly from the vibration structure. To test the proposed algorithm, an aircraft frame model is built and an experimental platform is constructed. The simulation and experimental results show that the proposed algorithm is more practical with a good vibration suppression performance.

  5. A comparison of various optimization algorithms of protein-ligand docking programs by fitness accuracy.

    PubMed

    Guo, Liyong; Yan, Zhiqiang; Zheng, Xiliang; Hu, Liang; Yang, Yongliang; Wang, Jin

    2014-07-01

    In protein-ligand docking, an optimization algorithm is used to find the best binding pose of a ligand against a protein target. This algorithm plays a vital role in determining the docking accuracy. To evaluate the relative performance of different optimization algorithms and provide guidance for real applications, we performed a comparative study on six efficient optimization algorithms, containing two evolutionary algorithm (EA)-based optimizers (LGA, DockDE) and four particle swarm optimization (PSO)-based optimizers (SODock, varCPSO, varCPSO-ls, FIPSDock), which were implemented into the protein-ligand docking program AutoDock. We unified the objective functions by applying the same scoring function, and built a new fitness accuracy as the evaluation criterion that incorporates optimization accuracy, robustness, and efficiency. The varCPSO and varCPSO-ls algorithms show high efficiency with fast convergence speed. However, their accuracy is not optimal, as they cannot reach very low energies. SODock has the highest accuracy and robustness. In addition, SODock shows good performance in efficiency when optimizing drug-like ligands with less than ten rotatable bonds. FIPSDock shows excellent robustness and is close to SODock in accuracy and efficiency. In general, the four PSO-based algorithms show superior performance than the two EA-based algorithms, especially for highly flexible ligands. Our method can be regarded as a reference for the validation of new optimization algorithms in protein-ligand docking.

  6. Study of image matching algorithm and sub-pixel fitting algorithm in target tracking

    NASA Astrophysics Data System (ADS)

    Yang, Ming-dong; Jia, Jianjun; Qiang, Jia; Wang, Jian-yu

    2015-03-01

    Image correlation matching is a tracking method that searched a region most approximate to the target template based on the correlation measure between two images. Because there is no need to segment the image, and the computation of this method is little. Image correlation matching is a basic method of target tracking. This paper mainly studies the image matching algorithm of gray scale image, which precision is at sub-pixel level. The matching algorithm used in this paper is SAD (Sum of Absolute Difference) method. This method excels in real-time systems because of its low computation complexity. The SAD method is introduced firstly and the most frequently used sub-pixel fitting algorithms are introduced at the meantime. These fitting algorithms can't be used in real-time systems because they are too complex. However, target tracking often requires high real-time performance, we put forward a fitting algorithm named paraboloidal fitting algorithm based on the consideration above, this algorithm is simple and realized easily in real-time system. The result of this algorithm is compared with that of surface fitting algorithm through image matching simulation. By comparison, the precision difference between these two algorithms is little, it's less than 0.01pixel. In order to research the influence of target rotation on precision of image matching, the experiment of camera rotation was carried on. The detector used in the camera is a CMOS detector. It is fixed to an arc pendulum table, take pictures when the camera rotated different angles. Choose a subarea in the original picture as the template, and search the best matching spot using image matching algorithm mentioned above. The result shows that the matching error is bigger when the target rotation angle is larger. It's an approximate linear relation. Finally, the influence of noise on matching precision was researched. Gaussian noise and pepper and salt noise were added in the image respectively, and the image

  7. Good foragers can also be good at detecting predators.

    PubMed Central

    Cresswell, W; Quinn, J L; Whittingham, M J; Butler, S

    2003-01-01

    The degree to which foraging and vigilance are mutually exclusive is crucial to understanding the management of the predation and starvation risk trade-off in animals. We tested whether wild-caught captive chaffinches that feed at a higher rate do so at the expense of their speed in responding to a model sparrowhawk flying nearby, and whether consistently good foragers will therefore tend to respond more slowly on average. First, we confirmed that the time taken to respond to the approaching predator depended on the rate of scanning: as head-up rate increased so chaffinches responded more quickly. However, against predictions, as peck rate increased so head-up rate increased and mean length of head-up and head-down periods decreased. Head-up rate was probably dependent on peck rate because almost every time a seed was found, a bird raised its head to handle it. Therefore chaffinches with higher peck rates responded more quickly. Individual chaffinches showed consistent durations of both their head-down and head-up periods and, therefore, individuals that were good foragers were also good detectors of predators. In relation to the broad range of species that have a similar foraging mode to chaffinches, our results have two major implications for predation/starvation risk trade-offs: (i) feeding rate can determine vigilance scanning patterns; and (ii) the best foragers can also be the best at detecting predators. We discuss how our results can be explained in mechanistic terms relating to fundamental differences in how the probabilities of detecting food rather than a predator are affected by time. In addition, our results offer a plausible explanation for the widely observed effect that vigilance continues to decline with group size even when there is no further benefit to reducing vigilance. PMID:12803897

  8. Shuttle Entry Air Data System (SEADS) - Optimization of preflight algorithms based on flight results

    NASA Technical Reports Server (NTRS)

    Wolf, H.; Henry, M. W.; Siemers, Paul M., III

    1988-01-01

    The SEADS pressure model algorithm results were tested against other sources of air data, in particular, the Shuttle Best Estimated Trajectory (BET). The algorithm basis was also tested through a comparison of flight-measured pressure distribution vs the wind tunnel database. It is concluded that the successful flight of SEADS and the subsequent analysis of the data shows good agreement between BET and SEADS air data.

  9. Good Enough to Eat

    ERIC Educational Resources Information Center

    Swinbank, Elizabeth

    2004-01-01

    This article shows how the physical testing of food ingredients and products can be used as a starting point for exploring aspects of physics. The three main areas addressed are: mechanical properties of solid materials; liquid flow; optical techniques for measuring sugar concentration. The activities described were originally developed for…

  10. Good nurse, bad nurse....

    PubMed

    Alavi, C; Cattoni, J

    1995-02-01

    The construction of the nursing subject is discussed. The paper takes a historical perspective, arguing that the range of speaking positions available to the nurse is limited by gender, class and education. It evaluates the position of nursing in the university, showing how this also has propensity to limit the development of the nursing profession.

  11. Acoustic design of rotor blades using a genetic algorithm

    NASA Technical Reports Server (NTRS)

    Wells, V. L.; Han, A. Y.; Crossley, W. A.

    1995-01-01

    A genetic algorithm coupled with a simplified acoustic analysis was used to generate low-noise rotor blade designs. The model includes thickness, steady loading and blade-vortex interaction noise estimates. The paper presents solutions for several variations in the fitness function, including thickness noise only, loading noise only, and combinations of the noise types. Preliminary results indicate that the analysis provides reasonable assessments of the noise produced, and that genetic algorithm successfully searches for 'good' designs. The results show that, for a given required thrust coefficient, proper blade design can noticeably reduce the noise produced at some expense to the power requirements.

  12. Moving target detection algorithm based on Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Wang, Zhihua; Kai, Du; Zhang, Xiandong

    2013-07-01

    In real-time video surveillance system, background noise and disturbance for the detection of moving objects will have a significant impact. The traditional Gaussian mixture model;GMM&;has strong adaptive various complex background ability, but slow convergence speed and vulnerable to illumination change influence. the paper proposes an improved moving target detection algorithm based on Gaussian mixture model which increase the convergence rate of foreground to the background model transformation and introducing the concept of the changing factors, through the three frame differential method solved light mutation problem. The results show that this algorithm can improve the accuracy of the moving object detection, and has good stability and real-time.

  13. Optimization of multilayer cylindrical cloaks using genetic algorithms and NEWUOA

    NASA Astrophysics Data System (ADS)

    Sakr, Ahmed A.; Abdelmageed, Alaa K.

    2016-06-01

    The problem of minimizing the scattering from a multilayer cylindrical cloak is studied. Both TM and TE polarizations are considered. A two-stage optimization procedure using genetic algorithms and NEWUOA (new unconstrained optimization algorithm) is adopted for realizing the cloak using homogeneous isotropic layers. The layers are arranged such that they follow a repeated pattern of alternating DPS and DNG materials. The results show that a good level of invisibility can be realized using a reasonable number of layers. Maintaining the cloak performance over a finite range of frequencies without sacrificing the level of invisibility is achieved.

  14. Novel permutation measures for image encryption algorithms

    NASA Astrophysics Data System (ADS)

    Abd-El-Hafiz, Salwa K.; AbdElHaleem, Sherif H.; Radwan, Ahmed G.

    2016-10-01

    This paper proposes two measures for the evaluation of permutation techniques used in image encryption. First, a general mathematical framework for describing the permutation phase used in image encryption is presented. Using this framework, six different permutation techniques, based on chaotic and non-chaotic generators, are described. The two new measures are, then, introduced to evaluate the effectiveness of permutation techniques. These measures are (1) Percentage of Adjacent Pixels Count (PAPC) and (2) Distance Between Adjacent Pixels (DBAP). The proposed measures are used to evaluate and compare the six permutation techniques in different scenarios. The permutation techniques are applied on several standard images and the resulting scrambled images are analyzed. Moreover, the new measures are used to compare the permutation algorithms on different matrix sizes irrespective of the actual parameters used in each algorithm. The analysis results show that the proposed measures are good indicators of the effectiveness of the permutation technique.

  15. And the good news...?

    NASA Astrophysics Data System (ADS)

    1999-11-01

    Along with the increase in the number of young people applying to enter higher education announced back in July, the UK Department for Education and Employment noted that over a thousand more graduates had applied for postgraduate teacher training when compared with the same time in 1998. It appeared that the `Golden hello' programme for new mathematics and science teachers had succeeded in its aim of encouraging applicants in those subjects: an increase of 37% had been witnessed for maths teaching, 33% for physics and 27% for chemistry. Primary teacher training was also well on target with over five applicants seeking each available place. Statistics for UK schools released in August by the DfEE show that 62% of primary schools and 93% of secondary schools are now linked to the Internet (the corresponding figures were 17% and 83% in 1998). On average there is now one computer for every 13 pupils at primary school and one for every eight students in secondary school. The figures show continuing progress towards the Government's target of ensuring that all schools, colleges, universities, libraries and as many community centres as possible should be online (with access to the National Grid for Learning) by 2002.

  16. Optimization of composite structures by estimation of distribution algorithms

    NASA Astrophysics Data System (ADS)

    Grosset, Laurent

    The design of high performance composite laminates, such as those used in aerospace structures, leads to complex combinatorial optimization problems that cannot be addressed by conventional methods. These problems are typically solved by stochastic algorithms, such as evolutionary algorithms. This dissertation proposes a new evolutionary algorithm for composite laminate optimization, named Double-Distribution Optimization Algorithm (DDOA). DDOA belongs to the family of estimation of distributions algorithms (EDA) that build a statistical model of promising regions of the design space based on sets of good points, and use it to guide the search. A generic framework for introducing statistical variable dependencies by making use of the physics of the problem is proposed. The algorithm uses two distributions simultaneously: the marginal distributions of the design variables, complemented by the distribution of auxiliary variables. The combination of the two generates complex distributions at a low computational cost. The dissertation demonstrates the efficiency of DDOA for several laminate optimization problems where the design variables are the fiber angles and the auxiliary variables are the lamination parameters. The results show that its reliability in finding the optima is greater than that of a simple EDA and of a standard genetic algorithm, and that its advantage increases with the problem dimension. A continuous version of the algorithm is presented and applied to a constrained quadratic problem. Finally, a modification of the algorithm incorporating probabilistic and directional search mechanisms is proposed. The algorithm exhibits a faster convergence to the optimum and opens the way for a unified framework for stochastic and directional optimization.

  17. Good Grubbin': Impact of a TV Cooking Show for College Students Living off Campus

    ERIC Educational Resources Information Center

    Clifford, Dawn; Anderson, Jennifer; Auld, Garry; Champ, Joseph

    2009-01-01

    Objective: To determine if a series of 4 15-minute, theory-driven (Social Cognitive Theory) cooking programs aimed at college students living off campus improved cooking self-efficacy, knowledge, attitudes, and behaviors regarding fruit and vegetable intake. Design: A randomized controlled trial with pre-, post- and follow-up tests. Setting:…

  18. Good News for New Orleans: Early Evidence Shows Reforms Lifting Student Achievement

    ERIC Educational Resources Information Center

    Harris, Douglas N.

    2015-01-01

    What happened to the New Orleans public schools following the tragic levee breeches after Hurricane Katrina is truly unprecedented. Within the span of one year, all public-school employees were fired, the teacher contract expired and was not replaced, and most attendance zones were eliminated. The state took control of almost all public schools…

  19. A fast image encryption algorithm based on chaotic map

    NASA Astrophysics Data System (ADS)

    Liu, Wenhao; Sun, Kehui; Zhu, Congxu

    2016-09-01

    Derived from Sine map and iterative chaotic map with infinite collapse (ICMIC), a new two-dimensional Sine ICMIC modulation map (2D-SIMM) is proposed based on a close-loop modulation coupling (CMC) model, and its chaotic performance is analyzed by means of phase diagram, Lyapunov exponent spectrum and complexity. It shows that this map has good ergodicity, hyperchaotic behavior, large maximum Lyapunov exponent and high complexity. Based on this map, a fast image encryption algorithm is proposed. In this algorithm, the confusion and diffusion processes are combined for one stage. Chaotic shift transform (CST) is proposed to efficiently change the image pixel positions, and the row and column substitutions are applied to scramble the pixel values simultaneously. The simulation and analysis results show that this algorithm has high security, low time complexity, and the abilities of resisting statistical analysis, differential, brute-force, known-plaintext and chosen-plaintext attacks.

  20. A New Switching-Based Median Filtering Scheme and Algorithm for Removal of High-Density Salt and Pepper Noise in Images

    NASA Astrophysics Data System (ADS)

    Jayaraj, V.; Ebenezer, D.

    2010-12-01

    A new switching-based median filtering scheme for restoration of images that are highly corrupted by salt and pepper noise is proposed. An algorithm based on the scheme is developed. The new scheme introduces the concept of substitution of noisy pixels by linear prediction prior to estimation. A novel simplified linear predictor is developed for this purpose. The objective of the scheme and algorithm is the removal of high-density salt and pepper noise in images. The new algorithm shows significantly better image quality with good PSNR, reduced MSE, good edge preservation, and reduced streaking. The good performance is achieved with reduced computational complexity. A comparison of the performance is made with several existing algorithms in terms of visual and quantitative results. The performance of the proposed scheme and algorithm is demonstrated.

  1. A variational multi-symplectic particle-in-cell algorithm with smoothing functions for the Vlasov-Maxwell system

    SciTech Connect

    Xiao, Jianyuan; Liu, Jian; Qin, Hong; Yu, Zhi

    2013-10-15

    Smoothing functions are commonly used to reduce numerical noise arising from coarse sampling of particles in particle-in-cell (PIC) plasma simulations. When applying smoothing functions to symplectic algorithms, the conservation of symplectic structure should be guaranteed to preserve good conservation properties. In this paper, we show how to construct a variational multi-symplectic PIC algorithm with smoothing functions for the Vlasov-Maxwell system. The conservation of the multi-symplectic structure and the reduction of numerical noise make this algorithm specifically suitable for simulating long-term dynamics of plasmas, such as those in the steady-state operation or long-pulse discharge of a super-conducting tokamak. The algorithm has been implemented in a 6D large scale PIC code. Numerical examples are given to demonstrate the good conservation properties of the multi-symplectic algorithm and the reduction of the noise due to the application of smoothing function.

  2. Digital watermarking algorithm based on HVS in wavelet domain

    NASA Astrophysics Data System (ADS)

    Zhang, Qiuhong; Xia, Ping; Liu, Xiaomei

    2013-10-01

    As a new technique used to protect the copyright of digital productions, the digital watermark technique has drawn extensive attention. A digital watermarking algorithm based on discrete wavelet transform (DWT) was presented according to human visual properties in the paper. Then some attack analyses were given. Experimental results show that the watermarking scheme proposed in this paper is invisible and robust to cropping, and also has good robustness to cut , compression , filtering , and noise adding .

  3. Algorithm development

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.; Lomax, Harvard

    1987-01-01

    The past decade has seen considerable activity in algorithm development for the Navier-Stokes equations. This has resulted in a wide variety of useful new techniques. Some examples for the numerical solution of the Navier-Stokes equations are presented, divided into two parts. One is devoted to the incompressible Navier-Stokes equations, and the other to the compressible form.

  4. Comparison of l₁-Norm SVR and Sparse Coding Algorithms for Linear Regression.

    PubMed

    Zhang, Qingtian; Hu, Xiaolin; Zhang, Bo

    2015-08-01

    Support vector regression (SVR) is a popular function estimation technique based on Vapnik's concept of support vector machine. Among many variants, the l1-norm SVR is known to be good at selecting useful features when the features are redundant. Sparse coding (SC) is a technique widely used in many areas and a number of efficient algorithms are available. Both l1-norm SVR and SC can be used for linear regression. In this brief, the close connection between the l1-norm SVR and SC is revealed and some typical algorithms are compared for linear regression. The results show that the SC algorithms outperform the Newton linear programming algorithm, an efficient l1-norm SVR algorithm, in efficiency. The algorithms are then used to design the radial basis function (RBF) neural networks. Experiments on some benchmark data sets demonstrate the high efficiency of the SC algorithms. In particular, one of the SC algorithms, the orthogonal matching pursuit is two orders of magnitude faster than a well-known RBF network designing algorithm, the orthogonal least squares algorithm.

  5. Contact solution algorithms

    NASA Technical Reports Server (NTRS)

    Tielking, John T.

    1989-01-01

    Two algorithms for obtaining static contact solutions are described in this presentation. Although they were derived for contact problems involving specific structures (a tire and a solid rubber cylinder), they are sufficiently general to be applied to other shell-of-revolution and solid-body contact problems. The shell-of-revolution contact algorithm is a method of obtaining a point load influence coefficient matrix for the portion of shell surface that is expected to carry a contact load. If the shell is sufficiently linear with respect to contact loading, a single influence coefficient matrix can be used to obtain a good approximation of the contact pressure distribution. Otherwise, the matrix will be updated to reflect nonlinear load-deflection behavior. The solid-body contact algorithm utilizes a Lagrange multiplier to include the contact constraint in a potential energy functional. The solution is found by applying the principle of minimum potential energy. The Lagrange multiplier is identified as the contact load resultant for a specific deflection. At present, only frictionless contact solutions have been obtained with these algorithms. A sliding tread element has been developed to calculate friction shear force in the contact region of the rolling shell-of-revolution tire model.

  6. Node Deployment Algorithm Based on Viscous Fluid Model for Wireless Sensor Networks

    PubMed Central

    Qian, Huanyan

    2014-01-01

    With the scale expands, traditional deployment algorithms are becoming increasingly complicated than before, which are no longer fit for sensor networks. In order to reduce the complexity, we propose a node deployment algorithm based on viscous fluid model. In wireless sensor networks, sensor nodes are abstracted as fluid particles. Similar to the diffusion and self-propagation behavior of fluid particles, sensor nodes realize deployment in unknown region following the motion rules of fluid. Simulation results show that our algorithm archives good coverage rate and homogeneity in large-scale sensor networks. PMID:25133222

  7. New color image encryption algorithm based on compound chaos mapping and hyperchaotic cellular neural network

    NASA Astrophysics Data System (ADS)

    Li, Jinqing; Bai, Fengming; Di, Xiaoqiang

    2013-01-01

    We propose an image encryption/decryption algorithm based on chaotic control parameter and hyperchaotic system with the composite permutation-diffusion structure. Compound chaos mapping is used to generate control parameters in the permutation stage. The high correlation between pixels is shuffled. In the diffusion stage, compound chaos mapping of different initial condition and control parameter generates the diffusion parameters, which are applied to hyperchaotic cellular neural networks. The diffusion key stream is obtained by this process and implements the pixels' diffusion. Compared with the existing methods, both simulation and statistical analysis of our proposed algorithm show that the algorithm has a good performance against attacks and meets the corresponding security level.

  8. A baseline algorithm for face detection and tracking in video

    NASA Astrophysics Data System (ADS)

    Manohar, Vasant; Soundararajan, Padmanabhan; Korzhova, Valentina; Boonstra, Matthew; Goldgof, Dmitry; Kasturi, Rangachar

    2007-10-01

    Establishing benchmark datasets, performance metrics and baseline algorithms have considerable research significance in gauging the progress in any application domain. These primarily allow both users and developers to compare the performance of various algorithms on a common platform. In our earlier works, we focused on developing performance metrics and establishing a substantial dataset with ground truth for object detection and tracking tasks (text and face) in two video domains -- broadcast news and meetings. In this paper, we present the results of a face detection and tracking algorithm on broadcast news videos with the objective of establishing a baseline performance for this task-domain pair. The detection algorithm uses a statistical approach that was originally developed by Viola and Jones and later extended by Lienhart. The algorithm uses a feature set that is Haar-like and a cascade of boosted decision tree classifiers as a statistical model. In this work, we used the Intel Open Source Computer Vision Library (OpenCV) implementation of the Haar face detection algorithm. The optimal values for the tunable parameters of this implementation were found through an experimental design strategy commonly used in statistical analyses of industrial processes. Tracking was accomplished as continuous detection with the detected objects in two frames mapped using a greedy algorithm based on the distances between the centroids of bounding boxes. Results on the evaluation set containing 50 sequences (~ 2.5 mins.) using the developed performance metrics show good performance of the algorithm reflecting the state-of-the-art which makes it an appropriate choice as the baseline algorithm for the problem.

  9. Digital watermarking algorithm research of color images based on quaternion Fourier transform

    NASA Astrophysics Data System (ADS)

    An, Mali; Wang, Weijiang; Zhao, Zhen

    2013-10-01

    A watermarking algorithm of color images based on the quaternion Fourier Transform (QFFT) and improved quantization index algorithm (QIM) is proposed in this paper. The original image is transformed by QFFT, the watermark image is processed by compression and quantization coding, and then the processed watermark image is embedded into the components of the transformed original image. It achieves embedding and blind extraction of the watermark image. The experimental results show that the watermarking algorithm based on the improved QIM algorithm with distortion compensation achieves a good tradeoff between invisibility and robustness, and better robustness for the attacks of Gaussian noises, salt and pepper noises, JPEG compression, cropping, filtering and image enhancement than the traditional QIM algorithm.

  10. Parallel algorithms for unconstrained optimizations by multisplitting

    SciTech Connect

    He, Qing

    1994-12-31

    In this paper a new parallel iterative algorithm for unconstrained optimization using the idea of multisplitting is proposed. This algorithm uses the existing sequential algorithms without any parallelization. Some convergence and numerical results for this algorithm are presented. The experiments are performed on an Intel iPSC/860 Hyper Cube with 64 nodes. It is interesting that the sequential implementation on one node shows that if the problem is split properly, the algorithm converges much faster than one without splitting.

  11. A Hybrid Ant Colony Algorithm for Loading Pattern Optimization

    NASA Astrophysics Data System (ADS)

    Hoareau, F.

    2014-06-01

    Electricité de France (EDF) operates 58 nuclear power plant (NPP), of the Pressurized Water Reactor (PWR) type. The loading pattern (LP) optimization of these NPP is currently done by EDF expert engineers. Within this framework, EDF R&D has developed automatic optimization tools that assist the experts. The latter can resort, for instance, to a loading pattern optimization software based on ant colony algorithm. This paper presents an analysis of the search space of a few realistic loading pattern optimization problems. This analysis leads us to introduce a hybrid algorithm based on ant colony and a local search method. We then show that this new algorithm is able to generate loading patterns of good quality.

  12. Retrieval of Sea Surface Temperature Over Poteran Island Water of Indonesia with Landsat 8 Tirs Image: a Preliminary Algorithm

    NASA Astrophysics Data System (ADS)

    Syariz, M. A.; Jaelani, L. M.; Subehi, L.; Pamungkas, A.; Koenhardono, E. S.; Sulisetyono, A.

    2015-10-01

    The Sea Surface Temperature (SST) retrieval from satellites data Thus, it could provide SST data for a long time. Since, the algorithms of SST estimation by using Landsat 8 Thermal Band are sitedependence, we need to develop an applicable algorithm in Indonesian water. The aim of this research was to develop SST algorithms in the North Java Island Water. The data used are in-situ data measured on April 22, 2015 and also estimated brightness temperature data from Landsat 8 Thermal Band Image (band 10 and band 11). The algorithm was established using 45 data by assessing the relation of measured in-situ data and estimated brightness temperature. Then, the algorithm was validated by using another 40 points. The results showed that the good performance of the sea surface temperature algorithm with coefficient of determination (R2) and Root Mean Square Error (RMSE) of 0.912 and 0.028, respectively.

  13. Dreaming, Stealing, Dancing, Showing Off.

    ERIC Educational Resources Information Center

    Lavender, Peter; Taylor, Chris

    2002-01-01

    Lessons learned from British projects to delivery literacy, numeracy, and English as a second language through community agencies included the following: (1) innovation and measured risks are required to attract hard-to-reach adults; (2) good practice needs to be shared; and (3) projects worked best when government funds were managed by community…

  14. Algorithm aversion: people erroneously avoid algorithms after seeing them err.

    PubMed

    Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade

    2015-02-01

    Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.

  15. Plan Showing Cross Bracing Under Upper Stringers, Typical Section Showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Plan Showing Cross Bracing Under Upper Stringers, Typical Section Showing End Framing, Plan Showing Cross Bracing Under Lower Stringers, End Elevation - Covered Bridge, Spanning Contoocook River, Hopkinton, Merrimack County, NH

  16. Enjoyment and the Good Life.

    ERIC Educational Resources Information Center

    Estes, Cheryl; Henderson, Karla

    2003-01-01

    Presents information to update parks and recreation professionals about what recent research says in regard to enjoyment and the good life, noting what applications this research has for practitioners. The article focuses on: the good life and leisure services; happiness, subjective well-being, and intrinsic motivation; leisure, happiness, and…

  17. What Are Good Child Outcomes?

    ERIC Educational Resources Information Center

    Moore, Kristin Anderson; Evans, V. Jeffery; Brooks-Gunn, Jeanne; Roth, Jodie

    This paper considers the question "What are good child outcomes?" from the perspectives of developmental psychology, economics, and sociology. Section 1 of the paper examines good child outcomes as characteristics of stage-salient tasks of development. Section 2 emphasizes the acquisition of "human capital," the development of productive traits…

  18. Temperature Corrected Bootstrap Algorithm

    NASA Technical Reports Server (NTRS)

    Comiso, Joey C.; Zwally, H. Jay

    1997-01-01

    A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.

  19. Power spectral estimation algorithms

    NASA Technical Reports Server (NTRS)

    Bhatia, Manjit S.

    1989-01-01

    Algorithms to estimate the power spectrum using Maximum Entropy Methods were developed. These algorithms were coded in FORTRAN 77 and were implemented on the VAX 780. The important considerations in this analysis are: (1) resolution, i.e., how close in frequency two spectral components can be spaced and still be identified; (2) dynamic range, i.e., how small a spectral peak can be, relative to the largest, and still be observed in the spectra; and (3) variance, i.e., how accurate the estimate of the spectra is to the actual spectra. The application of the algorithms based on Maximum Entropy Methods to a variety of data shows that these criteria are met quite well. Additional work in this direction would help confirm the findings. All of the software developed was turned over to the technical monitor. A copy of a typical program is included. Some of the actual data and graphs used on this data are also included.

  20. A test sheet generating algorithm based on intelligent genetic algorithm and hierarchical planning

    NASA Astrophysics Data System (ADS)

    Gu, Peipei; Niu, Zhendong; Chen, Xuting; Chen, Wei

    2013-03-01

    In recent years, computer-based testing has become an effective method to evaluate students' overall learning progress so that appropriate guiding strategies can be recommended. Research has been done to develop intelligent test assembling systems which can automatically generate test sheets based on given parameters of test items. A good multisubject test sheet depends on not only the quality of the test items but also the construction of the sheet. Effective and efficient construction of test sheets according to multiple subjects and criteria is a challenging problem. In this paper, a multi-subject test sheet generation problem is formulated and a test sheet generating approach based on intelligent genetic algorithm and hierarchical planning (GAHP) is proposed to tackle this problem. The proposed approach utilizes hierarchical planning to simplify the multi-subject testing problem and adopts genetic algorithm to process the layered criteria, enabling the construction of good test sheets according to multiple test item requirements. Experiments are conducted and the results show that the proposed approach is capable of effectively generating multi-subject test sheets that meet specified requirements and achieve good performance.

  1. A test sheet generating algorithm based on intelligent genetic algorithm and hierarchical planning

    NASA Astrophysics Data System (ADS)

    Gu, Peipei; Niu, Zhendong; Chen, Xuting; Chen, Wei

    2012-04-01

    In recent years, computer-based testing has become an effective method to evaluate students' overall learning progress so that appropriate guiding strategies can be recommended. Research has been done to develop intelligent test assembling systems which can automatically generate test sheets based on given parameters of test items. A good multisubject test sheet depends on not only the quality of the test items but also the construction of the sheet. Effective and efficient construction of test sheets according to multiple subjects and criteria is a challenging problem. In this paper, a multi-subject test sheet generation problem is formulated and a test sheet generating approach based on intelligent genetic algorithm and hierarchical planning (GAHP) is proposed to tackle this problem. The proposed approach utilizes hierarchical planning to simplify the multi-subject testing problem and adopts genetic algorithm to process the layered criteria, enabling the construction of good test sheets according to multiple test item requirements. Experiments are conducted and the results show that the proposed approach is capable of effectively generating multi-subject test sheets that meet specified requirements and achieve good performance.

  2. Does good documentation equate to good nursing care?

    PubMed

    Bosek, Marcia Sue DeWolf; Ring, Marcia Ellen

    2010-01-01

    Good documentation does not necessarily equate to good care. This article explores the potential underpinnings of poor documentation from an ethical decision-making lens. Nursing standards of care related to documentation are reviewed. The internal and external constraints of moral distress are considered, as is moral residue. Finally, the roles of the nurse administrator as well as specific remedial and restorative measures are suggested.

  3. Ensembles of satellite aerosol retrievals based on three AATSR algorithms within aerosol_cci

    NASA Astrophysics Data System (ADS)

    Kosmale, Miriam; Popp, Thomas

    2016-04-01

    Ensemble techniques are widely used in the modelling community, combining different modelling results in order to reduce uncertainties. This approach could be also adapted to satellite measurements. Aerosol_cci is an ESA funded project, where most of the European aerosol retrieval groups work together. The different algorithms are homogenized as far as it makes sense, but remain essentially different. Datasets are compared with ground based measurements and between each other. Three AATSR algorithms (Swansea university aerosol retrieval, ADV aerosol retrieval by FMI and Oxford aerosol retrieval ORAC) provide within this project 17 year global aerosol records. Each of these algorithms provides also uncertainty information on pixel level. Within the presented work, an ensembles of the three AATSR algorithms is performed. The advantage over each single algorithm is the higher spatial coverage due to more measurement pixels per gridbox. A validation to ground based AERONET measurements shows still a good correlation of the ensemble, compared to the single algorithms. Annual mean maps show the global aerosol distribution, based on a combination of the three aerosol algorithms. In addition, pixel level uncertainties of each algorithm are used for weighting the contributions, in order to reduce the uncertainty of the ensemble. Results of different versions of the ensembles for aerosol optical depth will be presented and discussed. The results are validated against ground based AERONET measurements. A higher spatial coverage on daily basis allows better results in annual mean maps. The benefit of using pixel level uncertainties is analysed.

  4. Bouc-Wen hysteresis model identification using Modified Firefly Algorithm

    NASA Astrophysics Data System (ADS)

    Zaman, Mohammad Asif; Sikder, Urmita

    2015-12-01

    The parameters of Bouc-Wen hysteresis model are identified using a Modified Firefly Algorithm. The proposed algorithm uses dynamic process control parameters to improve its performance. The algorithm is used to find the model parameter values that results in the least amount of error between a set of given data points and points obtained from the Bouc-Wen model. The performance of the algorithm is compared with the performance of conventional Firefly Algorithm, Genetic Algorithm and Differential Evolution algorithm in terms of convergence rate and accuracy. Compared to the other three optimization algorithms, the proposed algorithm is found to have good convergence rate with high degree of accuracy in identifying Bouc-Wen model parameters. Finally, the proposed method is used to find the Bouc-Wen model parameters from experimental data. The obtained model is found to be in good agreement with measured data.

  5. Ozone ensemble forecast with machine learning algorithms

    NASA Astrophysics Data System (ADS)

    Mallet, Vivien; Stoltz, Gilles; Mauricette, Boris

    2009-03-01

    We apply machine learning algorithms to perform sequential aggregation of ozone forecasts. The latter rely on a multimodel ensemble built for ozone forecasting with the modeling system Polyphemus. The ensemble simulations are obtained by changes in the physical parameterizations, the numerical schemes, and the input data to the models. The simulations are carried out for summer 2001 over western Europe in order to forecast ozone daily peaks and ozone hourly concentrations. On the basis of past observations and past model forecasts, the learning algorithms produce a weight for each model. A convex or linear combination of the model forecasts is then formed with these weights. This process is repeated for each round of forecasting and is therefore called sequential aggregation. The aggregated forecasts demonstrate good results; for instance, they always show better performance than the best model in the ensemble and they even compete against the best constant linear combination. In addition, the machine learning algorithms come with theoretical guarantees with respect to their performance, that hold for all possible sequences of observations, even nonstochastic ones. Our study also demonstrates the robustness of the methods. We therefore conclude that these aggregation methods are very relevant for operational forecasts.

  6. Research on registration algorithm for check seal verification

    NASA Astrophysics Data System (ADS)

    Wang, Shuang; Liu, Tiegen

    2008-03-01

    Nowadays seals play an important role in China. With the development of social economy, the traditional method of manual check seal identification can't meet the need s of banking transactions badly. This paper focus on pre-processing and registration algorithm for check seal verification using theory of image processing and pattern recognition. First of all, analyze the complex characteristics of check seals. To eliminate the difference of producing conditions and the disturbance caused by background and writing in check image, many methods are used in the pre-processing of check seal verification, such as color components transformation, linearity transform to gray-scale image, medium value filter, Otsu, close calculations and labeling algorithm of mathematical morphology. After the processes above, the good binary seal image can be obtained. On the basis of traditional registration algorithm, a double-level registration method including rough and precise registration method is proposed. The deflection angle of precise registration method can be precise to 0.1°. This paper introduces the concepts of difference inside and difference outside and use the percent of difference inside and difference outside to judge whether the seal is real or fake. The experimental results of a mass of check seals are satisfied. It shows that the methods and algorithmic presented have good robustness to noise sealing conditions and satisfactory tolerance of difference within class.

  7. The Good Second Language Learner

    ERIC Educational Resources Information Center

    Naiman, N.; And Others

    1975-01-01

    Reports on a research study designed to find out more about the successful second language learner. By means of interviews, a list of the learning strategies of good language students was developed. (PMP)

  8. Good Health For the Holidays!

    MedlinePlus

    ... Current Issue Past Issues Good Health For the Holidays! Past Issues / Fall 2007 Table of Contents For ... PhotoDisc When the family comes together for the holidays, make sure everyone knows about MedlinePlus.gov —your ...

  9. Diet and good health (image)

    MedlinePlus

    ... for a person of any age. A healthy diet is especially important for children since a variety of food is needed for proper development. Other elements of good health include exercise, rest and avoidance of stimulants such ...

  10. Good customer service for patients.

    PubMed

    Foster, Sam

    2016-08-11

    Sam Foster, Chief Nurse at Heart of England NHS Foundation Trust, looks at what the NHS can learn about good customer service from the private sector, and how Always Events can improve patient care. PMID:27523767

  11. Good Practices for Hood Use.

    ERIC Educational Resources Information Center

    Mikell, William G.; Drinkard, William C.

    1984-01-01

    Describes safety practices for laboratory fume hoods based on certain assumptions of hood design and performance. Also discusses the procedures in preparing to work at a hood. A checklist of good hood practices is included. (JM)

  12. Do good actions inspire good actions in others?

    PubMed Central

    Capraro, Valerio; Marcelletti, Alessandra

    2014-01-01

    Actions such as sharing food and cooperating to reach a common goal have played a fundamental role in the evolution of human societies. Despite the importance of such good actions, little is known about if and how they can spread from person to person to person. For instance, does being recipient of an altruistic act increase your probability of being cooperative with a third party? We have conducted an experiment on Amazon Mechanical Turk to test this mechanism using economic games. We have measured willingness to be cooperative through a standard Prisoner's dilemma and willingness to act altruistically using a binary Dictator game. In the baseline treatments, the endowments needed to play were given by the experimenters, as usual; in the control treatments, they came from a good action made by someone else. Across four different comparisons and a total of 572 subjects, we have never found a significant increase of cooperation or altruism when the endowment came from a good action. We conclude that good actions do not necessarily inspire good actions in others. While this is consistent with the theoretical prediction, it challenges the majority of other experimental studies. PMID:25502617

  13. Depreciation of public goods in spatial public goods games

    NASA Astrophysics Data System (ADS)

    Shi, Dong-Mei; Zhuang, Yong; Li, Yu-Jian; Wang, Bing-Hong

    2011-10-01

    In real situations, the value of public goods will be reduced or even lost because of external factors or for intrinsic reasons. In this work, we investigate the evolution of cooperation by considering the effect of depreciation of public goods in spatial public goods games on a square lattice. It is assumed that each individual gains full advantage if the number of the cooperators nc within a group centered on that individual equals or exceeds the critical mass (CM). Otherwise, there is depreciation of the public goods, which is realized by rescaling the multiplication factor r to (nc/CM)r. It is shown that the emergence of cooperation is remarkably promoted for CM > 1 even at small values of r, and a global cooperative level is achieved at an intermediate value of CM = 4 at a small r. We further study the effect of depreciation of public goods on different topologies of a regular lattice, and find that the system always reaches global cooperation at a moderate value of CM = G - 1 regardless of whether or not there exist overlapping triangle structures on the regular lattice, where G is the group size of the associated regular lattice.

  14. An innovative localisation algorithm for railway vehicles

    NASA Astrophysics Data System (ADS)

    Allotta, B.; D'Adamio, P.; Malvezzi, M.; Pugi, L.; Ridolfi, A.; Rindi, A.; Vettori, G.

    2014-11-01

    . The estimation strategy has good performance also under degraded adhesion conditions and could be put on board of high-speed railway vehicles; it represents an accurate and reliable solution. The IMU board is tested via a dedicated Hardware in the Loop (HIL) test rig: it includes an industrial robot able to replicate the motion of the railway vehicle. Through the generated experimental outputs the performances of the innovative localisation algorithm have been evaluated: the HIL test rig permitted to test the proposed algorithm, avoiding expensive (in terms of time and cost) on-track tests, obtaining encouraging results. In fact, the preliminary results show a significant improvement of the position and speed estimation performances compared to those obtained with SCMT algorithms, currently in use on the Italian railway network.

  15. DNABIT Compress - Genome compression algorithm.

    PubMed

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-01

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, "DNABIT Compress" for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that "DNABIT Compress" algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases.

  16. Communicating for the Good of Your Child.

    ERIC Educational Resources Information Center

    Lerman, Saf

    1985-01-01

    Parents can help their children feel secure and have a good self-image by communicating these feelings through words and actions. Suggestions for showing respect, building self-esteem, fostering security and success, and talking to children in a positive way are dicussed. (DF)

  17. A Good Teaching Technique: WebQuests

    ERIC Educational Resources Information Center

    Halat, Erdogan

    2008-01-01

    In this article, the author first introduces and describes a new teaching tool called WebQuests to practicing teachers. He then provides detailed information about the structure of a good WebQuest. Third, the author shows the strengths and weaknesses of using Web-Quests in teaching and learning. Last, he points out the challenges for practicing…

  18. SAGE II inversion algorithm. [Stratospheric Aerosol and Gas Experiment

    NASA Technical Reports Server (NTRS)

    Chu, W. P.; Mccormick, M. P.; Lenoble, J.; Brogniez, C.; Pruvost, P.

    1989-01-01

    The operational Stratospheric Aerosol and Gas Experiment II multichannel data inversion algorithm is described. Aerosol and ozone retrievals obtained with the algorithm are discussed. The algorithm is compared to an independently developed algorithm (Lenoble, 1989), showing that the inverted aerosol and ozone profiles from the two algorithms are similar within their respective uncertainties.

  19. 8. Detail showing concrete abutment, showing substructure of bridge, specifically ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Detail showing concrete abutment, showing substructure of bridge, specifically west side of arch and substructure. - Presumpscot Falls Bridge, Spanning Presumptscot River at Allen Avenue extension, 0.75 mile west of U.S. Interstate 95, Falmouth, Cumberland County, ME

  20. 28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS LINCOLN BOULEVARD, BIG LOST RIVER, AND NAVAL REACTORS FACILITY. F.C. TORKELSON DRAWING NUMBER 842-ARVFS-101-2. DATED OCTOBER 12, 1965. INEL INDEX CODE NUMBER: 075 0101 851 151969. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID

  1. ICESat-2 / ATLAS Flight Science Receiver Algorithms

    NASA Astrophysics Data System (ADS)

    Mcgarry, J.; Carabajal, C. C.; Degnan, J. J.; Mallama, A.; Palm, S. P.; Ricklefs, R.; Saba, J. L.

    2013-12-01

    . This Simulator makes it possible to check all logic paths that could be encountered by the Algorithms on orbit. In addition the NASA airborne instrument MABEL is collecting data with characteristics similar to what ATLAS will see. MABEL data is being used to test the ATLAS Receiver Algorithms. Further verification will be performed during Integration and Testing of the ATLAS instrument and during Environmental Testing on the full ATLAS instrument. Results from testing to date show the Receiver Algorithms have the ability to handle a wide range of signal and noise levels with a very good sensitivity at relatively low signal to noise ratios. In addition, preliminary tests have demonstrated, using the ICESat-2 Science Team's selected land ice and sea ice test cases, the capability of the Algorithms to successfully find and telemeter the surface echoes. In this presentation we will describe the ATLAS Flight Science Receiver Algorithms and the Software Simulator, and will present results of the testing to date. The onboard databases (DEM, DRM and the Surface Reference Mask) are being developed at the University of Texas at Austin as part of the ATLAS Flight Science Receiver Algorithms. Verification of the onboard databases is being performed by ATLAS Receiver Algorithms team members Claudia Carabajal and Jack Saba.

  2. Higher-order force gradient symplectic algorithms

    NASA Astrophysics Data System (ADS)

    Chin, Siu A.; Kidwell, Donald W.

    2000-12-01

    We show that a recently discovered fourth order symplectic algorithm, which requires one evaluation of force gradient in addition to three evaluations of the force, when iterated to higher order, yielded algorithms that are far superior to similarly iterated higher order algorithms based on the standard Forest-Ruth algorithm. We gauge the accuracy of each algorithm by comparing the step-size independent error functions associated with energy conservation and the rotation of the Laplace-Runge-Lenz vector when solving a highly eccentric Kepler problem. For orders 6, 8, 10, and 12, the new algorithms are approximately a factor of 103, 104, 104, and 105 better.

  3. Good Practices in Free-energy Calculations

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christopher

    2013-01-01

    As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in drug design. Yet, in a number of instances, the reliability of these calculations can be improved significantly if a number of precepts, or good practices are followed. For the most part, the theory upon which these good practices rely has been known for many years, but often overlooked, or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. The current best practices for carrying out free-energy calculations will be reviewed demonstrating that, at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. In energy perturbation and nonequilibrium work methods, monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision. In thermodynamic integration and probability distribution (histogramming) methods, properly designed adaptive techniques yield nearly uniform sampling of the relevant degrees of freedom and, by doing so, could markedly improve efficiency and accuracy of free energy calculations without incurring any additional computational expense.

  4. An Efficient Parallel Algorithm for Multiple Sequence Similarities Calculation Using a Low Complexity Method

    PubMed Central

    Marucci, Evandro A.; Neves, Leandro A.; Valêncio, Carlo R.; Pinto, Alex R.; Cansian, Adriano M.; de Souza, Rogeria C. G.; Shiyou, Yang; Machado, José M.

    2014-01-01

    With the advance of genomic researches, the number of sequences involved in comparative methods has grown immensely. Among them, there are methods for similarities calculation, which are used by many bioinformatics applications. Due the huge amount of data, the union of low complexity methods with the use of parallel computing is becoming desirable. The k-mers counting is a very efficient method with good biological results. In this work, the development of a parallel algorithm for multiple sequence similarities calculation using the k-mers counting method is proposed. Tests show that the algorithm presents a very good scalability and a nearly linear speedup. For 14 nodes was obtained 12x speedup. This algorithm can be used in the parallelization of some multiple sequence alignment tools, such as MAFFT and MUSCLE. PMID:25140318

  5. An efficient parallel algorithm for multiple sequence similarities calculation using a low complexity method.

    PubMed

    Marucci, Evandro A; Zafalon, Geraldo F D; Momente, Julio C; Neves, Leandro A; Valêncio, Carlo R; Pinto, Alex R; Cansian, Adriano M; de Souza, Rogeria C G; Shiyou, Yang; Machado, José M

    2014-01-01

    With the advance of genomic researches, the number of sequences involved in comparative methods has grown immensely. Among them, there are methods for similarities calculation, which are used by many bioinformatics applications. Due the huge amount of data, the union of low complexity methods with the use of parallel computing is becoming desirable. The k-mers counting is a very efficient method with good biological results. In this work, the development of a parallel algorithm for multiple sequence similarities calculation using the k-mers counting method is proposed. Tests show that the algorithm presents a very good scalability and a nearly linear speedup. For 14 nodes was obtained 12x speedup. This algorithm can be used in the parallelization of some multiple sequence alignment tools, such as MAFFT and MUSCLE. PMID:25140318

  6. A Pretty Good Paper about Pretty Good Privacy.

    ERIC Educational Resources Information Center

    McCollum, Roy

    With today's growth in the use of electronic information systems for e-mail, data development and research, and the relative ease of access to such resources, protecting one's data and correspondence has become a great concern. "Pretty Good Privacy" (PGP), an encryption program developed by Phil Zimmermann, may be the software tool that will…

  7. Adaptive Routing Algorithm in Wireless Communication Networks Using Evolutionary Algorithm

    NASA Astrophysics Data System (ADS)

    Yan, Xuesong; Wu, Qinghua; Cai, Zhihua

    At present, mobile communications traffic routing designs are complicated because there are more systems inter-connecting to one another. For example, Mobile Communication in the wireless communication networks has two routing design conditions to consider, i.e. the circuit switching and the packet switching. The problem in the Packet Switching routing design is its use of high-speed transmission link and its dynamic routing nature. In this paper, Evolutionary Algorithms is used to determine the best solution and the shortest communication paths. We developed a Genetic Optimization Process that can help network planners solving the best solutions or the best paths of routing table in wireless communication networks are easily and quickly. From the experiment results can be noted that the evolutionary algorithm not only gets good solutions, but also a more predictable running time when compared to sequential genetic algorithm.

  8. Comprehensive eye evaluation algorithm

    NASA Astrophysics Data System (ADS)

    Agurto, C.; Nemeth, S.; Zamora, G.; Vahtel, M.; Soliz, P.; Barriga, S.

    2016-03-01

    In recent years, several research groups have developed automatic algorithms to detect diabetic retinopathy (DR) in individuals with diabetes (DM), using digital retinal images. Studies have indicated that diabetics have 1.5 times the annual risk of developing primary open angle glaucoma (POAG) as do people without DM. Moreover, DM patients have 1.8 times the risk for age-related macular degeneration (AMD). Although numerous investigators are developing automatic DR detection algorithms, there have been few successful efforts to create an automatic algorithm that can detect other ocular diseases, such as POAG and AMD. Consequently, our aim in the current study was to develop a comprehensive eye evaluation algorithm that not only detects DR in retinal images, but also automatically identifies glaucoma suspects and AMD by integrating other personal medical information with the retinal features. The proposed system is fully automatic and provides the likelihood of each of the three eye disease. The system was evaluated in two datasets of 104 and 88 diabetic cases. For each eye, we used two non-mydriatic digital color fundus photographs (macula and optic disc centered) and, when available, information about age, duration of diabetes, cataracts, hypertension, gender, and laboratory data. Our results show that the combination of multimodal features can increase the AUC by up to 5%, 7%, and 8% in the detection of AMD, DR, and glaucoma respectively. Marked improvement was achieved when laboratory results were combined with retinal image features.

  9. A Comparison of Three Algorithms for Orion Drogue Parachute Release

    NASA Technical Reports Server (NTRS)

    Matz, Daniel A.; Braun, Robert D.

    2015-01-01

    The Orion Multi-Purpose Crew Vehicle is susceptible to ipping apex forward between drogue parachute release and main parachute in ation. A smart drogue release algorithm is required to select a drogue release condition that will not result in an apex forward main parachute deployment. The baseline algorithm is simple and elegant, but does not perform as well as desired in drogue failure cases. A simple modi cation to the baseline algorithm can improve performance, but can also sometimes fail to identify a good release condition. A new algorithm employing simpli ed rotational dynamics and a numeric predictor to minimize a rotational energy metric is proposed. A Monte Carlo analysis of a drogue failure scenario is used to compare the performance of the algorithms. The numeric predictor prevents more of the cases from ipping apex forward, and also results in an improvement in the capsule attitude at main bag extraction. The sensitivity of the numeric predictor to aerodynamic dispersions, errors in the navigated state, and execution rate is investigated, showing little degradation in performance.

  10. Sort-Mid tasks scheduling algorithm in grid computing.

    PubMed

    Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M

    2015-11-01

    Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan. PMID:26644937

  11. Sort-Mid tasks scheduling algorithm in grid computing.

    PubMed

    Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M

    2015-11-01

    Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.

  12. Sort-Mid tasks scheduling algorithm in grid computing

    PubMed Central

    Reda, Naglaa M.; Tawfik, A.; Marzok, Mohamed A.; Khamis, Soheir M.

    2014-01-01

    Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan. PMID:26644937

  13. Governing for the Common Good.

    PubMed

    Ruger, Jennifer Prah

    2015-12-01

    The proper object of global health governance (GHG) should be the common good, ensuring that all people have the opportunity to flourish. A well-organized global society that promotes the common good is to everyone's advantage. Enabling people to flourish includes enabling their ability to be healthy. Thus, we must assess health governance by its effectiveness in enhancing health capabilities. Current GHG fails to support human flourishing, diminishes health capabilities and thus does not serve the common good. The provincial globalism theory of health governance proposes a Global Health Constitution and an accompanying Global Institute of Health and Medicine that together propose to transform health governance. Multiple lines of empirical research suggest that these institutions would be effective, offering the most promising path to a healthier, more just world.

  14. Governing for the Common Good.

    PubMed

    Ruger, Jennifer Prah

    2015-12-01

    The proper object of global health governance (GHG) should be the common good, ensuring that all people have the opportunity to flourish. A well-organized global society that promotes the common good is to everyone's advantage. Enabling people to flourish includes enabling their ability to be healthy. Thus, we must assess health governance by its effectiveness in enhancing health capabilities. Current GHG fails to support human flourishing, diminishes health capabilities and thus does not serve the common good. The provincial globalism theory of health governance proposes a Global Health Constitution and an accompanying Global Institute of Health and Medicine that together propose to transform health governance. Multiple lines of empirical research suggest that these institutions would be effective, offering the most promising path to a healthier, more just world. PMID:26122555

  15. Religiosity as a public good.

    PubMed

    Sherlock, Richard

    2008-09-01

    Public Goods can be seen as one important way in which societies sustain themselves over time. These are part of the puzzle of the development of political order. Public goods like the rule of law are non-substractable and non-excludable . For economists the classic textbook examples are national defense and police protection. In this paper I argue that religiosity can function like police protection, a means of sustaining order through fear of punishment from a transcendent source. As a means of reducing defection from social norms it has a role to play as a public good. But religion cannot at the same time be seen as the source of such norms or dissention will undermine the very order that punishment seems to reinforce.

  16. Algorithmic chemistry

    SciTech Connect

    Fontana, W.

    1990-12-13

    In this paper complex adaptive systems are defined by a self- referential loop in which objects encode functions that act back on these objects. A model for this loop is presented. It uses a simple recursive formal language, derived from the lambda-calculus, to provide a semantics that maps character strings into functions that manipulate symbols on strings. The interaction between two functions, or algorithms, is defined naturally within the language through function composition, and results in the production of a new function. An iterated map acting on sets of functions and a corresponding graph representation are defined. Their properties are useful to discuss the behavior of a fixed size ensemble of randomly interacting functions. This function gas'', or Turning gas'', is studied under various conditions, and evolves cooperative interaction patterns of considerable intricacy. These patterns adapt under the influence of perturbations consisting in the addition of new random functions to the system. Different organizations emerge depending on the availability of self-replicators.

  17. Improved artificial bee colony algorithm for wavefront sensor-less system in free space optical communication

    NASA Astrophysics Data System (ADS)

    Niu, Chaojun; Han, Xiang'e.

    2015-10-01

    Adaptive optics (AO) technology is an effective way to alleviate the effect of turbulence on free space optical communication (FSO). A new adaptive compensation method can be used without a wave-front sensor. Artificial bee colony algorithm (ABC) is a population-based heuristic evolutionary algorithm inspired by the intelligent foraging behaviour of the honeybee swarm with the advantage of simple, good convergence rate, robust and less parameter setting. In this paper, we simulate the application of the improved ABC to correct the distorted wavefront and proved its effectiveness. Then we simulate the application of ABC algorithm, differential evolution (DE) algorithm and stochastic parallel gradient descent (SPGD) algorithm to the FSO system and analyze the wavefront correction capabilities by comparison of the coupling efficiency, the error rate and the intensity fluctuation in different turbulence before and after the correction. The results show that the ABC algorithm has much faster correction speed than DE algorithm and better correct ability for strong turbulence than SPGD algorithm. Intensity fluctuation can be effectively reduced in strong turbulence, but not so effective in week turbulence.

  18. On Parallel Push-Relabel based Algorithms for Bipartite Maximum Matching

    SciTech Connect

    Langguth, Johannes; Azad, Md Ariful; Halappanavar, Mahantesh; Manne, Fredrik

    2014-07-01

    We study multithreaded push-relabel based algorithms for computing maximum cardinality matching in bipartite graphs. Matching is a fundamental combinatorial (graph) problem with applications in a wide variety of problems in science and engineering. We are motivated by its use in the context of sparse linear solvers for computing maximum transversal of a matrix. We implement and test our algorithms on several multi-socket multicore systems and compare their performance to state-of-the-art augmenting path-based serial and parallel algorithms using a testset comprised of a wide range of real-world instances. Building on several heuristics for enhancing performance, we demonstrate good scaling for the parallel push-relabel algorithm. We show that it is comparable to the best augmenting path-based algorithms for bipartite matching. To the best of our knowledge, this is the first extensive study of multithreaded push-relabel based algorithms. In addition to a direct impact on the applications using matching, the proposed algorithmic techniques can be extended to preflow-push based algorithms for computing maximum flow in graphs.

  19. A Space-Bounded Anytime Algorithm for the Multiple Longest Common Subsequence Problem

    PubMed Central

    Yang, Jiaoyun; Xu, Yun; Shang, Yi; Chen, Guoliang

    2014-01-01

    The multiple longest common subsequence (MLCS) problem, related to the identification of sequence similarity, is an important problem in many fields. As an NP-hard problem, its exact algorithms have difficulty in handling large-scale data and time- and space-efficient algorithms are required in real-world applications. To deal with time constraints, anytime algorithms have been proposed to generate good solutions with a reasonable time. However, there exists little work on space-efficient MLCS algorithms. In this paper, we formulate the MLCS problem into a graph search problem and present two space-efficient anytime MLCS algorithms, SA-MLCS and SLA-MLCS. SA-MLCS uses an iterative beam widening search strategy to reduce space usage during the iterative process of finding better solutions. Based on SA-MLCS, SLA-MLCS, a space-bounded algorithm, is developed to avoid space usage from exceeding available memory. SLA-MLCS uses a replacing strategy when SA-MLCS reaches a given space bound. Experimental results show SA-MLCS and SLA-MLCS use an order of magnitude less space and time than the state-of-the-art approximate algorithm MLCS-APP while finding better solutions. Compared to the state-of-the-art anytime algorithm Pro-MLCS, SA-MLCS and SLA-MLCS can solve an order of magnitude larger size instances. Furthermore, SLA-MLCS can find much better solutions than SA-MLCS on large size instances. PMID:25400485

  20. A Space-Bounded Anytime Algorithm for the Multiple Longest Common Subsequence Problem.

    PubMed

    Yang, Jiaoyun; Xu, Yun; Shang, Yi; Chen, Guoliang

    2014-11-01

    The multiple longest common subsequence (MLCS) problem, related to the identification of sequence similarity, is an important problem in many fields. As an NP-hard problem, its exact algorithms have difficulty in handling large-scale data and time- and space-efficient algorithms are required in real-world applications. To deal with time constraints, anytime algorithms have been proposed to generate good solutions with a reasonable time. However, there exists little work on space-efficient MLCS algorithms. In this paper, we formulate the MLCS problem into a graph search problem and present two space-efficient anytime MLCS algorithms, SA-MLCS and SLA-MLCS. SA-MLCS uses an iterative beam widening search strategy to reduce space usage during the iterative process of finding better solutions. Based on SA-MLCS, SLA-MLCS, a space-bounded algorithm, is developed to avoid space usage from exceeding available memory. SLA-MLCS uses a replacing strategy when SA-MLCS reaches a given space bound. Experimental results show SA-MLCS and SLA-MLCS use an order of magnitude less space and time than the state-of-the-art approximate algorithm MLCS-APP while finding better solutions. Compared to the state-of-the-art anytime algorithm Pro-MLCS, SA-MLCS and SLA-MLCS can solve an order of magnitude larger size instances. Furthermore, SLA-MLCS can find much better solutions than SA-MLCS on large size instances.

  1. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  2. An adaptive gyroscope-based algorithm for temporal gait analysis.

    PubMed

    Greene, Barry R; McGrath, Denise; O'Neill, Ross; O'Donovan, Karol J; Burns, Adrian; Caulfield, Brian

    2010-12-01

    Body-worn kinematic sensors have been widely proposed as the optimal solution for portable, low cost, ambulatory monitoring of gait. This study aims to evaluate an adaptive gyroscope-based algorithm for automated temporal gait analysis using body-worn wireless gyroscopes. Gyroscope data from nine healthy adult subjects performing four walks at four different speeds were then compared against data acquired simultaneously using two force plates and an optical motion capture system. Data from a poliomyelitis patient, exhibiting pathological gait walking with and without the aid of a crutch, were also compared to the force plate. Results show that the mean true error between the adaptive gyroscope algorithm and force plate was -4.5 ± 14.4 ms and 43.4 ± 6.0 ms for IC and TC points, respectively, in healthy subjects. Similarly, the mean true error when data from the polio patient were compared against the force plate was -75.61 ± 27.53 ms and 99.20 ± 46.00 ms for IC and TC points, respectively. A comparison of the present algorithm against temporal gait parameters derived from an optical motion analysis system showed good agreement for nine healthy subjects at four speeds. These results show that the algorithm reported here could constitute the basis of a robust, portable, low-cost system for ambulatory monitoring of gait.

  3. An Algorithm for Testing the Efficient Market Hypothesis

    PubMed Central

    Boboc, Ioana-Andreea; Dinică, Mihai-Cristian

    2013-01-01

    The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH). PMID:24205148

  4. An algorithm for testing the efficient market hypothesis.

    PubMed

    Boboc, Ioana-Andreea; Dinică, Mihai-Cristian

    2013-01-01

    The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH).

  5. Methods of information theory and algorithmic complexity for network biology.

    PubMed

    Zenil, Hector; Kiani, Narsis A; Tegnér, Jesper

    2016-03-01

    We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.

  6. What Do Blood Tests Show?

    MedlinePlus

    ... shows the ranges for blood glucose levels after 8 to 12 hours of fasting (not eating). It shows the normal range and the abnormal ranges that are a sign of prediabetes or diabetes. Plasma Glucose Results (mg/dL)* Diagnosis 70 to 99 ...

  7. Linear Bregman algorithm implemented in parallel GPU

    NASA Astrophysics Data System (ADS)

    Li, Pengyan; Ke, Jue; Sui, Dong; Wei, Ping

    2015-08-01

    At present, most compressed sensing (CS) algorithms have poor converging speed, thus are difficult to run on PC. To deal with this issue, we use a parallel GPU, to implement a broadly used compressed sensing algorithm, the Linear Bregman algorithm. Linear iterative Bregman algorithm is a reconstruction algorithm proposed by Osher and Cai. Compared with other CS reconstruction algorithms, the linear Bregman algorithm only involves the vector and matrix multiplication and thresholding operation, and is simpler and more efficient for programming. We use C as a development language and adopt CUDA (Compute Unified Device Architecture) as parallel computing architectures. In this paper, we compared the parallel Bregman algorithm with traditional CPU realized Bregaman algorithm. In addition, we also compared the parallel Bregman algorithm with other CS reconstruction algorithms, such as OMP and TwIST algorithms. Compared with these two algorithms, the result of this paper shows that, the parallel Bregman algorithm needs shorter time, and thus is more convenient for real-time object reconstruction, which is important to people's fast growing demand to information technology.

  8. On mapping systolic algorithms onto the hypercube

    SciTech Connect

    Ibarra, O.H.; Sohn, S.M. )

    1990-01-01

    Much effort has been devoted toward developing efficient algorithms for systolic arrays. Here the authors consider the problem of mapping these algorithms into efficient algorithms for a fixed-size hypercube architecture. They describe in detail several optimal implementations of algorithms given for one-way one and two-dimensional systolic arrays. Since interprocessor communication is many times slower than local computation in parallel computers built to date, the problem of efficient communication is specifically addressed for these mappings. In order to experimentally validate the technique, five systolic algorithms were mapped in various ways onto a 64-node NCUBE/7 MMD hypercube machine. The algorithms are for the following problems: the shuffle scheduling problem, finite impulse response filtering, linear context-free language recognition, matrix multiplication, and computing the Boolean transitive closure. Experimental evidence indicates that good performance is obtained for the mappings.

  9. Fast training algorithms for multilayer neural nets.

    PubMed

    Brent, R P

    1991-01-01

    An algorithm that is faster than back-propagation and for which it is not necessary to specify the number of hidden units in advance is described. The relationship with other fast pattern-recognition algorithms, such as algorithms based on k-d trees, is discussed. The algorithm has been implemented and tested on artificial problems, such as the parity problem, and on real problems arising in speech recognition. Experimental results, including training times and recognition accuracy, are given. Generally, the algorithm achieves accuracy as good as or better than nets trained using back-propagation. Accuracy is comparable to that for the nearest-neighbor algorithm, which is slower and requires more storage space.

  10. Everyone Loves a Good Story

    ERIC Educational Resources Information Center

    Croxall, Kathy C.; Gubler, Rea R.

    2006-01-01

    Everyone loves a good story. Reading brings back pleasant memories of being read to by parents or others. Literacy is encouraged when students are continually exposed to stories and books. Teachers can encourage students to discover their parents' favorite stories and share them with the class. In this article, the authors recommend the use of…

  11. Gender Play and Good Governance

    ERIC Educational Resources Information Center

    Powell, Mark

    2008-01-01

    Like good government, thoughtful care of children requires those in power, whether teachers or parents, to recognize when it is appropriate for them to step back from day-to-day decision-making while still working behind the scenes to ensure an organizational structure that supports the independence and equitable development of those they serve.…

  12. Practicing Good Habits, Grade 2.

    ERIC Educational Resources Information Center

    Nguyen Van Quan; And Others

    This illustrated primer, designed for second grade students in Vietnam, consists of stories depicting rural family life in Vietnam. The book is divided into the following six chapters: (1) Practicing Good Habits (health, play, helpfulness); (2) Duties at Home (grandparents, father and mother, servants, the extended family; (3) Duties in School…

  13. Education for the Good Society

    ERIC Educational Resources Information Center

    Lawson, Neal; Spours, Ken

    2011-01-01

    The Left is facing a crisis of its approach to education highlighted by the "education revolution" of the Coalition Government. The authors argue that it is important to step back and present a positive vision of education based on the key pillars of the Good Society--fairness, democracy, sustainability and well-being. This values-led agenda,…

  14. Measuring Goodness of Story Narratives

    ERIC Educational Resources Information Center

    Le, Karen; Coelho, Carl; Mozeiko, Jennifer; Grafman, Jordan

    2011-01-01

    Purpose: The purpose of this article was to evaluate a new measure of story narrative performance: story completeness. It was hypothesized that by combining organizational (story grammar) and completeness measures, story "goodness" could be quantified. Method: Discourse samples from 46 typically developing adults were compared with those from 24…

  15. "Good Morning Boys and Girls"

    ERIC Educational Resources Information Center

    Bigler, Rebecca S.

    2005-01-01

    It happens every day across the nation: Teachers welcome their students to class by saying, "Good morning, boys and girls." It is one of countless ways teachers highlight gender with their speech and behavior. Unfortunately, teachers' use of gender to label students and organize the classroom can have negative consequences. New research in the…

  16. Good and Bad Public Prose.

    ERIC Educational Resources Information Center

    Cockburn, Stewart

    1969-01-01

    The basic requirements of all good prose are clarity, accuracy, brevity, and simplicity. Especially in public prose--in which the meaning is the crux of the article or speech--concise, vigorous English demands a minimum of adjectives, a maximum use of the active voice, nouns carefully chosen, a logical argument with no labored or obscure points,…

  17. Making the Common Good Common

    ERIC Educational Resources Information Center

    Chase, Barbara

    2011-01-01

    How are independent schools to be useful to the wider world? Beyond their common commitment to educate their students for meaningful lives in service of the greater good, can they educate a broader constituency and, thus, share their resources and skills more broadly? Their answers to this question will be shaped by their independence. Any…

  18. Edifying Theory: Serving the Good.

    ERIC Educational Resources Information Center

    Manen, Max van

    1982-01-01

    This article, concerning the importance of and need for educational theory, elucidates the etymology of the word "theory," describes the importance of ethnomethodology to educational principles, and views the concerns of epistemology to curriculum theory. The question "What is the good of theory?" is debated in relation to the actual benefit of…

  19. Is New Work Good Work?

    ERIC Educational Resources Information Center

    Westwood, Andy

    Some new work is good work. Quality is ultimately defined by the individual. However, these perceptions are inevitably colored by the circumstances in which people find themselves, by the time, place, and wide range of motivations for having to do a particular job in the first place. One person's quality may be another's purgatory and vice versa.…

  20. What Good Are Conferences, Anyway?

    ERIC Educational Resources Information Center

    Pietro, David C.

    1996-01-01

    According to Frederick Herzberg's studies of employee motivation, humans are driven by motivating factors that allow them to grow psychologically and hygiene factors that help them meet physical needs. Good education conferences can enhance both factors by helping principals refocus their energies, exchange ideas with trusted colleagues, and view…

  1. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    . Training was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  2. "What's the Plan?": "Good Management Begins with Good People"

    ERIC Educational Resources Information Center

    Vicars, Dennis

    2008-01-01

    In order for a successful center/school to achieve all it can for its children, staff, and operator, a plan is critical. Good planning begins by looking into the future that one wants for his or her center/school. Be as descriptive as possible in writing down the details of what that future looks like. Next, walk backwards from that future to the…

  3. Combining algorithms in automatic detection of QRS complexes in ECG signals.

    PubMed

    Meyer, Carsten; Fernández Gavela, José; Harris, Matthew

    2006-07-01

    QRS complex and specifically R-Peak detection is the crucial first step in every automatic electrocardiogram analysis. Much work has been carried out in this field, using various methods ranging from filtering and threshold methods, through wavelet methods, to neural networks and others. Performance is generally good, but each method has situations where it fails. In this paper, we suggest an approach to automatically combine different QRS complex detection algorithms, here the Pan-Tompkins and wavelet algorithms, to benefit from the strengths of both methods. In particular, we introduce parameters allowing to balance the contribution of the individual algorithms; these parameters are estimated in a data-driven way. Experimental results and analysis are provided on the Massachusetts Institute of Technology-Beth Israel Hospital (MIT-BIH) Arrhythmia Database. We show that our combination approach outperforms both individual algorithms. PMID:16871713

  4. A simple parallel prefix algorithm for compact finite-difference schemes

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He; Joslin, Ronald D.

    1993-01-01

    A compact scheme is a discretization scheme that is advantageous in obtaining highly accurate solutions. However, the resulting systems from compact schemes are tridiagonal systems that are difficult to solve efficiently on parallel computers. Considering the almost symmetric Toeplitz structure, a parallel algorithm, simple parallel prefix (SPP), is proposed. The SPP algorithm requires less memory than the conventional LU decomposition and is highly efficient on parallel machines. It consists of a prefix communication pattern and AXPY operations. Both the computation and the communication can be truncated without degrading the accuracy when the system is diagonally dominant. A formal accuracy study was conducted to provide a simple truncation formula. Experimental results were measured on a MasPar MP-1 SIMD machine and on a Cray 2 vector machine. Experimental results show that the simple parallel prefix algorithm is a good algorithm for the compact scheme on high-performance computers.

  5. Satellite Movie Shows Erika Dissipate

    NASA Video Gallery

    This animation of visible and infrared imagery from NOAA's GOES-West satellite from Aug. 27 to 29 shows Tropical Storm Erika move through the Eastern Caribbean Sea and dissipate near eastern Cuba. ...

  6. Parameters Identification of Fluxgate Magnetic Core Adopting the Biogeography-Based Optimization Algorithm.

    PubMed

    Jiang, Wenjuan; Shi, Yunbo; Zhao, Wenjie; Wang, Xiangxin

    2016-01-01

    The main part of the magnetic fluxgate sensor is the magnetic core, the hysteresis characteristic of which affects the performance of the sensor. When the fluxgate sensors are modelled for design purposes, an accurate model of hysteresis characteristic of the cores is necessary to achieve good agreement between modelled and experimental data. The Jiles-Atherton model is simple and can reflect the hysteresis properties of the magnetic material precisely, which makes it widely used in hysteresis modelling and simulation of ferromagnetic materials. However, in practice, it is difficult to determine the parameters accurately owing to the sensitivity of the parameters. In this paper, the Biogeography-Based Optimization (BBO) algorithm is applied to identify the Jiles-Atherton model parameters. To enhance the performances of the BBO algorithm such as global search capability, search accuracy and convergence rate, an improved Biogeography-Based Optimization (IBBO) algorithm is put forward by using Arnold map and mutation strategy of Differential Evolution (DE) algorithm. Simulation results show that IBBO algorithm is superior to Genetic Algorithm (GA), Particle Swarm Optimization (PSO) algorithm, Differential Evolution algorithm and BBO algorithm in identification accuracy and convergence rate. The IBBO algorithm is applied to identify Jiles-Atherton model parameters of selected permalloy. The simulation hysteresis loop is in high agreement with experimental data. Using permalloy as core of fluxgate probe, the simulation output is consistent with experimental output. The IBBO algorithm can identify the parameters of Jiles-Atherton model accurately, which provides a basis for the precise analysis and design of instruments and equipment with magnetic core. PMID:27347974

  7. Parameters Identification of Fluxgate Magnetic Core Adopting the Biogeography-Based Optimization Algorithm

    PubMed Central

    Jiang, Wenjuan; Shi, Yunbo; Zhao, Wenjie; Wang, Xiangxin

    2016-01-01

    The main part of the magnetic fluxgate sensor is the magnetic core, the hysteresis characteristic of which affects the performance of the sensor. When the fluxgate sensors are modelled for design purposes, an accurate model of hysteresis characteristic of the cores is necessary to achieve good agreement between modelled and experimental data. The Jiles-Atherton model is simple and can reflect the hysteresis properties of the magnetic material precisely, which makes it widely used in hysteresis modelling and simulation of ferromagnetic materials. However, in practice, it is difficult to determine the parameters accurately owing to the sensitivity of the parameters. In this paper, the Biogeography-Based Optimization (BBO) algorithm is applied to identify the Jiles-Atherton model parameters. To enhance the performances of the BBO algorithm such as global search capability, search accuracy and convergence rate, an improved Biogeography-Based Optimization (IBBO) algorithm is put forward by using Arnold map and mutation strategy of Differential Evolution (DE) algorithm. Simulation results show that IBBO algorithm is superior to Genetic Algorithm (GA), Particle Swarm Optimization (PSO) algorithm, Differential Evolution algorithm and BBO algorithm in identification accuracy and convergence rate. The IBBO algorithm is applied to identify Jiles-Atherton model parameters of selected permalloy. The simulation hysteresis loop is in high agreement with experimental data. Using permalloy as core of fluxgate probe, the simulation output is consistent with experimental output. The IBBO algorithm can identify the parameters of Jiles-Atherton model accurately, which provides a basis for the precise analysis and design of instruments and equipment with magnetic core. PMID:27347974

  8. Multiprojection algorithms with generalized projections

    SciTech Connect

    Censor, J.; Elfving, T.

    1994-12-31

    Generalized distances give raise to generalized projections onto convex sets. An important question is whether or not one can use, within the same projection algorithm, different types of such generalized projections. This question has practical consequences in the areas of signal detection and image recovery, in situations that can be formulated mathematically as convex feasibility problems. We show here that a simultaneous multiprojection algorithmic scheme converges. Different specific multiprojection algorithms can be derived from our scheme by a judicious choice of the Bregman functions which govern the process. As a by-product of the investigation we also obtain block-iterative schemes for certain kinds of linearly constrained optimization problems.

  9. The New and Computationally Efficient MIL-SOM Algorithm: Potential Benefits for Visualization and Analysis of a Large-Scale High-Dimensional Clinically Acquired Geographic Data

    PubMed Central

    Oyana, Tonny J.; Achenie, Luke E. K.; Heo, Joon

    2012-01-01

    The objective of this paper is to introduce an efficient algorithm, namely, the mathematically improved learning-self organizing map (MIL-SOM) algorithm, which speeds up the self-organizing map (SOM) training process. In the proposed MIL-SOM algorithm, the weights of Kohonen's SOM are based on the proportional-integral-derivative (PID) controller. Thus, in a typical SOM learning setting, this improvement translates to faster convergence. The basic idea is primarily motivated by the urgent need to develop algorithms with the competence to converge faster and more efficiently than conventional techniques. The MIL-SOM algorithm is tested on four training geographic datasets representing biomedical and disease informatics application domains. Experimental results show that the MIL-SOM algorithm provides a competitive, better updating procedure and performance, good robustness, and it runs faster than Kohonen's SOM. PMID:22481977

  10. National Orange Show Photovoltaic Demonstration

    SciTech Connect

    Dan Jimenez Sheri Raborn, CPA; Tom Baker

    2008-03-31

    National Orange Show Photovoltaic Demonstration created a 400KW Photovoltaic self-generation plant at the National Orange Show Events Center (NOS). The NOS owns a 120-acre state fairground where it operates an events center and produces an annual citrus fair known as the Orange Show. The NOS governing board wanted to employ cost-saving programs for annual energy expenses. It is hoped the Photovoltaic program will result in overall savings for the NOS, help reduce the State's energy demands as relating to electrical power consumption, improve quality of life within the affected grid area as well as increase the energy efficiency of buildings at our venue. In addition, the potential to reduce operational expenses would have a tremendous effect on the ability of the NOS to service its community.

  11. Arches showing UV flaring activity

    NASA Technical Reports Server (NTRS)

    Fontenla, J. M.

    1988-01-01

    The UVSP data obtained in the previous maximum activity cycle show the frequent appearance of flaring events in the UV. In many cases these flaring events are characterized by at least two footpoints which show compact impulsive non-simultaneous brightenings and a fainter but clearly observed arch developes between the footpoints. These arches and footpoints are observed in line corresponding to different temperatures, as Lyman alpha, N V, and C IV, and when observed above the limb display large Doppler shifts at some stages. The size of the arches can be larger than 20 arcsec.

  12. Spectral Regularization Algorithms for Learning Large Incomplete Matrices.

    PubMed

    Mazumder, Rahul; Hastie, Trevor; Tibshirani, Robert

    2010-03-01

    We use convex relaxation techniques to provide a sequence of regularized low-rank solutions for large-scale matrix completion problems. Using the nuclear norm as a regularizer, we provide a simple and very efficient convex algorithm for minimizing the reconstruction error subject to a bound on the nuclear norm. Our algorithm Soft-Impute iteratively replaces the missing elements with those obtained from a soft-thresholded SVD. With warm starts this allows us to efficiently compute an entire regularization path of solutions on a grid of values of the regularization parameter. The computationally intensive part of our algorithm is in computing a low-rank SVD of a dense matrix. Exploiting the problem structure, we show that the task can be performed with a complexity linear in the matrix dimensions. Our semidefinite-programming algorithm is readily scalable to large matrices: for example it can obtain a rank-80 approximation of a 10(6) × 10(6) incomplete matrix with 10(5) observed entries in 2.5 hours, and can fit a rank 40 approximation to the full Netflix training set in 6.6 hours. Our methods show very good performance both in training and test error when compared to other competitive state-of-the art techniques.

  13. Switch for Good Community Program

    SciTech Connect

    Crawford, Tabitha; Amran, Martha

    2013-11-19

    Switch4Good is an energy-savings program that helps residents reduce consumption from behavior changes; it was co-developed by Balfour Beatty Military Housing Management (BB) and WattzOn in Phase I of this grant. The program was offered at 11 Navy bases. Three customer engagement strategies were evaluated, and it was found that Digital Nudges (a combination of monthly consumption statements with frequent messaging via text or email) was most cost-effective.

  14. Good pharmacovigilance practices: technology enabled.

    PubMed

    Nelson, Robert C; Palsulich, Bruce; Gogolak, Victor

    2002-01-01

    The assessment of spontaneous reports is most effective it is conducted within a defined and rigorous process. The framework for good pharmacovigilance process (GPVP) is proposed as a subset of good postmarketing surveillance process (GPMSP), a functional structure for both a public health and corporate risk management strategy. GPVP has good practices that implement each step within a defined process. These practices are designed to efficiently and effectively detect and alert the drug safety professional to new and potentially important information on drug-associated adverse reactions. These practices are enabled by applied technology designed specifically for the review and assessment of spontaneous reports. Specific practices include rules-based triage, active query prompts for severe organ insults, contextual single case evaluation, statistical proportionality and correlational checks, case-series analyses, and templates for signal work-up and interpretation. These practices and the overall GPVP are supported by state-of-the-art web-based systems with powerful analytical engines, workflow and audit trials to allow validated systems support for valid drug safety signalling efforts. It is also important to understand that a process has a defined set of steps and any one cannot stand independently. Specifically, advanced use of technical alerting methods in isolation can mislead and allow one to misunderstand priorities and relative value. In the end, pharmacovigilance is a clinical art and a component process to the science of pharmacoepidemiology and risk management. PMID:12071777

  15. Statistical behaviour of adaptive multilevel splitting algorithms in simple models

    SciTech Connect

    Rolland, Joran Simonnet, Eric

    2015-02-15

    Adaptive multilevel splitting algorithms have been introduced rather recently for estimating tail distributions in a fast and efficient way. In particular, they can be used for computing the so-called reactive trajectories corresponding to direct transitions from one metastable state to another. The algorithm is based on successive selection–mutation steps performed on the system in a controlled way. It has two intrinsic parameters, the number of particles/trajectories and the reaction coordinate used for discriminating good or bad trajectories. We investigate first the convergence in law of the algorithm as a function of the timestep for several simple stochastic models. Second, we consider the average duration of reactive trajectories for which no theoretical predictions exist. The most important aspect of this work concerns some systems with two degrees of freedom. They are studied in detail as a function of the reaction coordinate in the asymptotic regime where the number of trajectories goes to infinity. We show that during phase transitions, the statistics of the algorithm deviate significatively from known theoretical results when using non-optimal reaction coordinates. In this case, the variance of the algorithm is peaking at the transition and the convergence of the algorithm can be much slower than the usual expected central limit behaviour. The duration of trajectories is affected as well. Moreover, reactive trajectories do not correspond to the most probable ones. Such behaviour disappears when using the optimal reaction coordinate called committor as predicted by the theory. We finally investigate a three-state Markov chain which reproduces this phenomenon and show logarithmic convergence of the trajectory durations.

  16. Create a Polarized Light Show.

    ERIC Educational Resources Information Center

    Conrad, William H.

    1992-01-01

    Presents a lesson that introduces students to polarized light using a problem-solving approach. After illustrating the concept using a slinky and poster board with a vertical slot, students solve the problem of creating a polarized light show using Polya's problem-solving methods. (MDH)

  17. Pembrolizumab Shows Promise for NSCLC.

    PubMed

    2015-06-01

    Data from the KEYNOTE-001 trial show that pembrolizumab improves clinical outcomes for patients with advanced non-small cell lung cancer, and is well tolerated. PD-L1 expression in at least 50% of tumor cells correlated with improved efficacy.

  18. Spaceborne SAR Imaging Algorithm for Coherence Optimized.

    PubMed

    Qiu, Zhiwei; Yue, Jianping; Wang, Xueqin; Yue, Shun

    2016-01-01

    This paper proposes SAR imaging algorithm with largest coherence based on the existing SAR imaging algorithm. The basic idea of SAR imaging algorithm in imaging processing is that output signal can have maximum signal-to-noise ratio (SNR) by using the optimal imaging parameters. Traditional imaging algorithm can acquire the best focusing effect, but would bring the decoherence phenomenon in subsequent interference process. Algorithm proposed in this paper is that SAR echo adopts consistent imaging parameters in focusing processing. Although the SNR of the output signal is reduced slightly, their coherence is ensured greatly, and finally the interferogram with high quality is obtained. In this paper, two scenes of Envisat ASAR data in Zhangbei are employed to conduct experiment for this algorithm. Compared with the interferogram from the traditional algorithm, the results show that this algorithm is more suitable for SAR interferometry (InSAR) research and application. PMID:26871446

  19. Spaceborne SAR Imaging Algorithm for Coherence Optimized.

    PubMed

    Qiu, Zhiwei; Yue, Jianping; Wang, Xueqin; Yue, Shun

    2016-01-01

    This paper proposes SAR imaging algorithm with largest coherence based on the existing SAR imaging algorithm. The basic idea of SAR imaging algorithm in imaging processing is that output signal can have maximum signal-to-noise ratio (SNR) by using the optimal imaging parameters. Traditional imaging algorithm can acquire the best focusing effect, but would bring the decoherence phenomenon in subsequent interference process. Algorithm proposed in this paper is that SAR echo adopts consistent imaging parameters in focusing processing. Although the SNR of the output signal is reduced slightly, their coherence is ensured greatly, and finally the interferogram with high quality is obtained. In this paper, two scenes of Envisat ASAR data in Zhangbei are employed to conduct experiment for this algorithm. Compared with the interferogram from the traditional algorithm, the results show that this algorithm is more suitable for SAR interferometry (InSAR) research and application.

  20. Spaceborne SAR Imaging Algorithm for Coherence Optimized

    PubMed Central

    Qiu, Zhiwei; Yue, Jianping; Wang, Xueqin; Yue, Shun

    2016-01-01

    This paper proposes SAR imaging algorithm with largest coherence based on the existing SAR imaging algorithm. The basic idea of SAR imaging algorithm in imaging processing is that output signal can have maximum signal-to-noise ratio (SNR) by using the optimal imaging parameters. Traditional imaging algorithm can acquire the best focusing effect, but would bring the decoherence phenomenon in subsequent interference process. Algorithm proposed in this paper is that SAR echo adopts consistent imaging parameters in focusing processing. Although the SNR of the output signal is reduced slightly, their coherence is ensured greatly, and finally the interferogram with high quality is obtained. In this paper, two scenes of Envisat ASAR data in Zhangbei are employed to conduct experiment for this algorithm. Compared with the interferogram from the traditional algorithm, the results show that this algorithm is more suitable for SAR interferometry (InSAR) research and application. PMID:26871446

  1. What is a Systolic Algorithm?

    NASA Astrophysics Data System (ADS)

    Rao, Sailesh K.; Kollath, T.

    1986-07-01

    In this paper, we show that every systolic array executes a Regular Iterative Algorithm with a strongly separating hyperplane and conversely, that every such algorithm can be implemented on a systolic array. This characterization provides us with an unified framework for describing the contributions of other authors. It also exposes the relevance of many fundamental concepts that were introduced in the sixties by Hennie, Waite and Karp, Miller and Winograd, to the present day concern of systolic array

  2. Efficient spectral and pseudospectral algorithms for 3D simulations of whistler-mode waves in a plasma

    SciTech Connect

    Gumerov, Nail A.; Karavaev, Alexey V.; Surjalal Sharma, A.; Shao Xi; Papadopoulos, Konstantinos D.

    2011-04-01

    Efficient spectral and pseudospectral algorithms for simulation of linear and nonlinear 3D whistler waves in a cold electron plasma are developed. These algorithms are applied to the simulation of whistler waves generated by loop antennas and spheromak-like stationary waves of considerable amplitude. The algorithms are linearly stable and show good stability properties for computations of nonlinear waves over tens of thousands of time steps. Additional speedups by factors of 10-20 (comparing single core CPU and one GPU) are achieved by using graphics processors (GPUs), which enable efficient numerical simulation of the wave propagation on relatively high resolution meshes (tens of millions nodes) in personal computing environment. Comparisons of the numerical results with analytical solutions and experiments show good agreement. The limitations of the codes and the performance of the GPU computing are discussed.

  3. Virtual goods recommendations in virtual worlds.

    PubMed

    Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren

    2015-01-01

    Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods. PMID:25834837

  4. Virtual goods recommendations in virtual worlds.

    PubMed

    Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren

    2015-01-01

    Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods.

  5. Virtual Goods Recommendations in Virtual Worlds

    PubMed Central

    Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren

    2015-01-01

    Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods. PMID:25834837

  6. Spatial dynamics of ecological public goods.

    PubMed

    Wakano, Joe Yuichiro; Nowak, Martin A; Hauert, Christoph

    2009-05-12

    The production, consumption, and exploitation of common resources ranging from extracellular products in microorganisms to global issues of climate change refer to public goods interactions. Individuals can cooperate and sustain common resources at some cost or defect and exploit the resources without contributing. This generates a conflict of interest, which characterizes social dilemmas: Individual selection favors defectors, but for the community, it is best if everybody cooperates. Traditional models of public goods do not take into account that benefits of the common resource enable cooperators to maintain higher population densities. This leads to a natural feedback between population dynamics and interaction group sizes as captured by "ecological public goods." Here, we show that the spatial evolutionary dynamics of ecological public goods in "selection-diffusion" systems promotes cooperation based on different types of pattern formation processes. In spatial settings, individuals can migrate (diffuse) to populate new territories. Slow diffusion of cooperators fosters aggregation in highly productive patches (activation), whereas fast diffusion enables defectors to readily locate and exploit these patches (inhibition). These antagonistic forces promote coexistence of cooperators and defectors in static or dynamic patterns, including spatial chaos of ever-changing configurations. The local environment of cooperators and defectors is shaped by the production or consumption of common resources. Hence, diffusion-induced self-organization into spatial patterns not only enhances cooperation but also provides simple mechanisms for the spontaneous generation of habitat diversity, which denotes a crucial determinant of the viability of ecological systems.

  7. Magic Carpet Shows Its Colors

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The upper left image in this display is from the panoramic camera on the Mars Exploration Rover Spirit, showing the 'Magic Carpet' region near the rover at Gusev Crater, Mars, on Sol 7, the seventh martian day of its journey (Jan. 10, 2004). The lower image, also from the panoramic camera, is a monochrome (single filter) image of a rock in the 'Magic Carpet' area. Note that colored portions of the rock correlate with extracted spectra shown in the plot to the side. Four different types of materials are shown: the rock itself, the soil in front of the rock, some brighter soil on top of the rock, and some dust that has collected in small recesses on the rock face ('spots'). Each color on the spectra matches a line on the graph, showing how the panoramic camera's different colored filters are used to broadly assess the varying mineral compositions of martian rocks and soils.

  8. Analysis of the Dryden Wet Bulb GLobe Temperature Algorithm for White Sands Missile Range

    NASA Technical Reports Server (NTRS)

    LaQuay, Ryan Matthew

    2011-01-01

    In locations where workforce is exposed to high relative humidity and light winds, heat stress is a significant concern. Such is the case at the White Sands Missile Range in New Mexico. Heat stress is depicted by the wet bulb globe temperature, which is the official measurement used by the American Conference of Governmental Industrial Hygienists. The wet bulb globe temperature is measured by an instrument which was designed to be portable and needing routine maintenance. As an alternative form for measuring the wet bulb globe temperature, algorithms have been created to calculate the wet bulb globe temperature from basic meteorological observations. The algorithms are location dependent; therefore a specific algorithm is usually not suitable for multiple locations. Due to climatology similarities, the algorithm developed for use at the Dryden Flight Research Center was applied to data from the White Sands Missile Range. A study was performed that compared a wet bulb globe instrument to data from two Surface Atmospheric Measurement Systems that was applied to the Dryden wet bulb globe temperature algorithm. The period of study was from June to September of2009, with focus being applied from 0900 to 1800, local time. Analysis showed that the algorithm worked well, with a few exceptions. The algorithm becomes less accurate to the measurement when the dew point temperature is over 10 Celsius. Cloud cover also has a significant effect on the measured wet bulb globe temperature. The algorithm does not show red and black heat stress flags well due to shorter time scales of such events. The results of this study show that it is plausible that the Dryden Flight Research wet bulb globe temperature algorithm is compatible with the White Sands Missile Range, except for when there are increased dew point temperatures and cloud cover or precipitation. During such occasions, the wet bulb globe temperature instrument would be the preferred method of measurement. Out of the 30

  9. A 2D vector map watermarking algorithm resistant to simplication attack

    NASA Astrophysics Data System (ADS)

    Wang, Chuanjian; Liang, Bin; Zhao, Qingzhan; Qiu, Zuqi; Peng, Yuwei; Yu, Liang

    2009-12-01

    Vector maps are valuable asset of data producers. How to protect copyright of vector maps effectively using digital watermarking is a hot research issue. In this paper, we propose a new robust and blind watermarking algorithm resilient to simplification attack. We proof that spatial topological relation between map objects bears an important property of approximate simplification invariance. We choose spatial topological relations as watermark feature domain and embed watermarks by slightly modifying spatial topological relation between map objects. Experiment shows that our algorithm has good performance to resist simplification attack and tradeoff of the robustness and data fidelity is acquired.

  10. Improvement of characteristic statistic algorithm and its application on equilibrium cycle reloading optimization

    SciTech Connect

    Hu, Y.; Liu, Z.; Shi, X.; Wang, B.

    2006-07-01

    A brief introduction of characteristic statistic algorithm (CSA) is given in the paper, which is a new global optimization algorithm to solve the problem of PWR in-core fuel management optimization. CSA is modified by the adoption of back propagation neural network and fast local adjustment. Then the modified CSA is applied to PWR Equilibrium Cycle Reloading Optimization, and the corresponding optimization code of CSA-DYW is developed. CSA-DYW is used to optimize the equilibrium cycle of 18 month reloading of Daya bay nuclear plant Unit 1 reactor. The results show that CSA-DYW has high efficiency and good global performance on PWR Equilibrium Cycle Reloading Optimization. (authors)

  11. Multi-objective Emergency Facility Location Problem Based on Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Zhao, Dan; Zhao, Yunsheng; Li, Zhenhua; Chen, Jin

    Recent years, emergent disasters have occurred frequently. This has attracted more attention on emergency management, especially the multi-objective emergency facility location problem (EFLP), a NP problem. However, few algorithms are efficient to solve the probleme and so the application of genetic algorithm (GA) can be a good choice. This paper first introduces the mathematical models for this problem and transforms it from complex constraints into simple constraints by punishment function. The solutions to the experiments are obtained by applying GA. The experiment results show that GA could solve the problems effectively.

  12. Infrared small target detection based on bilateral filtering algorithm with similarity judgments

    NASA Astrophysics Data System (ADS)

    Li, Yanbei; Li, Yan

    2014-11-01

    Infrared small target detection is part of the key technologies in infrared precision-guided, search and track system. Resulting from the relative distance of the infrared image system and the target is far, the target becomes small, faint and obscure. Furthermore, the interference of background clutter and system noise is intense. To solve the problem of infrared small target detection in a complex background, this paper proposes a bilateral filtering algorithm based on similarity judgments for infrared image background prediction. The algorithm introduces gradient factor and similarity judgment factor into traditional bilateral filtering. The two factors can enhance the accuracy of the algorithm for smooth region. At the same time, spatial proximity coefficients and gray similarity coefficient in the bilateral filtering are all expressed by the first two of McLaughlin expansion, which aiming at reducing the time overhead. Simulation results show that the proposed algorithm can effectively suppress complex background clutter in the infrared image and enhance target signal compared with the improved bilateral filtering algorithm, and it also can improve the signal to noise ratio (SNR) and contrast. Besides, this algorithm can reduce the computation time. In a word, this algorithm has a good background rejection performance.

  13. Establishing a Dynamic Self-Adaptation Learning Algorithm of the BP Neural Network and Its Applications

    NASA Astrophysics Data System (ADS)

    Li, Xiaofeng; Xiang, Suying; Zhu, Pengfei; Wu, Min

    2015-12-01

    In order to avoid the inherent deficiencies of the traditional BP neural network, such as slow convergence speed, that easily leading to local minima, poor generalization ability and difficulty in determining the network structure, the dynamic self-adaptive learning algorithm of the BP neural network is put forward to improve the function of the BP neural network. The new algorithm combines the merit of principal component analysis, particle swarm optimization, correlation analysis and self-adaptive model, hence can effectively solve the problems of selecting structural parameters, initial connection weights and thresholds and learning rates of the BP neural network. This new algorithm not only reduces the human intervention, optimizes the topological structures of BP neural networks and improves the network generalization ability, but also accelerates the convergence speed of a network, avoids trapping into local minima, and enhances network adaptation ability and prediction ability. The dynamic self-adaptive learning algorithm of the BP neural network is used to forecast the total retail sale of consumer goods of Sichuan Province, China. Empirical results indicate that the new algorithm is superior to the traditional BP network algorithm in predicting accuracy and time consumption, which shows the feasibility and effectiveness of the new algorithm.

  14. A symbol-map wavelet zero-tree image coding algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Liu, Wenyao; Peng, Xiang; Liu, Xiaoli

    2008-03-01

    A improved SPIHT image compression algorithm called symbol-map zero-tree coding algorithm (SMZTC) is proposed in this paper based on wavelet transform. The SPIHT algorithm is a high efficiency wavelet coefficients coding method and have good image compressing effect, but it has more complexity and need too much memory. The algorithm presented in this paper utilizes two small symbol-maps Mark and FC to store the status of coefficients and zero tree sets during coding procedure so as to reduce the memory requirement. By this strategy, the memory cost is reduced distinctly as well as the scanning speed of coefficients is improved. Those comparison experiments for 512 by 512 images are done with some other zerotree coding algorithms, such as SPIHT, NLS method. During the experiments, the biorthogonal 9/7 lifting wavelet transform is used to image transform. The results of coding experiments show that this algorithm speed of codec is improved significantly, and compression-ratio is almost uniformed with SPIHT algorithm.

  15. Parallelized dilate algorithm for remote sensing image.

    PubMed

    Zhang, Suli; Hu, Haoran; Pan, Xin

    2014-01-01

    As an important algorithm, dilate algorithm can give us more connective view of a remote sensing image which has broken lines or objects. However, with the technological progress of satellite sensor, the resolution of remote sensing image has been increasing and its data quantities become very large. This would lead to the decrease of algorithm running speed or cannot obtain a result in limited memory or time. To solve this problem, our research proposed a parallelized dilate algorithm for remote sensing Image based on MPI and MP. Experiments show that our method runs faster than traditional single-process algorithm.

  16. Alternative learning algorithms for feedforward neural networks

    SciTech Connect

    Vitela, J.E.

    1996-03-01

    The efficiency of the back propagation algorithm to train feed forward multilayer neural networks has originated the erroneous belief among many neural networks users, that this is the only possible way to obtain the gradient of the error in this type of networks. The purpose of this paper is to show how alternative algorithms can be obtained within the framework of ordered partial derivatives. Two alternative forward-propagating algorithms are derived in this work which are mathematically equivalent to the BP algorithm. This systematic way of obtaining learning algorithms illustrated with this particular type of neural networks can also be used with other types such as recurrent neural networks.

  17. Calibration of FRESIM for Singapore expressway using genetic algorithm

    SciTech Connect

    Cheu, R.L.; Jin, X.; Srinivasa, D.; Ng, K.C.; Ng, Y.L.

    1998-11-01

    FRESIM is a microscopic time-stepping simulation model for freeway corridor traffic operations. To enable FRESIM to realistically simulate expressway traffic flow in Singapore, parameters that govern the movement of vehicles needed to be recalibrated for local traffic conditions. This paper presents the application of a genetic algorithm as an optimization method for finding a suitable combination of FRESIM parameter values. The calibration is based on field data collected on weekdays over a 5.8 km segment of the Ayer Rajar Expressway. Independent calibrations have been made for evening peak and midday off-peak traffic. The results show that the genetic algorithm is able to search for two sets of parameter values that enable FRESIM to produce 30-s loop-detector volume and speed (averaged across all lanes) closely matching the field data under two different traffic conditions. The two sets of parameter values are found to produce a consistently good match for data collected in different days.

  18. Simple multiscale algorithm for layer detection with lidar.

    PubMed

    Mao, Feiyue; Gong, Wei; Zhu, Zhongmin

    2011-12-20

    Lidar is a powerful active remote sensing device used in the detection of the optical properties of aerosols and clouds. However, there are difficulties in layer detection and classification. Many previous methods are too complex for large dataset analysis or limited to data with too high a signal-to-noise ratio (SNR). In this study, a mechanism of multiscale detection and overdetection rejection is proposed based on a trend index function that we define. Finally, we classify layers based on connected layers employing a quantity known as the threshold of the peak-to-base ratio. We find good consistency between retrieved results employing our method and visual analysis. The testing of synthetic signals shows that our algorithm performs well with SNRs higher than 4. The results demonstrate that our algorithm is simple, practical, and suited to large dataset applications.

  19. An improved piecewise linear chaotic map based image encryption algorithm.

    PubMed

    Hu, Yuping; Zhu, Congxu; Wang, Zhijian

    2014-01-01

    An image encryption algorithm based on improved piecewise linear chaotic map (MPWLCM) model was proposed. The algorithm uses the MPWLCM to permute and diffuse plain image simultaneously. Due to the sensitivity to initial key values, system parameters, and ergodicity in chaotic system, two pseudorandom sequences are designed and used in the processes of permutation and diffusion. The order of processing pixels is not in accordance with the index of pixels, but it is from beginning or end alternately. The cipher feedback was introduced in diffusion process. Test results and security analysis show that not only the scheme can achieve good encryption results but also its key space is large enough to resist against brute attack.

  20. ShowMe3D

    SciTech Connect

    Sinclair, Michael B

    2012-01-05

    ShowMe3D is a data visualization graphical user interface specifically designed for use with hyperspectral image obtained from the Hyperspectral Confocal Microscope. The program allows the user to select and display any single image from a three dimensional hyperspectral image stack. By moving a slider control, the user can easily move between images of the stack. The user can zoom into any region of the image. The user can select any pixel or region from the displayed image and display the fluorescence spectrum associated with that pixel or region. The user can define up to 3 spectral filters to apply to the hyperspectral image and view the image as it would appear from a filter-based confocal microscope. The user can also obtain statistics such as intensity average and variance from selected regions.

  1. "Show me" bioethics and politics.

    PubMed

    Christopher, Myra J

    2007-10-01

    Missouri, the "Show Me State," has become the epicenter of several important national public policy debates, including abortion rights, the right to choose and refuse medical treatment, and, most recently, early stem cell research. In this environment, the Center for Practical Bioethics (formerly, Midwest Bioethics Center) emerged and grew. The Center's role in these "cultural wars" is not to advocate for a particular position but to provide well researched and objective information, perspective, and advocacy for the ethical justification of policy positions; and to serve as a neutral convener and provider of a public forum for discussion. In this article, the Center's work on early stem cell research is a case study through which to argue that not only the Center, but also the field of bioethics has a critical role in the politics of public health policy.

  2. ShowMe3D

    2012-01-05

    ShowMe3D is a data visualization graphical user interface specifically designed for use with hyperspectral image obtained from the Hyperspectral Confocal Microscope. The program allows the user to select and display any single image from a three dimensional hyperspectral image stack. By moving a slider control, the user can easily move between images of the stack. The user can zoom into any region of the image. The user can select any pixel or region from themore » displayed image and display the fluorescence spectrum associated with that pixel or region. The user can define up to 3 spectral filters to apply to the hyperspectral image and view the image as it would appear from a filter-based confocal microscope. The user can also obtain statistics such as intensity average and variance from selected regions.« less

  3. Goode Gym Energy Renovation Project

    SciTech Connect

    Coleman, Andrena

    2014-12-11

    The Ida H. Goode Gymnasium was constructed in 1964 to serve as a focal point for academics, student recreation, and health and wellness activities. This 38,000 SF building contains a gymnasium with a stage, swimming pool, eight classrooms, a weight room, six offices and auxiliary spaces for the athletic programs. The gym is located on a 4-acre greenfield, which is slated for improvement and enhancement to future athletics program at Bennett College. The available funding for this project was used to weatherize the envelope of the gymnasium, installation of a new energy-efficient mechanical system, and a retrofit of the existing lighting systems in the building’s interior. The envelope weatherization was completed without disturbing the building’s historic preservation eligibility. The existing heating system was replaced with a new high efficiency condensing system. The new heating system also includes a new Building Automation System which provides additional monitoring. Proper usage of this system will provide additional energy savings. Most of the existing interior lighting fixtures and bulbs were replaced with new LED and high efficiency T-8 bulbs and fixtures. Occupancy sensors were installed in applicable areas. The Ida Goode Gymnasium should experience high electricity and natural gas savings as well as operational/maintenance efficiency increases. The aesthetics of the building was maintained and the overall safety was improved.

  4. One of the Good Guys

    SciTech Connect

    Wiley, H. S.

    2010-10-01

    I was talking with some younger colleagues at a meeting last month when the subject of career goals came up. These colleagues were successful in that they had recently received tenure at top research universities and had some grants and good students. Thus, the early career pressure to simply survive was gone. So now what motivated them? Solving challenging and significant scientific problems was at the top of their lists. Interestingly, they were also motivated by a desire to become one of the “good guys” in science. The fact that being an important contributor to the scientific community can be fulfilling should not come as a surprise to anyone. However, what I do consider surprising is how rarely this seems to be discussed with students and postdocs. What we do discuss are either those issues that are fundamental aspects of the job (get a grant, get tenure, do research in an important field) or those that are important to our institutions. Knowing how to do our jobs well is indeed essential for any kind of professional success. However, achieving the right balance in our ambitions is also important for our happiness.

  5. Going public: good scientific conduct.

    PubMed

    Meyer, Gitte; Sandøe, Peter

    2012-06-01

    The paper addresses issues of scientific conduct regarding relations between science and the media, relations between scientists and journalists, and attitudes towards the public at large. In the large and increasing body of literature on scientific conduct and misconduct, these issues seem underexposed as ethical challenges. Consequently, individual scientists here tend to be left alone with problems and dilemmas, with no guidance for good conduct. Ideas are presented about how to make up for this omission. Using a practical, ethical approach, the paper attempts to identify ways scientists might deal with ethical public relations issues, guided by a norm or maxim of openness. Drawing on and rethinking the CUDOS codification of the scientific ethos, as it was worked out by Robert K. Merton in 1942, we propose that this, which is echoed in current codifications of norms for good scientific conduct, contains a tacit maxim of openness which may naturally be extended to cover the public relations of science. Discussing openness as access, accountability, transparency and receptiveness, the argumentation concentrates on the possible prevention of misconduct with respect to, on the one hand, sins of omission-withholding important information from the public-and, on the other hand, abuses of the authority of science in order to gain publicity. Statements from interviews with scientists are used to illustrate how scientists might view the relevance of the issues raised.

  6. Good-enough linguistic representations and online cognitive equilibrium in language processing.

    PubMed

    Karimi, Hossein; Ferreira, Fernanda

    2016-01-01

    We review previous research showing that representations formed during language processing are sometimes just "good enough" for the task at hand and propose the "online cognitive equilibrium" hypothesis as the driving force behind the formation of good-enough representations in language processing. Based on this view, we assume that the language comprehension system by default prefers to achieve as early as possible and remain as long as possible in a state of cognitive equilibrium where linguistic representations are successfully incorporated with existing knowledge structures (i.e., schemata) so that a meaningful and coherent overall representation is formed, and uncertainty is resolved or at least minimized. We also argue that the online equilibrium hypothesis is consistent with current theories of language processing, which maintain that linguistic representations are formed through a complex interplay between simple heuristics and deep syntactic algorithms and also theories that hold that linguistic representations are often incomplete and lacking in detail. We also propose a model of language processing that makes use of both heuristic and algorithmic processing, is sensitive to online cognitive equilibrium, and, we argue, is capable of explaining the formation of underspecified representations. We review previous findings providing evidence for underspecification in relation to this hypothesis and the associated language processing model and argue that most of these findings are compatible with them.

  7. Library of Continuation Algorithms

    2005-03-01

    LOCA (Library of Continuation Algorithms) is scientific software written in C++ that provides advanced analysis tools for nonlinear systems. In particular, it provides parameter continuation algorithms. bifurcation tracking algorithms, and drivers for linear stability analysis. The algorithms are aimed at large-scale applications that use Newton’s method for their nonlinear solve.

  8. Algorithmic causets

    NASA Astrophysics Data System (ADS)

    Bolognesi, Tommaso

    2011-07-01

    In the context of quantum gravity theories, several researchers have proposed causal sets as appropriate discrete models of spacetime. We investigate families of causal sets obtained from two simple models of computation - 2D Turing machines and network mobile automata - that operate on 'high-dimensional' supports, namely 2D arrays of cells and planar graphs, respectively. We study a number of quantitative and qualitative emergent properties of these causal sets, including dimension, curvature and localized structures, or 'particles'. We show how the possibility to detect and separate particles from background space depends on the choice between a global or local view at the causal set. Finally, we spot very rare cases of pseudo-randomness, or deterministic chaos; these exhibit a spontaneous phenomenon of 'causal compartmentation' that appears as a prerequisite for the occurrence of anything of physical interest in the evolution of spacetime.

  9. GOES-West Shows U.S. West's Record Rainfall

    NASA Video Gallery

    A new time-lapse animation of data from NOAA's GOES-West satellite provides a good picture of why the U.S. West Coast continues to experience record rainfall. The new animation shows the movement o...

  10. Bunkhouse basement interior showing storage area and a conveyor belt ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Bunkhouse basement interior showing storage area and a conveyor belt (circa 1936) used to unload dry goods into the basement through an opening on the east side of the bunkhouse. - Sespe Ranch, Bunkhouse, 2896 Telegraph Road, Fillmore, Ventura County, CA

  11. Multikernel least mean square algorithm.

    PubMed

    Tobar, Felipe A; Kung, Sun-Yuan; Mandic, Danilo P

    2014-02-01

    The multikernel least-mean-square algorithm is introduced for adaptive estimation of vector-valued nonlinear and nonstationary signals. This is achieved by mapping the multivariate input data to a Hilbert space of time-varying vector-valued functions, whose inner products (kernels) are combined in an online fashion. The proposed algorithm is equipped with novel adaptive sparsification criteria ensuring a finite dictionary, and is computationally efficient and suitable for nonstationary environments. We also show the ability of the proposed vector-valued reproducing kernel Hilbert space to serve as a feature space for the class of multikernel least-squares algorithms. The benefits of adaptive multikernel (MK) estimation algorithms are illuminated in the nonlinear multivariate adaptive prediction setting. Simulations on nonlinear inertial body sensor signals and nonstationary real-world wind signals of low, medium, and high dynamic regimes support the approach. PMID:24807027

  12. The Origins of Counting Algorithms

    PubMed Central

    Cantlon, Jessica F.; Piantadosi, Steven T.; Ferrigno, Stephen; Hughes, Kelly D.; Barnard, Allison M.

    2015-01-01

    Humans’ ability to ‘count’ by verbally labeling discrete quantities is unique in animal cognition. The evolutionary origins of counting algorithms are not understood. We report that non-human primates exhibit a cognitive ability that is algorithmically and logically similar to human counting. Monkeys were given the task of choosing between two food caches. Monkeys saw one cache baited with some number of food items, one item at a time. Then, a second cache was baited with food items, one at a time. At the point when the second set approximately outnumbered the first set, monkeys spontaneously moved to choose the second set even before it was completely baited. Using a novel Bayesian analysis, we show that monkeys used an approximate counting algorithm to increment and compare quantities in sequence. This algorithm is structurally similar to formal counting in humans and thus may have been an important evolutionary precursor to human counting. PMID:25953949

  13. Intelligent perturbation algorithms for space scheduling optimization

    NASA Technical Reports Server (NTRS)

    Kurtzman, Clifford R.

    1991-01-01

    Intelligent perturbation algorithms for space scheduling optimization are presented in the form of the viewgraphs. The following subject areas are covered: optimization of planning, scheduling, and manifesting; searching a discrete configuration space; heuristic algorithms used for optimization; use of heuristic methods on a sample scheduling problem; intelligent perturbation algorithms are iterative refinement techniques; properties of a good iterative search operator; dispatching examples of intelligent perturbation algorithm and perturbation operator attributes; scheduling implementations using intelligent perturbation algorithms; major advances in scheduling capabilities; the prototype ISF (industrial Space Facility) experiment scheduler; optimized schedule (max revenue); multi-variable optimization; Space Station design reference mission scheduling; ISF-TDRSS command scheduling demonstration; and example task - communications check.

  14. Natural gradient learning algorithms for RBF networks.

    PubMed

    Zhao, Junsheng; Wei, Haikun; Zhang, Chi; Li, Weiling; Guo, Weili; Zhang, Kanjian

    2015-02-01

    Radial basis function (RBF) networks are one of the most widely used models for function approximation and classification. There are many strange behaviors in the learning process of RBF networks, such as slow learning speed and the existence of the plateaus. The natural gradient learning method can overcome these disadvantages effectively. It can accelerate the dynamics of learning and avoid plateaus. In this letter, we assume that the probability density function (pdf) of the input and the activation function are gaussian. First, we introduce natural gradient learning to the RBF networks and give the explicit forms of the Fisher information matrix and its inverse. Second, since it is difficult to calculate the Fisher information matrix and its inverse when the numbers of the hidden units and the dimensions of the input are large, we introduce the adaptive method to the natural gradient learning algorithms. Finally, we give an explicit form of the adaptive natural gradient learning algorithm and compare it to the conventional gradient descent method. Simulations show that the proposed adaptive natural gradient method, which can avoid the plateaus effectively, has a good performance when RBF networks are used for nonlinear functions approximation. PMID:25380332

  15. Speech Enhancement based on Compressive Sensing Algorithm

    NASA Astrophysics Data System (ADS)

    Sulong, Amart; Gunawan, Teddy S.; Khalifa, Othman O.; Chebil, Jalel

    2013-12-01

    There are various methods, in performance of speech enhancement, have been proposed over the years. The accurate method for the speech enhancement design mainly focuses on quality and intelligibility. The method proposed with high performance level. A novel speech enhancement by using compressive sensing (CS) is a new paradigm of acquiring signals, fundamentally different from uniform rate digitization followed by compression, often used for transmission or storage. Using CS can reduce the number of degrees of freedom of a sparse/compressible signal by permitting only certain configurations of the large and zero/small coefficients, and structured sparsity models. Therefore, CS is significantly provides a way of reconstructing a compressed version of the speech in the original signal by taking only a small amount of linear and non-adaptive measurement. The performance of overall algorithms will be evaluated based on the speech quality by optimise using informal listening test and Perceptual Evaluation of Speech Quality (PESQ). Experimental results show that the CS algorithm perform very well in a wide range of speech test and being significantly given good performance for speech enhancement method with better noise suppression ability over conventional approaches without obvious degradation of speech quality.

  16. Pea Plants Show Risk Sensitivity.

    PubMed

    Dener, Efrat; Kacelnik, Alex; Shemesh, Hagai

    2016-07-11

    Sensitivity to variability in resources has been documented in humans, primates, birds, and social insects, but the fit between empirical results and the predictions of risk sensitivity theory (RST), which aims to explain this sensitivity in adaptive terms, is weak [1]. RST predicts that agents should switch between risk proneness and risk aversion depending on state and circumstances, especially according to the richness of the least variable option [2]. Unrealistic assumptions about agents' information processing mechanisms and poor knowledge of the extent to which variability imposes specific selection in nature are strong candidates to explain the gap between theory and data. RST's rationale also applies to plants, where it has not hitherto been tested. Given the differences between animals' and plants' information processing mechanisms, such tests should help unravel the conflicts between theory and data. Measuring root growth allocation by split-root pea plants, we show that they favor variability when mean nutrient levels are low and the opposite when they are high, supporting the most widespread RST prediction. However, the combination of non-linear effects of nitrogen availability at local and systemic levels may explain some of these effects as a consequence of mechanisms not necessarily evolved to cope with variance [3, 4]. This resembles animal examples in which properties of perception and learning cause risk sensitivity even though they are not risk adaptations [5]. PMID:27374342

  17. Casimir experiments showing saturation effects

    SciTech Connect

    Sernelius, Bo E.

    2009-10-15

    We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a {sup 87}Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.

  18. A fast learning algorithm for deep belief nets.

    PubMed

    Hinton, Geoffrey E; Osindero, Simon; Teh, Yee-Whye

    2006-07-01

    We show how to use "complementary priors" to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory. The fast, greedy algorithm is used to initialize a slower learning procedure that fine-tunes the weights using a contrastive version of the wake-sleep algorithm. After fine-tuning, a network with three hidden layers forms a very good generative model of the joint distribution of handwritten digit images and their labels. This generative model gives better digit classification than the best discriminative learning algorithms. The low-dimensional manifolds on which the digits lie are modeled by long ravines in the free-energy landscape of the top-level associative memory, and it is easy to explore these ravines by using the directed connections to display what the associative memory has in mind.

  19. Effective Memetic Algorithms for VLSI design = Genetic Algorithms + local search + multi-level clustering.

    PubMed

    Areibi, Shawki; Yang, Zhen

    2004-01-01

    Combining global and local search is a strategy used by many successful hybrid optimization approaches. Memetic Algorithms (MAs) are Evolutionary Algorithms (EAs) that apply some sort of local search to further improve the fitness of individuals in the population. Memetic Algorithms have been shown to be very effective in solving many hard combinatorial optimization problems. This paper provides a forum for identifying and exploring the key issues that affect the design and application of Memetic Algorithms. The approach combines a hierarchical design technique, Genetic Algorithms, constructive techniques and advanced local search to solve VLSI circuit layout in the form of circuit partitioning and placement. Results obtained indicate that Memetic Algorithms based on local search, clustering and good initial solutions improve solution quality on average by 35% for the VLSI circuit partitioning problem and 54% for the VLSI standard cell placement problem. PMID:15355604

  20. Effective Memetic Algorithms for VLSI design = Genetic Algorithms + local search + multi-level clustering.

    PubMed

    Areibi, Shawki; Yang, Zhen

    2004-01-01

    Combining global and local search is a strategy used by many successful hybrid optimization approaches. Memetic Algorithms (MAs) are Evolutionary Algorithms (EAs) that apply some sort of local search to further improve the fitness of individuals in the population. Memetic Algorithms have been shown to be very effective in solving many hard combinatorial optimization problems. This paper provides a forum for identifying and exploring the key issues that affect the design and application of Memetic Algorithms. The approach combines a hierarchical design technique, Genetic Algorithms, constructive techniques and advanced local search to solve VLSI circuit layout in the form of circuit partitioning and placement. Results obtained indicate that Memetic Algorithms based on local search, clustering and good initial solutions improve solution quality on average by 35% for the VLSI circuit partitioning problem and 54% for the VLSI standard cell placement problem.

  1. What are narratives good for?

    PubMed

    Beatty, John

    2016-08-01

    Narratives may be easy to come by, but not everything is worth narrating. What merits a narrative? Here, I follow the lead of narratologists and literary theorists, and focus on one particular proposal concerning the elements of a story that make it narrative-worthy. These elements correspond to features of the natural world addressed by the historical sciences, where narratives figure so prominently. What matters is contingency. Narratives are especially good for representing contingency and accounting for contingent outcomes. This will be squared with a common view that narratives leave no room for chance. On the contrary, I will argue, tracing one path through a maze of alternative possibilities, and alluding to those possibilities along the way, is what a narrative does particularly well.

  2. 'The good of the child'

    PubMed

    Warnock, Mary

    1987-04-01

    Warnock, chair of Britain's Committee of Inquiry into Human Fertilisation and Embryology, discusses the implications of the "artificial family" for children born through the use of reproductive technologies. She considers both treatment of infertility and the possible use of assisted reproduction to enable persons other than infertile couples, such as single persons and homosexuals, to have children. Warnock has found that emphasis has been placed on the wants and well-being of the adult(s) involved, and that the "good of the child" is a "wide and vague concept, widely invoked, not always plausibly." She is particularly concerned about children born as a result of the delayed implantation of frozen embryos, AID children who are deceived about their origins, and children born of surrogate pregnancies. She recommends that a detailed study of existing "artificial family" children be conducted to aid public policy decisions on assisted reproduction.

  3. Coordinating towards a Common Good

    NASA Astrophysics Data System (ADS)

    Santos, Francisco C.; Pacheco, Jorge M.

    2010-09-01

    Throughout their life, humans often engage in collective endeavors ranging from family related issues to global warming. In all cases, the tragedy of the commons threatens the possibility of reaching the optimal solution associated with global cooperation, a scenario predicted by theory and demonstrated by many experiments. Using the toolbox of evolutionary game theory, I will address two important aspects of evolutionary dynamics that have been neglected so far in the context of public goods games and evolution of cooperation. On one hand, the fact that often there is a threshold above which a public good is reached [1, 2]. On the other hand, the fact that individuals often participate in several games, related to the their social context and pattern of social ties, defined by a social network [3, 4, 5]. In the first case, the existence of a threshold above which collective action is materialized dictates a rich pattern of evolutionary dynamics where the direction of natural selection can be inverted compared to standard expectations. Scenarios of defector dominance, pure coordination or coexistence may arise simultaneously. Both finite and infinite population models are analyzed. In networked games, cooperation blooms whenever the act of contributing is more important than the effort contributed. In particular, the heterogeneous nature of social networks naturally induces a symmetry breaking of the dilemmas of cooperation, as contributions made by cooperators may become contingent on the social context in which the individual is embedded. This diversity in context provides an advantage to cooperators, which is particularly strong when both wealth and social ties follow a power-law distribution, providing clues on the self-organization of social communities. Finally, in both situations, it can be shown that individuals no longer play a defection dominance dilemma, but effectively engage in a general N-person coordination game. Even if locally defection may seem

  4. Dynamic Programming Algorithm vs. Genetic Algorithm: Which is Faster?

    NASA Astrophysics Data System (ADS)

    Petković, Dušan

    The article compares two different approaches for the optimization problem of large join queries (LJQs). Almost all commercial database systems use a form of the dynamic programming algorithm to solve the ordering of join operations for large join queries, i.e. joins with more than dozen join operations. The property of the dynamic programming algorithm is that the execution time increases significantly in the case, where the number of join operations in a query is large. Genetic algorithms (GAs), as a data mining technique, have been shown as a promising technique in solving the ordering of join operations in LJQs. Using the existing implementation of GA, we compare the dynamic programming algorithm implemented in commercial database systems with the corresponding GA module. Our results show that the use of a genetic algorithm is a better solution for optimization of large join queries, i.e., that such a technique outperforms the implementations of the dynamic programming algorithm in conventional query optimization components for very large join queries.

  5. Broadband and Broad-Angle Low-Scattering Metasurface Based on Hybrid Optimization Algorithm

    PubMed Central

    Wang, Ke; Zhao, Jie; Cheng, Qiang; Dong, Di Sha; Cui, Tie Jun

    2014-01-01

    A broadband and broad-angle low-scattering metasurface is designed, fabricated, and characterized. Based on the optimization algorithm and far-field scattering pattern analysis, we propose a rapid and efficient method to design metasurfaces, which avoids the large amount of time-consuming electromagnetic simulations. Full-wave simulation and measurement results show that the proposed metasurface is insensitive to the polarization of incident waves, and presents good scattering-reduction properties for oblique incident waves. PMID:25089367

  6. Mimas Showing False Colors #1

    NASA Technical Reports Server (NTRS)

    2005-01-01

    False color images of Saturn's moon, Mimas, reveal variation in either the composition or texture across its surface.

    During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles).

    The image at the left is a narrow angle clear-filter image, which was separately processed to enhance the contrast in brightness and sharpness of visible features. The image at the right is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined into a single black and white picture that isolates and maps regional color differences. This 'color map' was then superimposed over the clear-filter image at the left.

    The combination of color map and brightness image shows how the color differences across the Mimas surface materials are tied to geological features. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green.

    Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of each image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in

  7. Toward a theoretically based measurement model of the good life.

    PubMed

    Cheung, C K

    1997-06-01

    A theoretically based conceptualization of the good life should differentiate 4 dimensions-the hedonist good life, the dialectical good life, the humanist good life, and the formalist good life. These 4 dimensions incorporate previous fragmentary measures, such as life satisfaction, depression, work alienation, and marital satisfaction, to produce an integrative view. In the present study, 276 Hong Kong Chinese husbands and wives responded to a survey of 13 indicators for these 4 good life dimensions. Confirmatory hierarchical factor analysis showed that these indicators identified the 4 dimensions of the good life, which in turn converged to identify a second-order factor of the overall good life. The model demonstrates discriminant validity in that the first-order factors had high loadings on the overall good life factor despite being linked by a social desirability factor. Analysis further showed that the second-order factor model applied equally well to husbands and wives. Thus, the conceptualization appears to be theoretically and empirically adequate in incorporating previous conceptualizations of the good life. PMID:9168589

  8. Maximizing Submodular Functions under Matroid Constraints by Evolutionary Algorithms.

    PubMed

    Friedrich, Tobias; Neumann, Frank

    2015-01-01

    Many combinatorial optimization problems have underlying goal functions that are submodular. The classical goal is to find a good solution for a given submodular function f under a given set of constraints. In this paper, we investigate the runtime of a simple single objective evolutionary algorithm called (1 + 1) EA and a multiobjective evolutionary algorithm called GSEMO until they have obtained a good approximation for submodular functions. For the case of monotone submodular functions and uniform cardinality constraints, we show that the GSEMO achieves a (1 - 1/e)-approximation in expected polynomial time. For the case of monotone functions where the constraints are given by the intersection of K ≥ 2 matroids, we show that the (1 + 1) EA achieves a (1/k + δ)-approximation in expected polynomial time for any constant δ > 0. Turning to nonmonotone symmetric submodular functions with k ≥ 1 matroid intersection constraints, we show that the GSEMO achieves a 1/((k + 2)(1 + ε))-approximation in expected time O(n(k + 6)log(n)/ε.

  9. A new method for mesoscale eddy detection based on watershed segmentation algorithm

    NASA Astrophysics Data System (ADS)

    Qin, Lijuan; Dong, Qing; Xue, Cunjin; Hou, Xueyan; Song, Wanjiao

    2014-11-01

    Mesoscale eddies are widely found in the ocean. They play important roles in heat transport, momentum transport, ocean circulation and so on. The automatic detection of mesoscale eddies based on satellite remote sensing images is an important research topic. Some image processing methods have been applied to identify mesoscale eddies such as Canny operator, Hough transform and so forth, but the accuracy of detection was not very ideal. This paper described a new algorithm based on watershed segmentation algorithm for automatic detection of mesoscale eddies from sea level anomaly(SLA) image. Watershed segmentation algorithm has the disadvantage of over-segmentation. It is important to select appropriate markers. In this study, markers were selected from the reconstructed SLA image, which were used to modify the gradient image. Then two parameters, radius and amplitude of eddy, were used to filter the segmentation results. The method was tested on the Northwest Pacific using TOPEX/Poseidon altimeter data. The results are encouraging, showing that this algorithm is applicable for mesoscale eddies and has a good accuracy. This algorithm has a good response to weak edges and extracted eddies have complete and continuous boundaries. The eddy boundaries generally coincide with closed contours of SSH.

  10. A synthesized heuristic task scheduling algorithm.

    PubMed

    Dai, Yanyan; Zhang, Xiangli

    2014-01-01

    Aiming at the static task scheduling problems in heterogeneous environment, a heuristic task scheduling algorithm named HCPPEFT is proposed. In task prioritizing phase, there are three levels of priority in the algorithm to choose task. First, the critical tasks have the highest priority, secondly the tasks with longer path to exit task will be selected, and then algorithm will choose tasks with less predecessors to schedule. In resource selection phase, the algorithm is selected task duplication to reduce the interresource communication cost, besides forecasting the impact of an assignment for all children of the current task permits better decisions to be made in selecting resources. The algorithm proposed is compared with STDH, PEFT, and HEFT algorithms through randomly generated graphs and sets of task graphs. The experimental results show that the new algorithm can achieve better scheduling performance.

  11. Trading public goods stabilizes interspecific mutualism.

    PubMed

    Archetti, Marco; Scheuring, István

    2013-02-01

    The existence of cooperation between species raises a fundamental problem for evolutionary theory. Why provide costly services to another species if the feedback of this provision also happens to benefit intra-specific competitors that provide no service? Rewarding cooperators and punishing defectors can help maintain mutualism; this is not possible, however, when one can only respond to the collective action of one's partners, which is likely to be the case in many common symbioses. We show how the theory of public goods can explain the stability of mutualism when discrimination between cooperators and defectors is not possible: if two groups of individuals trade goods that are non-linear, increasing functions of the number of contributions, their mutualistic interaction is maintained by the exchange of these public goods, even when it is not possible to punish defectors, which can persist at relatively high frequencies. This provides a theoretical justification and testable predictions for the evolution of mutualism in the absence of discrimination mechanisms.

  12. Software for portable laser light show system

    NASA Astrophysics Data System (ADS)

    Buruchin, Dmitrey J.; Leonov, Alexander F.

    1995-04-01

    Portable laser light show system LS-3500-10M is connected to the parallel port of IBM PC/AT compatible computer. Computer performs output of digital control data describing images. Specially designed control device is used to convert digital data coming from parallel port to the analog signal driving scanner. Capabilities of even cost nothing 286 computer are quite enough for laser graphics control. Technology of scanning used in laser graphics system LS-3500-10M essentially differs from widely spread systems based on galvanometers with mobile core or with mobile magnet. Such devices are based on the same principle of work as electrically driven servo-mechanism. As scanner we use elastic system with hydraulic dampen oscillations and opened loop. For most of applications of laser graphics such system provides satisfactory precision and speed of scanning. LS-3500-10M software gives user ability to create on PC and play his own laser graphics demonstrations. It is possible to render recognizable text and pictures using different styles, 3D and abstract animation. All types of demonstrations can be mixed in slide-show. Time synchronization is supported. Software has the following features: (1) Different types of text output. Built-in text editor for typing and editing of textural information. Different fonts can be used to display text. User can create his own fonts using specially developed font editor. (2) Editor of 3D animation with library of predefined shapes. (3) Abstract animation provided by software routines. (4) Support of different graphics files formats (PCX or DXF). Original algorithm of raster image tracing was implemented. (5) Built-in slide-show editor.

  13. Learning dynamics in public goods games

    NASA Astrophysics Data System (ADS)

    Bladon, Alex J.; Galla, Tobias

    2011-10-01

    We extend recent analyses of stochastic effects in game dynamical learning to cases of multiplayer games and to games defined on networked structures. By means of an expansion in the noise strength we consider the weak-noise limit and present an analytical computation of spectral properties of fluctuations in multiplayer public goods games. This extends existing work on two-player games. In particular we show that coherent cycles may emerge driven by noise in the adaptation dynamics. These phenomena are not too dissimilar from cyclic strategy switching observed in experiments of behavioral game theory.

  14. Hybrid feature selection algorithm using symmetrical uncertainty and a harmony search algorithm

    NASA Astrophysics Data System (ADS)

    Salameh Shreem, Salam; Abdullah, Salwani; Nazri, Mohd Zakree Ahmad

    2016-04-01

    Microarray technology can be used as an efficient diagnostic system to recognise diseases such as tumours or to discriminate between different types of cancers in normal tissues. This technology has received increasing attention from the bioinformatics community because of its potential in designing powerful decision-making tools for cancer diagnosis. However, the presence of thousands or tens of thousands of genes affects the predictive accuracy of this technology from the perspective of classification. Thus, a key issue in microarray data is identifying or selecting the smallest possible set of genes from the input data that can achieve good predictive accuracy for classification. In this work, we propose a two-stage selection algorithm for gene selection problems in microarray data-sets called the symmetrical uncertainty filter and harmony search algorithm wrapper (SU-HSA). Experimental results show that the SU-HSA is better than HSA in isolation for all data-sets in terms of the accuracy and achieves a lower number of genes on 6 out of 10 instances. Furthermore, the comparison with state-of-the-art methods shows that our proposed approach is able to obtain 5 (out of 10) new best results in terms of the number of selected genes and competitive results in terms of the classification accuracy.

  15. Novel biomedical tetrahedral mesh methods: algorithms and applications

    NASA Astrophysics Data System (ADS)

    Yu, Xiao; Jin, Yanfeng; Chen, Weitao; Huang, Pengfei; Gu, Lixu

    2007-12-01

    Tetrahedral mesh generation algorithm, as a prerequisite of many soft tissue simulation methods, becomes very important in the virtual surgery programs because of the real-time requirement. Aiming to speed up the computation in the simulation, we propose a revised Delaunay algorithm which makes a good balance of quality of tetrahedra, boundary preservation and time complexity, with many improved methods. Another mesh algorithm named Space-Disassembling is also presented in this paper, and a comparison of Space-Disassembling, traditional Delaunay algorithm and the revised Delaunay algorithm is processed based on clinical soft-tissue simulation projects, including craniofacial plastic surgery and breast reconstruction plastic surgery.

  16. An SMP soft classification algorithm for remote sensing

    NASA Astrophysics Data System (ADS)

    Phillips, Rhonda D.; Watson, Layne T.; Easterling, David R.; Wynne, Randolph H.

    2014-07-01

    This work introduces a symmetric multiprocessing (SMP) version of the continuous iterative guided spectral class rejection (CIGSCR) algorithm, a semiautomated classification algorithm for remote sensing (multispectral) images. The algorithm uses soft data clusters to produce a soft classification containing inherently more information than a comparable hard classification at an increased computational cost. Previous work suggests that similar algorithms achieve good parallel scalability, motivating the parallel algorithm development work here. Experimental results of applying parallel CIGSCR to an image with approximately 108 pixels and six bands demonstrate superlinear speedup. A soft two class classification is generated in just over 4 min using 32 processors.

  17. Good news for coffee addicts.

    PubMed

    Lee, Thomas H

    2009-06-01

    Whether it's a basic Mr. Coffee or a gadget that sports a snazzy device for grinding beans on demand, the office coffee machine offers a place for serendipitous encounters that can improve the social aspect of work and generate new ideas. What's more, a steaming cup of joe may be as good for your health as it is for the bottom line, says Lee, a professor of medicine at Harvard Medical School and the CEO of Partners Community HealthCare. Fears of coffee's carcinogenic effects now appear to be unfounded, and, in fact, the brew might even protect against some types of cancer. What's more, coffee may guard against Alzheimer's disease and other forms of dementia and somehow soften the blow of a heart attack. Of course, its role as a pick-me-up is well known. So there's no need to take your coffee with a dollop of guilt, especially if you ease up on the sugar, cream, double chocolate, and whipped-cream topping. PMID:19496470

  18. New algorithms for binary wavefront optimization

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolong; Kner, Peter

    2015-03-01

    Binary amplitude modulation promises to allow rapid focusing through strongly scattering media with a large number of segments due to the faster update rates of digital micromirror devices (DMDs) compared to spatial light modulators (SLMs). While binary amplitude modulation has a lower theoretical enhancement than phase modulation, the faster update rate should more than compensate for the difference - a factor of π2 /2. Here we present two new algorithms, a genetic algorithm and a transmission matrix algorithm, for optimizing the focus with binary amplitude modulation that achieve enhancements close to the theoretical maximum. Genetic algorithms have been shown to work well in noisy environments and we show that the genetic algorithm performs better than a stepwise algorithm. Transmission matrix algorithms allow complete characterization and control of the medium but require phase control either at the input or output. Here we introduce a transmission matrix algorithm that works with only binary amplitude control and intensity measurements. We apply these algorithms to binary amplitude modulation using a Texas Instruments Digital Micromirror Device. Here we report an enhancement of 152 with 1536 segments (9.90%×N) using a genetic algorithm with binary amplitude modulation and an enhancement of 136 with 1536 segments (8.9%×N) using an intensity-only transmission matrix algorithm.

  19. The Common Good in Classical Political Philosophy

    ERIC Educational Resources Information Center

    Lewis, V. Bradley

    2006-01-01

    The term "common good" names the end (or final cause) of political and social life in the tradition of moral thought that owes its main substance to Aristotle and St. Thomas Aquinas. It names a genuine good ("bonum honestum") and not merely an instrumental or secondary good defeasible in the face of particular goods. However, at the same time, it…

  20. 19 CFR 102.12 - Fungible goods.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... RULES OF ORIGIN Rules of Origin § 102.12 Fungible goods. When fungible goods of different countries of origin are commingled the country of origin of the goods: (a) Is the countries of origin of those... the origin of the commingled good is not practical, the country or countries of origin may...

  1. Vertical Sextants give Good Sights

    NASA Astrophysics Data System (ADS)

    Dixon, Mark

    Many texts stress the need for marine sextants to be held precisely vertical at the instant that the altitude of a heavenly body is measured. Several authors lay particular emphasis on the technique of the instrument in a small arc about the horizontal axis to obtain a good sight. Nobody, to the author's knowledge, however, has attempted to quantify the errors involved, so as to compare them with other errors inherent in determining celestial position lines. This paper sets out to address these issues and to pose the question: what level of accuracy of vertical alignment can reasonably be expected during marine sextant work at sea ?When a heavenly body is brought to tangency with the visible horizon it is particularly important to ensure that the sextant is held in a truly vertical position. To this end the instrument is rocked gently about the horizontal so that the image of the body describes a small arc in the observer's field of vision. As Bruce Bauer points out, tangency with the horizon must be achieved during the process of rocking and not a second or so after rocking has been discontinued. The altitude is recorded for the instant that the body kisses the visible horizon at the lowest point of the rocking arc, as in Fig. 2. The only other visual clue as to whether the sextant is vertical is provided by the right angle made by the vertical edge of the horizon glass mirror with the horizon. There may also be some input from the observer's sense of balance and his hand orientation.

  2. Panniculitides, an algorithmic approach.

    PubMed

    Zelger, B

    2013-08-01

    The issue of inflammatory diseases of subcutis and its mimicries is generally considered a difficult field of dermatopathology. Yet, in my experience, with appropriate biopsies and good clinicopathological correlation, a specific diagnosis of panniculitides can usually be made. Thereby, knowledge about some basic anatomic and pathological issues is essential. Anatomy differentiates within the panniculus between the fatty lobules separated by fibrous septa. Pathologically, inflammation of panniculus is defined and recognized by an inflammatory process which leads to tissue damage and necrosis. Several types of fat necrosis are observed: xanthomatized macrophages in lipophagic necrosis; granular fat necrosis and fat micropseudocysts in liquefactive fat necrosis; mummified adipocytes in "hyalinizing" fat necrosis with/without saponification and/or calcification; and lipomembranous membranes in membranous fat necrosis. In an algorithmic approach the recognition of an inflammatory process recognized by features as elaborated above is best followed in three steps: recognition of pattern, second of subpattern, and finally of presence and composition of inflammatory cells. Pattern differentiates a mostly septal or mostly lobular distribution at scanning magnification. In the subpattern category one looks for the presence or absence of vasculitis, and, if this is the case, the size and the nature of the involved blood vessel: arterioles and small arteries or veins; capillaries or postcapillary venules. The third step will be to identify the nature of the cells present in the inflammatory infiltrate and, finally, to look for additional histopathologic features that allow for a specific final diagnosis in the language of clinical dermatology of disease involving the subcutaneous fat.

  3. A family of algorithms for computing consensus about node state from network data.

    PubMed

    Brush, Eleanor R; Krakauer, David C; Flack, Jessica C

    2013-01-01

    Biological and social networks are composed of heterogeneous nodes that contribute differentially to network structure and function. A number of algorithms have been developed to measure this variation. These algorithms have proven useful for applications that require assigning scores to individual nodes-from ranking websites to determining critical species in ecosystems-yet the mechanistic basis for why they produce good rankings remains poorly understood. We show that a unifying property of these algorithms is that they quantify consensus in the network about a node's state or capacity to perform a function. The algorithms capture consensus by either taking into account the number of a target node's direct connections, and, when the edges are weighted, the uniformity of its weighted in-degree distribution (breadth), or by measuring net flow into a target node (depth). Using data from communication, social, and biological networks we find that that how an algorithm measures consensus-through breadth or depth- impacts its ability to correctly score nodes. We also observe variation in sensitivity to source biases in interaction/adjacency matrices: errors arising from systematic error at the node level or direct manipulation of network connectivity by nodes. Our results indicate that the breadth algorithms, which are derived from information theory, correctly score nodes (assessed using independent data) and are robust to errors. However, in cases where nodes "form opinions" about other nodes using indirect information, like reputation, depth algorithms, like Eigenvector Centrality, are required. One caveat is that Eigenvector Centrality is not robust to error unless the network is transitive or assortative. In these cases the network structure allows the depth algorithms to effectively capture breadth as well as depth. Finally, we discuss the algorithms' cognitive and computational demands. This is an important consideration in systems in which individuals use the

  4. A family of algorithms for computing consensus about node state from network data.

    PubMed

    Brush, Eleanor R; Krakauer, David C; Flack, Jessica C

    2013-01-01

    Biological and social networks are composed of heterogeneous nodes that contribute differentially to network structure and function. A number of algorithms have been developed to measure this variation. These algorithms have proven useful for applications that require assigning scores to individual nodes-from ranking websites to determining critical species in ecosystems-yet the mechanistic basis for why they produce good rankings remains poorly understood. We show that a unifying property of these algorithms is that they quantify consensus in the network about a node's state or capacity to perform a function. The algorithms capture consensus by either taking into account the number of a target node's direct connections, and, when the edges are weighted, the uniformity of its weighted in-degree distribution (breadth), or by measuring net flow into a target node (depth). Using data from communication, social, and biological networks we find that that how an algorithm measures consensus-through breadth or depth- impacts its ability to correctly score nodes. We also observe variation in sensitivity to source biases in interaction/adjacency matrices: errors arising from systematic error at the node level or direct manipulation of network connectivity by nodes. Our results indicate that the breadth algorithms, which are derived from information theory, correctly score nodes (assessed using independent data) and are robust to errors. However, in cases where nodes "form opinions" about other nodes using indirect information, like reputation, depth algorithms, like Eigenvector Centrality, are required. One caveat is that Eigenvector Centrality is not robust to error unless the network is transitive or assortative. In these cases the network structure allows the depth algorithms to effectively capture breadth as well as depth. Finally, we discuss the algorithms' cognitive and computational demands. This is an important consideration in systems in which individuals use the

  5. Algorithm Visualization: The State of the Field

    ERIC Educational Resources Information Center

    Shaffer, Clifford A.; Cooper, Matthew L.; Alon, Alexander Joel D.; Akbar, Monika; Stewart, Michael; Ponce, Sean; Edwards, Stephen H.

    2010-01-01

    We present findings regarding the state of the field of Algorithm Visualization (AV) based on our analysis of a collection of over 500 AVs. We examine how AVs are distributed among topics, who created them and when, their overall quality, and how they are disseminated. There does exist a cadre of good AVs and active developers. Unfortunately, we…

  6. Predictive Caching Using the TDAG Algorithm

    NASA Technical Reports Server (NTRS)

    Laird, Philip; Saul, Ronald

    1992-01-01

    We describe how the TDAG algorithm for learning to predict symbol sequences can be used to design a predictive cache store. A model of a two-level mass storage system is developed and used to calculate the performance of the cache under various conditions. Experimental simulations provide good confirmation of the model.

  7. Algorithms for automated DNA assembly

    PubMed Central

    Densmore, Douglas; Hsiau, Timothy H.-C.; Kittleson, Joshua T.; DeLoache, Will; Batten, Christopher; Anderson, J. Christopher

    2010-01-01

    Generating a defined set of genetic constructs within a large combinatorial space provides a powerful method for engineering novel biological functions. However, the process of assembling more than a few specific DNA sequences can be costly, time consuming and error prone. Even if a correct theoretical construction scheme is developed manually, it is likely to be suboptimal by any number of cost metrics. Modular, robust and formal approaches are needed for exploring these vast design spaces. By automating the design of DNA fabrication schemes using computational algorithms, we can eliminate human error while reducing redundant operations, thus minimizing the time and cost required for conducting biological engineering experiments. Here, we provide algorithms that optimize the simultaneous assembly of a collection of related DNA sequences. We compare our algorithms to an exhaustive search on a small synthetic dataset and our results show that our algorithms can quickly find an optimal solution. Comparison with random search approaches on two real-world datasets show that our algorithms can also quickly find lower-cost solutions for large datasets. PMID:20335162

  8. An Intrusion Detection Algorithm Based On NFPA

    NASA Astrophysics Data System (ADS)

    Anming, Zhong

    A process oriented intrusion detection algorithm based on Probabilistic Automaton with No Final probabilities (NFPA) is introduced, system call sequence of process is used as the source data. By using information in system call sequence of normal process and system call sequence of anomaly process, the anomaly detection and the misuse detection are efficiently combined. Experiments show better performance of our algorithm compared to the classical algorithm in this field.

  9. Reasoning about systolic algorithms

    SciTech Connect

    Purushothaman, S.

    1986-01-01

    Systolic algorithms are a class of parallel algorithms, with small grain concurrency, well suited for implementation in VLSI. They are intended to be implemented as high-performance, computation-bound back-end processors and are characterized by a tesselating interconnection of identical processing elements. This dissertation investigates the problem of providing correctness of systolic algorithms. The following are reported in this dissertation: (1) a methodology for verifying correctness of systolic algorithms based on solving the representation of an algorithm as recurrence equations. The methodology is demonstrated by proving the correctness of a systolic architecture for optimal parenthesization. (2) The implementation of mechanical proofs of correctness of two systolic algorithms, a convolution algorithm and an optimal parenthesization algorithm, using the Boyer-Moore theorem prover. (3) An induction principle for proving correctness of systolic arrays which are modular. Two attendant inference rules, weak equivalence and shift transformation, which capture equivalent behavior of systolic arrays, are also presented.

  10. Algorithm-development activities

    NASA Technical Reports Server (NTRS)

    Carder, Kendall L.

    1994-01-01

    The task of algorithm-development activities at USF continues. The algorithm for determining chlorophyll alpha concentration, (Chl alpha) and gelbstoff absorption coefficient for SeaWiFS and MODIS-N radiance data is our current priority.

  11. A Novel Zero Velocity Interval Detection Algorithm for Self-Contained Pedestrian Navigation System with Inertial Sensors.

    PubMed

    Tian, Xiaochun; Chen, Jiabin; Han, Yongqiang; Shang, Jianyu; Li, Nan

    2016-01-01

    Zero velocity update (ZUPT) plays an important role in pedestrian navigation algorithms with the premise that the zero velocity interval (ZVI) should be detected accurately and effectively. A novel adaptive ZVI detection algorithm based on a smoothed pseudo Wigner-Ville distribution to remove multiple frequencies intelligently (SPWVD-RMFI) is proposed in this paper. The novel algorithm adopts the SPWVD-RMFI method to extract the pedestrian gait frequency and to calculate the optimal ZVI detection threshold in real time by establishing the function relationships between the thresholds and the gait frequency; then, the adaptive adjustment of thresholds with gait frequency is realized and improves the ZVI detection precision. To put it into practice, a ZVI detection experiment is carried out; the result shows that compared with the traditional fixed threshold ZVI detection method, the adaptive ZVI detection algorithm can effectively reduce the false and missed detection rate of ZVI; this indicates that the novel algorithm has high detection precision and good robustness. Furthermore, pedestrian trajectory positioning experiments at different walking speeds are carried out to evaluate the influence of the novel algorithm on positioning precision. The results show that the ZVI detected by the adaptive ZVI detection algorithm for pedestrian trajectory calculation can achieve better performance. PMID:27669266

  12. Noise-enhanced clustering and competitive learning algorithms.

    PubMed

    Osoba, Osonde; Kosko, Bart

    2013-01-01

    Noise can provably speed up convergence in many centroid-based clustering algorithms. This includes the popular k-means clustering algorithm. The clustering noise benefit follows from the general noise benefit for the expectation-maximization algorithm because many clustering algorithms are special cases of the expectation-maximization algorithm. Simulations show that noise also speeds up convergence in stochastic unsupervised competitive learning, supervised competitive learning, and differential competitive learning.

  13. Beyond Policy and Good Intentions

    ERIC Educational Resources Information Center

    Bevan-Brown, Jill

    2006-01-01

    This paper examines the situation for Maori learners for special needs in Aotearoa/New Zealand. Despite considerable legislation and official documentation supporting the provision of culturally appropriate special education services for Maori, research shows that these learners are often neglected, overlooked and sometimes even excluded. The main…

  14. INSENS classification algorithm report

    SciTech Connect

    Hernandez, J.E.; Frerking, C.J.; Myers, D.W.

    1993-07-28

    This report describes a new algorithm developed for the Imigration and Naturalization Service (INS) in support of the INSENS project for classifying vehicles and pedestrians using seismic data. This algorithm is less sensitive to nuisance alarms due to environmental events than the previous algorithm. Furthermore, the algorithm is simple enough that it can be implemented in the 8-bit microprocessor used in the INSENS system.

  15. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  16. Evolution of Cooperation in Public Goods Games

    NASA Astrophysics Data System (ADS)

    Xia, Cheng-Yi; Zhang, Juan-Juan; Wang, Yi-Ling; Wang, Jin-Song

    2011-10-01

    We investigate the evolution of cooperation with evolutionary public goods games based on finite populations, where four pure strategies: cooperators, defectors, punishers and loners who are unwilling to participate are considered. By adopting approximate best response dynamics, we show that the magnitude of rationality not only quantitatively explains the experiment results in [Nature (London) 425 (2003) 390], but also it will heavily influence the evolution of cooperation. Compared with previous results of infinite populations, which result in two equilibriums, we show that there merely exists a special equilibrium and the relevant high value of bounded rationality will sustain cooperation. In addition, we characterize that loner's payoff plays an active role in the maintenance of cooperation, which will only be warranted for the low and moderate values of loner's payoff. It thus indicates the effects of rationality and loner's payoff will influence the cooperation. Finally, we highlight the important result that the introduction of voluntary participation and punishment will facilitate cooperation greatly.

  17. Traumatic intraventricular hemorrhage with a good prognosis.

    PubMed

    Is, Merih; Gezen, Ferruh; Akgul, Mehmet; Dosoglu, Murat

    2011-01-01

    We report a 10-year-old girl with an isolated traumatic intraventricular hemorrhage following a traffic accident, who had a good prognosis. Her neurological examination upon arrival was normal and she had no complaint other than headache and vomiting. Computed tomography on admission showed a hemorrhage in the lateral and fourth ventricles. She had a Glasgow Coma Score of 15, and she was thus given only antiepileptic drugs for prophylaxis and followed. Computed tomography that was repeated 5 days after admission showed no blood and all ventricles were of normal size. There was no vascular pathology on magnetic resonance imaging and magnetic resonance angiography. The patient remains well 5 months after her accident. Intraventricular hemorrhage does not always have a poor prognosis.

  18. Detection of cracks in shafts with the Approximated Entropy algorithm

    NASA Astrophysics Data System (ADS)

    Sampaio, Diego Luchesi; Nicoletti, Rodrigo

    2016-05-01

    The Approximate Entropy is a statistical calculus used primarily in the fields of Medicine, Biology, and Telecommunication for classifying and identifying complex signal data. In this work, an Approximate Entropy algorithm is used to detect cracks in a rotating shaft. The signals of the cracked shaft are obtained from numerical simulations of a de Laval rotor with breathing cracks modelled by the Fracture Mechanics. In this case, one analysed the vertical displacements of the rotor during run-up transients. The results show the feasibility of detecting cracks from 5% depth, irrespective of the unbalance of the rotating system and crack orientation in the shaft. The results also show that the algorithm can differentiate the occurrence of crack only, misalignment only, and crack + misalignment in the system. However, the algorithm is sensitive to intrinsic parameters p (number of data points in a sample vector) and f (fraction of the standard deviation that defines the minimum distance between two sample vectors), and good results are only obtained by appropriately choosing their values according to the sampling rate of the signal.

  19. Cognitive radio resource allocation based on coupled chaotic genetic algorithm

    NASA Astrophysics Data System (ADS)

    Zu, Yun-Xiao; Zhou, Jie; Zeng, Chang-Chang

    2010-11-01

    A coupled chaotic genetic algorithm for cognitive radio resource allocation which is based on genetic algorithm and coupled Logistic map is proposed. A fitness function for cognitive radio resource allocation is provided. Simulations are conducted for cognitive radio resource allocation by using the coupled chaotic genetic algorithm, simple genetic algorithm and dynamic allocation algorithm respectively. The simulation results show that, compared with simple genetic and dynamic allocation algorithm, coupled chaotic genetic algorithm reduces the total transmission power and bit error rate in cognitive radio system, and has faster convergence speed.

  20. Interior search algorithm (ISA): a novel approach for global optimization.

    PubMed

    Gandomi, Amir H

    2014-07-01

    This paper presents the interior search algorithm (ISA) as a novel method for solving optimization tasks. The proposed ISA is inspired by interior design and decoration. The algorithm is different from other metaheuristic algorithms and provides new insight for global optimization. The proposed method is verified using some benchmark mathematical and engineering problems commonly used in the area of optimization. ISA results are further compared with well-known optimization algorithms. The results show that the ISA is efficiently capable of solving optimization problems. The proposed algorithm can outperform the other well-known algorithms. Further, the proposed algorithm is very simple and it only has one parameter to tune.

  1. A Short Survey of Document Structure Similarity Algorithms

    SciTech Connect

    Buttler, D

    2004-02-27

    This paper provides a brief survey of document structural similarity algorithms, including the optimal Tree Edit Distance algorithm and various approximation algorithms. The approximation algorithms include the simple weighted tag similarity algorithm, Fourier transforms of the structure, and a new application of the shingle technique to structural similarity. We show three surprising results. First, the Fourier transform technique proves to be the least accurate of any of approximation algorithms, while also being slowest. Second, optimal Tree Edit Distance algorithms may not be the best technique for clustering pages from different sites. Third, the simplest approximation to structure may be the most effective and efficient mechanism for many applications.

  2. Avoiding or restricting defectors in public goods games?

    PubMed

    Han, The Anh; Pereira, Luís Moniz; Lenaerts, Tom

    2015-02-01

    When creating a public good, strategies or mechanisms are required to handle defectors. We first show mathematically and numerically that prior agreements with posterior compensations provide a strategic solution that leads to substantial levels of cooperation in the context of public goods games, results that are corroborated by available experimental data. Notwithstanding this success, one cannot, as with other approaches, fully exclude the presence of defectors, raising the question of how they can be dealt with to avoid the demise of the common good. We show that both avoiding creation of the common good, whenever full agreement is not reached, and limiting the benefit that disagreeing defectors can acquire, using costly restriction mechanisms, are relevant choices. Nonetheless, restriction mechanisms are found the more favourable, especially in larger group interactions. Given decreasing restriction costs, introducing restraining measures to cope with public goods free-riding issues is the ultimate advantageous solution for all participants, rather than avoiding its creation.

  3. Avoiding or restricting defectors in public goods games?

    PubMed

    Han, The Anh; Pereira, Luís Moniz; Lenaerts, Tom

    2015-02-01

    When creating a public good, strategies or mechanisms are required to handle defectors. We first show mathematically and numerically that prior agreements with posterior compensations provide a strategic solution that leads to substantial levels of cooperation in the context of public goods games, results that are corroborated by available experimental data. Notwithstanding this success, one cannot, as with other approaches, fully exclude the presence of defectors, raising the question of how they can be dealt with to avoid the demise of the common good. We show that both avoiding creation of the common good, whenever full agreement is not reached, and limiting the benefit that disagreeing defectors can acquire, using costly restriction mechanisms, are relevant choices. Nonetheless, restriction mechanisms are found the more favourable, especially in larger group interactions. Given decreasing restriction costs, introducing restraining measures to cope with public goods free-riding issues is the ultimate advantageous solution for all participants, rather than avoiding its creation. PMID:25540240

  4. Avoiding or restricting defectors in public goods games?

    PubMed Central

    Han, The Anh; Pereira, Luís Moniz; Lenaerts, Tom

    2015-01-01

    When creating a public good, strategies or mechanisms are required to handle defectors. We first show mathematically and numerically that prior agreements with posterior compensations provide a strategic solution that leads to substantial levels of cooperation in the context of public goods games, results that are corroborated by available experimental data. Notwithstanding this success, one cannot, as with other approaches, fully exclude the presence of defectors, raising the question of how they can be dealt with to avoid the demise of the common good. We show that both avoiding creation of the common good, whenever full agreement is not reached, and limiting the benefit that disagreeing defectors can acquire, using costly restriction mechanisms, are relevant choices. Nonetheless, restriction mechanisms are found the more favourable, especially in larger group interactions. Given decreasing restriction costs, introducing restraining measures to cope with public goods free-riding issues is the ultimate advantageous solution for all participants, rather than avoiding its creation. PMID:25540240

  5. The Normalized-Rate Iterative Algorithm: A Practical Dynamic Spectrum Management Method for DSL

    NASA Astrophysics Data System (ADS)

    Statovci, Driton; Nordström, Tomas; Nilsson, Rickard

    2006-12-01

    We present a practical solution for dynamic spectrum management (DSM) in digital subscriber line systems: the normalized-rate iterative algorithm (NRIA). Supported by a novel optimization problem formulation, the NRIA is the only DSM algorithm that jointly addresses spectrum balancing for frequency division duplexing systems and power allocation for the users sharing a common cable bundle. With a focus on being implementable rather than obtaining the highest possible theoretical performance, the NRIA is designed to efficiently solve the DSM optimization problem with the operators' business models in mind. This is achieved with the help of two types of parameters: the desired network asymmetry and the desired user priorities. The NRIA is a centralized DSM algorithm based on the iterative water-filling algorithm (IWFA) for finding efficient power allocations, but extends the IWFA by finding the achievable bitrates and by optimizing the bandplan. It is compared with three other DSM proposals: the IWFA, the optimal spectrum balancing algorithm (OSBA), and the bidirectional IWFA (bi-IWFA). We show that the NRIA achieves better bitrate performance than the IWFA and the bi-IWFA. It can even achieve performance almost as good as the OSBA, but with dramatically lower requirements on complexity. Additionally, the NRIA can achieve bitrate combinations that cannot be supported by any other DSM algorithm.

  6. 10. Ned Goode, Photographer April 1959 FIRST FLOOR, EAST ROOM, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. Ned Goode, Photographer April 1959 FIRST FLOOR, EAST ROOM, NORTHEAST CORNER, SHOWING CLOSET WITH WINDOW AND SEAT - Barnes-Brinton House, 630 Baltimore Pike (U.S. Route 1), Chadds Ford, Delaware County, PA

  7. 12. Ned Goode, Photographer April 1959 SECOND FLOOR, WEST ROOM, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Ned Goode, Photographer April 1959 SECOND FLOOR, WEST ROOM, SOUTHWEST CORNER, SHOWING CLOSET AND GRILLWORK - Barnes-Brinton House, 630 Baltimore Pike (U.S. Route 1), Chadds Ford, Delaware County, PA

  8. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency.

  9. [A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].

    PubMed

    Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo

    2015-10-01

    With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency. PMID:26904830

  10. 19 CFR 10.1021 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 1 2014-04-01 2014-04-01 false Goods classifiable as goods put up in sets. 10... Free Trade Agreement Rules of Origin § 10.1021 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 33, HTSUS, goods classifiable as goods put up in...

  11. 19 CFR 10.3021 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Goods classifiable as goods put up in sets. 10...-Colombia Trade Promotion Agreement Rules of Origin § 10.3021 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 34, HTSUS, goods classifiable as goods put up in...

  12. 19 CFR 10.2021 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 1 2014-04-01 2014-04-01 false Goods classifiable as goods put up in sets. 10... Trade Promotion Agreement Rules of Origin § 10.2021 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 35, HTSUS, goods classifiable as goods put up in...

  13. 19 CFR 10.1021 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 1 2012-04-01 2012-04-01 false Goods classifiable as goods put up in sets. 10... Free Trade Agreement Rules of Origin § 10.1021 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 33, HTSUS, goods classifiable as goods put up in...

  14. 19 CFR 10.921 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 1 2012-04-01 2012-04-01 false Goods classifiable as goods put up in sets. 10.921... Trade Promotion Agreement Rules of Origin § 10.921 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 32(n), HTSUS, goods classifiable as goods put up...

  15. 19 CFR 10.921 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 1 2014-04-01 2014-04-01 false Goods classifiable as goods put up in sets. 10.921... Trade Promotion Agreement Rules of Origin § 10.921 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 32(n), HTSUS, goods classifiable as goods put up...

  16. 19 CFR 10.1021 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Goods classifiable as goods put up in sets. 10... Free Trade Agreement Rules of Origin § 10.1021 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 33, HTSUS, goods classifiable as goods put up in...

  17. 19 CFR 10.3021 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 1 2014-04-01 2014-04-01 false Goods classifiable as goods put up in sets. 10...-Colombia Trade Promotion Agreement Rules of Origin § 10.3021 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 34, HTSUS, goods classifiable as goods put up in...

  18. 19 CFR 10.921 - Goods classifiable as goods put up in sets.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Goods classifiable as goods put up in sets. 10.921... Trade Promotion Agreement Rules of Origin § 10.921 Goods classifiable as goods put up in sets. Notwithstanding the specific rules set forth in General Note 32(n), HTSUS, goods classifiable as goods put up...

  19. Good news for sea turtles.

    PubMed

    Hays, Graeme C

    2004-07-01

    Following the overexploitation of sea turtle populations, conservation measures are now in place in many areas. However, the overall impact of these measures is often unknown because there are few long time-series showing trends in population sizes. In a recent paper, George Balazs and Milani Chaloupka chart the number of green turtles Chelonia mydas nesting in Hawaii over the past 30 years and reveal a remarkably quick increase in the size of this population following the instigation of conservation measures during the 1970s. Importantly, this work shows how even a small population of sea turtles can recover rapidly, suggesting that Allee effects do not impede conservation efforts in operation worldwide. PMID:16701283

  20. Lensless optical data hiding system based on phase encoding algorithm in the Fresnel domain.

    PubMed

    Chen, Yen-Yu; Wang, Jian-Hong; Lin, Cheng-Chung; Hwang, Hone-Ene

    2013-07-20

    A novel and efficient algorithm based on a modified Gerchberg-Saxton algorithm (MGSA) in the Fresnel domain is presented, together with mathematical derivation, and two pure phase-only masks (POMs) are generated. The algorithm's application to data hiding is demonstrated by a simulation procedure, in which a hidden image/logo is encoded into phase forms. A hidden image/logo can be extracted by the proposed high-performance lensless optical data-hiding system. The reconstructed image shows good quality and the errors are close to zero. In addition, the robustness of our data-hiding technique is illustrated by simulation results. The position coordinates of the POMs as well as the wavelength are used as secure keys that can ensure sufficient information security and robustness. The main advantages of this proposed watermarking system are that it uses fewer iterative processes to produce the masks, and the image-hiding scheme is straightforward.

  1. Classification of underground pipe scanned images using feature extraction and neuro-fuzzy algorithm.

    PubMed

    Sinha, S K; Karray, F

    2002-01-01

    Pipeline surface defects such as holes and cracks cause major problems for utility managers, particularly when the pipeline is buried under the ground. Manual inspection for surface defects in the pipeline has a number of drawbacks, including subjectivity, varying standards, and high costs. Automatic inspection system using image processing and artificial intelligence techniques can overcome many of these disadvantages and offer utility managers an opportunity to significantly improve quality and reduce costs. A recognition and classification of pipe cracks using images analysis and neuro-fuzzy algorithm is proposed. In the preprocessing step the scanned images of pipe are analyzed and crack features are extracted. In the classification step the neuro-fuzzy algorithm is developed that employs a fuzzy membership function and error backpropagation algorithm. The idea behind the proposed approach is that the fuzzy membership function will absorb variation of feature values and the backpropagation network, with its learning ability, will show good classification efficiency.

  2. An infrared small target detection algorithm based on high-speed local contrast method

    NASA Astrophysics Data System (ADS)

    Cui, Zheng; Yang, Jingli; Jiang, Shouda; Li, Junbao

    2016-05-01

    Small-target detection in infrared imagery with a complex background is always an important task in remote sensing fields. It is important to improve the detection capabilities such as detection rate, false alarm rate, and speed. However, current algorithms usually improve one or two of the detection capabilities while sacrificing the other. In this letter, an Infrared (IR) small target detection algorithm with two layers inspired by Human Visual System (HVS) is proposed to balance those detection capabilities. The first layer uses high speed simplified local contrast method to select significant information. And the second layer uses machine learning classifier to separate targets from background clutters. Experimental results show the proposed algorithm pursue good performance in detection rate, false alarm rate and speed simultaneously.

  3. Autophagy: not good OR bad, but good AND bad.

    PubMed

    Altman, Brian J; Rathmell, Jeffrey C

    2009-05-01

    Autophagy is a well-established mechanism to degrade intracellular components and provide a nutrient source to promote survival of cells in metabolic distress. Such stress can be caused by a lack of available nutrients or by insufficient rates of nutrient uptake. Indeed, growth factor deprivation leads to internalization and degradation of nutrient transporters, leaving cells with limited means to access extracellular nutrients even when plentiful.This loss of growth factor signaling and extracellular nutrients ultimately leads to apoptosis, but also activates autophagy, which may degrade intracellular components and provide fuel for mitochondrial bioenergetics. The precise metabolic role of autophagy and how it intersects with the apoptotic pathways in growth factor withdrawal, however, has been uncertain. Our recent findings ingrowth factor-deprived hematopoietic cells show that autophagy can simultaneously contribute to cell metabolism and initiate a pathway to sensitize cells to apoptotic death. This pathway may promote tissue homeostasis by ensuring that only cells with high resistance to apoptosis may utilize autophagy as a survival mechanism when growth factors are limiting and nutrient uptake decreases. PMID:19398886

  4. A swaying object detection algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Shidong; Rong, Jianzhong; Zhou, Dechuang; Wang, Jian

    2013-07-01

    Moving object detection is a most important preliminary step in video analysis. Some moving objects such as spitting steam, fire and smoke have unique motion feature whose lower position keep basically unchanged and the upper position move back and forth. Based on this unique motion feature, a swaying object detection algorithm is presented in this paper. Firstly, fuzzy integral was adopted to integrate color features for extracting moving objects from video frames. Secondly, a swaying identification algorithm based on centroid calculation was used to distinguish the swaying object from other moving objects. Experiments show that the proposed method is effective to detect swaying object.

  5. Motion Cueing Algorithm Development: Initial Investigation and Redesign of the Algorithms

    NASA Technical Reports Server (NTRS)

    Telban, Robert J.; Wu, Weimin; Cardullo, Frank M.; Houck, Jacob A. (Technical Monitor)

    2000-01-01

    In this project four motion cueing algorithms were initially investigated. The classical algorithm generated results with large distortion and delay and low magnitude. The NASA adaptive algorithm proved to be well tuned with satisfactory performance, while the UTIAS adaptive algorithm produced less desirable results. Modifications were made to the adaptive algorithms to reduce the magnitude of undesirable spikes. The optimal algorithm was found to have the potential for improved performance with further redesign. The center of simulator rotation was redefined. More terms were added to the cost function to enable more tuning flexibility. A new design approach using a Fortran/Matlab/Simulink setup was employed. A new semicircular canals model was incorporated in the algorithm. With these changes results show the optimal algorithm has some advantages over the NASA adaptive algorithm. Two general problems observed in the initial investigation required solutions. A nonlinear gain algorithm was developed that scales the aircraft inputs by a third-order polynomial, maximizing the motion cues while remaining within the operational limits of the motion system. A braking algorithm was developed to bring the simulator to a full stop at its motion limit and later release the brake to follow the cueing algorithm output.

  6. Monte Carlo algorithm for free energy calculation.

    PubMed

    Bi, Sheng; Tong, Ning-Hua

    2015-07-01

    We propose a Monte Carlo algorithm for the free energy calculation based on configuration space sampling. An upward or downward temperature scan can be used to produce F(T). We implement this algorithm for the Ising model on a square lattice and triangular lattice. Comparison with the exact free energy shows an excellent agreement. We analyze the properties of this algorithm and compare it with the Wang-Landau algorithm, which samples in energy space. This method is applicable to general classical statistical models. The possibility of extending it to quantum systems is discussed.

  7. Thermostat algorithm for generating target ensembles.

    PubMed

    Bravetti, A; Tapias, D

    2016-02-01

    We present a deterministic algorithm called contact density dynamics that generates any prescribed target distribution in the physical phase space. Akin to the famous model of Nosé and Hoover, our algorithm is based on a non-Hamiltonian system in an extended phase space. However, the equations of motion in our case follow from contact geometry and we show that in general they have a similar form to those of the so-called density dynamics algorithm. As a prototypical example, we apply our algorithm to produce a Gibbs canonical distribution for a one-dimensional harmonic oscillator. PMID:26986320

  8. Thermostat algorithm for generating target ensembles.

    PubMed

    Bravetti, A; Tapias, D

    2016-02-01

    We present a deterministic algorithm called contact density dynamics that generates any prescribed target distribution in the physical phase space. Akin to the famous model of Nosé and Hoover, our algorithm is based on a non-Hamiltonian system in an extended phase space. However, the equations of motion in our case follow from contact geometry and we show that in general they have a similar form to those of the so-called density dynamics algorithm. As a prototypical example, we apply our algorithm to produce a Gibbs canonical distribution for a one-dimensional harmonic oscillator.

  9. Thermostat algorithm for generating target ensembles

    NASA Astrophysics Data System (ADS)

    Bravetti, A.; Tapias, D.

    2016-02-01

    We present a deterministic algorithm called contact density dynamics that generates any prescribed target distribution in the physical phase space. Akin to the famous model of Nosé and Hoover, our algorithm is based on a non-Hamiltonian system in an extended phase space. However, the equations of motion in our case follow from contact geometry and we show that in general they have a similar form to those of the so-called density dynamics algorithm. As a prototypical example, we apply our algorithm to produce a Gibbs canonical distribution for a one-dimensional harmonic oscillator.

  10. Exploration of new multivariate spectral calibration algorithms.

    SciTech Connect

    Van Benthem, Mark Hilary; Haaland, David Michael; Melgaard, David Kennett; Martin, Laura Elizabeth; Wehlburg, Christine Marie; Pell, Randy J.; Guenard, Robert D.

    2004-03-01

    A variety of multivariate calibration algorithms for quantitative spectral analyses were investigated and compared, and new algorithms were developed in the course of this Laboratory Directed Research and Development project. We were able to demonstrate the ability of the hybrid classical least squares/partial least squares (CLSIPLS) calibration algorithms to maintain calibrations in the presence of spectrometer drift and to transfer calibrations between spectrometers from the same or different manufacturers. These methods were found to be as good or better in prediction ability as the commonly used partial least squares (PLS) method. We also present the theory for an entirely new class of algorithms labeled augmented classical least squares (ACLS) methods. New factor selection methods are developed and described for the ACLS algorithms. These factor selection methods are demonstrated using near-infrared spectra collected from a system of dilute aqueous solutions. The ACLS algorithm is also shown to provide improved ease of use and better prediction ability than PLS when transferring calibrations between near-infrared calibrations from the same manufacturer. Finally, simulations incorporating either ideal or realistic errors in the spectra were used to compare the prediction abilities of the new ACLS algorithm with that of PLS. We found that in the presence of realistic errors with non-uniform spectral error variance across spectral channels or with spectral errors correlated between frequency channels, ACLS methods generally out-performed the more commonly used PLS method. These results demonstrate the need for realistic error structure in simulations when the prediction abilities of various algorithms are compared. The combination of equal or superior prediction ability and the ease of use of the ACLS algorithms make the new ACLS methods the preferred algorithms to use for multivariate spectral calibrations.

  11. Annealed Importance Sampling Reversible Jump MCMC algorithms

    SciTech Connect

    Karagiannis, Georgios; Andrieu, Christophe

    2013-03-20

    It will soon be 20 years since reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithms have been proposed. They have significantly extended the scope of Markov chain Monte Carlo simulation methods, offering the promise to be able to routinely tackle transdimensional sampling problems, as encountered in Bayesian model selection problems for example, in a principled and flexible fashion. Their practical efficient implementation, however, still remains a challenge. A particular difficulty encountered in practice is in the choice of the dimension matching variables (both their nature and their distribution) and the reversible transformations which allow one to define the one-to-one mappings underpinning the design of these algorithms. Indeed, even seemingly sensible choices can lead to algorithms with very poor performance. The focus of this paper is the development and performance evaluation of a method, annealed importance sampling RJ-MCMC (aisRJ), which addresses this problem by mitigating the sensitivity of RJ-MCMC algorithms to the aforementioned poor design. As we shall see the algorithm can be understood as being an “exact approximation” of an idealized MCMC algorithm that would sample from the model probabilities directly in a model selection set-up. Such an idealized algorithm may have good theoretical convergence properties, but typically cannot be implemented, and our algorithms can approximate the performance of such idealized algorithms to an arbitrary degree while not introducing any bias for any degree of approximation. Our approach combines the dimension matching ideas of RJ-MCMC with annealed importance sampling and its Markov chain Monte Carlo implementation. We illustrate the performance of the algorithm with numerical simulations which indicate that, although the approach may at first appear computationally involved, it is in fact competitive.

  12. Geovisualization and analysis of the Good Country Index

    NASA Astrophysics Data System (ADS)

    Tan, C.; Dramowicz, K.

    2016-04-01

    The Good Country Index measures the contribution of a single country in the humanity and health aspects that are beneficial to the planet. Countries which are globally good for our planet do not necessarily have to be good for their own citizens. The Good Country Index is based on the following seven categories: science and technology, culture, international peace and security, world order, planet and climate, prosperity and equality, and health and well-being. The Good Country Index is focused on the external effects, in contrast to other global indices (for example, the Human Development Index, or the Social Progress Index) showing the level of development of a single country in benefiting its own citizens. The authors verify if these global indices may be good proxies as potential predictors, as well as indicators of a country's ‘goodness’. Non-spatial analysis included analyzing relationships between the overall Good Country Index and the seven contributing categories, as well as between the overall Good Country Index and other global indices. Data analytics was used for building various predictive models and selecting the most accurate model to predict the overall Good Country Index. The most important rules for high and low index values were identified. Spatial analysis included spatial autocorrelation to analyze similarity of index values of a country in relation to its neighbors. Hot spot analysis was used to identify and map significant clusters of countries with high and low index values. Similar countries were grouped into geographically compact clusters and mapped.

  13. New analytical algorithm for overlay accuracy

    NASA Astrophysics Data System (ADS)

    Ham, Boo-Hyun; Yun, Sangho; Kwak, Min-Cheol; Ha, Soon Mok; Kim, Cheol-Hong; Nam, Suk-Woo

    2012-03-01

    total overlay error data is decomposed into two parts: the systematic error and the random error. And we tried to show both error components characteristic, systematic error has a good correlation with residual error by scanner condition, whereas, random error has a good correlation with residual error as going process steps. Furthermore, we demonstrate the practical using case with proposed method that shows the working of the high order method through systematic error. Our results show that to characterize an overlay data that is suitable for use in advanced technology nodes requires much more than just evaluating the conventional metrology metrics of TIS and TMU.

  14. Modified OMP Algorithm for Exponentially Decaying Signals

    PubMed Central

    Kazimierczuk, Krzysztof; Kasprzak, Paweł

    2015-01-01

    A group of signal reconstruction methods, referred to as compressed sensing (CS), has recently found a variety of applications in numerous branches of science and technology. However, the condition of the applicability of standard CS algorithms (e.g., orthogonal matching pursuit, OMP), i.e., the existence of the strictly sparse representation of a signal, is rarely met. Thus, dedicated algorithms for solving particular problems have to be developed. In this paper, we introduce a modification of OMP motivated by nuclear magnetic resonance (NMR) application of CS. The algorithm is based on the fact that the NMR spectrum consists of Lorentzian peaks and matches a single Lorentzian peak in each of its iterations. Thus, we propose the name Lorentzian peak matching pursuit (LPMP). We also consider certain modification of the algorithm by introducing the allowed positions of the Lorentzian peaks' centers. Our results show that the LPMP algorithm outperforms other CS algorithms when applied to exponentially decaying signals. PMID:25609044

  15. Ensemble algorithms in reinforcement learning.

    PubMed

    Wiering, Marco A; van Hasselt, Hado

    2008-08-01

    This paper describes several ensemble methods that combine multiple different reinforcement learning (RL) algorithms in a single agent. The aim is to enhance learning speed and final performance by combining the chosen actions or action probabilities of different RL algorithms. We designed and implemented four different ensemble methods combining the following five different RL algorithms: Q-learning, Sarsa, actor-critic (AC), QV-learning, and AC learning automaton. The intuitively designed ensemble methods, namely, majority voting (MV), rank voting, Boltzmann multiplication (BM), and Boltzmann addition, combine the policies derived from the value functions of the different RL algorithms, in contrast to previous work where ensemble methods have been used in RL for representing and learning a single value function. We show experiments on five maze problems of varying complexity; the first problem is simple, but the other four maze tasks are of a dynamic or partially observable nature. The results indicate that the BM and MV ensembles significantly outperform the single RL algorithms.

  16. SDR Input Power Estimation Algorithms

    NASA Technical Reports Server (NTRS)

    Nappier, Jennifer M.; Briones, Janette C.

    2013-01-01

    The General Dynamics (GD) S-Band software defined radio (SDR) in the Space Communications and Navigation (SCAN) Testbed on the International Space Station (ISS) provides experimenters an opportunity to develop and demonstrate experimental waveforms in space. The SDR has an analog and a digital automatic gain control (AGC) and the response of the AGCs to changes in SDR input power and temperature was characterized prior to the launch and installation of the SCAN Testbed on the ISS. The AGCs were used to estimate the SDR input power and SNR of the received signal and the characterization results showed a nonlinear response to SDR input power and temperature. In order to estimate the SDR input from the AGCs, three algorithms were developed and implemented on the ground software of the SCAN Testbed. The algorithms include a linear straight line estimator, which used the digital AGC and the temperature to estimate the SDR input power over a narrower section of the SDR input power range. There is a linear adaptive filter algorithm that uses both AGCs and the temperature to estimate the SDR input power over a wide input power range. Finally, an algorithm that uses neural networks was designed to estimate the input power over a wide range. This paper describes the algorithms in detail and their associated performance in estimating the SDR input power.

  17. Ensemble algorithms in reinforcement learning.

    PubMed

    Wiering, Marco A; van Hasselt, Hado

    2008-08-01

    This paper describes several ensemble methods that combine multiple different reinforcement learning (RL) algorithms in a single agent. The aim is to enhance learning speed and final performance by combining the chosen actions or action probabilities of different RL algorithms. We designed and implemented four different ensemble methods combining the following five different RL algorithms: Q-learning, Sarsa, actor-critic (AC), QV-learning, and AC learning automaton. The intuitively designed ensemble methods, namely, majority voting (MV), rank voting, Boltzmann multiplication (BM), and Boltzmann addition, combine the policies derived from the value functions of the different RL algorithms, in contrast to previous work where ensemble methods have been used in RL for representing and learning a single value function. We show experiments on five maze problems of varying complexity; the first problem is simple, but the other four maze tasks are of a dynamic or partially observable nature. The results indicate that the BM and MV ensembles significantly outperform the single RL algorithms. PMID:18632380

  18. SDR input power estimation algorithms

    NASA Astrophysics Data System (ADS)

    Briones, J. C.; Nappier, J. M.

    The General Dynamics (GD) S-Band software defined radio (SDR) in the Space Communications and Navigation (SCAN) Testbed on the International Space Station (ISS) provides experimenters an opportunity to develop and demonstrate experimental waveforms in space. The SDR has an analog and a digital automatic gain control (AGC) and the response of the AGCs to changes in SDR input power and temperature was characterized prior to the launch and installation of the SCAN Testbed on the ISS. The AGCs were used to estimate the SDR input power and SNR of the received signal and the characterization results showed a nonlinear response to SDR input power and temperature. In order to estimate the SDR input from the AGCs, three algorithms were developed and implemented on the ground software of the SCAN Testbed. The algorithms include a linear straight line estimator, which used the digital AGC and the temperature to estimate the SDR input power over a narrower section of the SDR input power range. There is a linear adaptive filter algorithm that uses both AGCs and the temperature to estimate the SDR input power over a wide input power range. Finally, an algorithm that uses neural networks was designed to estimate the input power over a wide range. This paper describes the algorithms in detail and their associated performance in estimating the SDR input power.

  19. Evolution of cooperation in rotating indivisible goods game.

    PubMed

    Koike, Shimpei; Nakamaru, Mayuko; Tsujimoto, Masahiro

    2010-05-01

    Collective behavior is theoretically and experimentally studied through a public goods game in which players contribute resources or efforts to produce goods (or pool), which are then divided equally among all players regardless of the amount of their contribution. However, if goods are indivisible, only one player can receive the goods. In this case, the problem is how to distribute indivisible goods, and here therefore we propose a new game, namely the "rotating indivisible goods game." In this game, the goods are not divided but distributed by regular rotation. An example is rotating savings and credit associations (ROSCAs), which exist all over the world and serve as efficient and informal institutions for collecting savings for small investments. In a ROSCA, members regularly contribute money to produce goods and to distribute them to each member on a regular rotation. It has been pointed out that ROSCA members are selected based on their reliability or reputation, and that defectors who stop contributing are excluded. We elucidate mechanisms that sustain cooperation in rotating indivisible goods games by means of evolutionary simulations. First, we investigate the effect of the peer selection rule by which the group chooses members based on the players reputation, also by which players choose groups based on their reputation. Regardless of the peer selection rule, cooperation is not sustainable in a rotating indivisible goods game. Second, we introduce the forfeiture rule that forbids a member who has not contributed earlier from receiving goods. These analyses show that employing these two rules can sustain cooperation in the rotating indivisible goods game, although employing either of the two cannot. Finally, we prove that evolutionary simulation can be a tool for investigating institutional designs that promote cooperation.

  20. Evolution of cooperation in rotating indivisible goods game.

    PubMed

    Koike, Shimpei; Nakamaru, Mayuko; Tsujimoto, Masahiro

    2010-05-01

    Collective behavior is theoretically and experimentally studied through a public goods game in which players contribute resources or efforts to produce goods (or pool), which are then divided equally among all players regardless of the amount of their contribution. However, if goods are indivisible, only one player can receive the goods. In this case, the problem is how to distribute indivisible goods, and here therefore we propose a new game, namely the "rotating indivisible goods game." In this game, the goods are not divided but distributed by regular rotation. An example is rotating savings and credit associations (ROSCAs), which exist all over the world and serve as efficient and informal institutions for collecting savings for small investments. In a ROSCA, members regularly contribute money to produce goods and to distribute them to each member on a regular rotation. It has been pointed out that ROSCA members are selected based on their reliability or reputation, and that defectors who stop contributing are excluded. We elucidate mechanisms that sustain cooperation in rotating indivisible goods games by means of evolutionary simulations. First, we investigate the effect of the peer selection rule by which the group chooses members based on the players reputation, also by which players choose groups based on their reputation. Regardless of the peer selection rule, cooperation is not sustainable in a rotating indivisible goods game. Second, we introduce the forfeiture rule that forbids a member who has not contributed earlier from receiving goods. These analyses show that employing these two rules can sustain cooperation in the rotating indivisible goods game, although employing either of the two cannot. Finally, we prove that evolutionary simulation can be a tool for investigating institutional designs that promote cooperation. PMID:20064533

  1. Do humans make good decisions?

    PubMed

    Summerfield, Christopher; Tsetsos, Konstantinos

    2015-01-01

    Human performance on perceptual classification tasks approaches that of an ideal observer, but economic decisions are often inconsistent and intransitive, with preferences reversing according to the local context. We discuss the view that suboptimal choices may result from the efficient coding of decision-relevant information, a strategy that allows expected inputs to be processed with higher gain than unexpected inputs. Efficient coding leads to 'robust' decisions that depart from optimality but maximise the information transmitted by a limited-capacity system in a rapidly-changing world. We review recent work showing that when perceptual environments are variable or volatile, perceptual decisions exhibit the same suboptimal context-dependence as economic choices, and we propose a general computational framework that accounts for findings across the two domains. PMID:25488076

  2. Do humans make good decisions?

    PubMed Central

    Summerfield, Christopher; Tsetsos, Konstantinos

    2014-01-01

    Human performance on perceptual classification tasks approaches that of an ideal observer, but economic decisions are often inconsistent and intransitive, with preferences reversing according to the local context. We discuss the view that suboptimal choices may result from the efficient coding of decision-relevant information, a strategy that allows expected inputs to be processed with higher gain than unexpected inputs. Efficient coding leads to ‘robust’ decisions that depart from optimality but maximise the information transmitted by a limited-capacity system in a rapidly-changing world. We review recent work showing that when perceptual environments are variable or volatile, perceptual decisions exhibit the same suboptimal context-dependence as economic choices, and propose a general computational framework that accounts for findings across the two domains. PMID:25488076

  3. The Case of a "Giffen Good."

    ERIC Educational Resources Information Center

    Spiegel, Uriel

    1994-01-01

    Discusses the concept of "Giffen goods" as it applies to price theory in economics instruction. Provides a graphic presentation and analysis of Giffen goods and behavior. Includes examples of Giffen behavior that can be used to illustrate the concept. (CFR)

  4. Good Health Before Pregnancy: Preconception Care

    MedlinePlus

    ... Login Join Pay Dues Follow us: Women's Health Care Physicians Contact Us My ACOG ACOG Departments Donate ... Patients About ACOG Good Health Before Pregnancy: Preconception Care Home For Patients Search FAQs Good Health Before ...

  5. A constraint consensus memetic algorithm for solving constrained optimization problems

    NASA Astrophysics Data System (ADS)

    Hamza, Noha M.; Sarker, Ruhul A.; Essam, Daryl L.; Deb, Kalyanmoy; Elsayed, Saber M.

    2014-11-01

    Constraint handling is an important aspect of evolutionary constrained optimization. Currently, the mechanism used for constraint handling with evolutionary algorithms mainly assists the selection process, but not the actual search process. In this article, first a genetic algorithm is combined with a class of search methods, known as constraint consensus methods, that assist infeasible individuals to move towards the feasible region. This approach is also integrated with a memetic algorithm. The proposed algorithm is tested and analysed by solving two sets of standard benchmark problems, and the results are compared with other state-of-the-art algorithms. The comparisons show that the proposed algorithm outperforms other similar algorithms. The algorithm has also been applied to solve a practical economic load dispatch problem, where it also shows superior performance over other algorithms.

  6. Last-passage Monte Carlo algorithm for mutual capacitance.

    PubMed

    Hwang, Chi-Ok; Given, James A

    2006-08-01

    We develop and test the last-passage diffusion algorithm, a charge-based Monte Carlo algorithm, for the mutual capacitance of a system of conductors. The first-passage algorithm is highly efficient because it is charge based and incorporates importance sampling; it averages over the properties of Brownian paths that initiate outside the conductor and terminate on its surface. However, this algorithm does not seem to generalize to mutual capacitance problems. The last-passage algorithm, in a sense, is the time reversal of the first-passage algorithm; it involves averages over particles that initiate on an absorbing surface, leave that surface, and diffuse away to infinity. To validate this algorithm, we calculate the mutual capacitance matrix of the circular-disk parallel-plate capacitor and compare with the known numerical results. Good agreement is obtained.

  7. Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models

    PubMed Central

    Yuan, Gonglin; Duan, Xiabin; Liu, Wenjie; Wang, Xiaoliang; Cui, Zengru; Sheng, Zhou

    2015-01-01

    Two new PRP conjugate Algorithms are proposed in this paper based on two modified PRP conjugate gradient methods: the first algorithm is proposed for solving unconstrained optimization problems, and the second algorithm is proposed for solving nonlinear equations. The first method contains two aspects of information: function value and gradient value. The two methods both possess some good properties, as follows: 1)βk ≥ 0 2) the search direction has the trust region property without the use of any line search method 3) the search direction has sufficient descent property without the use of any line search method. Under some suitable conditions, we establish the global convergence of the two algorithms. We conduct numerical experiments to evaluate our algorithms. The numerical results indicate that the first algorithm is effective and competitive for solving unconstrained optimization problems and that the second algorithm is effective for solving large-scale nonlinear equations. PMID:26502409

  8. Application of hybrid evolutionary algorithms to low exhaust emission diesel engine design

    NASA Astrophysics Data System (ADS)

    Jeong, S.; Obayashi, S.; Minemura, Y.

    2008-01-01

    A hybrid evolutionary algorithm, consisting of a genetic algorithm (GA) and particle swarm optimization (PSO), is proposed. Generally, GAs maintain diverse solutions of good quality in multi-objective problems, while PSO shows fast convergence to the optimum solution. By coupling these algorithms, GA will compensate for the low diversity of PSO, while PSO will compensate for the high computational costs of GA. The hybrid algorithm was validated using standard test functions. The results showed that the hybrid algorithm has better performance than either a pure GA or pure PSO. The method was applied to an engineering design problem—the geometry of diesel engine combustion chamber reducing exhaust emissions such as NOx, soot and CO was optimized. The results demonstrated the usefulness of the present method to this engineering design problem. To identify the relation between exhaust emissions and combustion chamber geometry, data mining was performed with a self-organising map (SOM). The results indicate that the volume near the lower central part of the combustion chamber has a large effect on exhaust emissions and the optimum chamber geometry will vary depending on fuel injection angle.

  9. The good body: when big is better.

    PubMed

    Cassidy, C M

    1991-09-01

    An important cultural question is, "What is a 'good'--desirable, beautiful, impressive--body?" The answers are legion; here I examine why bigger bodies represent survival skill, and how this power symbolism is embodied by behaviors that guide larger persons toward the top of the social hierarchy. bigness is a complex concept comprising tallness, boniness, muscularity and fattiness. Data show that most people worldwide want to be big--both tall and fat. Those who achieve the ideal are disproportionately among the society's most socially powerful. In the food-secure West, fascination with power and the body has not waned, but has been redefined such that thinness is desired. This apparent anomaly is resolved by realizing that thinness in the midst of abundance--as long as one is also tall and muscular--still projects the traditional message of power, and brings such social boons as upward mobility. PMID:1961102

  10. Benefits of tolerance in public goods games.

    PubMed

    Szolnoki, Attila; Chen, Xiaojie

    2015-10-01

    Leaving the joint enterprise when defection is unveiled is always a viable option to avoid being exploited. Although loner strategy helps the population not to be trapped into the tragedy of the commons state, it could offer only a modest income for nonparticipants. In this paper we demonstrate that showing some tolerance toward defectors could not only save cooperation in harsh environments but in fact results in a surprisingly high average payoff for group members in public goods games. Phase diagrams and the underlying spatial patterns reveal the high complexity of evolving states where cyclic dominant strategies or two-strategy alliances can characterize the final state of evolution. We identify microscopic mechanisms which are responsible for the superiority of global solutions containing tolerant players. This phenomenon is robust and can be observed both in well-mixed and in structured populations highlighting the importance of tolerance in our everyday life.

  11. Benefits of tolerance in public goods games

    NASA Astrophysics Data System (ADS)

    Szolnoki, Attila; Chen, Xiaojie

    2015-10-01

    Leaving the joint enterprise when defection is unveiled is always a viable option to avoid being exploited. Although loner strategy helps the population not to be trapped into the tragedy of the commons state, it could offer only a modest income for nonparticipants. In this paper we demonstrate that showing some tolerance toward defectors could not only save cooperation in harsh environments but in fact results in a surprisingly high average payoff for group members in public goods games. Phase diagrams and the underlying spatial patterns reveal the high complexity of evolving states where cyclic dominant strategies or two-strategy alliances can characterize the final state of evolution. We identify microscopic mechanisms which are responsible for the superiority of global solutions containing tolerant players. This phenomenon is robust and can be observed both in well-mixed and in structured populations highlighting the importance of tolerance in our everyday life.

  12. Benefits of tolerance in public goods games.

    PubMed

    Szolnoki, Attila; Chen, Xiaojie

    2015-10-01

    Leaving the joint enterprise when defection is unveiled is always a viable option to avoid being exploited. Although loner strategy helps the population not to be trapped into the tragedy of the commons state, it could offer only a modest income for nonparticipants. In this paper we demonstrate that showing some tolerance toward defectors could not only save cooperation in harsh environments but in fact results in a surprisingly high average payoff for group members in public goods games. Phase diagrams and the underlying spatial patterns reveal the high complexity of evolving states where cyclic dominant strategies or two-strategy alliances can characterize the final state of evolution. We identify microscopic mechanisms which are responsible for the superiority of global solutions containing tolerant players. This phenomenon is robust and can be observed both in well-mixed and in structured populations highlighting the importance of tolerance in our everyday life. PMID:26565295

  13. When good news leads to bad choices.

    PubMed

    McDevitt, Margaret A; Dunn, Roger M; Spetch, Marcia L; Ludvig, Elliot A

    2016-01-01

    Pigeons and other animals sometimes deviate from optimal choice behavior when given informative signals for delayed outcomes. For example, when pigeons are given a choice between an alternative that always leads to food after a delay and an alternative that leads to food only half of the time after a delay, preference changes dramatically depending on whether the stimuli during the delays are correlated with (signal) the outcomes or not. With signaled outcomes, pigeons show a much greater preference for the suboptimal alternative than with unsignaled outcomes. Key variables and research findings related to this phenomenon are reviewed, including the effects of durations of the choice and delay periods, probability of reinforcement, and gaps in the signal. We interpret the available evidence as reflecting a preference induced by signals for good news in a context of uncertainty. Other explanations are briefly summarized and compared. PMID:26781050

  14. Semioptimal practicable algorithmic cooling

    NASA Astrophysics Data System (ADS)

    Elias, Yuval; Mor, Tal; Weinstein, Yossi

    2011-04-01

    Algorithmic cooling (AC) of spins applies entropy manipulation algorithms in open spin systems in order to cool spins far beyond Shannon’s entropy bound. Algorithmic cooling of nuclear spins was demonstrated experimentally and may contribute to nuclear magnetic resonance spectroscopy. Several cooling algorithms were suggested in recent years, including practicable algorithmic cooling (PAC) and exhaustive AC. Practicable algorithms have simple implementations, yet their level of cooling is far from optimal; exhaustive algorithms, on the other hand, cool much better, and some even reach (asymptotically) an optimal level of cooling, but they are not practicable. We introduce here semioptimal practicable AC (SOPAC), wherein a few cycles (typically two to six) are performed at each recursive level. Two classes of SOPAC algorithms are proposed and analyzed. Both attain cooling levels significantly better than PAC and are much more efficient than the exhaustive algorithms. These algorithms are shown to bridge the gap between PAC and exhaustive AC. In addition, we calculated the number of spins required by SOPAC in order to purify qubits for quantum computation. As few as 12 and 7 spins are required (in an ideal scenario) to yield a mildly pure spin (60% polarized) from initial polarizations of 1% and 10%, respectively. In the latter case, about five more spins are sufficient to produce a highly pure spin (99.99% polarized), which could be relevant for fault-tolerant quantum computing.

  15. Why good projects fail anyway.

    PubMed

    Matta, Nadim F; Ashkenas, Ronald N

    2003-09-01

    Big projects fail at an astonishing rate--more than half the time, by some estimates. It's not hard to understand why. Complicated long-term projects are customarily developed by a series of teams working along parallel tracks. If managers fail to anticipate everything that might fall through the cracks, those tracks will not converge successfully at the end to reach the goal. Take a companywide CRM project. Traditionally, one team might analyze customers, another select the software, a third develop training programs, and so forth. When the project's finally complete, though, it may turn out that the salespeople won't enter in the requisite data because they don't understand why they need to. This very problem has, in fact, derailed many CRM programs at major organizations. There is a way to uncover unanticipated problems while the project is still in development. The key is to inject into the overall plan a series of miniprojects, or "rapid-results initiatives," which each have as their goal a miniature version of the overall goal. In the CRM project, a single team might be charged with increasing the revenues of one sales group in one region by 25% within four months. To reach that goal, team members would have to draw on the work of all the parallel teams. But in just four months, they would discover the salespeople's resistance and probably other unforeseen issues, such as, perhaps, the need to divvy up commissions for joint-selling efforts. The World Bank has used rapid-results initiatives to great effect to keep a sweeping 16-year project on track and deliver visible results years ahead of schedule. In taking an in-depth look at this project, and others, the authors show why this approach is so effective and how the initiatives are managed in conjunction with more traditional project activities.

  16. 29 CFR 779.14 - Goods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Goods. 779.14 Section 779.14 Labor Regulations Relating to... INTERPRETATION NOT DIRECTLY RELATED TO REGULATIONS THE FAIR LABOR STANDARDS ACT AS APPLIED TO RETAILERS OF GOODS OR SERVICES General Some Basic Definitions § 779.14 Goods. The definition in section 3(i) of the...

  17. Scheduling with genetic algorithms

    NASA Technical Reports Server (NTRS)

    Fennel, Theron R.; Underbrink, A. J., Jr.; Williams, George P. W., Jr.

    1994-01-01

    In many domains, scheduling a sequence of jobs is an important function contributing to the overall efficiency of the operation. At Boeing, we develop schedules for many different domains, including assembly of military and commercial aircraft, weapons systems, and space vehicles. Boeing is under contract to develop scheduling systems for the Space Station Payload Planning System (PPS) and Payload Operations and Integration Center (POIC). These applications require that we respect certain sequencing restrictions among the jobs to be scheduled while at the same time assigning resources to the jobs. We call this general problem scheduling and resource allocation. Genetic algorithms (GA's) offer a search method that uses a population of solutions and benefits from intrinsic parallelism to search the problem space rapidly, producing near-optimal solutions. Good intermediate solutions are probabalistically recombined to produce better offspring (based upon some application specific measure of solution fitness, e.g., minimum flowtime, or schedule completeness). Also, at any point in the search, any intermediate solution can be accepted as a final solution; allowing the search to proceed longer usually produces a better solution while terminating the search at virtually any time may yield an acceptable solution. Many processes are constrained by restrictions of sequence among the individual jobs. For a specific job, other jobs must be completed beforehand. While there are obviously many other constraints on processes, it is these on which we focussed for this research: how to allocate crews to jobs while satisfying job precedence requirements and personnel, and tooling and fixture (or, more generally, resource) requirements.

  18. High Rate Pulse Processing Algorithms for Microcalorimeters

    NASA Astrophysics Data System (ADS)

    Tan, Hui; Breus, Dimitry; Hennig, Wolfgang; Sabourov, Konstantin; Collins, Jeffrey W.; Warburton, William K.; Bertrand Doriese, W.; Ullom, Joel N.; Bacrania, Minesh K.; Hoover, Andrew S.; Rabin, Michael W.

    2009-12-01

    It has been demonstrated that microcalorimeter spectrometers based on superconducting transition-edge-sensors can readily achieve sub-100 eV energy resolution near 100 keV. However, the active volume of a single microcalorimeter has to be small in order to maintain good energy resolution, and pulse decay times are normally on the order of milliseconds due to slow thermal relaxation. Therefore, spectrometers are typically built with an array of microcalorimeters to increase detection efficiency and count rate. For large arrays, however, as much pulse processing as possible must be performed at the front end of readout electronics to avoid transferring large amounts of waveform data to a host computer for post-processing. In this paper, we present digital filtering algorithms for processing microcalorimeter pulses in real time at high count rates. The goal for these algorithms, which are being implemented in readout electronics that we are also currently developing, is to achieve sufficiently good energy resolution for most applications while being: a) simple enough to be implemented in the readout electronics; and, b) capable of processing overlapping pulses, and thus achieving much higher output count rates than those achieved by existing algorithms. Details of our algorithms are presented, and their performance is compared to that of the "optimal filter" that is currently the predominantly used pulse processing algorithm in the cryogenic-detector community.

  19. Development of an Inverse Algorithm for Resonance Inspection

    SciTech Connect

    Lai, Canhai; Xu, Wei; Sun, Xin

    2012-10-01

    Resonance inspection (RI), which employs the natural frequency spectra shift between the good and the anomalous part populations to detect defects, is a non-destructive evaluation (NDE) technique with many advantages such as low inspection cost, high testing speed, and broad applicability to structures with complex geometry compared to other contemporary NDE methods. It has already been widely used in the automobile industry for quality inspections of safety critical parts. Unlike some conventionally used NDE methods, the current RI technology is unable to provide details, i.e. location, dimension, or types, of the flaws for the discrepant parts. Such limitation severely hinders its wide spread applications and further development. In this study, an inverse RI algorithm based on maximum correlation function is proposed to quantify the location and size of flaws for a discrepant part. A dog-bone shaped stainless steel sample with and without controlled flaws are used for algorithm development and validation. The results show that multiple flaws can be accurately pinpointed back using the algorithms developed, and the prediction accuracy decreases with increasing flaw numbers and decreasing distance between flaws.

  20. Implementation of several mathematical algorithms to breast tissue density classification

    NASA Astrophysics Data System (ADS)

    Quintana, C.; Redondo, M.; Tirao, G.

    2014-02-01

    The accuracy of mammographic abnormality detection methods is strongly dependent on breast tissue characteristics, where a dense breast tissue can hide lesions causing cancer to be detected at later stages. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. This paper presents the implementation and the performance of different mathematical algorithms designed to standardize the categorization of mammographic images, according to the American College of Radiology classifications. These mathematical techniques are based on intrinsic properties calculations and on comparison with an ideal homogeneous image (joint entropy, mutual information, normalized cross correlation and index Q) as categorization parameters. The algorithms evaluation was performed on 100 cases of the mammographic data sets provided by the Ministerio de Salud de la Provincia de Córdoba, Argentina—Programa de Prevención del Cáncer de Mama (Department of Public Health, Córdoba, Argentina, Breast Cancer Prevention Program). The obtained breast classifications were compared with the expert medical diagnostics, showing a good performance. The implemented algorithms revealed a high potentiality to classify breasts into tissue density categories.

  1. RATE-ADJUSTMENT ALGORITHM FOR AGGREGATE TCP CONGESTION CONTROL

    SciTech Connect

    P. TINNAKORNSRISUPHAP, ET AL

    2000-09-01

    The TCP congestion-control mechanism is an algorithm designed to probe the available bandwidth of the network path that TCP packets traverse. However, it is well-known that the TCP congestion-control mechanism does not perform well on networks with a large bandwidth-delay product due to the slow dynamics in adapting its congestion window, especially for short-lived flows. One promising solution to the problem is to aggregate and share the path information among TCP connections that traverse the same bottleneck path, i.e., Aggregate TCP. However, this paper shows via a queueing analysis of a generalized processor-sharing (GPS) queue with regularly-varying service time that a simple aggregation of local TCP connections together into a single aggregate TCP connection can result in a severe performance degradation. To prevent such a degradation, we introduce a rate-adjustment algorithm. Our simulation confirms that by utilizing our rate-adjustment algorithm on aggregate TCP, connections which would normally receive poor service achieve significant performance improvements without penalizing connections which already receive good service.

  2. FITEST: A computer program for ``exact chi-square'' goodness-of-fit significance tests

    NASA Astrophysics Data System (ADS)

    Romesburg, H. Charles; Marshall, Kim; Mauk, Timothy P.

    FITEST, a FORTRAN IV computer program, performs what is termed an exact chi-square test (ECST) to assess the goodness-of-fit between an observed and a theoretical distribution. This test is an alternative to the chi-square and Kolmogorov-Smirnov goodness-of-fit tests. Because it is based on less restrictive assumptions, the ECST may be more appropriate. However, the test imposes a computational burden which, if not handled by an efficiently designed computer algorithm, makes it prohibitively expensive on all but trivial problems. FITEST, through an efficiently designed algorithm, makes an ECST possible for any problem at a reasonable cost.

  3. Gender Equality or Primacy of the Mother? Ambivalent Descriptions of Good Parents

    ERIC Educational Resources Information Center

    Perl-Littunen, Satu

    2007-01-01

    The ideology of gender equality is accepted as the norm in the Nordic countries. When asked to describe what they thought was required to be a good mother and a good father, Finnish informants (N = 387) showed uneasiness in describing good parents separately, however, often describing only a good mother. This article aims to explore the ambivalent…

  4. Reasoning about systolic algorithms

    SciTech Connect

    Purushothaman, S.; Subrahmanyam, P.A.

    1988-12-01

    The authors present a methodology for verifying correctness of systolic algorithms. The methodology is based on solving a set of Uniform Recurrence Equations obtained from a description of systolic algorithms as a set of recursive equations. They present an approach to mechanically verify correctness of systolic algorithms, using the Boyer-Moore theorem proven. A mechanical correctness proof of an example from the literature is also presented.

  5. Immunoscintigraphy with indium-111 labeled monoclonal antibodies: The importance of a good display method

    SciTech Connect

    Liehn, J.C.; Hannequin, P.; Nasca, S.; Lebrun, D.; Fernandez-Valoni, A.; Valeyre, J. )

    1989-03-01

    A major drawback of In-111-labeled monoclonal antibodies (MoAb) is the presence of intense liver, renal, and bone marrow nonspecific activity. This makes the display of the images hardly optimal and their visual interpretation difficult. In this study, the intrinsic color scale (which consists of selecting the limits of the color scale as the highest and the lowest pixel value of the image) was compared to a new, simple algorithm for the determination of the limits of the color scale. This algorithm was based on the count density in the iliac crest areas. OC-125 or anti-CEA In-111 MoAb F(ab')2 fragments were used in 32 patients with suspected recurrence of ovarian (19 patients) or colorectal cancer (13 patients). Final diagnosis was assessed by surgery (21 patients), biopsy (five patients), or followup (six patients). A 10-minute abdomino-pelvic anterior view was recorded two days after injection. These views are displayed using the two methods and interpreted by two observers. Using their responses in each quadrant of the pelvis, the authors calculated two ROC curves. The comparison of the ROC curves showed better performances for the new method. For example, for the same specificity (73%), the sensitivity of the new method was significantly better (78% versus 68%). This result confirmed the importance of a good methodology for displaying immunoscintigraphic images.

  6. Good soldiers and good actors: prosocial and impression management motives as interactive predictors of affiliative citizenship behaviors.

    PubMed

    Grant, Adam M; Mayer, David M

    2009-07-01

    Researchers have discovered inconsistent relationships between prosocial motives and citizenship behaviors. We draw on impression management theory to propose that impression management motives strengthen the association between prosocial motives and affiliative citizenship by encouraging employees to express citizenship in ways that both "do good" and "look good." We report 2 studies that examine the interactions of prosocial and impression management motives as predictors of affiliative citizenship using multisource data from 2 different field samples. Across the 2 studies, we find positive interactions between prosocial and impression management motives as predictors of affiliative citizenship behaviors directed toward other people (helping and courtesy) and the organization (initiative). Study 2 also shows that only prosocial motives predict voice-a challenging citizenship behavior. Our results suggest that employees who are both good soldiers and good actors are most likely to emerge as good citizens in promoting the status quo.

  7. AerGOM, an improved algorithm for stratospheric aerosol extinction retrieval from GOMOS observations - Part 1: Algorithm description

    NASA Astrophysics Data System (ADS)

    Vanhellemont, Filip; Mateshvili, Nina; Blanot, Laurent; Étienne Robert, Charles; Bingen, Christine; Sofieva, Viktoria; Dalaudier, Francis; Tétard, Cédric; Fussen, Didier; Dekemper, Emmanuel; Kyrölä, Erkki; Laine, Marko; Tamminen, Johanna; Zehner, Claus

    2016-09-01

    The GOMOS instrument on Envisat has successfully demonstrated that a UV-Vis-NIR spaceborne stellar occultation instrument is capable of delivering quality data on the gaseous and particulate composition of Earth's atmosphere. Still, some problems related to data inversion remained to be examined. In the past, it was found that the aerosol extinction profile retrievals in the upper troposphere and stratosphere are of good quality at a reference wavelength of 500 nm but suffer from anomalous, retrieval-related perturbations at other wavelengths. Identification of algorithmic problems and subsequent improvement was therefore necessary. This work has been carried out; the resulting AerGOM Level 2 retrieval algorithm together with the first data version AerGOMv1.0 forms the subject of this paper. The AerGOM algorithm differs from the standard GOMOS IPF processor in a number of important ways: more accurate physical laws have been implemented, all retrieval-related covariances are taken into account, and the aerosol extinction spectral model is strongly improved. Retrieval examples demonstrate that the previously observed profile perturbations have disappeared, and the obtained extinction spectra look in general more consistent. We present a detailed validation study in a companion paper; here, to give a first idea of the data quality, a worst-case comparison at 386 nm shows SAGE II-AerGOM correlation coefficients that are up to 1 order of magnitude larger than the ones obtained with the GOMOS IPFv6.01 data set.

  8. The hierarchical algorithms--theory and applications

    NASA Astrophysics Data System (ADS)

    Su, Zheng-Yao

    Monte Carlo simulations are one of the most important numerical techniques for investigating statistical physical systems. Among these systems, spin models are a typical example which also play an essential role in constructing the abstract mechanism for various complex systems. Unfortunately, traditional Monte Carlo algorithms are afflicted with "critical slowing down" near continuous phase transitions and the efficiency of the Monte Carlo simulation goes to zero as the size of the lattice is increased. To combat critical slowing down, a very different type of collective-mode algorithm, in contrast to the traditional single-spin-flipmode, was proposed by Swendsen and Wang in 1987 for Potts spin models. Since then, there has been an explosion of work attempting to understand, improve, or generalize it. In these so-called "cluster" algorithms, clusters of spin are regarded as one template and are updated at each step of the Monte Carlo procedure. In implementing these algorithms the cluster labeling is a major time-consuming bottleneck and is also isomorphic to the problem of computing connected components of an undirected graph seen in other application areas, such as pattern recognition.A number of cluster labeling algorithms for sequential computers have long existed. However, the dynamic irregular nature of clusters complicates the task of finding good parallel algorithms and this is particularly true on SIMD (single-instruction-multiple-data machines. Our design of the Hierarchical Cluster Labeling Algorithm aims at alleviating this problem by building a hierarchical structure on the problem domain and by incorporating local and nonlocal communication schemes. We present an estimate for the computational complexity of cluster labeling and prove the key features of this algorithm (such as lower computational complexity, data locality, and easy implementation) compared with the methods formerly known. In particular, this algorithm can be viewed as a generalized

  9. Model of wealth and goods dynamics in a closed market

    NASA Astrophysics Data System (ADS)

    Ausloos, Marcel; Peķalski, Andrzej

    2007-01-01

    A simple computer simulation model of a closed market on a fixed network with free flow of goods and money is introduced. The model contains only two variables: the amount of goods and money beside the size of the system. An initially flat distribution of both variables is presupposed. We show that under completely random rules, i.e. through the choice of interacting agent pairs on the network and of the exchange rules that the market stabilizes in time and shows diversification of money and goods. We also indicate that the difference between poor and rich agents increases for small markets, as well as for systems in which money is steadily deduced from the market through taxation. It is also found that the price of goods decreases when taxes are introduced, likely due to the less availability of money.

  10. How to Set Focal Categories for Brief Implicit Association Test? "Good" Is Good, "Bad" Is Not So Good.

    PubMed

    Shi, Yuanyuan; Cai, Huajian; Shen, Yiqin Alicia; Yang, Jing

    2016-01-01

    Three studies were conducted to examine the validity of the four versions of BIATs that are supposed to measure the same construct but differ in shared focal category. Study 1 investigated the criterion validity of four BIATs measuring attitudes toward flower versus insect. Study 2 examined the experimental sensitivity of four BIATs by considering attitudes toward induced ingroup versus outgroup. Study 3 examined the predictive power of the four BIATs by investigating attitudes toward the commercial beverages Coke versus Sprite. The findings suggested that for the two attributes "good" and "bad," "good" rather than "bad" proved to be good as a shared focal category; for two targets, so long as they clearly differed in goodness or valence, the "good" rather than "bad" target emerged as good for a shared focal category. Beyond this case, either target worked well. These findings may facilitate the understanding of the BIAT and its future applications. PMID:26869948

  11. How to Set Focal Categories for Brief Implicit Association Test? "Good" Is Good, "Bad" Is Not So Good.

    PubMed

    Shi, Yuanyuan; Cai, Huajian; Shen, Yiqin Alicia; Yang, Jing

    2016-01-01

    Three studies were conducted to examine the validity of the four versions of BIATs that are supposed to measure the same construct but differ in shared focal category. Study 1 investigated the criterion validity of four BIATs measuring attitudes toward flower versus insect. Study 2 examined the experimental sensitivity of four BIATs by considering attitudes toward induced ingroup versus outgroup. Study 3 examined the predictive power of the four BIATs by investigating attitudes toward the commercial beverages Coke versus Sprite. The findings suggested that for the two attributes "good" and "bad," "good" rather than "bad" proved to be good as a shared focal category; for two targets, so long as they clearly differed in goodness or valence, the "good" rather than "bad" target emerged as good for a shared focal category. Beyond this case, either target worked well. These findings may facilitate the understanding of the BIAT and its future applications.

  12. Parameter estimation and optimal scheduling algorithm for a mathematical model of intermittent androgen suppression therapy for prostate cancer

    NASA Astrophysics Data System (ADS)

    Guo, Qian; Lu, Zhichang; Hirata, Yoshito; Aihara, Kazuyuki

    2013-12-01

    We propose an algorithm based on cross-entropy to determine parameters of a piecewise linear model, which describes intermittent androgen suppression therapy for prostate cancer. By comparing with clinical data, the parameter estimation for the switched system shows good fitting accuracy and efficiency. We further optimize switching time points for the piecewise linear model to obtain a feasible therapeutic schedule. The simulation results of therapeutic effect are superior to those of previous strategy.

  13. Quantifying capital goods for waste incineration

    SciTech Connect

    Brogaard, L.K.; Riber, C.; Christensen, T.H.

    2013-06-15

    Highlights: • Materials and energy used for the construction of waste incinerators were quantified. • The data was collected from five incineration plants in Scandinavia. • Included were six main materials, electronic systems, cables and all transportation. • The capital goods contributed 2–3% compared to the direct emissions impact on GW. - Abstract: Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000–26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14 kg CO{sub 2} per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO{sub 2} per tonne of waste combusted.

  14. TrackEye tracking algorithm characterization

    NASA Astrophysics Data System (ADS)

    Valley, Michael T.; Shields, Robert W.; Reed, Jack M.

    2004-10-01

    TrackEye is a film digitization and target tracking system that offers the potential for quantitatively measuring the dynamic state variables (e.g., absolute and relative position, orientation, linear and angular velocity/acceleration, spin rate, trajectory, angle of attack, etc.) for moving objects using captured single or dual view image sequences. At the heart of the system is a set of tracking algorithms that automatically find and quantify the location of user selected image details such as natural test article features or passive fiducials that have been applied to cooperative test articles. This image position data is converted into real world coordinates and rates with user specified information such as the image scale and frame rate. Though tracking methods such as correlation algorithms are typically robust by nature, the accuracy and suitability of each TrackEye tracking algorithm is in general unknown even under good imaging conditions. The challenges of optimal algorithm selection and algorithm performance/measurement uncertainty are even more significant for long range tracking of high-speed targets where temporally varying atmospheric effects degrade the imagery. This paper will present the preliminary results from a controlled test sequence used to characterize the performance of the TrackEye tracking algorithm suite.

  15. New formulations of monotonically convergent quantum control algorithms

    NASA Astrophysics Data System (ADS)

    Maday, Yvon; Turinici, Gabriel

    2003-05-01

    Most of the numerical simulation in quantum (bilinear) control have used one of the monotonically convergent algorithms of Krotov (introduced by Tannor et al.) or of Zhu and Rabitz. However, until now no explicit relationship has been revealed between the two algorithms in order to understand their common properties. Within this framework, we propose in this paper a unified formulation that comprises both algorithms and that extends to a new class of monotonically convergent algorithms. Numerical results show that the newly derived algorithms behave as well as (and sometimes better than) the well-known algorithms cited above.

  16. Wire Detection Algorithms for Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia I.

    2002-01-01

    In this research we addressed the problem of obstacle detection for low altitude rotorcraft flight. In particular, the problem of detecting thin wires in the presence of image clutter and noise was studied. Wires present a serious hazard to rotorcrafts. Since they are very thin, their detection early enough so that the pilot has enough time to take evasive action is difficult, as their images can be less than one or two pixels wide. Two approaches were explored for this purpose. The first approach involved a technique for sub-pixel edge detection and subsequent post processing, in order to reduce the false alarms. After reviewing the line detection literature, an algorithm for sub-pixel edge detection proposed by Steger was identified as having good potential to solve the considered task. The algorithm was tested using a set of images synthetically generated by combining real outdoor images with computer generated wire images. The performance of the algorithm was evaluated both, at the pixel and the wire levels. It was observed that the algorithm performs well, provided that the wires are not too thin (or distant) and that some post processing is performed to remove false alarms due to clutter. The second approach involved the use of an example-based learning scheme namely, Support Vector Machines. The purpose of this approach was to explore the feasibility of an example-based learning based approach for the task of detecting wires from their images. Support Vector Machines (SVMs) have emerged as a promising pattern classification tool and have been used in various applications. It was found that this approach is not suitable for very thin wires and of course, not suitable at all for sub-pixel thick wires. High dimensionality of the data as such does not present a major problem for SVMs. However it is desirable to have a large number of training examples especially for high dimensional data. The main difficulty in using SVMs (or any other example-based learning

  17. Cape of Good Hope: Teacher Description and Project Plan.

    ERIC Educational Resources Information Center

    Moyo, Kimya

    1998-01-01

    Presents detailed information about the Cape of Good Hope project in which pairs of students designed capes and cloaks out of the garbage bags for a fashion show. Also describes student objectives, unit goals, group activities, products required, and the final show and presentation. (ASK)

  18. Algorithm That Synthesizes Other Algorithms for Hashing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2010-01-01

    An algorithm that includes a collection of several subalgorithms has been devised as a means of synthesizing still other algorithms (which could include computer code) that utilize hashing to determine whether an element (typically, a number or other datum) is a member of a set (typically, a list of numbers). Each subalgorithm synthesizes an algorithm (e.g., a block of code) that maps a static set of key hashes to a somewhat linear monotonically increasing sequence of integers. The goal in formulating this mapping is to cause the length of the sequence thus generated to be as close as practicable to the original length of the set and thus to minimize gaps between the elements. The advantage of the approach embodied in this algorithm is that it completely avoids the traditional approach of hash-key look-ups that involve either secondary hash generation and look-up or further searching of a hash table for a desired key in the event of collisions. This algorithm guarantees that it will never be necessary to perform a search or to generate a secondary key in order to determine whether an element is a member of a set. This algorithm further guarantees that any algorithm that it synthesizes can be executed in constant time. To enforce these guarantees, the subalgorithms are formulated to employ a set of techniques, each of which works very effectively covering a certain class of hash-key values. These subalgorithms are of two types, summarized as follows: Given a list of numbers, try to find one or more solutions in which, if each number is shifted to the right by a constant number of bits and then masked with a rotating mask that isolates a set of bits, a unique number is thereby generated. In a variant of the foregoing procedure, omit the masking. Try various combinations of shifting, masking, and/or offsets until the solutions are found. From the set of solutions, select the one that provides the greatest compression for the representation and is executable in the

  19. DACH1: Its Role as a Classifier of Long Term Good Prognosis in Luminal Breast Cancer

    PubMed Central

    Lemetre, Christophe; Allen, Tony; Habashy, Hany O.; Ellis, Ian O.; Rees, Robert; Ball, Graham R.

    2014-01-01

    Background Oestrogen receptor (ER) positive (luminal) tumours account for the largest proportion of females with breast cancer. Theirs is a heterogeneous disease presenting clinical challenges in managing their treatment. Three main biological luminal groups have been identified but clinically these can be distilled into two prognostic groups in which Luminal A are accorded good prognosis and Luminal B correlate with poor prognosis. Further biomarkers are needed to attain classification consensus. Machine learning approaches like Artificial Neural Networks (ANNs) have been used for classification and identification of biomarkers in breast cancer using high throughput data. In this study, we have used an artificial neural network (ANN) approach to identify DACH1 as a candidate luminal marker and its role in predicting clinical outcome in breast cancer is assessed. Materials and methods A reiterative ANN approach incorporating a network inferencing algorithm was used to identify ER-associated biomarkers in a publically available cDNA microarray dataset. DACH1 was identified in having a strong influence on ER associated markers and a positive association with ER. Its clinical relevance in predicting breast cancer specific survival was investigated by statistically assessing protein expression levels after immunohistochemistry in a series of unselected breast cancers, formatted as a tissue microarray. Results Strong nuclear DACH1 staining is more prevalent in tubular and lobular breast cancer. Its expression correlated with ER-alpha positive tumours expressing PgR, epithelial cytokeratins (CK)18/19 and ‘luminal-like’ markers of good prognosis including FOXA1 and RERG (p<0.05). DACH1 is increased in patients showing longer cancer specific survival and disease free interval and reduced metastasis formation (p<0.001). Nuclear DACH1 showed a negative association with markers of aggressive growth and poor prognosis. Conclusion Nuclear DACH1 expression appears to be a

  20. SWAT: a spiking neural network training algorithm for classification problems.

    PubMed

    Wade, John J; McDaid, Liam J; Santos, Jose A; Sayers, Heather M

    2010-11-01

    This paper presents a synaptic weight association training (SWAT) algorithm for spiking neural networks (SNNs). SWAT merges the Bienenstock-Cooper-Munro (BCM) learning rule with spike timing dependent plasticity (STDP). The STDP/BCM rule yields a unimodal weight distribution where the height of the plasticity window associated with STDP is modulated causing stability after a period of training. The SNN uses a single training neuron in the training phase where data associated with all classes is passed to this neuron. The rule then maps weights to the classifying output neurons to reflect similarities in the data across the classes. The SNN also includes both excitatory and inhibitory facilitating synapses which create a frequency routing capability allowing the information presented to the network to be routed to different hidden layer neurons. A variable neuron threshold level simulates the refractory period. SWAT is initially benchmarked against the nonlinearly separable Iris and Wisconsin Breast Cancer datasets. Results presented show that the proposed training algorithm exhibits a convergence accuracy of 95.5% and 96.2% for the Iris and Wisconsin training sets, respectively, and 95.3% and 96.7% for the testing sets, noise experiments show that SWAT has a good generalization capability. SWAT is also benchmarked using an isolated digit automatic speech recognition (ASR) system where a subset of the TI46 speech corpus is used. Results show that with SWAT as the classifier, the ASR system provides an accuracy of 98.875% for training and 95.25% for testing.

  1. The origins of counting algorithms.

    PubMed

    Cantlon, Jessica F; Piantadosi, Steven T; Ferrigno, Stephen; Hughes, Kelly D; Barnard, Allison M

    2015-06-01

    Humans' ability to count by verbally labeling discrete quantities is unique in animal cognition. The evolutionary origins of counting algorithms are not understood. We report that nonhuman primates exhibit a cognitive ability that is algorithmically and logically similar to human counting. Monkeys were given the task of choosing between two food caches. First, they saw one cache baited with some number of food items, one item at a time. Then, a second cache was baited with food items, one at a time. At the point when the second set was approximately equal to the first set, the monkeys spontaneously moved to choose the second set even before that cache was completely baited. Using a novel Bayesian analysis, we show that the monkeys used an approximate counting algorithm for comparing quantities in sequence that is incremental, iterative, and condition controlled. This proto-counting algorithm is structurally similar to formal counting in humans and thus may have been an important evolutionary precursor to human counting. PMID:25953949

  2. A hybrid artificial bee colony algorithm for numerical function optimization

    NASA Astrophysics Data System (ADS)

    Alqattan, Zakaria N.; Abdullah, Rosni

    2015-02-01

    Artificial Bee Colony (ABC) algorithm is one of the swarm intelligence algorithms; it has been introduced by Karaboga in 2005. It is a meta-heuristic optimization search algorithm inspired from the intelligent foraging behavior of the honey bees in nature. Its unique search process made it as one of the most competitive algorithm with some other search algorithms in the area of optimization, such as Genetic algorithm (GA) and Particle Swarm Optimization (PSO). However, the ABC performance of the local search process and the bee movement or the solution improvement equation still has some weaknesses. The ABC is good in avoiding trapping at the local optimum but it spends its time searching around unpromising random selected solutions. Inspired by the PSO, we propose a Hybrid Particle-movement ABC algorithm called HPABC, which adapts the particle movement process to improve the exploration of the original ABC algorithm. Numerical benchmark functions were used in order to experimentally test the HPABC algorithm. The results illustrate that the HPABC algorithm can outperform the ABC algorithm in most of the experiments (75% better in accuracy and over 3 times faster).

  3. A study on ionospheric TEC forecast using genetic algorithm and neural network

    NASA Astrophysics Data System (ADS)

    Huang, Zhi; Yuan, Hong

    Back propagation artificial neural network (ANN) augmented by genetic algorithm (GA) is introduced to forecast ionospheric TEC with the dual-frequency GPS measurements from the low and high solar activity years in this paper due to ionosphere space characterizing by the highly nonlinear and time-varying with random variations. First, with different number of neurons in the hidden layer, different transfer function and training function, the training performance of network model is analyzed and then optimized network structure is determined. The ionospheric TEC values one hour in advance are forecasted and further the prediction performance of the developed network model is evaluated at the given criterions. The results show that predicted TEC using BP neural network improved by genetic algorithm has good agreement with observed data. In addition, the prediction errors are smaller in middle and high latitudes than in low latitudes, smaller in low solar activity than in high solar activity. Compared with BP Network with three layers structure, Prediction precision of network model optimized by genetic algorithm is further improved. The resolution quality indicate that the proposed algorithm can offer a powerful and reliable alternative to the design of ionospheric TEC forecast technologies, and provide advice for the regional ionospheric TEC maps. Key words: Neural network, Genetic algorithm, Ionospheric TEC, Forecast,

  4. Fast algorithm for scaling analysis with higher-order detrending moving average method

    NASA Astrophysics Data System (ADS)

    Tsujimoto, Yutaka; Miki, Yuki; Shimatani, Satoshi; Kiyono, Ken

    2016-05-01

    Among scaling analysis methods based on the root-mean-square deviation from the estimated trend, it has been demonstrated that centered detrending moving average (DMA) analysis with a simple moving average has good performance when characterizing long-range correlation or fractal scaling behavior. Furthermore, higher-order DMA has also been proposed; it is shown to have better detrending capabilities, removing higher-order polynomial trends than original DMA. However, a straightforward implementation of higher-order DMA requires a very high computational cost, which would prevent practical use of this method. To solve this issue, in this study, we introduce a fast algorithm for higher-order DMA, which consists of two techniques: (1) parallel translation of moving averaging windows by a fixed interval; (2) recurrence formulas for the calculation of summations. Our algorithm can significantly reduce computational cost. Monte Carlo experiments show that the computational time of our algorithm is approximately proportional to the data length, although that of the conventional algorithm is proportional to the square of the data length. The efficiency of our algorithm is also shown by a systematic study of the performance of higher-order DMA, such as the range of detectable scaling exponents and detrending capability for removing polynomial trends. In addition, through the analysis of heart-rate variability time series, we discuss possible applications of higher-order DMA.

  5. Seeker optimization algorithm for parameter estimation of time-delay chaotic systems

    NASA Astrophysics Data System (ADS)

    Dai, Chaohua; Chen, Weirong; Li, Lixiang; Zhu, Yunfang; Yang, Yixian

    2011-03-01

    Time-delay chaotic systems have some very interesting properties, and their parameter estimation has received increasing interest in the recent years. It is well known that parameter estimation of a chaotic system is a nonlinear, multivariable, and multimodal optimization problem for which global optimization techniques are required in order to avoid local minima. In this work, a seeker-optimization-algorithm (SOA)-based method is proposed to address this issue. In the SOA, search direction is based on the empirical gradients by evaluating the response to the position changes, and step length is based on uncertainty reasoning by using a simple fuzzy rule. The performance of the algorithm is evaluated on two typical test systems. Moreover, two state-of-the-art algorithms (i.e., particle swarm optimization and differential evolution) are also considered for comparison. The simulation results show that the proposed algorithm is better than or at least as good as the other two algorithms and can effectively solve the parameter estimation problem of time-delay chaotic systems.

  6. Totally parallel multilevel algorithms

    NASA Technical Reports Server (NTRS)

    Frederickson, Paul O.

    1988-01-01

    Four totally parallel algorithms for the solution of a sparse linear system have common characteristics which become quite apparent when they are implemented on a highly parallel hypercube such as the CM2. These four algorithms are Parallel Superconvergent Multigrid (PSMG) of Frederickson and McBryan, Robust Multigrid (RMG) of Hackbusch, the FFT based Spectral Algorithm, and Parallel Cyclic Reduction. In fact, all four can be formulated as particular cases of the same totally parallel multilevel algorithm, which are referred to as TPMA. In certain cases the spectral radius of TPMA is zero, and it is recognized to be a direct algorithm. In many other cases the spectral radius, although not zero, is small enough that a single iteration per timestep keeps the local error within the required tolerance.

  7. Motion Cueing Algorithm Development: Piloted Performance Testing of the Cueing Algorithms

    NASA Technical Reports Server (NTRS)

    Houck, Jacob A. (Technical Monitor); Telban, Robert J.; Cardullo, Frank M.; Kelly, Lon C.

    2005-01-01

    The relative effectiveness in simulating aircraft maneuvers with both current and newly developed motion cueing algorithms was assessed with an eleven-subject piloted performance evaluation conducted on the NASA Langley Visual Motion Simulator (VMS). In addition to the current NASA adaptive algorithm, two new cueing algorithms were evaluated: the optimal algorithm and the nonlinear algorithm. The test maneuvers included a straight-in approach with a rotating wind vector, an offset approach with severe turbulence and an on/off lateral gust that occurs as the aircraft approaches the runway threshold, and a takeoff both with and without engine failure after liftoff. The maneuvers were executed with each cueing algorithm with added visual display delay conditions ranging from zero to 200 msec. Two methods, the quasi-objective NASA Task Load Index (TLX), and power spectral density analysis of pilot control, were used to assess pilot workload. Piloted performance parameters for the approach maneuvers, the vertical velocity upon touchdown and the runway touchdown position, were also analyzed but did not show any noticeable difference among the cueing algorithms. TLX analysis reveals, in most cases, less workload and variation among pilots with the nonlinear algorithm. Control input analysis shows pilot-induced oscillations on a straight-in approach were less prevalent compared to the optimal algorithm. The augmented turbulence cues increased workload on an offset approach that the pilots deemed more realistic compared to the NASA adaptive algorithm. The takeoff with engine failure showed the least roll activity for the nonlinear algorithm, with the least rudder pedal activity for the optimal algorithm.

  8. Efficient Homotopy Continuation Algorithms with Application to Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Brown, David A.

    New homotopy continuation algorithms are developed and applied to a parallel implicit finite-difference Newton-Krylov-Schur external aerodynamic flow solver for the compressible Euler, Navier-Stokes, and Reynolds-averaged Navier-Stokes equations with the Spalart-Allmaras one-equation turbulence model. Many new analysis tools, calculations, and numerical algorithms are presented for the study and design of efficient and robust homotopy continuation algorithms applicable to solving very large and sparse nonlinear systems of equations. Several specific homotopies are presented and studied and a methodology is presented for assessing the suitability of specific homotopies for homotopy continuation. . A new class of homotopy continuation algorithms, referred to as monolithic homotopy continuation algorithms, is developed. These algorithms differ from classical predictor-corrector algorithms by combining the predictor and corrector stages into a single update, significantly reducing the amount of computation and avoiding wasted computational effort resulting from over-solving in the corrector phase. The new algorithms are also simpler from a user perspective, with fewer input parameters, which also improves the user's ability to choose effective parameters on the first flow solve attempt. Conditional convergence is proved analytically and studied numerically for the new algorithms. The performance of a fully-implicit monolithic homotopy continuation algorithm is evaluated for several inviscid, laminar, and turbulent flows over NACA 0012 airfoils and ONERA M6 wings. The monolithic algorithm is demonstrated to be more efficient than the predictor-corrector algorithm for all applications investigated. It is also demonstrated to be more efficient than the widely-used pseudo-transient continuation algorithm for all inviscid and laminar cases investigated, and good performance scaling with grid refinement is demonstrated for the inviscid cases. Performance is also demonstrated

  9. Advancements to the planogram frequency–distance rebinning algorithm

    PubMed Central

    Champley, Kyle M; Raylman, Raymond R; Kinahan, Paul E

    2010-01-01

    In this paper we consider the task of image reconstruction in positron emission tomography (PET) with the planogram frequency–distance rebinning (PFDR) algorithm. The PFDR algorithm is a rebinning algorithm for PET systems with panel detectors. The algorithm is derived in the planogram coordinate system which is a native data format for PET systems with panel detectors. A rebinning algorithm averages over the redundant four-dimensional set of PET data to produce a three-dimensional set of data. Images can be reconstructed from this rebinned three-dimensional set of data. This process enables one to reconstruct PET images more quickly than reconstructing directly from the four-dimensional PET data. The PFDR algorithm is an approximate rebinning algorithm. We show that implementing the PFDR algorithm followed by the (ramp) filtered backprojection (FBP) algorithm in linogram coordinates from multiple views reconstructs a filtered version of our image. We develop an explicit formula for this filter which can be used to achieve exact reconstruction by means of a modified FBP algorithm applied to the stack of rebinned linograms and can also be used to quantify the errors introduced by the PFDR algorithm. This filter is similar to the filter in the planogram filtered backprojection algorithm derived by Brasse et al. The planogram filtered backprojection and exact reconstruction with the PFDR algorithm require complete projections which can be completed with a reprojection algorithm. The PFDR algorithm is similar to the rebinning algorithm developed by Kao et al. By expressing the PFDR algorithm in detector coordinates, we provide a comparative analysis between the two algorithms. Numerical experiments using both simulated data and measured data from a positron emission mammography/tomography (PEM/PET) system are performed. Images are reconstructed by PFDR+FBP (PFDR followed by 2D FBP reconstruction), PFDRX (PFDR followed by the modified FBP algorithm for exact

  10. Good veterinary governance: definition, measurement and challenges.

    PubMed

    Msellati, L; Commault, J; Dehove, A

    2012-08-01

    Good veterinary governance assumes the provision of veterinary services that are sustainably financed, universally available, and provided efficiently without waste or duplication, in a manner that is transparent and free of fraud or corruption. Good veterinary governance is a necessary condition for sustainable economic development insomuch as it promotes the effective delivery of services and improves the overall performance of animal health systems. This article defines governance in Veterinary Services and proposes a framework for its measurement. It also discusses the role of Veterinary Services and analyses the governance dimensions of the performance-assessment tools developed by the World Organisation for Animal Health (OIE). These tools (OIE PVS Tool and PVS Gap Analysis) track the performance of Veterinary Services across countries (a harmonised tool) and over time (the PVS Pathway). The article shows the usefulness of the OIE PVS Tool for measuring governance, but also points to two shortcomings, namely (i) the lack of clear outcome indicators, which is an impediment to a comprehensive assessment of the performance of Veterinary Services, and (ii) the lack of specific measures for assessing the extent of corruption within Veterinary Services and the extent to which demand for better governance is being strengthened within the animal health system. A discussion follows on the drivers of corruption and instruments for perception-based assessments of country governance and corruption. Similarly, the article introduces the concept of social accountability, which is an approach to enhancing government transparency and accountability, and shows how supply-side and demand-side mechanisms complement each other in improving the governance of service delivery. It further elaborates on two instruments--citizen report card surveys and grievance redress mechanisms--because of their wider relevance and their possible applications in many settings, including Veterinary

  11. Good veterinary governance: definition, measurement and challenges.

    PubMed

    Msellati, L; Commault, J; Dehove, A

    2012-08-01

    Good veterinary governance assumes the provision of veterinary services that are sustainably financed, universally available, and provided efficiently without waste or duplication, in a manner that is transparent and free of fraud or corruption. Good veterinary governance is a necessary condition for sustainable economic development insomuch as it promotes the effective delivery of services and improves the overall performance of animal health systems. This article defines governance in Veterinary Services and proposes a framework for its measurement. It also discusses the role of Veterinary Services and analyses the governance dimensions of the performance-assessment tools developed by the World Organisation for Animal Health (OIE). These tools (OIE PVS Tool and PVS Gap Analysis) track the performance of Veterinary Services across countries (a harmonised tool) and over time (the PVS Pathway). The article shows the usefulness of the OIE PVS Tool for measuring governance, but also points to two shortcomings, namely (i) the lack of clear outcome indicators, which is an impediment to a comprehensive assessment of the performance of Veterinary Services, and (ii) the lack of specific measures for assessing the extent of corruption within Veterinary Services and the extent to which demand for better governance is being strengthened within the animal health system. A discussion follows on the drivers of corruption and instruments for perception-based assessments of country governance and corruption. Similarly, the article introduces the concept of social accountability, which is an approach to enhancing government transparency and accountability, and shows how supply-side and demand-side mechanisms complement each other in improving the governance of service delivery. It further elaborates on two instruments--citizen report card surveys and grievance redress mechanisms--because of their wider relevance and their possible applications in many settings, including Veterinary

  12. 31 CFR 575.414 - Imports of Iraqi goods and purchases of goods from Iraq.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 3 2010-07-01 2010-07-01 false Imports of Iraqi goods and purchases of goods from Iraq. 575.414 Section 575.414 Money and Finance: Treasury Regulations Relating to Money... REGULATIONS Interpretations § 575.414 Imports of Iraqi goods and purchases of goods from Iraq....

  13. A set-covering based heuristic algorithm for the periodic vehicle routing problem.

    PubMed

    Cacchiani, V; Hemmelmayr, V C; Tricoire, F

    2014-01-30

    We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011)  [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.

  14. Detection of Cheating by Decimation Algorithm

    NASA Astrophysics Data System (ADS)

    Yamanaka, Shogo; Ohzeki, Masayuki; Decelle, Aurélien

    2015-02-01

    We expand the item response theory to study the case of "cheating students" for a set of exams, trying to detect them by applying a greedy algorithm of inference. This extended model is closely related to the Boltzmann machine learning. In this paper we aim to infer the correct biases and interactions of our model by considering a relatively small number of sets of training data. Nevertheless, the greedy algorithm that we employed in the present study exhibits good performance with a few number of training data. The key point is the sparseness of the interactions in our problem in the context of the Boltzmann machine learning: the existence of cheating students is expected to be very rare (possibly even in real world). We compare a standard approach to infer the sparse interactions in the Boltzmann machine learning to our greedy algorithm and we find the latter to be superior in several aspects.

  15. The fuzzy C spherical shells algorithm - A new approach

    NASA Technical Reports Server (NTRS)

    Krishnapuram, Raghu; Nasraoui, Olfa; Frigui, Hichem

    1992-01-01

    The fuzzy c spherical shells (FCSS) algorithm is specially designed to search for clusters that can be described by circular arcs or, more generally, by shells of hyperspheres. In this paper, a new approach to the FCSS algorithm is presented. This algorithm is computationally and implementationally simpler than other clustering algorithms that have been suggested for this purpose. An unsupervised algorithm which automatically finds the optimum number of clusters is also proposed. This algorithm can be used when the number of clusters is not known. It uses a cluster validity measure to identify good clusters, merges all compatible clusters, and eliminates spurious clusters to achieve the final result. Experimental results on several data sets are presented.

  16. Recent Advancements in Lightning Jump Algorithm Work

    NASA Technical Reports Server (NTRS)

    Schultz, Christopher J.; Petersen, Walter A.; Carey, Lawrence D.

    2010-01-01

    In the past year, the primary objectives were to show the usefulness of total lightning as compared to traditional cloud-to-ground (CG) networks, test the lightning jump algorithm configurations in other regions of the country, increase the number of thunderstorms within our thunderstorm database, and to pinpoint environments that could prove difficult for any lightning jump configuration. A total of 561 thunderstorms have been examined in the past year (409 non-severe, 152 severe) from four regions of the country (North Alabama, Washington D.C., High Plains of CO/KS, and Oklahoma). Results continue to indicate that the 2 lightning jump algorithm configuration holds the most promise in terms of prospective operational lightning jump algorithms, with a probability of detection (POD) at 81%, a false alarm rate (FAR) of 45%, a critical success index (CSI) of 49% and a Heidke Skill Score (HSS) of 0.66. The second best performing algorithm configuration was the Threshold 4 algorithm, which had a POD of 72%, FAR of 51%, a CSI of 41% and an HSS of 0.58. Because a more complex algorithm configuration shows the most promise in terms of prospective operational lightning jump algorithms, accurate thunderstorm cell tracking work must be undertaken to track lightning trends on an individual thunderstorm basis over time. While these numbers for the 2 configuration are impressive, the algorithm does have its weaknesses. Specifically, low-topped and tropical cyclone thunderstorm environments are present issues for the 2 lightning jump algorithm, because of the suppressed vertical depth impact on overall flash counts (i.e., a relative dearth in lightning). For example, in a sample of 120 thunderstorms from northern Alabama that contained 72 missed events by the 2 algorithm 36% of the misses were associated with these two environments (17 storms).

  17. Fourth Order Algorithms for Solving Diverse Many-Body Problems

    NASA Astrophysics Data System (ADS)

    Chin, Siu A.; Forbert, Harald A.; Chen, Chia-Rong; Kidwell, Donald W.; Ciftja, Orion

    2001-03-01

    We show that the method of factorizing an evolution operator of the form e^ɛ(A+B) to fourth order with purely positive coefficient yields new classes of symplectic algorithms for solving classical dynamical problems, unitary algorithms for solving the time-dependent Schrödinger equation, norm preserving algorithms for solving the Langevin equation and large time step convergent Diffusion Monte Carlo algorithms. Results for each class of problems will be presented and disucss

  18. Genetic Algorithm Tuned Fuzzy Logic for Gliding Return Trajectories

    NASA Technical Reports Server (NTRS)

    Burchett, Bradley T.

    2003-01-01

    The problem of designing and flying a trajectory for successful recovery of a reusable launch vehicle is tackled using fuzzy logic control with genetic algorithm optimization. The plant is approximated by a simplified three degree of freedom non-linear model. A baseline trajectory design and guidance algorithm consisting of several Mamdani type fuzzy controllers is tuned using a simple genetic algorithm. Preliminary results show that the performance of the overall system is shown to improve with genetic algorithm tuning.

  19. An incremental clustering algorithm based on Mahalanobis distance

    NASA Astrophysics Data System (ADS)

    Aik, Lim Eng; Choon, Tan Wee

    2014-12-01

    Classical fuzzy c-means clustering algorithm is insufficient to cluster non-spherical or elliptical distributed datasets. The paper replaces classical fuzzy c-means clustering euclidean distance with Mahalanobis distance. It applies Mahalanobis distance to incremental learning for its merits. A Mahalanobis distance based fuzzy incremental clustering learning algorithm is proposed. Experimental results show the algorithm is an effective remedy for the defect in fuzzy c-means algorithm but also increase training accuracy.

  20. Evaluation of prostate segmentation algorithms for MRI: the PROMISE12 challenge.

    PubMed

    Litjens, Geert; Toth, Robert; van de Ven, Wendy; Hoeks, Caroline; Kerkstra, Sjoerd; van Ginneken, Bram; Vincent, Graham; Guillard, Gwenael; Birbeck, Neil; Zhang, Jindang; Strand, Robin; Malmberg, Filip; Ou, Yangming; Davatzikos, Christos; Kirschner, Matthias; Jung, Florian; Yuan, Jing; Qiu, Wu; Gao, Qinquan; Edwards, Philip Eddie; Maan, Bianca; van der Heijden, Ferdinand; Ghose, Soumya; Mitra, Jhimli; Dowling, Jason; Barratt, Dean; Huisman, Henkjan; Madabhushi, Anant

    2014-02-01

    -atlas registration, both on accuracy and computation time. Although average algorithm performance was good to excellent and the Imorphics algorithm outperformed the second observer on average, we showed that algorithm combination might lead to further improvement, indicating that optimal performance for prostate segmentation is not yet obtained. All results are available online at http://promise12.grand-challenge.org/.

  1. Evaluation of prostate segmentation algorithms for MRI: the PROMISE12 challenge.

    PubMed

    Litjens, Geert; Toth, Robert; van de Ven, Wendy; Hoeks, Caroline; Kerkstra, Sjoerd; van Ginneken, Bram; Vincent, Graham; Guillard, Gwenael; Birbeck, Neil; Zhang, Jindang; Strand, Robin; Malmberg, Filip; Ou, Yangming; Davatzikos, Christos; Kirschner, Matthias; Jung, Florian; Yuan, Jing; Qiu, Wu; Gao, Qinquan; Edwards, Philip Eddie; Maan, Bianca; van der Heijden, Ferdinand; Ghose, Soumya; Mitra, Jhimli; Dowling, Jason; Barratt, Dean; Huisman, Henkjan; Madabhushi, Anant

    2014-02-01

    -atlas registration, both on accuracy and computation time. Although average algorithm performance was good to excellent and the Imorphics algorithm outperformed the second observer on average, we showed that algorithm combination might lead to further improvement, indicating that optimal performance for prostate segmentation is not yet obtained. All results are available online at http://promise12.grand-challenge.org/. PMID:24418598

  2. Fast proximity algorithm for MAP ECT reconstruction

    NASA Astrophysics Data System (ADS)

    Li, Si; Krol, Andrzej; Shen, Lixin; Xu, Yuesheng

    2012-03-01

    We arrived at the fixed-point formulation of the total variation maximum a posteriori (MAP) regularized emission computed tomography (ECT) reconstruction problem and we proposed an iterative alternating scheme to numerically calculate the fixed point. We theoretically proved that our algorithm converges to unique solutions. Because the obtained algorithm exhibits slow convergence speed, we further developed the proximity algorithm in the transformed image space, i.e. the preconditioned proximity algorithm. We used the bias-noise curve method to select optimal regularization hyperparameters for both our algorithm and expectation maximization with total variation regularization (EM-TV). We showed in the numerical experiments that our proposed algorithms, with an appropriately selected preconditioner, outperformed conventional EM-TV algorithm in many critical aspects, such as comparatively very low noise and bias for Shepp-Logan phantom. This has major ramification for nuclear medicine because clinical implementation of our preconditioned fixed-point algorithms might result in very significant radiation dose reduction in the medical applications of emission tomography.

  3. Quantum Algorithm for Linear Programming Problems

    NASA Astrophysics Data System (ADS)

    Joag, Pramod; Mehendale, Dhananjay

    The quantum algorithm (PRL 103, 150502, 2009) solves a system of linear equations with exponential speedup over existing classical algorithms. We show that the above algorithm can be readily adopted in the iterative algorithms for solving linear programming (LP) problems. The first iterative algorithm that we suggest for LP problem follows from duality theory. It consists of finding nonnegative solution of the equation forduality condition; forconstraints imposed by the given primal problem and for constraints imposed by its corresponding dual problem. This problem is called the problem of nonnegative least squares, or simply the NNLS problem. We use a well known method for solving the problem of NNLS due to Lawson and Hanson. This algorithm essentially consists of solving in each iterative step a new system of linear equations . The other iterative algorithms that can be used are those based on interior point methods. The same technique can be adopted for solving network flow problems as these problems can be readily formulated as LP problems. The suggested quantum algorithm cansolveLP problems and Network Flow problems of very large size involving millions of variables.

  4. A novel algorithm for Bluetooth ECG.

    PubMed

    Pandya, Utpal T; Desai, Uday B

    2012-11-01

    In wireless transmission of ECG, data latency will be significant when battery power level and data transmission distance are not maintained. In applications like home monitoring or personalized care, to overcome the joint effect of previous issues of wireless transmission and other ECG measurement noises, a novel filtering strategy is required. Here, a novel algorithm, identified as peak rejection adaptive sampling modified moving average (PRASMMA) algorithm for wireless ECG is introduced. This algorithm first removes error in bit pattern of received data if occurred in wireless transmission and then removes baseline drift. Afterward, a modified moving average is implemented except in the region of each QRS complexes. The algorithm also sets its filtering parameters according to different sampling rate selected for acquisition of signals. To demonstrate the work, a prototyped Bluetooth-based ECG module is used to capture ECG with different sampling rate and in different position of patient. This module transmits ECG wirelessly to Bluetooth-enabled devices where the PRASMMA algorithm is applied on captured ECG. The performance of PRASMMA algorithm is compared with moving average and S-Golay algorithms visually as well as numerically. The results show that the PRASMMA algorithm can significantly improve the ECG reconstruction by efficiently removing the noise and its use can be extended to any parameters where peaks are importance for diagnostic purpose.

  5. Online Planning Algorithms for POMDPs

    PubMed Central

    Ross, Stéphane; Pineau, Joelle; Paquet, Sébastien; Chaib-draa, Brahim

    2009-01-01

    Partially Observable Markov Decision Processes (POMDPs) provide a rich framework for sequential decision-making under uncertainty in stochastic domains. However, solving a POMDP is often intractable except for small problems due to their complexity. Here, we focus on online approaches that alleviate the computational complexity by computing good local policies at each decision step during the execution. Online algorithms generally consist of a lookahead search to find the best action to execute at each time step in an environment. Our objectives here are to survey the various existing online POMDP methods, analyze their properties and discuss their advantages and disadvantages; and to thoroughly evaluate these online approaches in different environments under various metrics (return, error bound reduction, lower bound improvement). Our experimental results indicate that state-of-the-art online heuristic search methods can handle large POMDP domains efficiently. PMID:19777080

  6. Radiation Hormesis: The Good, the Bad, and the Ugly

    PubMed Central

    Luckey, T.D.

    2006-01-01

    Three aspects of hormesis with low doses of ionizing radiation are presented: the good, the bad, and the ugly. The good is acceptance by France, Japan, and China of the thousands of studies showing stimulation and/or benefit, with no harm, from low dose irradiation. This includes thousands of people who live in good health with high background radiation. The bad is the nonacceptance of radiation hormesis by the U. S. and most other governments; their linear no threshold (LNT) concept promulgates fear of all radiation and produces laws which have no basis in mammalian physiology. The LNT concept leads to poor health, unreasonable medicine and oppressed industries. The ugly is decades of deception by medical and radiation committees which refuse to consider valid evidence of radiation hormesis in cancer, other diseases, and health. Specific examples are provided for the good, the bad, and the ugly in radiation hormesis. PMID:18648595

  7. Elderly Consumers and the Used Goods Economy.

    ERIC Educational Resources Information Center

    Dobbs, Ralph C.

    A study examined the used goods market as it affects older adults. A set of open-ended questions was administered to 100 respondents over sixty years of age who were either retired or near retirement, married or widowed, and suburban or rural. Interviews were conducted to derermine the effects of the used goods market on the elderly consumer, to…

  8. How To Achieve Good Library Acoustics.

    ERIC Educational Resources Information Center

    Wiens, Janet

    2003-01-01

    Discusses how to create a good acoustical environment for college libraries, focusing on requirements related to the HVAC system and lighting, and noting the importance of good maintenance. A sidebar looks at how to design and achieve the most appropriate HVAC and lighting systems for optimum library acoustics. (SM)

  9. Paleolithic Counseling - The Good Old Days.

    ERIC Educational Resources Information Center

    King, Paul T.

    This paper outlines what clients were like in the "Good Ol' Days", as compared with what they are like now. Formerly clients appeared to come in with a plethora of ego energy, while now it seems more like a depletion. Explicit in our culture now is the idea that it is almost healthy and good to publicize one's private experience. Some of…

  10. Static and evolutionary quantum public goods games

    NASA Astrophysics Data System (ADS)

    Liao, Zeyang; Qin, Gan; Hu, Lingzhi; Li, Songjian; Xu, Nanyang; Du, Jiangfeng

    2008-05-01

    We apply the continuous-variable quantization scheme to quantize public goods game and find that new pure strategy Nash equilibria emerge in the static case. Furthermore, in the evolutionary public goods game, entanglement can also contribute to the persistence of cooperation under various population structures without altruism, voluntary participation, and punishment.

  11. 19 CFR 10.810 - Originating goods.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; (2) The good is a new or different article of commerce, as defined in § 10.809(i) of this subpart, that has been grown, produced, or manufactured in the territory of one or both of the Parties, is...-originating materials used in the production of the good undergoes an applicable change in...

  12. 19 CFR 10.770 - Originating goods.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; (2) The good is a new or different article of commerce, as defined in § 10.769(i) of this subpart, that has been grown, produced, or manufactured in the territory of one or both of the Parties, is...-originating materials used in the production of the good undergoes an applicable change in...

  13. Feedback after Good Trials Enhances Learning

    ERIC Educational Resources Information Center

    Chiviacowsky, Suzete; Wulf, Gabriele

    2007-01-01

    Recent studies (Chiviacowsky & Wulf, 2002, 2005) have shown that learners prefer to receive feedback after they believe they had a "good" rather than "poor" trial. The present study followed up on this finding and examined whether learning would benefit if individuals received feedback after good relative to poor trials. Participants practiced a…

  14. The Good Friends Volunteer Program Evaluation Report.

    ERIC Educational Resources Information Center

    Hooper, Richard

    This evaluation report relates data pertaining to the 1975-76 school year. The Good Friends Volunteer Program was established in 1974. During the 1975-76 school year, over 3,000 volunteers in 110 schools participated in the Good Friends program. Duties included giving individual attention to students; enriching programs in such areas as music,…

  15. Student View: What Do Good Teachers Do?

    ERIC Educational Resources Information Center

    Educational Horizons, 2012

    2012-01-01

    Students know what good teaching looks like--but educators rarely ask them. See what these high school students, who are members of the Future Educators Association[R] and want to be teachers themselves, said. FEA is a part of the PDK family of education associations, which includes Pi Lambda Theta. Get insider advice on good teaching from some…

  16. Toward an Aristotelian Conception of Good Listening

    ERIC Educational Resources Information Center

    Rice, Suzanne

    2011-01-01

    In this essay Suzanne Rice examines Aristotle's ideas about virtue, character, and education as elements in an Aristotelian conception of good listening. Rice begins by surveying of several different contexts in which listening typically occurs, using this information to introduce the argument that what should count as "good listening" must be…

  17. Learning to Show You're Listening

    ERIC Educational Resources Information Center

    Ward, Nigel G.; Escalante, Rafael; Al Bayyari, Yaffa; Solorio, Thamar

    2007-01-01

    Good listeners generally produce back-channel feedback, that is, short utterances such as "uh-huh" which signal active listening. As the rules governing back-channeling vary from language to language, second-language learners may need help acquiring this skill. This paper is an initial exploration of how to provide this. It presents a training…

  18. Quantifying capital goods for waste incineration.

    PubMed

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted.

  19. How to pick a good fight.

    PubMed

    Joni, Saj-nicole A; Beyer, Damon

    2009-12-01

    Peace and harmony are overrated. Though conflict-free teamwork is often held up as the be-all and end-all of organizational life, it actually can be the worst thing to ever happen to a company. Look at Lehman Brothers. When Dick Fuld took over, he transformed a notoriously contentious workplace into one of Wall Street's most harmonious firms. But his efforts backfired--directors and managers became too agreeable, afraid to rock the boat by pointing out that the firm was heading into a crisis. Research shows that the single greatest predictor of poor company performance is complacency, which is why every organization needs a healthy dose of dissent. Not all kinds of conflict are productive, of course -companies need to find the right balance of alignment and competition and make sure that people's energies are pointed in a positive direction. In this article, two seasoned business advisers lay down ground rules for the right kinds of fights. First, the stakes must be worthwhile: The issue should involve a noble purpose or create noticeable--preferably game-changing--value. Next, good fights focus on the future; they're never about placing blame for the past. And it's critical for leaders to keep fights sportsmanlike, allow informal give-and-take in the trenches, and help soften the blow for the losing parties. PMID:19968056

  20. Developing a game plan for good sportsmanship.

    PubMed

    Lodl, Kathleen

    2005-01-01

    It is widely believed in the United States that competition is beneficial for youngsters. However, the media are full of examples of players, fans, and coaches whose behavior veers out of control. There have been well-documented examples of youth in livestock competitions illegally medicating show animals to make them appear calmer, officials biasing their rulings toward a team that will take the most fans to a playoff game, and team rivalries that have become so caustic as to be dangerous for competitors and fans. A university extension and its partners created a program called "Great Fans. Great Sports." in order to teach the kinds of behaviors we wish to instill among all who are involved in competitions. It requires entire communities to develop and implement plans for enhancing sportsmanship in music, debate, drama, 4-H, and other arenas, as well as sports. The goal is to make good sportsmanship not the exception but the norm. The authors provide anecdotal evidence that "Great Fans. Great Sports." is having a positive impact on the attitudes and behaviors of competitors, fans, and communities.

  1. Quantifying capital goods for waste incineration.

    PubMed

    Brogaard, L K; Riber, C; Christensen, T H

    2013-06-01

    Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000-240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000-26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000-5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7-14 kg CO2 per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2-3% with respect to kg CO2 per tonne of waste combusted. PMID:23561797

  2. Developing a game plan for good sportsmanship.

    PubMed

    Lodl, Kathleen

    2005-01-01

    It is widely believed in the United States that competition is beneficial for youngsters. However, the media are full of examples of players, fans, and coaches whose behavior veers out of control. There have been well-documented examples of youth in livestock competitions illegally medicating show animals to make them appear calmer, officials biasing their rulings toward a team that will take the most fans to a playoff game, and team rivalries that have become so caustic as to be dangerous for competitors and fans. A university extension and its partners created a program called "Great Fans. Great Sports." in order to teach the kinds of behaviors we wish to instill among all who are involved in competitions. It requires entire communities to develop and implement plans for enhancing sportsmanship in music, debate, drama, 4-H, and other arenas, as well as sports. The goal is to make good sportsmanship not the exception but the norm. The authors provide anecdotal evidence that "Great Fans. Great Sports." is having a positive impact on the attitudes and behaviors of competitors, fans, and communities. PMID:16570883

  3. Algorithm for Autonomous Landing

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki

    2011-01-01

    Because of their small size, high maneuverability, and easy deployment, micro aerial vehicles (MAVs) are used for a wide variety of both civilian and military missions. One of their current drawbacks is the vast array of sensors (such as GPS, altimeter, radar, and the like) required to make a landing. Due to the MAV s small payload size, this is a major concern. Replacing the imaging sensors with a single monocular camera is sufficient to land a MAV. By applying optical flow algorithms to images obtained from the camera, time-to-collision can be measured. This is a measurement of position and velocity (but not of absolute distance), and can avoid obstacles as well as facilitate a landing on a flat surface given a set of initial conditions. The key to this approach is to calculate time-to-collision based on some image on the ground. By holding the angular velocity constant, horizontal speed decreases linearly with the height, resulting in a smooth landing. Mathematical proofs show that even with actuator saturation or modeling/ measurement uncertainties, MAVs can land safely. Landings of this nature may have a higher velocity than is desirable, but this can be compensated for by a cushioning or dampening system, or by using a system of legs to grab onto a surface. Such a monocular camera system can increase vehicle payload size (or correspondingly reduce vehicle size), increase speed of descent, and guarantee a safe landing by directly correlating speed to height from the ground.

  4. Cyclic cooling algorithm

    SciTech Connect

    Rempp, Florian; Mahler, Guenter; Michel, Mathias

    2007-09-15

    We introduce a scheme to perform the cooling algorithm, first presented by Boykin et al. in 2002, for an arbitrary number of times on the same set of qbits. We achieve this goal by adding an additional SWAP gate and a bath contact to the algorithm. This way one qbit may repeatedly be cooled without adding additional qbits to the system. By using a product Liouville space to model the bath contact we calculate the density matrix of the system after a given number of applications of the algorithm.

  5. Network-Control Algorithm

    NASA Technical Reports Server (NTRS)

    Chan, Hak-Wai; Yan, Tsun-Yee

    1989-01-01

    Algorithm developed for optimal routing of packets of data along links of multilink, multinode digital communication network. Algorithm iterative and converges to cost-optimal assignment independent of initial assignment. Each node connected to other nodes through links, each containing number of two-way channels. Algorithm assigns channels according to message traffic leaving and arriving at each node. Modified to take account of different priorities among packets belonging to different users by using different delay constraints or imposing additional penalties via cost function.

  6. New stereo matching algorithm

    NASA Astrophysics Data System (ADS)

    Ahmed, Yasser A.; Afifi, Hossam; Rubino, Gerardo

    1999-05-01

    This paper present a new algorithm for stereo matching. The main idea is to decompose the original problem into independent hierarchical and more elementary problems that can be solved faster without any complicated mathematics using BBD. To achieve that, we use a new image feature called 'continuity feature' instead of classical noise. This feature can be extracted from any kind of images by a simple process and without using a searching technique. A new matching technique is proposed to match the continuity feature. The new algorithm resolves the main disadvantages of feature based stereo matching algorithms.

  7. Algorithms for remote estimation of chlorophyll-a in coastal and inland waters using red and near infrared bands.

    PubMed

    Gilerson, Alexander A; Gitelson, Anatoly A; Zhou, Jing; Gurlin, Daniela; Moses, Wesley; Ioannou, Ioannis; Ahmed, Samir A

    2010-11-01

    Remote sensing algorithms that use red and NIR bands for the estimation of chlorophyll-a concentration [Chl] can be more effective in inland and coastal waters than algorithms that use blue and green bands. We tested such two-band and three-band red-NIR algorithms using comprehensive synthetic data sets of reflectance spectra and inherent optical properties related to various water parameters and a very consistent in situ data set from several lakes in Nebraska, USA. The two-band algorithms tested with MERIS bands were Rrs(708)/Rrs(665) and Rrs(753)/Rrs(665). The three-band algorithm with MERIS bands was in the form R3=[Rrs(-1)(665)-Rrs(-1)(708)]×Rrs(753). It is shown that the relationships of both Rrs(708)/Rrs(665) and R3 with [Chl] do not depend much on the absorption by CDOM and non-algal particles, or the backscattering properties of water constituents, and can be defined in terms of water absorption coefficients at the respective bands as well as the phytoplankton specific absorption coefficient at 665 nm. The relationship of the latter with [Chl] was established for [Chl]>1 mg/m3 and then further used to develop algorithms which showed a very good match with field data and should not require regional tuning. PMID:21164758

  8. A discrete artificial bee colony algorithm incorporating differential evolution for the flow-shop scheduling problem with blocking

    NASA Astrophysics Data System (ADS)

    Han, Yu-Yan; Gong, Dunwei; Sun, Xiaoyan

    2015-07-01

    A flow-shop scheduling problem with blocking has important applications in a variety of industrial systems but is underrepresented in the research literature. In this study, a novel discrete artificial bee colony (ABC) algorithm is presented to solve the above scheduling problem with a makespan criterion by incorporating the ABC with differential evolution (DE). The proposed algorithm (DE-ABC) contains three key operators. One is related to the employed bee operator (i.e. adopting mutation and crossover operators of discrete DE to generate solutions with good quality); the second is concerned with the onlooker bee operator, which modifies the selected solutions using insert or swap operators based on the self-adaptive strategy; and the last is for the local search, that is, the insert-neighbourhood-based local search with a small probability is adopted to improve the algorithm's capability in exploitation. The performance of the proposed DE-ABC algorithm is empirically evaluated by applying it to well-known benchmark problems. The experimental results show that the proposed algorithm is superior to the compared algorithms in minimizing the makespan criterion.

  9. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations.

    PubMed

    Koch, Nicholas C; Newhauser, Wayne D

    2010-02-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  10. Data quality system using reference dictionaries and edit distance algorithms

    NASA Astrophysics Data System (ADS)

    Karbarz, Radosław; Mulawka, Jan

    2015-09-01

    The real art of management it is important to make smart decisions, what in most of the cases is not a trivial task. Those decisions may lead to determination of production level, funds allocation for investments etc. Most of the parameters in decision-making process such as: interest rate, goods value or exchange rate may change. It is well know that these parameters in the decision-making are based on the data contained in datamarts or data warehouse. However, if the information derived from the processed data sets is the basis for the most important management decisions, it is required that the data is accurate, complete and current. In order to achieve high quality data and to gain from them measurable business benefits, data quality system should be used. The article describes the approach to the problem, shows the algorithms in details and their usage. Finally the test results are provide. Test results show the best algorithms (in terms of quality and quantity) for different parameters and data distribution.

  11. Design and implementation of parallel multigrid algorithms

    NASA Technical Reports Server (NTRS)

    Chan, Tony F.; Tuminaro, Ray S.

    1988-01-01

    Techniques for mapping multigrid algorithms to solve elliptic PDEs on hypercube parallel computers are described and demonstrated. The need for proper data mapping to minimize communication distances is stressed, and an execution-time model is developed to show how algorithm efficiency is affected by changes in the machine and algorithm parameters. Particular attention is then given to the case of coarse computational grids, which can lead to idle processors, load imbalances, and inefficient performance. It is shown that convergence can be improved by using idle processors to solve a new problem concurrently on the fine grid defined by a splitting.

  12. Quantum hyperparallel algorithm for matrix multiplication.

    PubMed

    Zhang, Xin-Ding; Zhang, Xiao-Ming; Xue, Zheng-Yuan

    2016-01-01

    Hyperentangled states, entangled states with more than one degree of freedom, are considered as promising resource in quantum computation. Here we present a hyperparallel quantum algorithm for matrix multiplication with time complexity O(N(2)), which is better than the best known classical algorithm. In our scheme, an N dimensional vector is mapped to the state of a single source, which is separated to N paths. With the assistance of hyperentangled states, the inner product of two vectors can be calculated with a time complexity independent of dimension N. Our algorithm shows that hyperparallel quantum computation may provide a useful tool in quantum machine learning and "big data" analysis. PMID:27125586

  13. Quantum hyperparallel algorithm for matrix multiplication

    NASA Astrophysics Data System (ADS)

    Zhang, Xin-Ding; Zhang, Xiao-Ming; Xue, Zheng-Yuan

    2016-04-01

    Hyperentangled states, entangled states with more than one degree of freedom, are considered as promising resource in quantum computation. Here we present a hyperparallel quantum algorithm for matrix multiplication with time complexity O(N2), which is better than the best known classical algorithm. In our scheme, an N dimensional vector is mapped to the state of a single source, which is separated to N paths. With the assistance of hyperentangled states, the inner product of two vectors can be calculated with a time complexity independent of dimension N. Our algorithm shows that hyperparallel quantum computation may provide a useful tool in quantum machine learning and “big data” analysis.

  14. Estimating the granularity coefficient of a Potts-Markov random field within a Markov chain Monte Carlo algorithm.

    PubMed

    Pereyra, Marcelo; Dobigeon, Nicolas; Batatia, Hadj; Tourneret, Jean-Yves

    2013-06-01

    This paper addresses the problem of estimating the Potts parameter β jointly with the unknown parameters of a Bayesian model within a Markov chain Monte Carlo (MCMC) algorithm. Standard MCMC methods cannot be applied to this problem because performing inference on β requires computing the intractable normalizing constant of the Potts model. In the proposed MCMC method, the estimation of β is conducted using a likelihood-free Metropolis-Hastings algorithm. Experimental results obtained for synthetic data show that estimating β jointly with the other unknown parameters leads to estimation results that are as good as those obtained with the actual value of β. On the other hand, choosing an incorrect value of β can degrade estimation performance significantly. To illustrate the interest of this method, the proposed algorithm is successfully applied to real bidimensional SAR and tridimensional ultrasound images.

  15. A novel kernel extreme learning machine algorithm based on self-adaptive artificial bee colony optimisation strategy

    NASA Astrophysics Data System (ADS)

    Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Ji, Jin-Chao

    2016-04-01

    In this paper, we propose a novel learning algorithm, named SABC-MKELM, based on a kernel extreme learning machine (KELM) method for single-hidden-layer feedforward networks. In SABC-MKELM, the combination of Gaussian kernels is used as the activate function of KELM instead of simple fixed kernel learning, where the related parameters of kernels and the weights of kernels can be optimised by a novel self-adaptive artificial bee colony (SABC) approach simultaneously. SABC-MKELM outperforms six other state-of-the-art approaches in general, as it could effectively determine solution updating strategies and suitable parameters to produce a flexible kernel function involved in SABC. Simulations have demonstrated that the proposed algorithm not only self-adaptively determines suitable parameters and solution updating strategies learning from the previous experiences, but also achieves better generalisation performances than several related methods, and the results show good stability of the proposed algorithm.

  16. [The myth of the good savage].

    PubMed

    Yampey, N

    1994-09-01

    The conquest of the New World gave way to the myth of the Good Savage. For the Renaissance intellectuals, the ancient ideas about the Golden Age (an ideal society promising an unending bliss) seemed to be brought back to life at last. Sharply contrasting with the European exacerbated unrest of the time, America stood for a redeeming hope, a symbol of a better future. The myth of the Good Savage assumes people to be naturally good, but civilization has led them into the realm of violence, hatred, and cruelty. Besides being naturally good, nice-minded people, "good savages" were also useful, obedient people, most likely to be easily exploited by Europeans--a source for the historical drama to come. On the verge of freeing itself from the Spanish rule, Latin America--fighting its way toward independence, had three enlightened mentors: Voltaire, Rousseau, and Montesquieu. There, again, another deep contrast arose between the abstract characteristics of Latin American aims to perfection, and people's actual behaviors. The former "good savage" became the modern "Latin American" embodying an utopia as well as a hope in his eagerness for setting up a plural, and humanized culture. The myth of the Good Savage represents a deep longing for an objectivation of the ego-ideal: it has been used, so to speak, in collective mobilizations as well as dogmatic crystallizations, to escape from ignominous realities or to project alternatives for a better socially-shared life. PMID:7872031

  17. [The myth of the good savage].

    PubMed

    Yampey, N

    1994-09-01

    The conquest of the New World gave way to the myth of the Good Savage. For the Renaissance intellectuals, the ancient ideas about the Golden Age (an ideal society promising an unending bliss) seemed to be brought back to life at last. Sharply contrasting with the European exacerbated unrest of the time, America stood for a redeeming hope, a symbol of a better future. The myth of the Good Savage assumes people to be naturally good, but civilization has led them into the realm of violence, hatred, and cruelty. Besides being naturally good, nice-minded people, "good savages" were also useful, obedient people, most likely to be easily exploited by Europeans--a source for the historical drama to come. On the verge of freeing itself from the Spanish rule, Latin America--fighting its way toward independence, had three enlightened mentors: Voltaire, Rousseau, and Montesquieu. There, again, another deep contrast arose between the abstract characteristics of Latin American aims to perfection, and people's actual behaviors. The former "good savage" became the modern "Latin American" embodying an utopia as well as a hope in his eagerness for setting up a plural, and humanized culture. The myth of the Good Savage represents a deep longing for an objectivation of the ego-ideal: it has been used, so to speak, in collective mobilizations as well as dogmatic crystallizations, to escape from ignominous realities or to project alternatives for a better socially-shared life.

  18. Authenticated algorithms for Byzantine agreement

    SciTech Connect

    Dolev, D.; Strong, H.R.

    1983-11-01

    Reaching agreement in a distributed system in the presence of fault processors is a central issue for reliable computer systems. Using an authentication protocol, one can limit the undetected behavior of faulty processors to a simple failure to relay messages to all intended targets. In this paper the authors show that, in spite of such an ability to limit faulty behavior, and no matter what message types or protocols are allowed, reaching (Byzantine) agreement requires at least t+1 phases or rounds of information exchange, where t is an upper bound on the number of faulty processors. They present algorithms for reaching agreement based on authentication that require a total number of messages sent by correctly operating processors that is polynomial in both t and the number of processors, n. The best algorithm uses only t+1 phases and o(nt) messages. 9 references.

  19. Modeling algorithm execution time on processor arrays

    NASA Technical Reports Server (NTRS)

    Adams, L. M.; Crockett, T. W.

    1984-01-01

    An approach to modelling the execution time of algorithms on parallel arrays is presented. This time is expressed as a function of the number of processors and system parameters. The resulting model has been applied to a parallel implementation of the conjugate-gradient algorithm on NASA's FEM. Results of experiments performed to compare the model predictions against actual behavior show that the floating-point arithmetic, communication, and synchronization components of the parallel algorithm execution time were correctly modelled. The results also show that the overhead caused by the interaction of the system software and the actual parallel hardware must be reflected in the model parameters. The model has been used to predict the performance of the conjugate gradient algorithm on a given problem as the number of processors and machine characteristics varied.

  20. Accelerating k-NN Algorithm with Hybrid MPI and OpenSHMEM

    SciTech Connect

    Lin, Jian; Hamidouche, Khaled; Zheng, Jie; Lu, Xiaoyi; Vishnu, Abhinav; Panda, Dhabaleswar

    2015-08-05

    Machine Learning algorithms are benefiting from the continuous improvement of programming models, including MPI, MapReduce and PGAS. k-Nearest Neighbors (k-NN) algorithm is a widely used machine learning algorithm, applied to supervised learning tasks such as classification. Several parallel implementations of k-NN have been proposed in the literature and practice. However, on high-performance computing systems with high-speed interconnects, it is important to further accelerate existing designs of the k-NN algorithm through taking advantage of scalable programming models. To improve the performance of k-NN on large-scale environment with InfiniBand network, this paper proposes several alternative hybrid MPI+OpenSHMEM designs and performs a systemic evaluation and analysis on typical workloads. The hybrid designs leverage the one-sided memory access to better overlap communication with computation than the existing pure MPI design, and propose better schemes for efficient buffer management. The implementation based on k-NN program from MaTEx with MVAPICH2-X (Unified MPI+PGAS Communication Runtime over InfiniBand) shows up to 9.0% time reduction for training KDD Cup 2010 workload over 512 cores, and 27.6% time reduction for small workload with balanced communication and computation. Experiments of running with varied number of cores show that our design can maintain good scalability.

  1. Ancestral population genomics using coalescence hidden Markov models and heuristic optimisation algorithms.

    PubMed

    Cheng, Jade Yu; Mailund, Thomas

    2015-08-01

    With full genome data from several closely related species now readily available, we have the ultimate data for demographic inference. Exploiting these full genomes, however, requires models that can explicitly model recombination along alignments of full chromosomal length. Over the last decade a class of models, based on the sequential Markov coalescence model combined with hidden Markov models, has been developed and used to make inference in simple demographic scenarios. To move forward to more complex demographic modelling we need better and more automated ways of specifying these models and efficient optimisation algorithms for inferring the parameters in complex and often high-dimensional models. In this paper we present a framework for building such coalescence hidden Markov models for pairwise alignments and present results for using heuristic optimisation algorithms for parameter estimation. We show that we can build more complex demographic models than our previous frameworks and that we obtain more accurate parameter estimates using heuristic optimisation algorithms than when using our previous gradient based approaches. Our new framework provides a flexible way of constructing coalescence hidden Markov models almost automatically. While estimating parameters in more complex models is still challenging we show that using heuristic optimisation algorithms we still get a fairly good accuracy.

  2. VES/TEM 1D joint inversion by using Controlled Random Search (CRS) algorithm

    NASA Astrophysics Data System (ADS)

    Bortolozo, Cassiano Antonio; Porsani, Jorge Luís; Santos, Fernando Acácio Monteiro dos; Almeida, Emerson Rodrigo

    2015-01-01

    Electrical (DC) and Transient Electromagnetic (TEM) soundings are used in a great number of environmental, hydrological, and mining exploration studies. Usually, data interpretation is accomplished by individual 1D models resulting often in ambiguous models. This fact can be explained by the way as the two different methodologies sample the medium beneath surface. Vertical Electrical Sounding (VES) is good in marking resistive structures, while Transient Electromagnetic sounding (TEM) is very sensitive to conductive structures. Another difference is VES is better to detect shallow structures, while TEM soundings can reach deeper layers. A Matlab program for 1D joint inversion of VES and TEM soundings was developed aiming at exploring the best of both methods. The program uses CRS - Controlled Random Search - algorithm for both single and 1D joint inversions. Usually inversion programs use Marquadt type algorithms but for electrical and electromagnetic methods, these algorithms may find a local minimum or not converge. Initially, the algorithm was tested with synthetic data, and then it was used to invert experimental data from two places in Paraná sedimentary basin (Bebedouro and Pirassununga cities), both located in São Paulo State, Brazil. Geoelectric model obtained from VES and TEM data 1D joint inversion is similar to the real geological condition, and ambiguities were minimized. Results with synthetic and real data show that 1D VES/TEM joint inversion better recovers simulated models and shows a great potential in geological studies, especially in hydrogeological studies.

  3. Vibrational molecular quantum computing: basis set independence and theoretical realization of the Deutsch-Jozsa algorithm.

    PubMed

    Tesch, Carmen M; de Vivie-Riedle, Regina

    2004-12-22

    The phase of quantum gates is one key issue for the implementation of quantum algorithms. In this paper we first investigate the phase evolution of global molecular quantum gates, which are realized by optimally shaped femtosecond laser pulses. The specific laser fields are calculated using the multitarget optimal control algorithm, our modification of the optimal control theory relevant for application in quantum computing. As qubit system we use vibrational modes of polyatomic molecules, here the two IR-active modes of acetylene. Exemplarily, we present our results for a Pi gate, which shows a strong dependence on the phase, leading to a significant decrease in quantum yield. To correct for this unwanted behavior we include pressure on the quantum phase in our multitarget approach. In addition the accuracy of these phase corrected global quantum gates is enhanced. Furthermore we could show that in our molecular approach phase corrected quantum gates and basis set independence are directly linked. Basis set independence is also another property highly required for the performance of quantum algorithms. By realizing the Deutsch-Jozsa algorithm in our two qubit molecular model system, we demonstrate the good performance of our phase corrected and basis set independent quantum gates.

  4. A novel regularized edge-preserving super-resolution algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Hui; Chen, Fu-sheng; Zhang, Zhi-jie; Wang, Chen-sheng

    2013-09-01

    Using super-resolution (SR) technology is a good approach to obtain high-resolution infrared image. However, Image super-resolution reconstruction is essentially an ill-posed problem, it is important to design an effective regularization term (image prior). Gaussian prior is widely used in the regularization term, but the reconstructed SR image becomes over-smoothness. Here, a novel regularization term called non-local means (NLM) term is derived based on the assumption that the natural image content is likely to repeat itself within some neighborhood. In the proposed framework, the estimated high image is obtained by minimizing a cost function. The iteration method is applied to solve the optimum problem. With the progress of iteration, the regularization term is adaptively updated. The proposed algorithm has been tested in several experiments. The experimental results show that the proposed approach is robust and can reconstruct higher quality images both in quantitative term and perceptual effect.

  5. OpenEIS Algorithms

    2013-07-29

    The OpenEIS Algorithm package seeks to provide a low-risk path for building owners, service providers and managers to explore analytical methods for improving building control and operational efficiency. Users of this software can analyze building data, and learn how commercial implementations would provide long-term value. The code also serves as a reference implementation for developers who wish to adapt the algorithms for use in commercial tools or service offerings.

  6. Reconsidering the “Good Divorce”

    PubMed Central

    Amato, Paul R.; Kane, Jennifer B.; James, Spencer

    2011-01-01

    This study attempted to assess the notion that a “good divorce” protects children from the potential negative consequences of marital dissolution. A cluster analysis of data on postdivorce parenting from 944 families resulted in three groups: cooperative coparenting, parallel parenting, and single parenting. Children in the cooperative coparenting (good divorce) cluster had the smallest number of behavior problems and the closest ties to their fathers. Nevertheless, children in this cluster did not score significantly better than other children on 10 additional outcomes. These findings provide only modest support for the good divorce hypothesis. PMID:22125355

  7. Node Deployment Algorithm Based on Connected Tree for Underwater Sensor Networks

    PubMed Central

    Jiang, Peng; Wang, Xingmin; Jiang, Lurong

    2015-01-01

    Designing an efficient deployment method to guarantee optimal monitoring quality is one of the key topics in underwater sensor networks. At present, a realistic approach of deployment involves adjusting the depths of nodes in water. One of the typical algorithms used in such process is the self-deployment depth adjustment algorithm (SDDA). This algorithm mainly focuses on maximizing network coverage by constantly adjusting node depths to reduce coverage overlaps between two neighboring nodes, and thus, achieves good performance. However, the connectivity performance of SDDA is irresolute. In this paper, we propose a depth adjustment algorithm based on connected tree (CTDA). In CTDA, the sink node is used as the first root node to start building a connected tree. Finally, the network can be organized as a forest to maintain network connectivity. Coverage overlaps between the parent node and the child node are then reduced within each sub-tree to optimize coverage. The hierarchical strategy is used to adjust the distance between the parent node and the child node to reduce node movement. Furthermore, the silent mode is adopted to reduce communication cost. Simulations show that compared with SDDA, CTDA can achieve high connectivity with various communication ranges and different numbers of nodes. Moreover, it can realize coverage as high as that of SDDA with various sensing ranges and numbers of nodes but with less energy consumption. Simulations under sparse environments show that the connectivity and energy consumption performances of CTDA are considerably better than those of SDDA. Meanwhile, the connectivity and coverage performances of CTDA are close to those depth adjustment algorithms base on connected dominating set (CDA), which is an algorithm similar to CTDA. However, the energy consumption of CTDA is less than that of CDA, particularly in sparse underwater environments. PMID:26184209

  8. Node Deployment Algorithm Based on Connected Tree for Underwater Sensor Networks.

    PubMed

    Jiang, Peng; Wang, Xingmin; Jiang, Lurong

    2015-01-01

    Designing an efficient deployment method to guarantee optimal monitoring quality is one of the key topics in underwater sensor networks. At present, a realistic approach of deployment involves adjusting the depths of nodes in water. One of the typical algorithms used in such process is the self-deployment depth adjustment algorithm (SDDA). This algorithm mainly focuses on maximizing network coverage by constantly adjusting node depths to reduce coverage overlaps between two neighboring nodes, and thus, achieves good performance. However, the connectivity performance of SDDA is irresolute. In this paper, we propose a depth adjustment algorithm based on connected tree (CTDA). In CTDA, the sink node is used as the first root node to start building a connected tree. Finally, the network can be organized as a forest to maintain network connectivity. Coverage overlaps between the parent node and the child node are then reduced within each sub-tree to optimize coverage. The hierarchical strategy is used to adjust the distance between the parent node and the child node to reduce node movement. Furthermore, the silent mode is adopted to reduce communication cost. Simulations show that compared with SDDA, CTDA can achieve high connectivity with various communication ranges and different numbers of nodes. Moreover, it can realize coverage as high as that of SDDA with various sensing ranges and numbers of nodes but with less energy consumption. Simulations under sparse environments show that the connectivity and energy consumption performances of CTDA are considerably better than those of SDDA. Meanwhile, the connectivity and coverage performances of CTDA are close to those depth adjustment algorithms base on connected dominating set (CDA), which is an algorithm similar to CTDA. However, the energy consumption of CTDA is less than that of CDA, particularly in sparse underwater environments. PMID:26184209

  9. Node Deployment Algorithm Based on Connected Tree for Underwater Sensor Networks.

    PubMed

    Jiang, Peng; Wang, Xingmin; Jiang, Lurong

    2015-07-10

    Designing an efficient deployment method to guarantee optimal monitoring quality is one of the key topics in underwater sensor networks. At present, a realistic approach of deployment involves adjusting the depths of nodes in water. One of the typical algorithms used in such process is the self-deployment depth adjustment algorithm (SDDA). This algorithm mainly focuses on maximizing network coverage by constantly adjusting node depths to reduce coverage overlaps between two neighboring nodes, and thus, achieves good performance. However, the connectivity performance of SDDA is irresolute. In this paper, we propose a depth adjustment algorithm based on connected tree (CTDA). In CTDA, the sink node is used as the first root node to start building a connected tree. Finally, the network can be organized as a forest to maintain network connectivity. Coverage overlaps between the parent node and the child node are then reduced within each sub-tree to optimize coverage. The hierarchical strategy is used to adjust the distance between the parent node and the child node to reduce node movement. Furthermore, the silent mode is adopted to reduce communication cost. Simulations show that compared with SDDA, CTDA can achieve high connectivity with various communication ranges and different numbers of nodes. Moreover, it can realize coverage as high as that of SDDA with various sensing ranges and numbers of nodes but with less energy consumption. Simulations under sparse environments show that the connectivity and energy consumption performances of CTDA are considerably better than those of SDDA. Meanwhile, the connectivity and coverage performances of CTDA are close to those depth adjustment algorithms base on connected dominating set (CDA), which is an algorithm similar to CTDA. However, the energy consumption of CTDA is less than that of CDA, particularly in sparse underwater environments.

  10. [An Algorithm for Correcting Fetal Heart Rate Baseline].

    PubMed

    Li, Xiaodong; Lu, Yaosheng

    2015-10-01

    Fetal heart rate (FHR) baseline estimation is of significance for the computerized analysis of fetal heart rate and the assessment of fetal state. In our work, a fetal heart rate baseline correction algorithm was presented to make the existing baseline more accurate and fit to the tracings. Firstly, the deviation of the existing FHR baseline was found and corrected. And then a new baseline was obtained finally after treatment with some smoothing methods. To assess the performance of FHR baseline correction algorithm, a new FHR baseline estimation algorithm that combined baseline estimation algorithm and the baseline correction algorithm was compared with two existing FHR baseline estimation algorithms. The results showed that the new FHR baseline estimation algorithm did well in both accuracy and efficiency. And the results also proved the effectiveness of the FHR baseline correction algorithm.

  11. Improved Bat Algorithm Applied to Multilevel Image Thresholding

    PubMed Central

    2014-01-01

    Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed. PMID:25165733

  12. Improved hybrid optimization algorithm for 3D protein structure prediction.

    PubMed

    Zhou, Changjun; Hou, Caixia; Wei, Xiaopeng; Zhang, Qiang

    2014-07-01

    A new improved hybrid optimization algorithm - PGATS algorithm, which is based on toy off-lattice model, is presented for dealing with three-dimensional protein structure prediction problems. The algorithm combines the particle swarm optimization (PSO), genetic algorithm (GA), and tabu search (TS) algorithms. Otherwise, we also take some different improved strategies. The factor of stochastic disturbance is joined in the particle swarm optimization to improve the search ability; the operations of crossover and mutation that are in the genetic algorithm are changed to a kind of random liner method; at last tabu search algorithm is improved by appending a mutation operator. Through the combination of a variety of strategies and algorithms, the protein structure prediction (PSP) in a 3D off-lattice model is achieved. The PSP problem is an NP-hard problem, but the problem can be attributed to a global optimization problem of multi-extremum and multi-parameters. This is the theoretical principle of the hybrid optimization algorithm that is proposed in this paper. The algorithm combines local search and global search, which overcomes the shortcoming of a single algorithm, giving full play to the advantage of each algorithm. In the current universal standard sequences, Fibonacci sequences and real protein sequences are certified. Experiments show that the proposed new method outperforms single algorithms on the accuracy of calculating the protein sequence energy value, which is proved to be an effective way to predict the structure of proteins. PMID:25069136

  13. An improved localization algorithm based on genetic algorithm in wireless sensor networks.

    PubMed

    Peng, Bo; Li, Lei

    2015-04-01

    Wireless sensor network (WSN) are widely used in many applications. A WSN is a wireless decentralized structure network comprised of nodes, which autonomously set up a network. The node localization that is to be aware of position of the node in the network is an essential part of many sensor network operations and applications. The existing localization algorithms can be classified into two categories: range-based and range-free. The range-based localization algorithm has requirements on hardware, thus is expensive to be implemented in practice. The range-free localization algorithm reduces the hardware cost. Because of the hardware limitations of WSN devices, solutions in range-free localization are being pursued as a cost-effective alternative to more expensive range-based approaches. However, these techniques usually have higher localization error compared to the range-based algorithms. DV-Hop is a typical range-free localization algorithm utilizing hop-distance estimation. In this paper, we propose an improved DV-Hop algorithm based on genetic algorithm. Simulation results show that our proposed algorithm improves the localization accuracy compared with previous algorithms.

  14. Combined string searching algorithm based on knuth-morris- pratt and boyer-moore algorithms

    NASA Astrophysics Data System (ADS)

    Tsarev, R. Yu; Chernigovskiy, A. S.; Tsareva, E. A.; Brezitskaya, V. V.; Nikiforov, A. Yu; Smirnov, N. A.

    2016-04-01

    The string searching task can be classified as a classic information processing task. Users either encounter the solution of this task while working with text processors or browsers, employing standard built-in tools, or this task is solved unseen by the users, while they are working with various computer programmes. Nowadays there are many algorithms for solving the string searching problem. The main criterion of these algorithms’ effectiveness is searching speed. The larger the shift of the pattern relative to the string in case of pattern and string characters’ mismatch is, the higher is the algorithm running speed. This article offers a combined algorithm, which has been developed on the basis of well-known Knuth-Morris-Pratt and Boyer-Moore string searching algorithms. These algorithms are based on two different basic principles of pattern matching. Knuth-Morris-Pratt algorithm is based upon forward pattern matching and Boyer-Moore is based upon backward pattern matching. Having united these two algorithms, the combined algorithm allows acquiring the larger shift in case of pattern and string characters’ mismatch. The article provides an example, which illustrates the results of Boyer-Moore and Knuth-Morris- Pratt algorithms and combined algorithm’s work and shows advantage of the latter in solving string searching problem.

  15. Fixed-point error analysis of Winograd Fourier transform algorithms

    NASA Technical Reports Server (NTRS)

    Patterson, R. W.; Mcclellan, J. H.

    1978-01-01

    The quantization error introduced by the Winograd Fourier transform algorithm (WFTA) when implemented in fixed-point arithmetic is studied and compared with that of the fast Fourier transform (FFT). The effect of ordering the computational modules and the relative contributions of data quantization error and coefficient quantization error are determined. In addition, the quantization error introduced by the Good-Winograd (GW) algorithm, which uses Good's prime-factor decomposition for the discrete Fourier transform (DFT) together with Winograd's short length DFT algorithms, is studied. Error introduced by the WFTA is, in all cases, worse than that of the FFT. In general, the WFTA requires one or two more bits for data representation to give an error similar to that of the FFT. Error introduced by the GW algorithm is approximately the same as that of the FFT.

  16. A Bad Case of Good's Syndrome.

    PubMed

    Tachdjian, Raffi; Keller, Janet J; Pfeffer, Michael

    2014-12-01

    Good's syndrome is a relatively rare immunodeficiency condition that presents in the fourth or fifth decade of life and is defined by hypogammaglobulinemia in the setting of a thymoma. The humoral defect may be severe enough to cause an absence in B cells, with a consequent recurrence of sinopulmonary disease, chronic non-infectious diarrhea and opportunistic infections. The prognosis in patients with Good's syndrome appears to be worse than in those with X-linked agammaglobulinemia (XLA) and common variable immune deficiency (CVID). There have only been three cases of Good's syndrome associated with mycobacterium, and only one case with a cavitary lesion in the lungs. We present here a unique case of Good's syndrome with a non-mycobacterial cavitary lesion.

  17. Keys to Maintaining a Good Banking Relationship.

    ERIC Educational Resources Information Center

    Stephens, Keith

    1988-01-01

    Suggests strategies for finding an appropriate bank for a day care center, maintaining a good relationship with a bank once one has been selected, and obtaining and repaying a day care center loan. (SKC)

  18. 42 CFR 93.210 - Good faith.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... allegation or cooperation with a research misconduct proceeding is not in good faith if made with knowing or... are dishonest or influenced by personal, professional, or financial conflicts of interest with...

  19. The Goodness of Simultaneous Fits in ISIS

    NASA Astrophysics Data System (ADS)

    Kühnel, Matthias; Falkner, Sebastian; Grossberger, Christoph; Ballhausen, Ralf; Dauser, Thomas; Schwarm, Fritz-Walter; Kreykenbohm, Ingo; Nowak, Michael A.; Pottschmidt, Katja; Ferrigno, Carlo; Rothschild, Richard E.; Martínez-Núñez, Silvia; Torrejón, José Miguel; Fürst, Felix; Klochkov, Dmitry; Staubert, Rüdiger; Kretschmar, Peter; Wilms, Jörn

    2016-02-01

    In a previous work, we introduced a tool for analyzing multiple datasets simultaneously, which has been implemented into ISIS. This tool was used to fit many spectra of X-ray binaries. However, the large number of degrees of freedom and individual datasets raise an issue about a good measure for a simultaneous fit quality. We present three ways to check the goodness of these fits: we investigate the goodness of each fit in all datasets, we define a combined goodness exploiting the logical structure of a simultaneous fit, and we stack the fit residuals of all datasets to detect weak features. These tools are applied to all RXTE-spectra from GRO 1008-57, revealing calibration features that are not detected significantly in any single spectrum. Stacking the residuals from the best-fit model for the Vela X-1 and XTE J1859+083 data evidences fluorescent emission lines that would have gone undetected otherwise.

  20. The Goods Upstairs Car Innovative Design

    NASA Astrophysics Data System (ADS)

    Wang, Feng-Lan; Zhang, Bo; Gao, Bo; Liu, Yan-Xin; Gao, Bo

    2016-05-01

    The design is a new kind of cars used for loading goods when you upstairs. The cars -- ones are very safe and convenient --consist of body, chassis, bottom, round, object, stage, upstairs, train wheels, handles, storage tank, security fence etc. The design, composed of combination of each structure, achieves the purpose of loading goods and even some large potted plants when you go upstairs or downstairs very flatly.

  1. Quantum Adiabatic Algorithms and Large Spin Tunnelling

    NASA Technical Reports Server (NTRS)

    Boulatov, A.; Smelyanskiy, V. N.

    2003-01-01

    We provide a theoretical study of the quantum adiabatic evolution algorithm with different evolution paths proposed in this paper. The algorithm is applied to a random binary optimization problem (a version of the 3-Satisfiability problem) where the n-bit cost function is symmetric with respect to the permutation of individual bits. The evolution paths are produced, using the generic control Hamiltonians H (r) that preserve the bit symmetry of the underlying optimization problem. In the case where the ground state of H(0) coincides with the totally-symmetric state of an n-qubit system the algorithm dynamics is completely described in terms of the motion of a spin-n/2. We show that different control Hamiltonians can be parameterized by a set of independent parameters that are expansion coefficients of H (r) in a certain universal set of operators. Only one of these operators can be responsible for avoiding the tunnelling in the spin-n/2 system during the quantum adiabatic algorithm. We show that it is possible to select a coefficient for this operator that guarantees a polynomial complexity of the algorithm for all problem instances. We show that a successful evolution path of the algorithm always corresponds to the trajectory of a classical spin-n/2 and provide a complete characterization of such paths.

  2. Spatial dilemmas of diffusible public goods.

    PubMed

    Allen, Benjamin; Gore, Jeff; Nowak, Martin A

    2013-01-01

    The emergence of cooperation is a central question in evolutionary biology. Microorganisms often cooperate by producing a chemical resource (a public good) that benefits other cells. The sharing of public goods depends on their diffusion through space. Previous theory suggests that spatial structure can promote evolution of cooperation, but the diffusion of public goods introduces new phenomena that must be modeled explicitly. We develop an approach where colony geometry and public good diffusion are described by graphs. We find that the success of cooperation depends on a simple relation between the benefits and costs of the public good, the amount retained by a producer, and the average amount retained by each of the producer's neighbors. These quantities are derived as analytic functions of the graph topology and diffusion rate. In general, cooperation is favored for small diffusion rates, low colony dimensionality, and small rates of decay of the public good. DOI: http://dx.doi.org/10.7554/eLife.01169.001. PMID:24347543

  3. Making Good Teaching Great: Everyday Strategies for Teaching with Impact

    ERIC Educational Resources Information Center

    Breaux, Annette L.; Whitaker, Todd

    2012-01-01

    Every good teacher strives to be a great teacher--and this must-have book shows you how! It's filled with practical tips and strategies for connecting with your students in a meaningful and powerful way. Learn how to improve student learning with easy-to-implement daily activities designed to integrate seamlessly into any day of the school year.…

  4. An Iterative Image Registration Algorithm by Optimizing Similarity Measurement.

    PubMed

    Chu, Wei; Ma, Li; Song, John; Vorburger, Theodore

    2010-01-01

    A new registration algorithm based on Newton-Raphson iteration is proposed to align images with rigid body transformation. A set of transformation parameters consisting of translation in x and y and rotation angle around z is calculated by optimizing a specified similarity metric using the Newton-Raphson method. This algorithm has been tested by registering and correlating pairs of topography measurements of nominally identical NIST Standard Reference Material (SRM 2461) standard cartridge cases, and very good registration accuracy has been obtained.

  5. LCD motion blur: modeling, analysis, and algorithm.

    PubMed

    Chan, Stanley H; Nguyen, Truong Q

    2011-08-01

    Liquid crystal display (LCD) devices are well known for their slow responses due to the physical limitations of liquid crystals. Therefore, fast moving objects in a scene are often perceived as blurred. This effect is known as the LCD motion blur. In order to reduce LCD motion blur, an accurate LCD model and an efficient deblurring algorithm are needed. However, existing LCD motion blur models are insufficient to reflect the limitation of human-eye-tracking system. Also, the spatiotemporal equivalence in LCD motion blur models has not been proven directly in the discrete 2-D spatial domain, although it is widely used. There are three main contributions of this paper: modeling, analysis, and algorithm. First, a comprehensive LCD motion blur model is presented, in which human-eye-tracking limits are taken into consideration. Second, a complete analysis of spatiotemporal equivalence is provided and verified using real video sequences. Third, an LCD motion blur reduction algorithm is proposed. The proposed algorithm solves an l(1)-norm regularized least-squares minimization problem using a subgradient projection method. Numerical results show that the proposed algorithm gives higher peak SNR, lower temporal error, and lower spatial error than motion-compensated inverse filtering and Lucy-Richardson deconvolution algorithm, which are two state-of-the-art LCD deblurring algorithms. PMID:21292596

  6. Novel and efficient tag SNPs selection algorithms.

    PubMed

    Chen, Wen-Pei; Hung, Che-Lun; Tsai, Suh-Jen Jane; Lin, Yaw-Ling

    2014-01-01

    SNPs are the most abundant forms of genetic variations amongst species; the association studies between complex diseases and SNPs or haplotypes have received great attention. However, these studies are restricted by the cost of genotyping all SNPs; thus, it is necessary to find smaller subsets, or tag SNPs, representing the rest of the SNPs. In fact, the existing tag SNP selection algorithms are notoriously time-consuming. An efficient algorithm for tag SNP selection was presented, which was applied to analyze the HapMap YRI data. The experimental results show that the proposed algorithm can achieve better performance than the existing tag SNP selection algorithms; in most cases, this proposed algorithm is at least ten times faster than the existing methods. In many cases, when the redundant ratio of the block is high, the proposed algorithm can even be thousands times faster than the previously known methods. Tools and web services for haplotype block analysis integrated by hadoop MapReduce framework are also developed using the proposed algorithm as computation kernels. PMID:24212035

  7. Least significant qubit algorithm for quantum images

    NASA Astrophysics Data System (ADS)

    Sang, Jianzhi; Wang, Shen; Li, Qiong

    2016-08-01

    To study the feasibility of the classical image least significant bit (LSB) information hiding algorithm on quantum computer, a least significant qubit (LSQb) information hiding algorithm of quantum image is proposed. In this paper, we focus on a novel quantum representation for color digital images (NCQI). Firstly, by designing the three qubits comparator and unitary operators, the reasonability and feasibility of LSQb based on NCQI are presented. Then, the concrete LSQb information hiding algorithm is proposed, which can realize the aim of embedding the secret qubits into the least significant qubits of RGB channels of quantum cover image. Quantum circuit of the LSQb information hiding algorithm is also illustrated. Furthermore, the secrets extracting algorithm and circuit are illustrated through utilizing control-swap gates. The two merits of our algorithm are: (1) it is absolutely blind and (2) when extracting secret binary qubits, it does not need any quantum measurement operation or any other help from classical computer. Finally, simulation and comparative analysis show the performance of our algorithm.

  8. Algorithm for dynamic Speckle pattern processing

    NASA Astrophysics Data System (ADS)

    Cariñe, J.; Guzmán, R.; Torres-Ruiz, F. A.

    2016-07-01

    In this paper we present a new algorithm for determining surface activity by processing speckle pattern images recorded with a CCD camera. Surface activity can be produced by motility or small displacements among other causes, and is manifested as a change in the pattern recorded in the camera with reference to a static background pattern. This intensity variation is considered to be a small perturbation compared with the mean intensity. Based on a perturbative method we obtain an equation with which we can infer information about the dynamic behavior of the surface that generates the speckle pattern. We define an activity index based on our algorithm that can be easily compared with the outcomes from other algorithms. It is shown experimentally that this index evolves in time in the same way as the Inertia Moment method, however our algorithm is based on direct processing of speckle patterns without the need for other kinds of post-processes (like THSP and co-occurrence matrix), making it a viable real-time method. We also show how this algorithm compares with several other algorithms when applied to calibration experiments. From these results we conclude that our algorithm offer qualitative and quantitative advantages over current methods.

  9. A quantum algorithm for obtaining the lowest eigenstate of a Hamiltonian assisted with an ancillary qubit system

    NASA Astrophysics Data System (ADS)

    Bang, Jeongho; Lee, Seung-Woo; Lee, Chang-Woo; Jeong, Hyunseok

    2015-01-01

    We propose a quantum algorithm to obtain the lowest eigenstate of any Hamiltonian simulated by a quantum computer. The proposed algorithm begins with an arbitrary initial state of the simulated system. A finite series of transforms is iteratively applied to the initial state assisted with an ancillary qubit. The fraction of the lowest eigenstate in the initial state is then amplified up to 1. We prove that our algorithm can faithfully work for any arbitrary Hamiltonian in the theoretical analysis. Numerical analyses are also carried out. We firstly provide a numerical proof-of-principle demonstration with a simple Hamiltonian in order to compare our scheme with the so-called "Demon-like algorithmic cooling (DLAC)", recently proposed in Xu (Nat Photonics 8:113, 2014). The result shows a good agreement with our theoretical analysis, exhibiting the comparable behavior to the best `cooling' with the DLAC method. We then consider a random Hamiltonian model for further analysis of our algorithm. By numerical simulations, we show that the total number of iterations is proportional to , where is the difference between the two lowest eigenvalues and is an error defined as the probability that the finally obtained system state is in an unexpected (i.e., not the lowest) eigenstate.

  10. Generic algorithms for high performance scalable geocomputing

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; Schmitz, Oliver; Karssenberg, Derek

    2016-04-01

    During the last decade, the characteristics of computing hardware have changed a lot. For example, instead of a single general purpose CPU core, personal computers nowadays contain multiple cores per CPU and often general purpose accelerators, like GPUs. Additionally, compute nodes are often grouped together to form clusters or a supercomputer, providing enormous amounts of compute power. For existing earth simulation models to be able to use modern hardware platforms, their compute intensive parts must be rewritten. This can be a major undertaking and may involve many technical challenges. Compute tasks must be distributed over CPU cores, offloaded to hardware accelerators, or distributed to different compute nodes. And ideally, all of this should be done in such a way that the compute task scales well with the hardware resources. This presents two challenges: 1) how to make good use of all the compute resources and 2) how to make these compute resources available for developers of simulation models, who may not (want to) have the required technical background for distributing compute tasks. The first challenge requires the use of specialized technology (e.g.: threads, OpenMP, MPI, OpenCL, CUDA). The second challenge requires the abstraction of the logic handling the distribution of compute tasks from the model-specific logic, hiding the technical details from the model developer. To assist the model developer, we are developing a C++ software library (called Fern) containing algorithms that can use all CPU cores available in a single compute node (distributing tasks over multiple compute nodes will be done at a later stage). The algorithms are grid-based (finite difference) and include local and spatial operations such as convolution filters. The algorithms handle distribution of the compute tasks to CPU cores internally. In the resulting model the low-level details of how this is done is separated from the model-specific logic representing the modeled system

  11. Generic algorithms for high performance scalable geocomputing

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; Schmitz, Oliver; Karssenberg, Derek

    2016-04-01

    During the last decade, the characteristics of computing hardware have changed a lot. For example, instead of a single general purpose CPU core, personal computers nowadays contain multiple cores per CPU and often general purpose accelerators, like GPUs. Additionally, compute nodes are often grouped together to form clusters or a supercomputer, providing enormous amounts of compute power. For existing earth simulation models to be able to use modern hardware platforms, their compute intensive parts must be rewritten. This can be a major undertaking and may involve many technical challenges. Compute tasks must be distributed over CPU cores, offloaded to hardware accelerators, or distributed to different compute nodes. And ideally, all of this should be done in such a way that the compute task scales well with the hardware resources. This presents two challenges: 1) how to make good use of all the compute resources and 2) how to make these compute resources available for developers of simulation models, who may not (want to) have the required technical background for distributing compute tasks. The first challenge requires the use of specialized technology (e.g.: threads, OpenMP, MPI, OpenCL, CUDA). The second challenge requires the abstraction of the logic handling the distribution of compute tasks from the model-specific logic, hiding the technical details from the model developer. To assist the model developer, we are developing a C++ software library (called Fern) containing algorithms that can use all CPU cores available in a single compute node (distributing tasks over multiple compute nodes will be done at a later stage). The algorithms are grid-based (finite difference) and include local and spatial operations such as convolution filters. The algorithms handle distribution of the compute tasks to CPU cores internally. In the resulting model the low-level details of how this is done is separated from the model-specific logic representing the modeled system

  12. Stoffenmanager exposure model: development of a quantitative algorithm.

    PubMed

    Tielemans, Erik; Noy, Dook; Schinkel, Jody; Heussen, Henri; Van Der Schaaf, Doeke; West, John; Fransman, Wouter

    2008-08-01

    In The Netherlands, the web-based tool called 'Stoffenmanager' was initially developed to assist small- and medium-sized enterprises to prioritize and control risks of handling chemical products in their workplaces. The aim of the present study was to explore the accuracy of the Stoffenmanager exposure algorithm. This was done by comparing its semi-quantitative exposure rankings for specific substances with exposure measurements collected from several occupational settings to derive a quantitative exposure algorithm. Exposure data were collected using two strategies. First, we conducted seven surveys specifically for validation of the Stoffenmanager. Second, existing occupational exposure data sets were collected from various sources. This resulted in 378 and 320 measurements for solid and liquid scenarios, respectively. The Spearman correlation coefficients between Stoffenmanager scores and exposure measurements appeared to be good for handling solids (r(s) = 0.80, N = 378, P < 0.0001) and liquid scenarios (r(s) = 0.83, N = 320, P < 0.0001). However, the correlation for liquid scenarios appeared to be lower when calculated separately for sets of volatile substances with a vapour pressure >10 Pa (r(s) = 0.56, N = 104, P < 0.0001) and non-volatile substances with a vapour pressure < or =10 Pa (r(s) = 0.53, N = 216, P < 0.0001). The mixed-effect regression models with natural log-transformed Stoffenmanager scores as independent parameter explained a substantial part of the total exposure variability (52% for solid scenarios and 76% for liquid scenarios). Notwithstanding the good correlation, the data show substantial variability in exposure measurements given a certain Stoffenmanager score. The overall performance increases our confidence in the use of the Stoffenmanager as a generic tool for risk assessment. The mixed-effect regression models presented in this paper may be used for assessment of so-called reasonable worst case exposures. This evaluation is

  13. Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.

    PubMed

    Ricci, E; Di Domenico, S; Cianca, E; Rossi, T

    2015-01-01

    Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity.

  14. Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.

    PubMed

    Ricci, E; Di Domenico, S; Cianca, E; Rossi, T

    2015-01-01

    Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity. PMID:26736661

  15. Topology Organization in Peer-to-Peer Platform for Genetic Algorithm Environment

    NASA Astrophysics Data System (ADS)

    Yao, Hong; Yu, Linchen

    Genetic algorithms (GAs) have the inherent nature of parallel search. With the advantage of the computing power of PCs, GA computing environment can be shifted from a single machine to Internet. Topology, the organization of the peers, as well as their dynamic change and maintaining mechanisms, is important to organize an efficient and stable topological structure. A new topology is proposed in this paper to create a hybrid structure for large scale of peers. The whole structure is divided into two layers. The upper part is composed by super nodes, while the lower part is composed by the ordinary nodes. Testing shows that it is good for maintaining the platform stable and scalable.

  16. Grooming of arbitrary traffic using improved genetic algorithms

    NASA Astrophysics Data System (ADS)

    Jiao, Yueguang; Xu, Zhengchun; Zhang, Hanyi

    2004-04-01

    A genetic algorithm is proposed with permutation based chromosome presentation and roulette wheel selection to solve traffic grooming problems in WDM ring network. The parameters of the algorithm are evaluated by calculating of large amount of traffic patterns at different conditions. Four methods were developed to improve the algorithm, which can be used combining with each other. Effects of them on the algorithm are studied via computer simulations. The results show that they can all make the algorithm more powerful to reduce the number of add-drop multiplexers or wavelengths required in a network.

  17. Solving SAT Problem Based on Hybrid Differential Evolution Algorithm

    NASA Astrophysics Data System (ADS)

    Liu, Kunqi; Zhang, Jingmin; Liu, Gang; Kang, Lishan

    Satisfiability (SAT) problem is an NP-complete problem. Based on the analysis about it, SAT problem is translated equally into an optimization problem on the minimum of objective function. A hybrid differential evolution algorithm is proposed to solve the Satisfiability problem. It makes full use of strong local search capacity of hill-climbing algorithm and strong global search capability of differential evolution algorithm, which makes up their disadvantages, improves the efficiency of algorithm and avoids the stagnation phenomenon. The experiment results show that the hybrid algorithm is efficient in solving SAT problem.

  18. A Unified Differential Evolution Algorithm for Global Optimization

    SciTech Connect

    Qiang, Ji; Mitchell, Chad

    2014-06-24

    Abstract?In this paper, we propose a new unified differential evolution (uDE) algorithm for single objective global optimization. Instead of selecting among multiple mutation strategies as in the conventional differential evolution algorithm, this algorithm employs a single equation as the mutation strategy. It has the virtue of mathematical simplicity and also provides users the flexbility for broader exploration of different mutation strategies. Numerical tests using twelve basic unimodal and multimodal functions show promising performance of the proposed algorithm in comparison to convential differential evolution algorithms.

  19. Segmentation of MRI Brain Images with an Improved Harmony Searching Algorithm

    PubMed Central

    Yang, Zhang; Li, Guo; Weifeng, Ding

    2016-01-01

    The harmony searching (HS) algorithm is a kind of optimization search algorithm currently applied in many practical problems. The HS algorithm constantly revises variables in the harmony database and the probability of different values that can be used to complete iteration convergence to achieve the optimal effect. Accordingly, this study proposed a modified algorithm to improve the efficiency of the algorithm. First, a rough set algorithm was employed to improve the convergence and accuracy of the HS algorithm. Then, the optimal value was obtained using the improved HS algorithm. The optimal value of convergence was employed as the initial value of the fuzzy clustering algorithm for segmenting magnetic resonance imaging (MRI) brain images. Experimental results showed that the improved HS algorithm attained better convergence and more accurate results than those of the original HS algorithm. In our study, the MRI image segmentation effect of the improved algorithm was superior to that of the original fuzzy clustering method. PMID:27403428

  20. Segmentation of MRI Brain Images with an Improved Harmony Searching Algorithm.

    PubMed

    Yang, Zhang; Shufan, Ye; Li, Guo; Weifeng, Ding

    2016-01-01

    The harmony searching (HS) algorithm is a kind of optimization search algorithm currently applied in many practical problems. The HS algorithm constantly revises variables in the harmony database and the probability of different values that can be used to complete iteration convergence to achieve the optimal effect. Accordingly, this study proposed a modified algorithm to improve the efficiency of the algorithm. First, a rough set algorithm was employed to improve the convergence and accuracy of the HS algorithm. Then, the optimal value was obtained using the improved HS algorithm. The optimal value of convergence was employed as the initial value of the fuzzy clustering algorithm for segmenting magnetic resonance imaging (MRI) brain images. Experimental results showed that the improved HS algorithm attained better convergence and more accurate results than those of the original HS algorithm. In our study, the MRI image segmentation effect of the improved algorithm was superior to that of the original fuzzy clustering method. PMID:27403428

  1. Evolutionary pattern search algorithms

    SciTech Connect

    Hart, W.E.

    1995-09-19

    This paper defines a class of evolutionary algorithms called evolutionary pattern search algorithms (EPSAs) and analyzes their convergence properties. This class of algorithms is closely related to evolutionary programming, evolutionary strategie and real-coded genetic algorithms. EPSAs are self-adapting systems that modify the step size of the mutation operator in response to the success of previous optimization steps. The rule used to adapt the step size can be used to provide a stationary point convergence theory for EPSAs on any continuous function. This convergence theory is based on an extension of the convergence theory for generalized pattern search methods. An experimental analysis of the performance of EPSAs demonstrates that these algorithms can perform a level of global search that is comparable to that of canonical EAs. We also describe a stopping rule for EPSAs, which reliably terminated near stationary points in our experiments. This is the first stopping rule for any class of EAs that can terminate at a given distance from stationary points.

  2. Comparison of algorithms for out-of-plane artifacts removal in digital tomosynthesis reconstructions.

    PubMed

    Bliznakova, K; Bliznakov, Z; Buliev, I

    2012-07-01

    Digital tomosynthesis is a method of limited angle reconstruction of tomographic images produced at variable heights, on the basis of a set of angular projections taken in an arc around human anatomy. Reconstructed tomograms from unprocessed original projection images, however, are invariably affected by tomographic noise such as blurred images of objects lying outside the plane of interest and superimposed on the focused image of the fulcrum plane. The present work investigates the performance of two approaches for generation of tomograms with a reduced noise: a generalised post-processing method, based on constructing a noise mask from all planes in the reconstructed volume, and its subsequent subtraction from the in-focus plane and a filtered Multiple Projection Algorithm. The comparison between the two algorithms shows that the first method provides reconstructions with very good quality in case of high contrast features, especially for those embedded into a heterogeneous background.

  3. Social-Stratification Probabilistic Routing Algorithm in Delay-Tolerant Network

    NASA Astrophysics Data System (ADS)

    Alnajjar, Fuad; Saadawi, Tarek

    Routing in mobile ad hoc networks (MANET) is complicated due to the fact that the network graph is episodically connected. In MANET, topology is changing rapidly because of weather, terrain and jamming. A key challenge is to create a mechanism that can provide good delivery performance and low end-to-end delay in an intermittent network graph where nodes may move freely. Delay-Tolerant Networking (DTN) architecture is designed to provide communication in intermittently connected networks, by moving messages towards destination via ”store, carry and forward” technique that supports multi-routing algorithms to acquire best path towards destination. In this paper, we propose the use of probabilistic routing in DTN architecture using the concept of social-stratification network. We use the Opportunistic Network Environment (ONE) simulator as a simulation tool to compare the proposed Social- stratification Probabilistic Routing Algorithm (SPRA) with the common DTN-based protocols. Our results show that SPRA outperforms the other protocols.

  4. Heuristic algorithms for a storage location assignment problem in a chaotic warehouse

    NASA Astrophysics Data System (ADS)

    Quintanilla, Sacramento; Pérez, Ángeles; Ballestín, Francisco; Lino, Pilar

    2015-10-01

    The extensive application of emerging technologies is revolutionizing warehouse management. These technologies facilitate working with complex and powerful warehouse management models in which products do not have assigned fixed locations (random storage). Random storage allows the utilization of the available space to be optimized. In this context, and motivated by a real problem, this article presents a model that looks for the optimal allocation of goods in order to maximize the storage space availability within the restrictions of the warehouse. For the proposed model a construction method, a local search algorithm and different metaheuristics have been developed. The introduced algorithms can also be used for other purposes such as to assess when and how it is convenient to perform relocation of stored items to improve the current level of storage space availability. Computational tests performed on a set of randomly generated and real warehouse instances show the effectiveness of the proposed methods.

  5. Study on Ply Orientation Optimum Design for Composite Material Structure Based on Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Ma, Ai-Jun

    2016-05-01

    To find the optimum design of ply orientation for composite material structure, we proposed a method based on genetic algorithm and executed on a composite frame case. Firstly we gave the descriptions of the structure including solid model and mechanical property of the material and then created the finite element model of composite frame and set a static load step to get the displacement of cared node. Then we created the optimization mathematical model and used genetic algorithm to find the global optimal solution of the optimization problem, and finally achieved the best layer angle of the composite material case. The ply orientation optimum design made a good performance as the results showed that the objective function dropped by 16.6%. This case can might provide a reference for ply orientation optimum design of similar composite structure.

  6. Study on the classification algorithm of degree of arteriosclerosis based on fuzzy pattern recognition

    NASA Astrophysics Data System (ADS)

    Ding, Li; Zhou, Runjing; Liu, Guiying

    2010-08-01

    Pulse wave of human body contains large amount of physiological and pathological information, so the degree of arteriosclerosis classification algorithm is study based on fuzzy pattern recognition in this paper. Taking the human's pulse wave as the research object, we can extract the characteristic of time and frequency domain of pulse signal, and select the parameters with a better clustering effect for arteriosclerosis identification. Moreover, the validity of characteristic parameters is verified by fuzzy ISODATA clustering method (FISOCM). Finally, fuzzy pattern recognition system can quantitatively distinguish the degree of arteriosclerosis with patients. By testing the 50 samples in the built pulse database, the experimental result shows that the algorithm is practical and achieves a good classification recognition result.

  7. Segmentation algorithm via Cellular Neural/Nonlinear Network: implementation on Bio-inspired hardware platform

    NASA Astrophysics Data System (ADS)

    Karabiber, Fethullah; Vecchio, Pietro; Grassi, Giuseppe

    2011-12-01

    The Bio-inspired (Bi-i) Cellular Vision System is a computing platform consisting of sensing, array sensing-processing, and digital signal processing. The platform is based on the Cellular Neural/Nonlinear Network (CNN) paradigm. This article presents the implementation of a novel CNN-based segmentation algorithm onto the Bi-i system. Each part of the algorithm, along with the corresponding implementation on the hardware platform, is carefully described through the article. The experimental results, carried out for Foreman and Car-phone video sequences, highlight the feasibility of the approach, which provides a frame rate of about 26 frames/s. Comparisons with existing CNN-based methods show that the conceived approach is more accurate, thus representing a good trade-off between real-time requirements and accuracy.

  8. Constitutive Modeling and Algorithmic Implementation of a Plasticity-like Model for Trabecular Bone Structures

    NASA Astrophysics Data System (ADS)

    Gupta, Atul; Bayraktar, Harun H.; Fox, Julia C.; Keaveny, Tony M.; Papadopoulos, Panayiotis

    2007-06-01

    Trabecular bone is a highly porous orthotropic cellular solid material present inside human bones such as the femur (hip bone) and vertebra (spine). In this study, an infinitesimal plasticity-like model with isotropic/kinematic hardening is developed to describe yielding of trabecular bone at the continuum level. One of the unique features of this formulation is the development of the plasticity-like model in strain space for a yield envelope expressed in terms of principal strains having asymmetric yield behavior. An implicit return-mapping approach is adopted to obtain a symmetric algorithmic tangent modulus and a step-by-step procedure of algorithmic implementation is derived. To investigate the performance of this approach in a full-scale finite element simulation, the model is implemented in a non-linear finite element analysis program and several test problems including the simulation of loading of the human femur structures are analyzed. The results show good agreement with the experimental data.

  9. Good Workers for Good Jobs: Improving Education and Workforce Systems in the US. Discussion Paper No. 1404-13

    ERIC Educational Resources Information Center

    Holzer, Harry J.

    2013-01-01

    Stagnant earnings and growing inequality in the US labor market reflect both a slowdown in the growth of worker skills and the growing matching of good-paying jobs to skilled workers. Improving the ties between colleges, workforce institutions, and employers would help more workers gain the needed skills. Evaluation evidence shows that training…

  10. Five-dimensional Janis-Newman algorithm

    NASA Astrophysics Data System (ADS)

    Erbin, Harold; Heurtier, Lucien

    2015-08-01

    The Janis-Newman algorithm has been shown to be successful in finding new stationary solutions of four-dimensional gravity. Attempts for a generalization to higher dimensions have already been found for the restricted cases with only one angular momentum. In this paper we propose an extension of this algorithm to five-dimensions with two angular momenta—using the prescription of Giampieri—through two specific examples, that are the Myers-Perry and BMPV black holes. We also discuss possible enlargements of our prescriptions to other dimensions and maximal number of angular momenta, and show how dimensions higher than six appear to be much more challenging to treat within this framework. Nonetheless this general algorithm provides a unification of the formulation in d=3,4,5 of the Janis-Newman algorithm, from which several examples are exposed, including the BTZ black hole.

  11. Rigorous estimates for the relegation algorithm

    NASA Astrophysics Data System (ADS)

    Sansottera, Marco; Ceccaroni, Marta

    2016-07-01

    We revisit the relegation algorithm by Deprit et al. (Celest. Mech. Dyn. Astron. 79:157-182, 2001) in the light of the rigorous Nekhoroshev's like theory. This relatively recent algorithm is nowadays widely used for implementing closed form analytic perturbation theories, as it generalises the classical Birkhoff normalisation algorithm. The algorithm, here briefly explained by means of Lie transformations, has been so far introduced and used in a formal way, i.e. without providing any rigorous convergence or asymptotic estimates. The overall aim of this paper is to find such quantitative estimates and to show how the results about stability over exponentially long times can be recovered in a simple and effective way, at least in the non-resonant case.

  12. Universal lossless compression algorithm for textual images

    NASA Astrophysics Data System (ADS)

    al Zahir, Saif

    2012-03-01

    In recent years, an unparalleled volume of textual information has been transported over the Internet via email, chatting, blogging, tweeting, digital libraries, and information retrieval systems. As the volume of text data has now exceeded 40% of the total volume of traffic on the Internet, compressing textual data becomes imperative. Many sophisticated algorithms were introduced and employed for this purpose including Huffman encoding, arithmetic encoding, the Ziv-Lempel family, Dynamic Markov Compression, and Burrow-Wheeler Transform. My research presents novel universal algorithm for compressing textual images. The algorithm comprises two parts: 1. a universal fixed-to-variable codebook; and 2. our row and column elimination coding scheme. Simulation results on a large number of Arabic, Persian, and Hebrew textual images show that this algorithm has a compression ratio of nearly 87%, which exceeds published results including JBIG2.

  13. A New Pivot Algorithm for Star Identification

    NASA Astrophysics Data System (ADS)

    Nah, Jakyoung; Yi, Yu; Kim, Yong Ha

    2014-09-01

    In this study, a star identification algorithm which utilizes pivot patterns instead of apparent magnitude information was developed. The new star identification algorithm consists of two steps of recognition process. In the first step, the brightest star in a sensor image is identified using the orientation of brightness between two stars as recognition information. In the second step, cell indexes are used as new recognition information to identify dimmer stars, which are derived from the brightest star already identified. If we use the cell index information, we can search over limited portion of the star catalogue database, which enables the faster identification of dimmer stars. The new pivot algorithm does not require calibrations on the apparent magnitude of a star but it shows robust characteristics on the errors of apparent magnitude compared to conventional pivot algorithms which require the apparent magnitude information.

  14. Machine Learning Algorithms for Automatic Classification of Marmoset Vocalizations

    PubMed Central

    Ribeiro, Sidarta; Pereira, Danillo R.; Papa, João P.; de Albuquerque, Victor Hugo C.

    2016-01-01

    Automatic classification of vocalization type could potentially become a useful tool for acoustic the monitoring of captive colonies of highly vocal primates. However, for classification to be useful in practice, a reliable algorithm that can be successfully trained on small datasets is necessary. In this work, we consider seven different classification algorithms with the goal of finding a robust classifier that can be successfully trained on small datasets. We found good classification performance (accuracy > 0.83 and F1-score > 0.84) using the Optimum Path Forest classifier. Dataset and algorithms are made publicly available. PMID:27654941

  15. Detecting cosmic strings in the CMB with the Canny algorithm

    SciTech Connect

    Amsel, Stephen; Brandenberger, Robert H; Berger, Joshua E-mail: jb454@cornell.edu

    2008-04-15

    Line discontinuities in cosmic microwave background anisotropy maps are a distinctive prediction of models with cosmic strings. These signatures are visible in anisotropy maps with good angular resolution and should be identifiable using edge-detection algorithms. One such algorithm is the Canny algorithm. We study the potential of this algorithm to pick out the line discontinuities generated by cosmic strings. By applying the algorithm to small-scale microwave anisotropy maps generated from theoretical models with and without cosmic strings, we find that, given an angular resolution of several minutes of arc, cosmic strings can be detected down to a limit of the mass per unit length of the string which is one order of magnitude lower than the current upper bounds.

  16. Optical rate sensor algorithms

    NASA Astrophysics Data System (ADS)

    Uhde-Lacovara, Jo A.

    1989-12-01

    Optical sensors, in particular Charge Coupled Device (CCD) arrays, will be used on Space Station to track stars in order to provide inertial attitude reference. Algorithms are presented to derive attitude rate from the optical sensors. The first algorithm is a recursive differentiator. A variance reduction factor (VRF) of 0.0228 was achieved with a rise time of 10 samples. A VRF of 0.2522 gives a rise time of 4 samples. The second algorithm is based on the direct manipulation of the pixel intensity outputs of the sensor. In 1-dimensional simulations, the derived rate was with 0.07 percent of the actual rate in the presence of additive Gaussian noise with a signal to noise ratio of 60 dB.

  17. Optical rate sensor algorithms

    NASA Technical Reports Server (NTRS)

    Uhde-Lacovara, Jo A.

    1989-01-01

    Optical sensors, in particular Charge Coupled Device (CCD) arrays, will be used on Space Station to track stars in order to provide inertial attitude reference. Algorithms are presented to derive attitude rate from the optical sensors. The first algorithm is a recursive differentiator. A variance reduction factor (VRF) of 0.0228 was achieved with a rise time of 10 samples. A VRF of 0.2522 gives a rise time of 4 samples. The second algorithm is based on the direct manipulation of the pixel intensity outputs of the sensor. In 1-dimensional simulations, the derived rate was with 0.07 percent of the actual rate in the presence of additive Gaussian noise with a signal to noise ratio of 60 dB.

  18. Is Good Fit Related to Good Behaviour? Goodness of Fit between Daycare Teacher-Child Relationships, Temperament, and Prosocial Behaviour

    ERIC Educational Resources Information Center

    Hipson, Will E.; Séguin, Daniel G.

    2016-01-01

    The Goodness-of-Fit model [Thomas, A., & Chess, S. (1977). Temperament and development. New York: Brunner/Mazel] proposes that a child's temperament interacts with the environment to influence child outcomes. In the past, researchers have shown how the association between the quality of the teacher-child relationship in daycare and child…

  19. New Effective Multithreaded Matching Algorithms

    SciTech Connect

    Manne, Fredrik; Halappanavar, Mahantesh

    2014-05-19

    Matching is an important combinatorial problem with a number of applications in areas such as community detection, sparse linear algebra, and network alignment. Since computing optimal matchings can be very time consuming, several fast approximation algorithms, both sequential and parallel, have been suggested. Common to the algorithms giving the best solutions is that they tend to be sequential by nature, while algorithms more suitable for parallel computation give solutions of less quality. We present a new simple 1 2 -approximation algorithm for the weighted matching problem. This algorithm is both faster than any other suggested sequential 1 2 -approximation algorithm on almost all inputs and also scales better than previous multithreaded algorithms. We further extend this to a general scalable multithreaded algorithm that computes matchings of weight comparable with the best sequential algorithms. The performance of the suggested algorithms is documented through extensive experiments on different multithreaded architectures.

  20. "No-Shows": A Vexing Problem.

    ERIC Educational Resources Information Center

    Holmes, John; And Others

    1980-01-01

    What can we learn about the applicants who do not show up on campus to register? This study suggests both a method for learning more about "no-shows" and the reasons why they change their mind. (Author)