Why is Boris Algorithm So Good?
et al, Hong Qin
2013-03-03
Due to its excellent long term accuracy, the Boris algorithm is the de facto standard for advancing a charged particle. Despite its popularity, up to now there has been no convincing explanation why the Boris algorithm has this advantageous feature. In this letter, we provide an answer to this question. We show that the Boris algorithm conserves phase space volume, even though it is not symplectic. The global bound on energy error typically associated with symplectic algorithms still holds for the Boris algorithm, making it an effective algorithm for the multi-scale dynamics of plasmas.
Why is Boris algorithm so good?
Qin, Hong; Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 ; Zhang, Shuangxi; Xiao, Jianyuan; Liu, Jian; Sun, Yajuan; Tang, William M.
2013-08-15
Due to its excellent long term accuracy, the Boris algorithm is the de facto standard for advancing a charged particle. Despite its popularity, up to now there has been no convincing explanation why the Boris algorithm has this advantageous feature. In this paper, we provide an answer to this question. We show that the Boris algorithm conserves phase space volume, even though it is not symplectic. The global bound on energy error typically associated with symplectic algorithms still holds for the Boris algorithm, making it an effective algorithm for the multi-scale dynamics of plasmas.
Can You Show You Are a Good Lecturer?
ERIC Educational Resources Information Center
Wood, Leigh N.; Harding, Ansie
2007-01-01
Measurement of the quality of teaching activities is becoming increasingly important since universities are rewarding performance in terms of promotion, awards and bonuses and research is no longer the only key performance indicator. Good teaching is not easy to identify and measure. This paper specifically deals with the issue of good teaching in…
Winners show the way to good management in health care.
Schwefel, D; Pons, M C
1994-01-01
To stimulate resourcefulness in the health care services of the Philippines, the German Agency for Technical Cooperation (GTZ) organized a competition to discover and publicize examples of good management. The results provide a rich fund of new ideas. PMID:7999220
Cationorm shows good tolerability on human HCE-2 corneal epithelial cell cultures.
Kinnunen, Kati; Kauppinen, Anu; Piippo, Niina; Koistinen, Arto; Toropainen, Elisa; Kaarniranta, Kai
2014-03-01
mitochondrial metabolism to 73% with Cationorm and 53% with BAK from that of the control cells after 30 min exposure in MTT assay. BAK was the only test compound having clear adverse effects on the cell number and metabolism in CCK-8 assay. The activity of caspase-3 did not show significant differences between the groups. Inflammatory response after exposure to Cationorm was significantly lower than after exposure to BAK. There were no significant differences in NF-κB activity between the groups. Diluted Cationorm and Systane with polyquaternium-1/polidronium chloride 0.001% showed good tolerability on HCE-2 cells and thereby provide a clear improvement when compared to BAK-containing eye drop formulations. PMID:24462278
You Showed Your Whiteness: You Don't Get a "Good" White People's Medal
ERIC Educational Resources Information Center
Hayes, Cleveland; Juarez, Brenda G.
2009-01-01
The White liberal is a person who finds themselves defined as White, as an oppressor, in short, and retreats in horror from that designation. The desire to be and to be known as a good White person stems from the recognition that Whiteness is problematic, recognition that many White liberals try to escape by being demonstrably different from…
Nonoperatively treated forearm shaft fractures in children show good long-term recovery
Sinikumpu, Juha-Jaakko; Victorzon, Sarita; Antila, Eeva; Pokka, Tytti; Serlo, Willy
2014-01-01
Background and purpose — The incidence of forearm shaft fractures in children has increased and operative treatment has increased compared with nonoperative treatment in recent years. We analyzed the long-term results of nonoperative treatment. Patients and methods — We performed a population-based age- and sex-matched case-control study in Vaasa Central Hospital, concerning fractures treated in the period 1995–1999. There were 47 nonoperatively treated both-bone forearm shaft fractures, and the patients all participated in the study. 1 healthy control per case was randomly selected and evaluated for comparison. We analyzed clinical and radiographic outcomes of all fractures at a mean of 11 (9–14) years after the trauma. Results — The main outcome, pronosupination of the forearm, was not decreased in the long term. Grip strength was also equally as good as in the controls. Wrist mobility was similar in flexion (85°) and extension (83°) compared to the contralateral side. The patients were satisfied with the outcome, and pain-free. Radiographally, 4 cases had radio-carpal joint degeneration and 4 had a local bone deformity. Interpretation — The long-term outcome of nonoperatively treated both-bone forearm shaft fractures in children was excellent. PMID:25238437
Oxygen isotopes in tree rings show good coherence between species and sites in Bolivia
NASA Astrophysics Data System (ADS)
Baker, Jessica C. A.; Hunt, Sarah F. P.; Clerici, Santiago J.; Newton, Robert J.; Bottrell, Simon H.; Leng, Melanie J.; Heaton, Timothy H. E.; Helle, Gerhard; Argollo, Jaime; Gloor, Manuel; Brienen, Roel J. W.
2015-10-01
A tree ring oxygen isotope (δ18OTR) chronology developed from one species (Cedrela odorata) growing in a single site has been shown to be a sensitive proxy for rainfall over the Amazon Basin, thus allowing reconstructions of precipitation in a region where meteorological records are short and scarce. Although these results suggest that there should be large-scale (> 100 km) spatial coherence of δ18OTR records in the Amazon, this has not been tested. Furthermore, it is of interest to investigate whether other, possibly longer-lived, species similarly record interannual variation of Amazon precipitation, and can be used to develop climate sensitive isotope chronologies. In this study, we measured δ18O in tree rings from seven lowland and one highland tree species from Bolivia. We found that cross-dating with δ18OTR gave more accurate tree ring dates than using ring width. Our "isotope cross-dating approach" is confirmed with radiocarbon "bomb-peak" dates, and has the potential to greatly facilitate development of δ18OTR records in the tropics, identify dating errors, and check annual ring formation in tropical trees. Six of the seven lowland species correlated significantly with C. odorata, showing that variation in δ18OTR has a coherent imprint across very different species, most likely arising from a dominant influence of source water δ18O on δ18OTR. In addition we show that δ18OTR series cohere over large distances, within and between species. Comparison of two C. odorata δ18OTR chronologies from sites several hundreds of kilometres apart showed a very strong correlation (r = 0.80, p < 0.001, 1901-2001), and a significant (but weaker) relationship was found between lowland C. odorata trees and a Polylepis tarapacana tree growing in the distant Altiplano (r = 0.39, p < 0.01, 1931-2001). This large-scale coherence of δ18OTR records is probably triggered by a strong spatial coherence in precipitation δ18O due to large-scale controls. These results
Ehsan, Shoaib; Kanwal, Nadia; Clark, Adrian F; McDonald-Maier, Klaus D
2012-01-01
Speeded-Up Robust Features is a feature extraction algorithm designed for real-time execution, although this is rarely achievable on low-power hardware such as that in mobile robots. One way to reduce the computation is to discard some of the scale-space octaves, and previous research has simply discarded the higher octaves. This paper shows that this approach is not always the most sensible and presents an algorithm for choosing which octaves to discard based on the properties of the imagery. Results obtained with this best octaves algorithm show that it is able to achieve a significant reduction in computation without compromising matching performance. PMID:21712160
A brief 5-item version of the Neck Disability Index shows good psychometric properties
2013-01-01
Background The purpose of this secondary analysis of clinical databases of people with neck pain was to use a mixed unique conceptual and statistical approach to develop a brief version of the Neck Disability Index (NDI). Methods An a priori framework of neck-related function based on the International Classification of Functioning, Disability and Health was used to identify items from the original 10-item NDI that do not conceptually fit. Remaining items were subject to Rasch analysis to identify items that did not statistically fit with axioms of quantitative measurement. Finally, approaches drawn from classical test theory were used to compare stability, responsiveness and concurrent validity of the original NDI, the new brief NDI and the linearly-transformed brief NDI. Results Conceptual analysis identified 3 items that did not fit with the construct of self-reported ability to perform activity: pain intensity, headache, and sleeping. These items were removed, and responses to the remaining 7 items drawn from an assembled database of 316 physiotherapy patients with neck pain were subject to Rasch analysis. Two items were removed due to either considerable differential item functioning (reading) or statistical redundancy (lifting). The remaining items were considered the NDI-5. Test-retest reliability, responsiveness, sensitivity to change, and concurrent validity were all comparable across the original NDI, NDI-5 and linearly-transformed NDI-5. Sensitivity to change over a 1-month period of physiotherapy was the notable exception, where the linearly-transformed NDI-5 showed superiority over the other two forms. Conclusions A shortened version of the NDI, the NDI-5, has been constructed that is conceptually and statistically sound. Implications for research and clinical practice are discussed. Comparison with the NDI-8 is provided that suggests overall similar function across the forms, although the latter may be more sensitive to change. PMID:23816395
Good-Enough Brain Model: Challenges, Algorithms, and Discoveries in Multisubject Experiments.
Papalexakis, Evangelos E; Fyshe, Alona; Sidiropoulos, Nicholas D; Talukdar, Partha Pratim; Mitchell, Tom M; Faloutsos, Christos
2014-12-01
Given a simple noun such as apple, and a question such as "Is it edible?," what processes take place in the human brain? More specifically, given the stimulus, what are the interactions between (groups of) neurons (also known as functional connectivity) and how can we automatically infer those interactions, given measurements of the brain activity? Furthermore, how does this connectivity differ across different human subjects? In this work, we show that this problem, even though originating from the field of neuroscience, can benefit from big data techniques; we present a simple, novel good-enough brain model, or GeBM in short, and a novel algorithm Sparse-SysId, which are able to effectively model the dynamics of the neuron interactions and infer the functional connectivity. Moreover, GeBM is able to simulate basic psychological phenomena such as habituation and priming (whose definition we provide in the main text). We evaluate GeBM by using real brain data. GeBM produces brain activity patterns that are strikingly similar to the real ones, where the inferred functional connectivity is able to provide neuroscientific insights toward a better understanding of the way that neurons interact with each other, as well as detect regularities and outliers in multisubject brain activity measurements. PMID:27442756
Koiwai, Keiichiro; Sasaki, Shigeru; Yoshizawa, Eriko; Ina, Hironobu; Fukazawa, Ayumu; Sakai, Katsuya; Ozawa, Takesumi; Matsushita, Hirohide; Kadoya, Masumi
2014-03-01
To evaluate the validity of a decrease in the radiation dose for patients who were good responders to chemotherapy for localized diffuse large B-cell lymphoma (DLBCL), 91 patients with localized DLBCL who underwent radiotherapy after multi-agent chemotherapy from 1988-2008 were reviewed. Exclusion criteria were as follows: central nervous system or nasal cavity primary site, or Stage II with bulky tumor (≥10 cm). Of these patients, 62 were identified as good responders to chemotherapy. They were divided into two groups receiving either a higher or a lower radiation dose (32-50.4 Gy or 15-30.6 Gy, respectively). There were no statistically significant differences between the lower and higher dose groups in progression-free survival, locoregional progression-free survival or overall survival. Adaptation of decreased radiation dose may be valid for localized DLBCL patients who show a good response to chemotherapy. PMID:24187329
Rapacciuolo, Giovanni; Roy, David B.; Gillings, Simon; Fox, Richard; Walker, Kevin; Purvis, Andy
2012-01-01
Conservation planners often wish to predict how species distributions will change in response to environmental changes. Species distribution models (SDMs) are the primary tool for making such predictions. Many methods are widely used; however, they all make simplifying assumptions, and predictions can therefore be subject to high uncertainty. With global change well underway, field records of observed range shifts are increasingly being used for testing SDM transferability. We used an unprecedented distribution dataset documenting recent range changes of British vascular plants, birds, and butterflies to test whether correlative SDMs based on climate change provide useful approximations of potential distribution shifts. We modelled past species distributions from climate using nine single techniques and a consensus approach, and projected the geographical extent of these models to a more recent time period based on climate change; we then compared model predictions with recent observed distributions in order to estimate the temporal transferability and prediction accuracy of our models. We also evaluated the relative effect of methodological and taxonomic variation on the performance of SDMs. Models showed good transferability in time when assessed using widespread metrics of accuracy. However, models had low accuracy to predict where occupancy status changed between time periods, especially for declining species. Model performance varied greatly among species within major taxa, but there was also considerable variation among modelling frameworks. Past climatic associations of British species distributions retain a high explanatory power when transferred to recent time – due to their accuracy to predict large areas retained by species – but fail to capture relevant predictors of change. We strongly emphasize the need for caution when using SDMs to predict shifts in species distributions: high explanatory power on temporally-independent records – as assessed
Is It that Difficult to Find a Good Preference Order for the Incremental Algorithm?
ERIC Educational Resources Information Center
Krahmer, Emiel; Koolen, Ruud; Theune, Mariet
2012-01-01
In a recent article published in this journal (van Deemter, Gatt, van der Sluis, & Power, 2012), the authors criticize the Incremental Algorithm (a well-known algorithm for the generation of referring expressions due to Dale & Reiter, 1995, also in this journal) because of its strong reliance on a pre-determined, domain-dependent Preference Order.…
Lanasa, M C; Allgood, S D; Slager, S L; Dave, S S; Love, C; Marti, G E; Kay, N E; Hanson, C A; Rabe, K G; Achenbach, S J; Goldin, L R; Camp, N J; Goodman, B K; Vachon, C M; Spector, L G; Rassenti, L Z; Leis, J F; Gockerman, J P; Strom, S S; Call, T G; Glenn, M; Cerhan, J R; Levesque, M C; Weinberg, J B; Caporaso, N E
2011-09-01
Monoclonal B-cell lymphocytosis (MBL) is a hematologic condition wherein small B-cell clones can be detected in the blood of asymptomatic individuals. Most MBL have an immunophenotype similar to chronic lymphocytic leukemia (CLL), and 'CLL-like' MBL is a precursor to CLL. We used flow cytometry to identify MBL from unaffected members of CLL kindreds. We identified 101 MBL cases from 622 study subjects; of these, 82 individuals with MBL were further characterized. In all, 91 unique MBL clones were detected: 73 CLL-like MBL (CD5(+)CD20(dim)sIg(dim)), 11 atypical MBL (CD5(+)CD20(+)sIg(+)) and 7 CD5(neg) MBL (CD5(neg)CD20(+)sIg(neg)). Extended immunophenotypic characterization of these MBL subtypes was performed, and significant differences in cell surface expression of CD23, CD49d, CD79b and FMC-7 were observed among the groups. Markers of risk in CLL such as CD38, ZAP70 and CD49d were infrequently expressed in CLL-like MBL, but were expressed in the majority of atypical MBL. Interphase cytogenetics was performed in 35 MBL cases, and del 13q14 was most common (22/30 CLL-like MBL cases). Gene expression analysis using oligonucleotide arrays was performed on seven CLL-like MBL, and showed activation of B-cell receptor associated pathways. Our findings underscore the diversity of MBL subtypes and further clarify the relationship between MBL and other lymphoproliferative disorders. PMID:21617698
Searching good strategies in evolutionary minority game using variable length genetic algorithm
NASA Astrophysics Data System (ADS)
Yang, Wei-Song; Wang, Bing-Hong; Wu, Yi-Lin; Xie, Yan-Bo
2004-08-01
We propose and study a new adaptation minority game for understanding the complex dynamical behavior characterized by agent interactions competing limited resource in many natural and social systems. We compare the strategy of agents in the model to chromosome in biology. In our model, the agents with poor performance during certain time period may modify their strategies via variable length genetic algorithm which consists of cut and splice operator, imitating similar processes in biology. The performances of the agents in our model are calculated for different parameter conditions and different evolution mechanism. It is found that the system may evolve into a much more ideal equilibrium state, which implies much stronger cooperation among agents and much more effective utilization of the social resources. It is also found that the distribution of the strategies held by agents will tend towards a state concentrating upon small m region.
ERIC Educational Resources Information Center
Lowry, W. Kenneth
1977-01-01
Investigates whether today's students would score as well as students of the 1930-1950 era on achievement tests. Uses the Progressive Achievement Test, a test widely used in the 1930-1950 era as a barometer of student ability. (RK)
Nadanaciva, Sashi; Aleo, Michael D.; Strock, Christopher J.; Stedman, Donald B.; Wang, Huijun; Will, Yvonne
2013-10-15
To reduce costly late-stage compound attrition, there has been an increased focus on assessing compounds in in vitro assays that predict attributes of human safety liabilities, before preclinical in vivo studies are done. Relevant questions when choosing a panel of assays for predicting toxicity are (a) whether there is general concordance in the data among the assays, and (b) whether, in a retrospective analysis, the rank order of toxicity of compounds in the assays correlates with the known safety profile of the drugs in humans. The aim of our study was to answer these questions using nonsteroidal anti-inflammatory drugs (NSAIDs) as a test set since NSAIDs are generally associated with gastrointestinal injury, hepatotoxicity, and/or cardiovascular risk, with mitochondrial impairment and endoplasmic reticulum stress being possible contributing factors. Eleven NSAIDs, flufenamic acid, tolfenamic acid, mefenamic acid, diclofenac, meloxicam, sudoxicam, piroxicam, diflunisal, acetylsalicylic acid, nimesulide, and sulindac (and its two metabolites, sulindac sulfide and sulindac sulfone), were tested for their effects on (a) the respiration of rat liver mitochondria, (b) a panel of mechanistic endpoints in rat hepatocytes, and (c) the viability and organ morphology of zebrafish. We show good concordance for distinguishing among/between NSAID chemical classes in the observations among the three approaches. Furthermore, the assays were complementary and able to correctly identify “toxic” and “non-toxic” drugs in accordance with their human safety profile, with emphasis on hepatic and gastrointestinal safety. We recommend implementing our multi-assay approach in the drug discovery process to reduce compound attrition. - Highlights: • NSAIDS cause liver and GI toxicity. • Mitochondrial uncoupling contributes to NSAID liver toxicity. • ER stress is a mechanism that contributes to liver toxicity. • Zebrafish and cell based assays are complimentary.
da Silva, Bárbara Pereira; Dias, Desirrê Morais; de Castro Moreira, Maria Eliza; Toledo, Renata Celi Lopes; da Matta, Sérgio Luis Pinto; Lucia, Ceres Mattos Della; Martino, Hércia Stampini Duarte; Pinheiro-Sant'Ana, Helena Maria
2016-09-01
Chia has been consumed by the world population due to its high fiber, lipids and proteins content. The objective was to evaluate the protein quality of chia untreated (seed and flour) and heat treated (90 °C/20 min), their influence on glucose and lipid homeostasis and integrity of liver and intestinal morphology of Wistar rats. 36 male rats, weanling, divided into six groups which received control diet (casein), free protein diet (aproteic) and four diet tests (chia seed; chia seed with heat treatment; chia flour and chia flour with heat treatment) for 14 days were used. The protein efficiency ratio (PER), net protein ratio (NPR) and true digestibility (TD) were evaluated. The biochemical variables and liver and intestinal morphologies of animals were determined. The values of PER, NPR and TD did not differ among the animals that were fed with chia and were lower than the control group. The animals that were fed with chia showed lower concentrations of glucose; triacylglycerides, low-density lipoprotein cholesterol and very low-density lipoprotein and higher high-density lipoprotein cholesterol than the control group. The liver weight of animals that were fed with chia was lower than the control group. Crypt depth and thickness of intestinal muscle layers were higher in groups that were fed with chia. The consumption of chia has shown good digestibility, hypoglycemic effect, improved lipid and glycemic profiles and reduced fat deposition in liver of animals, and also promoted changes in intestinal tissue that enhanced its functionality. PMID:27193017
ERIC Educational Resources Information Center
Allocco, Katherine
2010-01-01
One of the most versatile and multi-faceted films that an educator can use to illustrate urban America in the 1930s is "Great Guy," a relatively obscure film from 1936 directed by John G. Blystone and starring James Cagney and Mae Clarke. There are some simple practical considerations that make the film such a good fit for an American history or…
Kry, Stephen F.; Alvarez, Paola; Molineu, Andrea; Amador, Carrie; Galvin, James; Followill, David S.
2012-01-01
Purpose To determine the impact of treatment planning algorithm on the accuracy of heterogeneous dose calculations in the Radiological Physics Center (RPC) thorax phantom. Methods and Materials We retrospectively analyzed the results of 304 irradiations of the RPC thorax phantom at 221 different institutions as part of credentialing for RTOG clinical trials; the irradiations were all done using 6-MV beams. Treatment plans included those for intensity-modulated radiation therapy (IMRT) as well as 3D conformal therapy (3D CRT). Heterogeneous plans were developed using Monte Carlo (MC), convolution/superposition (CS) and the anisotropic analytic algorithm (AAA), as well as pencil beam (PB) algorithms. For each plan and delivery, the absolute dose measured in the center of a lung target was compared to the calculated dose, as was the planar dose in 3 orthogonal planes. The difference between measured and calculated dose was examined as a function of planning algorithm as well as use of IMRT. Results PB algorithms overestimated the dose delivered to the center of the target by 4.9% on average. Surprisingly, CS algorithms and AAA also showed a systematic overestimation of the dose to the center of the target, by 3.7% on average. In contrast, the MC algorithm dose calculations agreed with measurement within 0.6% on average. There was no difference observed between IMRT and 3D CRT calculation accuracy. Conclusion Unexpectedly, advanced treatment planning systems (those using CS and AAA algorithms) overestimated the dose that was delivered to the lung target. This issue requires attention in terms of heterogeneity calculations and potentially in terms of clinical practice. PMID:23237006
Wang, Rui-Rui; Yang, Qing-Hua; Luo, Rong-Hua; Peng, You-Mei; Dai, Shao-Xing; Zhang, Xing-Jie; Chen, Huan; Cui, Xue-Qing; Liu, Ya-Juan; Huang, Jing-Fei; Chang, Jun-Biao; Zheng, Yong-Tang
2014-01-01
Azvudine is a novel nucleoside reverse transcriptase inhibitor with antiviral activity on human immunodeficiency virus, hepatitis B virus and hepatitis C virus. Here we reported the in vitro activity of azvudine against HIV-1 and HIV-2 when used alone or in combination with other antiretroviral drugs and its drug resistance features. Azvudine exerted highly potent inhibition on HIV-1 (EC(50)s ranging from 0.03 to 6.92 nM) and HIV-2 (EC(50)s ranging from 0.018 to 0.025 nM). It also showed synergism in combination with six approved anti-HIV drugs on both C8166 and PBMC. In combination assay, the concentrations of azvudine used were 1000 or 500 fold lower than other drugs. Azvudine also showed potent inhibition on NRTI-resistant strains (L74V and T69N). Although M184V caused 250 fold reduction in susceptibility, azvudine remained active at nanomolar range. In in vitro induced resistant assay, the frequency of M184I mutation increased with induction time which suggests M184I as the key mutation in azvudine treatment. As control, lamivudine treatment resulted in a higher frequency of M184I/V given the same induction time and higher occurrence of M184V was found. Molecular modeling analysis suggests that steric hindrance is more pronounced in mutant M184I than M184V due to the azido group of azvudine. The present data demonstrates the potential of azvudine as a complementary drug to current anti-HIV drugs. M184I should be the key mutation, however, azvudine still remains active on HIV-1LAI-M184V at nanomolar range. PMID:25144636
Good Agreements Make Good Friends
Han, The Anh; Pereira, Luís Moniz; Santos, Francisco C.; Lenaerts, Tom
2013-01-01
When starting a new collaborative endeavor, it pays to establish upfront how strongly your partner commits to the common goal and what compensation can be expected in case the collaboration is violated. Diverse examples in biological and social contexts have demonstrated the pervasiveness of making prior agreements on posterior compensations, suggesting that this behavior could have been shaped by natural selection. Here, we analyze the evolutionary relevance of such a commitment strategy and relate it to the costly punishment strategy, where no prior agreements are made. We show that when the cost of arranging a commitment deal lies within certain limits, substantial levels of cooperation can be achieved. Moreover, these levels are higher than that achieved by simple costly punishment, especially when one insists on sharing the arrangement cost. Not only do we show that good agreements make good friends, agreements based on shared costs result in even better outcomes. PMID:24045873
Latifi, Kujtim; Oliver, Jasmine; Baker, Ryan; Dilling, Thomas J.; Stevens, Craig W.; Kim, Jongphil; Yue, Binglin; DeMarco, MaryLou; Zhang, Geoffrey G.; Moros, Eduardo G.; Feygelman, Vladimir
2014-04-01
Purpose: Pencil beam (PB) and collapsed cone convolution (CCC) dose calculation algorithms differ significantly when used in the thorax. However, such differences have seldom been previously directly correlated with outcomes of lung stereotactic ablative body radiation (SABR). Methods and Materials: Data for 201 non-small cell lung cancer patients treated with SABR were analyzed retrospectively. All patients were treated with 50 Gy in 5 fractions of 10 Gy each. The radiation prescription mandated that 95% of the planning target volume (PTV) receive the prescribed dose. One hundred sixteen patients were planned with BrainLab treatment planning software (TPS) with the PB algorithm and treated on a Novalis unit. The other 85 were planned on the Pinnacle TPS with the CCC algorithm and treated on a Varian linac. Treatment planning objectives were numerically identical for both groups. The median follow-up times were 24 and 17 months for the PB and CCC groups, respectively. The primary endpoint was local/marginal control of the irradiated lesion. Gray's competing risk method was used to determine the statistical differences in local/marginal control rates between the PB and CCC groups. Results: Twenty-five patients planned with PB and 4 patients planned with the CCC algorithms to the same nominal doses experienced local recurrence. There was a statistically significant difference in recurrence rates between the PB and CCC groups (hazard ratio 3.4 [95% confidence interval: 1.18-9.83], Gray's test P=.019). The differences (Δ) between the 2 algorithms for target coverage were as follows: ΔD99{sub GITV} = 7.4 Gy, ΔD99{sub PTV} = 10.4 Gy, ΔV90{sub GITV} = 13.7%, ΔV90{sub PTV} = 37.6%, ΔD95{sub PTV} = 9.8 Gy, and ΔD{sub ISO} = 3.4 Gy. GITV = gross internal tumor volume. Conclusions: Local control in patients receiving who were planned to the same nominal dose with PB and CCC algorithms were statistically significantly different. Possible alternative
Nakamura, Harumi; Tsuta, Koji; Nakagawa, Takashi; Hirai, Risen; Ota, Yasunori
2015-12-01
Primary effusion lymphoma (PEL) is a rare subtype of non-Hodgkin lymphoma that proliferates in body cavities without detectable masses. PEL is universally associated with human herpes virus-8 (HHV-8) infection and has an aggressive prognosis. Recently, an HHV-8-unrelated PEL-like lymphoma that usually occurs in elderly individuals and follows a more indolent prognosis has been reported, and it is treated as a disease distinct from PEL. However, its pathogenesis and prognostic factors have not been sufficiently clarified. In PEL-like lymphoma accompanied by Epstein-Barr virus (EBV) infection, latent infection types are not mentioned in the literature. Herein, we report the case of an 85-year-old Japanese man with pericardial PEL-like lymphoma who showed good improvement in condition for 24 months after pericardiocentesis without chemotherapy. Serological test results were positive for EBV capsid antigen and EBV nuclear antigen 2 (EBNA2), but negative for human immunodeficiency virus, hepatitis B virus, and hepatitis C virus. The disease phenotype and EBV infection mechanism were immunohistochemically investigated by the cellblock prepared from pericardial effusion. Atypical cells were positive for CD20, CD30, CD45, BCL2, MUM1, EBNA2, latent membrane protein 1, and EBV-encoded RNA (on in situ hybridization), but negative for CD3, CD5, CD10, CD138, cytokeratin AE1/AE3, and HHV-8. Accordingly, this case was considered to be a B-cell activated phenotype with a type III latent EBV infection. Type III latent EBV infection is unusual in PEL. PMID:26384578
Applications and accuracy of the parallel diagonal dominant algorithm
NASA Technical Reports Server (NTRS)
Sun, Xian-He
1993-01-01
The Parallel Diagonal Dominant (PDD) algorithm is a highly efficient, ideally scalable tridiagonal solver. In this paper, a detailed study of the PDD algorithm is given. First the PDD algorithm is introduced. Then the algorithm is extended to solve periodic tridiagonal systems. A variant, the reduced PDD algorithm, is also proposed. Accuracy analysis is provided for a class of tridiagonal systems, the symmetric, and anti-symmetric Toeplitz tridiagonal systems. Implementation results show that the analysis gives a good bound on the relative error, and the algorithm is a good candidate for the emerging massively parallel machines.
ERIC Educational Resources Information Center
Gehring, John
2004-01-01
For the past 16 years, the blue-collar city of Huntington, West Virginia, has rolled out the red carpet to welcome young wrestlers and their families as old friends. They have come to town chasing the same dream for a spot in what many of them call "The Show". For three days, under the lights of an arena packed with 5,000 fans, the state's best…
ERIC Educational Resources Information Center
Schoenheimer, Henry P.
This book contains seventeen thumb-nail sketches of schools in Europe, the United States, Asia, Britain, and Australia, as they appeared in the eye of the author as a professional educator and a journalist while travelling around the world. The author considers the schools described to be good schools, and not necessarily the 17 best schools in…
Zhou, Yongquan; Xie, Jian; Li, Liangliang; Ma, Mingzhi
2014-01-01
Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: "bats approach their prey." Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization. PMID:24967425
"Good mothering" or "good citizenship"?
Porter, Maree; Kerridge, Ian H; Jordens, Christopher F C
2012-03-01
Umbilical cord blood banking is one of many biomedical innovations that confront pregnant women with new choices about what they should do to secure their own and their child's best interests. Many mothers can now choose to donate their baby's umbilical cord blood (UCB) to a public cord blood bank or pay to store it in a private cord blood bank. Donation to a public bank is widely regarded as an altruistic act of civic responsibility. Paying to store UCB may be regarded as a "unique opportunity" to provide "insurance" for the child's future. This paper reports findings from a survey of Australian women that investigated the decision to either donate or store UCB. We conclude that mothers are faced with competing discourses that force them to choose between being a "good mother" and fulfilling their role as a "good citizen." We discuss this finding with reference to the concept of value pluralism. PMID:23180199
Atmospheric Science Data Center
2016-08-24
article title: Aerosol retrieval over Cape of Good Hope (Enlargement) ... SpectroRadiometer (MISR) image is an enlargement of the aerosol retrieval over Cape of Good Hope, August 23, 2000 , showing a more ... the incoming energy, so MISR's contribution is not only the aerosol retrieval necessary to do the correction, but the multi-angular ...
NASA Astrophysics Data System (ADS)
Zheng, Genrang; Lin, ZhengChun
The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.
Algorithms and Algorithmic Languages.
ERIC Educational Resources Information Center
Veselov, V. M.; Koprov, V. M.
This paper is intended as an introduction to a number of problems connected with the description of algorithms and algorithmic languages, particularly the syntaxes and semantics of algorithmic languages. The terms "letter, word, alphabet" are defined and described. The concept of the algorithm is defined and the relation between the algorithm and…
NASA Technical Reports Server (NTRS)
Arenstorf, Norbert S.; Jordan, Harry F.
1987-01-01
A barrier is a method for synchronizing a large number of concurrent computer processes. After considering some basic synchronization mechanisms, a collection of barrier algorithms with either linear or logarithmic depth are presented. A graphical model is described that profiles the execution of the barriers and other parallel programming constructs. This model shows how the interaction between the barrier algorithms and the work that they synchronize can impact their performance. One result is that logarithmic tree structured barriers show good performance when synchronizing fixed length work, while linear self-scheduled barriers show better performance when synchronizing fixed length work with an imbedded critical section. The linear barriers are better able to exploit the process skew associated with critical sections. Timing experiments, performed on an eighteen processor Flex/32 shared memory multiprocessor, that support these conclusions are detailed.
NASA Technical Reports Server (NTRS)
Arenstorf, Norbert S.; Jordan, Harry F.
1989-01-01
A barrier is a method for synchronizing a large number of concurrent computer processes. After considering some basic synchronization mechanisms, a collection of barrier algorithms with either linear or logarithmic depth are presented. A graphical model is described that profiles the execution of the barriers and other parallel programming constructs. This model shows how the interaction between the barrier algorithms and the work that they synchronize can impact their performance. One result is that logarithmic tree structured barriers show good performance when synchronizing fixed length work, while linear self-scheduled barriers show better performance when synchronizing fixed length work with an imbedded critical section. The linear barriers are better able to exploit the process skew associated with critical sections. Timing experiments, performed on an eighteen processor Flex/32 shared memory multiprocessor that support these conclusions, are detailed.
Advanced optimization of permanent magnet wigglers using a genetic algorithm
Hajima, Ryoichi
1995-12-31
In permanent magnet wigglers, magnetic imperfection of each magnet piece causes field error. This field error can be reduced or compensated by sorting magnet pieces in proper order. We showed a genetic algorithm has good property for this sorting scheme. In this paper, this optimization scheme is applied to the case of permanent magnets which have errors in the direction of field. The result shows the genetic algorithm is superior to other algorithms.
NASA Astrophysics Data System (ADS)
Abrams, Daniel S.
This thesis describes several new quantum algorithms. These include a polynomial time algorithm that uses a quantum fast Fourier transform to find eigenvalues and eigenvectors of a Hamiltonian operator, and that can be applied in cases (commonly found in ab initio physics and chemistry problems) for which all known classical algorithms require exponential time. Fast algorithms for simulating many body Fermi systems are also provided in both first and second quantized descriptions. An efficient quantum algorithm for anti-symmetrization is given as well as a detailed discussion of a simulation of the Hubbard model. In addition, quantum algorithms that calculate numerical integrals and various characteristics of stochastic processes are described. Two techniques are given, both of which obtain an exponential speed increase in comparison to the fastest known classical deterministic algorithms and a quadratic speed increase in comparison to classical Monte Carlo (probabilistic) methods. I derive a simpler and slightly faster version of Grover's mean algorithm, show how to apply quantum counting to the problem, develop some variations of these algorithms, and show how both (apparently distinct) approaches can be understood from the same unified framework. Finally, the relationship between physics and computation is explored in some more depth, and it is shown that computational complexity theory depends very sensitively on physical laws. In particular, it is shown that nonlinear quantum mechanics allows for the polynomial time solution of NP-complete and #P oracle problems. Using the Weinberg model as a simple example, the explicit construction of the necessary gates is derived from the underlying physics. Nonlinear quantum algorithms are also presented using Polchinski type nonlinearities which do not allow for superluminal communication. (Copies available exclusively from MIT Libraries, Rm. 14- 0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)
Sobel, E.; Lange, K.; O`Connell, J.R.
1996-12-31
Haplotyping is the logical process of inferring gene flow in a pedigree based on phenotyping results at a small number of genetic loci. This paper formalizes the haplotyping problem and suggests four algorithms for haplotype reconstruction. These algorithms range from exhaustive enumeration of all haplotype vectors to combinatorial optimization by simulated annealing. Application of the algorithms to published genetic analyses shows that manual haplotyping is often erroneous. Haplotyping is employed in screening pedigrees for phenotyping errors and in positional cloning of disease genes from conserved haplotypes in population isolates. 26 refs., 6 figs., 3 tabs.
A Revision of the NASA Team Sea Ice Algorithm
NASA Technical Reports Server (NTRS)
Markus, T.; Cavalieri, Donald J.
1998-01-01
In a recent paper, two operational algorithms to derive ice concentration from satellite multichannel passive microwave sensors have been compared. Although the results of these, known as the NASA Team algorithm and the Bootstrap algorithm, have been validated and are generally in good agreement, there are areas where the ice concentrations differ, by up to 30%. These differences can be explained by shortcomings in one or the other algorithm. Here, we present an algorithm which, in addition to the 19 and 37 GHz channels used by both the Bootstrap and NASA Team algorithms, makes use of the 85 GHz channels as well. Atmospheric effects particularly at 85 GHz are reduced by using a forward atmospheric radiative transfer model. Comparisons with the NASA Team and Bootstrap algorithm show that the individual shortcomings of these algorithms are not apparent in this new approach. The results further show better quantitative agreement with ice concentrations derived from NOAA AVHRR infrared data.
Runtime support for parallelizing data mining algorithms
NASA Astrophysics Data System (ADS)
Jin, Ruoming; Agrawal, Gagan
2002-03-01
With recent technological advances, shared memory parallel machines have become more scalable, and offer large main memories and high bus bandwidths. They are emerging as good platforms for data warehousing and data mining. In this paper, we focus on shared memory parallelization of data mining algorithms. We have developed a series of techniques for parallelization of data mining algorithms, including full replication, full locking, fixed locking, optimized full locking, and cache-sensitive locking. Unlike previous work on shared memory parallelization of specific data mining algorithms, all of our techniques apply to a large number of common data mining algorithms. In addition, we propose a reduction-object based interface for specifying a data mining algorithm. We show how our runtime system can apply any of the technique we have developed starting from a common specification of the algorithm.
NASA Astrophysics Data System (ADS)
Zhang, Yanjun; Yu, Chunjuan; Fu, Xinghu; Liu, Wenzhe; Bi, Weihong
2015-12-01
In the distributed optical fiber sensing system based on Brillouin scattering, strain and temperature are the main measuring parameters which can be obtained by analyzing the Brillouin center frequency shift. The novel algorithm which combines the cuckoo search algorithm (CS) with the improved differential evolution (IDE) algorithm is proposed for the Brillouin scattering parameter estimation. The CS-IDE algorithm is compared with CS algorithm and analyzed in different situation. The results show that both the CS and CS-IDE algorithm have very good convergence. The analysis reveals that the CS-IDE algorithm can extract the scattering spectrum features with different linear weight ratio, linewidth combination and SNR. Moreover, the BOTDR temperature measuring system based on electron optical frequency shift is set up to verify the effectiveness of the CS-IDE algorithm. Experimental results show that there is a good linear relationship between the Brillouin center frequency shift and temperature changes.
NASA Astrophysics Data System (ADS)
Huang, Xiaobiao; Safranek, James
2014-09-01
Nonlinear dynamics optimization is carried out for a low emittance upgrade lattice of SPEAR3 in order to improve its dynamic aperture and Touschek lifetime. Two multi-objective optimization algorithms, a genetic algorithm and a particle swarm algorithm, are used for this study. The performance of the two algorithms are compared. The result shows that the particle swarm algorithm converges significantly faster to similar or better solutions than the genetic algorithm and it does not require seeding of good solutions in the initial population. These advantages of the particle swarm algorithm may make it more suitable for many accelerator optimization applications.
A Hybrid Monkey Search Algorithm for Clustering Analysis
Chen, Xin; Zhou, Yongquan; Luo, Qifang
2014-01-01
Clustering is a popular data analysis and data mining technique. The k-means clustering algorithm is one of the most commonly used methods. However, it highly depends on the initial solution and is easy to fall into local optimum solution. In view of the disadvantages of the k-means method, this paper proposed a hybrid monkey algorithm based on search operator of artificial bee colony algorithm for clustering analysis and experiment on synthetic and real life datasets to show that the algorithm has a good performance than that of the basic monkey algorithm for clustering analysis. PMID:24772039
Lezama, José; Randall, Gregory; Morel, Jean-Michel; Grompone von Gioi, Rafael
2016-09-01
We propose a novel approach to the grouping of dot patterns by the good continuation law. Our model is based on local symmetries, and the non-accidentalness principle to determine perceptually relevant configurations. A quantitative measure of non-accidentalness is proposed, showing a good correlation with the visibility of a curve of dots. A robust, unsupervised and scale-invariant algorithm for the detection of good continuation of dots is derived. The results of the proposed method are illustrated on various datasets, including data from classic psychophysical studies. An online demonstration of the algorithm allows the reader to directly evaluate the method. PMID:26408332
A Simple Calculator Algorithm.
ERIC Educational Resources Information Center
Cook, Lyle; McWilliam, James
1983-01-01
The problem of finding cube roots when limited to a calculator with only square root capability is discussed. An algorithm is demonstrated and explained which should always produce a good approximation within a few iterations. (MP)
Good Discipline, Good Kids. [Videotape with Guide].
ERIC Educational Resources Information Center
Squier, William; Simmons, Susan; Yannes, Michelle; Simmons, Susan; Levine, Beth
Noting that good parental discipline provides positive, constructive ways to encourage cooperation and good behavior and gives children the skills to regulate themselves, this 42-minute videotape with facilitator's guide comprise a program intended to help parents get past daily power struggles by using effective disciplinary techniques and…
Nios II hardware acceleration of the epsilon quadratic sieve algorithm
NASA Astrophysics Data System (ADS)
Meyer-Bäse, Uwe; Botella, Guillermo; Castillo, Encarnacion; García, Antonio
2010-04-01
The quadratic sieve (QS) algorithm is one of the most powerful algorithms to factor large composite primes used to break RSA cryptographic systems. The hardware structure of the QS algorithm seems to be a good fit for FPGA acceleration. Our new ɛ-QS algorithm further simplifies the hardware architecture making it an even better candidate for C2H acceleration. This paper shows our design results in FPGA resource and performance when implementing very long arithmetic on the Nios microprocessor platform with C2H acceleration for different libraries (GMP, LIP, FLINT, NRMP) and QS architecture choices for factoring 32-2048 bit RSA numbers.
Identification of Traceability Barcode Based on Phase Correlation Algorithm
NASA Astrophysics Data System (ADS)
Lang, Liying; Zhang, Xiaofang
In the paper phase correlation algorithm based on Fourier transform is applied to the traceability barcode identification, which is a widely used method of image registration. And there is the rotation-invariant phase correlation algorithm which combines polar coordinate transform with phase correlation, that they can recognize the barcode with partly destroyed and rotated. The paper provides the analysis and simulation for the algorithm using Matlab, the results show that the algorithm has the advantages of good real-time and high performance. And it improves the matching precision and reduces the calculation by optimizing the rotation-invariant phase correlation.
A scalable parallel algorithm for multiple objective linear programs
NASA Technical Reports Server (NTRS)
Wiecek, Malgorzata M.; Zhang, Hong
1994-01-01
This paper presents an ADBASE-based parallel algorithm for solving multiple objective linear programs (MOLP's). Job balance, speedup and scalability are of primary interest in evaluating efficiency of the new algorithm. Implementation results on Intel iPSC/2 and Paragon multiprocessors show that the algorithm significantly speeds up the process of solving MOLP's, which is understood as generating all or some efficient extreme points and unbounded efficient edges. The algorithm gives specially good results for large and very large problems. Motivation and justification for solving such large MOLP's are also included.
An ant colony algorithm on continuous searching space
NASA Astrophysics Data System (ADS)
Xie, Jing; Cai, Chao
2015-12-01
Ant colony algorithm is heuristic, bionic and parallel. Because of it is property of positive feedback, parallelism and simplicity to cooperate with other method, it is widely adopted in planning on discrete space. But it is still not good at planning on continuous space. After a basic introduction to the basic ant colony algorithm, we will propose an ant colony algorithm on continuous space. Our method makes use of the following three tricks. We search for the next nodes of the route according to fixed-step to guarantee the continuity of solution. When storing pheromone, it discretizes field of pheromone, clusters states and sums up the values of pheromone of these states. When updating pheromone, it makes good resolutions measured in relative score functions leave more pheromone, so that ant colony algorithm can find a sub-optimal solution in shorter time. The simulated experiment shows that our ant colony algorithm can find sub-optimal solution in relatively shorter time.
Atmospheric Science Data Center
2013-04-16
article title: Aerosol retrieval over Cape of Good Hope View larger JPEG image ... Imaging SpectroRadiometer (MISR) images of the Cape of Good Hope were acquired on August 23, 2000. This first of two image sets, ...
Good Concrete Activity Is Good Mental Activity
ERIC Educational Resources Information Center
McDonough, Andrea
2016-01-01
Early years mathematics classrooms can be colourful, exciting, and challenging places of learning. Andrea McDonough and fellow teachers have noticed that some students make good decisions about using materials to assist their problem solving, but this is not always the case. These experiences lead her to ask the following questions: (1) Are…
Good Teaching and Supervision.
ERIC Educational Resources Information Center
Zahorik, John A.
1992-01-01
Without a definition of good teaching, the supervisor's efforts to help teachers improve will probably be fragmented, and teacher improvement may not occur. This article examines three definitions of good teaching, presents and defends a certain definition, and suggests supervisory applications. Good teachers are proficient in the kinds of…
Education Is Not a Public Good.
ERIC Educational Resources Information Center
Pisciotta, John
The purpose of this essay is to show that education is not a public good, and that in contrast to a public good such as national defense, education can be provided through competitive suppliers in the private sector as well as through government enterprise. A public good differs from a private good in the nature of consumption. A public good…
ERIC Educational Resources Information Center
Cai, Li; Lee, Taehun
2009-01-01
We apply the Supplemented EM algorithm (Meng & Rubin, 1991) to address a chronic problem with the "two-stage" fitting of covariance structure models in the presence of ignorable missing data: the lack of an asymptotically chi-square distributed goodness-of-fit statistic. We show that the Supplemented EM algorithm provides a convenient…
NASA Technical Reports Server (NTRS)
Wang, Lui; Bayer, Steven E.
1991-01-01
Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.
Image segmentation using an improved differential algorithm
NASA Astrophysics Data System (ADS)
Gao, Hao; Shi, Yujiao; Wu, Dongmei
2014-10-01
Among all the existing segmentation techniques, the thresholding technique is one of the most popular due to its simplicity, robustness, and accuracy (e.g. the maximum entropy method, Otsu's method, and K-means clustering). However, the computation time of these algorithms grows exponentially with the number of thresholds due to their exhaustive searching strategy. As a population-based optimization algorithm, differential algorithm (DE) uses a population of potential solutions and decision-making processes. It has shown considerable success in solving complex optimization problems within a reasonable time limit. Thus, applying this method into segmentation algorithm should be a good choice during to its fast computational ability. In this paper, we first propose a new differential algorithm with a balance strategy, which seeks a balance between the exploration of new regions and the exploitation of the already sampled regions. Then, we apply the new DE into the traditional Otsu's method to shorten the computation time. Experimental results of the new algorithm on a variety of images show that, compared with the EA-based thresholding methods, the proposed DE algorithm gets more effective and efficient results. It also shortens the computation time of the traditional Otsu method.
ERIC Educational Resources Information Center
Tingey, Carol
1987-01-01
Suggestions are presented from parents on how to help children with disabilities (with particular focus on Downs Syndrome) learn good grooming habits in such areas as good health, exercise, cleanliness, teeth and hair care, skin care, glasses and other devices, and social behavior. (CB)
ERIC Educational Resources Information Center
Placer Hills Union Elementary School District, Meadow Vista, CA.
THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: The "Good Citizen" Program was developed for many reasons: to keep the campus clean, to reward students for improvement, to reward students for good deeds, to improve the total school climate, to reward students for excellence, and to offer staff members a method of reward for positive…
Productivity and Capital Goods.
ERIC Educational Resources Information Center
Zicht, Barbara, Ed.; And Others
1981-01-01
Providing teacher background on the concepts of productivity and capital goods, this document presents 3 teaching units about these ideas for different grade levels. The grade K-2 unit, "How Do They Do It?," is designed to provide students with an understanding of how physical capital goods add to productivity. Activities include a field trip to…
ERIC Educational Resources Information Center
Dawkins, John
The punctuation system presented in this paper has explanatory power insofar as it explains how good writers punctuate. The paper notes that good writers have learned, through reading, the differences among a hierarchy of marks and acquired a sense of independent clauses that allows them to use the hierarchy, along with a reader-sensitive notion…
ERIC Educational Resources Information Center
Csikszentmihalyi, Mihaly
2003-01-01
Examines the working lives of geneticists and journalists to place into perspective what lies behind personal ethics and success. Defines "good work" as productive activity that is valued socially and loved by people engaged in it. Asserts that certain cultural values, social controls, and personal standards are necessary to maintain good work and…
Adaptive color image watermarking algorithm
NASA Astrophysics Data System (ADS)
Feng, Gui; Lin, Qiwei
2008-03-01
As a major method for intellectual property right protecting, digital watermarking techniques have been widely studied and used. But due to the problems of data amount and color shifted, watermarking techniques on color image was not so widespread studied, although the color image is the principal part for multi-medium usages. Considering the characteristic of Human Visual System (HVS), an adaptive color image watermarking algorithm is proposed in this paper. In this algorithm, HSI color model was adopted both for host and watermark image, the DCT coefficient of intensity component (I) of the host color image was used for watermark date embedding, and while embedding watermark the amount of embedding bit was adaptively changed with the complex degree of the host image. As to the watermark image, preprocessing is applied first, in which the watermark image is decomposed by two layer wavelet transformations. At the same time, for enhancing anti-attack ability and security of the watermarking algorithm, the watermark image was scrambled. According to its significance, some watermark bits were selected and some watermark bits were deleted as to form the actual embedding data. The experimental results show that the proposed watermarking algorithm is robust to several common attacks, and has good perceptual quality at the same time.
Flower pollination algorithm: A novel approach for multiobjective optimization
NASA Astrophysics Data System (ADS)
Yang, Xin-She; Karamanoglu, Mehmet; He, Xingshi
2014-09-01
Multiobjective design optimization problems require multiobjective optimization techniques to solve, and it is often very challenging to obtain high-quality Pareto fronts accurately. In this article, the recently developed flower pollination algorithm (FPA) is extended to solve multiobjective optimization problems. The proposed method is used to solve a set of multiobjective test functions and two bi-objective design benchmarks, and a comparison of the proposed algorithm with other algorithms has been made, which shows that the FPA is efficient with a good convergence rate. Finally, the importance for further parametric studies and theoretical analysis is highlighted and discussed.
A Novel Image Encryption Algorithm Based on DNA Subsequence Operation
Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng
2012-01-01
We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack. PMID:23093912
Changes to the COS Extraction Algorithm for Lifetime Position 3
NASA Astrophysics Data System (ADS)
Proffitt, Charles R.; Bostroem, K. Azalee; Ely, Justin; Foster, Deatrick; Hernandez, Svea; Hodge, Philip; Jedrzejewski, Robert I.; Lockwood, Sean A.; Massa, Derck; Peeples, Molly S.; Oliveira, Cristina M.; Penton, Steven V.; Plesha, Rachel; Roman-Duval, Julia; Sana, Hugues; Sahnow, David J.; Sonnentrucker, Paule; Taylor, Joanna M.
2015-09-01
The COS FUV Detector Lifetime Position 3 (LP3) has been placed only 2.5" below the original lifetime position (LP1). This is sufficiently close to gain-sagged regions at LP1 that a revised extraction algorithm is needed to ensure good spectral quality. We provide an overview of this new "TWOZONE" extraction algorithm, discuss its strengths and limitations, describe new output columns in the X1D files that show the boundaries of the new extraction regions, and provide some advice on how to manually tune the algorithm for specialized applications.
A novel image encryption algorithm based on DNA subsequence operation.
Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng
2012-01-01
We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack. PMID:23093912
Television Quiz Show Simulation
ERIC Educational Resources Information Center
Hill, Jonnie Lynn
2007-01-01
This article explores the simulation of four television quiz shows for students in China studying English as a foreign language (EFL). It discusses the adaptation and implementation of television quiz shows and how the students reacted to them.
Scalable Nearest Neighbor Algorithms for High Dimensional Data.
Muja, Marius; Lowe, David G
2014-11-01
For many computer vision and machine learning problems, large training sets are key for good performance. However, the most computationally expensive part of many computer vision and machine learning algorithms consists of finding nearest neighbor matches to high dimensional vectors that represent the training data. We propose new algorithms for approximate nearest neighbor matching and evaluate and compare them with previous algorithms. For matching high dimensional features, we find two algorithms to be the most efficient: the randomized k-d forest and a new algorithm proposed in this paper, the priority search k-means tree. We also propose a new algorithm for matching binary features by searching multiple hierarchical clustering trees and show it outperforms methods typically used in the literature. We show that the optimal nearest neighbor algorithm and its parameters depend on the data set characteristics and describe an automated configuration procedure for finding the best algorithm to search a particular data set. In order to scale to very large data sets that would otherwise not fit in the memory of a single machine, we propose a distributed nearest neighbor matching framework that can be used with any of the algorithms described in the paper. All this research has been released as an open source library called fast library for approximate nearest neighbors (FLANN), which has been incorporated into OpenCV and is now one of the most popular libraries for nearest neighbor matching. PMID:26353063
NASA Astrophysics Data System (ADS)
Brown, Alan S.
1992-10-01
The paper examines the prevention and prediction of failures in avionics systems caused by persistent corrosion and vibration. Preventive maintenance of redundant avionics elements and subsystems is discussed in terms of corrosion initiated by water intrusion. Measures developed to mitigate electromagnetic interference can lead to corrosion such as the introduction of Al flakes into rubber gaskets. Hermetic seals are shown to be good for corrosion prevention under certain conditions, and the limitations of glues and rubbery organics are listed. Solder joints in avionics are shown to be vulnerable to accumulated vibration and shock, and techniques for force and temperature isolation can be used to extend the life of avionics. Simulations are described of flight vibration and shock demonstrating that resonance is a more serious problem than direct coupling, and vibration can also hasten the onset of overloads.
ERIC Educational Resources Information Center
Anderton, Alice
The Intertribal Wordpath Society is a nonprofit educational corporation formed to promote the teaching, status, awareness, and use of Oklahoma Indian languages. The Society produces "Wordpath," a weekly 30-minute public access television show about Oklahoma Indian languages and the people who are teaching and preserving them. The show aims to…
Fast Optimal Load Balancing Algorithms for 1D Partitioning
Pinar, Ali; Aykanat, Cevdet
2002-12-09
One-dimensional decomposition of nonuniform workload arrays for optimal load balancing is investigated. The problem has been studied in the literature as ''chains-on-chains partitioning'' problem. Despite extensive research efforts, heuristics are still used in parallel computing community with the ''hope'' of good decompositions and the ''myth'' of exact algorithms being hard to implement and not runtime efficient. The main objective of this paper is to show that using exact algorithms instead of heuristics yields significant load balance improvements with negligible increase in preprocessing time. We provide detailed pseudocodes of our algorithms so that our results can be easily reproduced. We start with a review of literature on chains-on-chains partitioning problem. We propose improvements on these algorithms as well as efficient implementation tips. We also introduce novel algorithms, which are asymptotically and runtime efficient. We experimented with data sets from two different applications: Sparse matrix computations and Direct volume rendering. Experiments showed that the proposed algorithms are 100 times faster than a single sparse-matrix vector multiplication for 64-way decompositions on average. Experiments also verify that load balance can be significantly improved by using exact algorithms instead of heuristics. These two findings show that exact algorithms with efficient implementations discussed in this paper can effectively replace heuristics.
Anomaly, Jonathan
2014-01-01
Procreation is the ultimate public goods problem. Each new child affects the welfare of many other people, and some (but not all) children produce uncompensated value that future people will enjoy. This essay addresses challenges that arise if we think of procreation and parenting as public goods. These include whether individual choices are likely to lead to a socially desirable outcome, and whether changes in laws, social norms, or access to genetic engineering and embryo selection might improve the aggregate outcome of our reproductive choices. PMID:25743046
ERIC Educational Resources Information Center
Kirkpatrick, Larry D.; Rugheimer, Mac
1979-01-01
Describes the viewing sessions and the holograms of a holographic road show. The traveling exhibits, believed to stimulate interest in physics, include a wide variety of holograms and demonstrate several physical principles. (GA)
Competing Sudakov veto algorithms
NASA Astrophysics Data System (ADS)
Kleiss, Ronald; Verheyen, Rob
2016-07-01
We present a formalism to analyze the distribution produced by a Monte Carlo algorithm. We perform these analyses on several versions of the Sudakov veto algorithm, adding a cutoff, a second variable and competition between emission channels. The formal analysis allows us to prove that multiple, seemingly different competition algorithms, including those that are currently implemented in most parton showers, lead to the same result. Finally, we test their performance in a semi-realistic setting and show that there are significantly faster alternatives to the commonly used algorithms.
The Superior Lambert Algorithm
NASA Astrophysics Data System (ADS)
der, G.
2011-09-01
Lambert algorithms are used extensively for initial orbit determination, mission planning, space debris correlation, and missile targeting, just to name a few applications. Due to the significance of the Lambert problem in Astrodynamics, Gauss, Battin, Godal, Lancaster, Gooding, Sun and many others (References 1 to 15) have provided numerous formulations leading to various analytic solutions and iterative methods. Most Lambert algorithms and their computer programs can only work within one revolution, break down or converge slowly when the transfer angle is near zero or 180 degrees, and their multi-revolution limitations are either ignored or barely addressed. Despite claims of robustness, many Lambert algorithms fail without notice, and the users seldom have a clue why. The DerAstrodynamics lambert2 algorithm, which is based on the analytic solution formulated by Sun, works for any number of revolutions and converges rapidly at any transfer angle. It provides significant capability enhancements over every other Lambert algorithm in use today. These include improved speed, accuracy, robustness, and multirevolution capabilities as well as implementation simplicity. Additionally, the lambert2 algorithm provides a powerful tool for solving the angles-only problem without artificial singularities (pointed out by Gooding in Reference 16), which involves 3 lines of sight captured by optical sensors, or systems such as the Air Force Space Surveillance System (AFSSS). The analytic solution is derived from the extended Godal’s time equation by Sun, while the iterative method of solution is that of Laguerre, modified for robustness. The Keplerian solution of a Lambert algorithm can be extended to include the non-Keplerian terms of the Vinti algorithm via a simple targeting technique (References 17 to 19). Accurate analytic non-Keplerian trajectories can be predicted for satellites and ballistic missiles, while performing at least 100 times faster in speed than most
Designing Good Educational Software.
ERIC Educational Resources Information Center
Kingman, James C.
1984-01-01
Describes eight characteristics of good educational software. They are: (1) educational soundness; (2) ease of use; (3) "bullet" proofing (preventing a program from coming to a premature halt); (4) clear instructions; (5) appropriate language; (6) appropriate frame size; (7) motivation; and (8) evaluation. (JN)
ERIC Educational Resources Information Center
Drozdowski, Mark J.
2007-01-01
In this article, the author draws on his experience as the director of the Fitchburg State College Foundation in Fitchburg, Massachusetts, to make a distinction between being a good neighbor to local non-profit organizations by sharing strategies and information, and creating conflicts of interest when both the college and its neighbor…
ERIC Educational Resources Information Center
Webber, Nancy
2004-01-01
Many art teachers use the Web as an information source. Overall, they look for good content that is clearly written concise, accurate, and pertinent. A well-designed site gives users what they want quickly, efficiently, and logically, and does not ask them to assemble a puzzle to resolve their search. How can websites with these qualities be…
Reconsidering the "Good Divorce"
ERIC Educational Resources Information Center
Amato, Paul R.; Kane, Jennifer B.; James, Spencer
2011-01-01
This study attempted to assess the notion that a "good divorce" protects children from the potential negative consequences of marital dissolution. A cluster analysis of data on postdivorce parenting from 944 families resulted in three groups: cooperative coparenting, parallel parenting, and single parenting. Children in the cooperative coparenting…
Restructuring for Good Governance
ERIC Educational Resources Information Center
Robert, Stephen; Carey, Russell C.
2006-01-01
American higher education has never been more in need of good governance than it is right now. Yet much of the structure many boards have inherited or created tends to stall or impede timely, well-informed, and broadly supported decision making. At many institutions (ours included), layers of governance have been added with each passing year,…
ERIC Educational Resources Information Center
Eccleston, Jeff
2007-01-01
Big things come in small packages. This saying came to the mind of the author after he created a simple math review activity for his fourth grade students. Though simple, it has proven to be extremely advantageous in reinforcing math concepts. He uses this activity, which he calls "Show What You Know," often. This activity provides the perfect…
ERIC Educational Resources Information Center
Mathieu, Aaron
2000-01-01
Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)
Honored Teacher Shows Commitment.
ERIC Educational Resources Information Center
Ratte, Kathy
1987-01-01
Part of the acceptance speech of the 1985 National Council for the Social Studies Teacher of the Year, this article describes the censorship experience of this honored social studies teacher. The incident involved the showing of a videotape version of the feature film entitled "The Seduction of Joe Tynan." (JDH)
ERIC Educational Resources Information Center
Moore, Mitzi Ruth
1992-01-01
Proposes having students perform skits in which they play the roles of the science concepts they are trying to understand. Provides the dialog for a skit in which hot and cold gas molecules are interviewed on a talk show to study how these properties affect wind, rain, and other weather phenomena. (MDH)
ERIC Educational Resources Information Center
Frasier, Debra
2008-01-01
In the author's book titled "The Incredible Water Show," the characters from "Miss Alaineus: A Vocabulary Disaster" used an ocean of information to stage an inventive performance about the water cycle. In this article, the author relates how she turned the story into hands-on science teaching for real-life fifth-grade students. The author also…
ERIC Educational Resources Information Center
Cech, Scott J.
2008-01-01
Having students show their skills in three dimensions, known as performance-based assessment, dates back at least to Socrates. Individual schools such as Barrington High School--located just outside of Providence--have been requiring students to actively demonstrate their knowledge for years. The Rhode Island's high school graduating class became…
'Good palliative care' orders.
Maddocks, I
1993-01-01
A Select Committee of the Parliament of South Australia, considering revisions to legislation governing care of the dying, did not support allowing doctors to assist suicide. They recommended that no liability attach to the provision of reasonable palliative care which happens to shorten life. The Committee affirmed the suggestion that positive open orders to provide 'good palliative care' should replace 'do not resuscitate' orders. PMID:7506978
Barnett, K; Pittman, M
2001-01-01
Leaders cannot make the "business case" for community benefit in the traditional sense of near-term financial returns on investment. The concept of returns must be expanded to encompass more long-term--yet concrete and measurable--benefits that may be accrued both by nonprofit hospitals and local communities. Hospitals can "do well" economically through a more strategic approach to "doing good." PMID:11372275
An Artificial Immune Univariate Marginal Distribution Algorithm
NASA Astrophysics Data System (ADS)
Zhang, Qingbin; Kang, Shuo; Gao, Junxiang; Wu, Song; Tian, Yanping
Hybridization is an extremely effective way of improving the performance of the Univariate Marginal Distribution Algorithm (UMDA). Owing to its diversity and memory mechanisms, artificial immune algorithm has been widely used to construct hybrid algorithms with other optimization algorithms. This paper proposes a hybrid algorithm which combines the UMDA with the principle of general artificial immune algorithm. Experimental results on deceptive function of order 3 show that the proposed hybrid algorithm can get more building blocks (BBs) than the UMDA.
A scalable and practical one-pass clustering algorithm for recommender system
NASA Astrophysics Data System (ADS)
Khalid, Asra; Ghazanfar, Mustansar Ali; Azam, Awais; Alahmari, Saad Ali
2015-12-01
KMeans clustering-based recommendation algorithms have been proposed claiming to increase the scalability of recommender systems. One potential drawback of these algorithms is that they perform training offline and hence cannot accommodate the incremental updates with the arrival of new data, making them unsuitable for the dynamic environments. From this line of research, a new clustering algorithm called One-Pass is proposed, which is a simple, fast, and accurate. We show empirically that the proposed algorithm outperforms K-Means in terms of recommendation and training time while maintaining a good level of accuracy.
Boden, Timothy W
2016-01-01
Many medical practices have cut back on education and staff development expenses, especially those costs associated with conventions and conferences. But there are hard-to-value returns on your investment in these live events--beyond the obvious benefits of acquired knowledge and skills. Major vendors still exhibit their services and wares at many events, and the exhibit hall is a treasure-house of information and resources for the savvy physician or administrator. Make and stick to a purposeful plan to exploit the trade show. You can compare products, gain new insights and ideas, and even negotiate better deals with representatives anxious to realize returns on their exhibition investments. PMID:27249887
Wrong, Terence; Baumgart, Erica
2013-01-01
The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show. PMID:23631336
Cooperation and the common good.
Johnstone, Rufus A; Rodrigues, António M M
2016-02-01
In this paper, we draw the attention of biologists to a result from the economic literature, which suggests that when individuals are engaged in a communal activity of benefit to all, selection may favour cooperative sharing of resources even among non-relatives. Provided that group members all invest some resources in the public good, they should refrain from conflict over the division of these resources. The reason is that, given diminishing returns on investment in public and private goods, claiming (or ceding) a greater share of total resources only leads to the actor (or its competitors) investing more in the public good, such that the marginal costs and benefits of investment remain in balance. This cancels out any individual benefits of resource competition. We illustrate how this idea may be applied in the context of biparental care, using a sequential game in which parents first compete with one another over resources, and then choose how to allocate the resources they each obtain to care of their joint young (public good) versus their own survival and future reproductive success (private good). We show that when the two parents both invest in care to some extent, they should refrain from any conflict over the division of resources. The same effect can also support asymmetric outcomes in which one parent competes for resources and invests in care, whereas the other does not invest but refrains from competition. The fact that the caring parent gains higher fitness pay-offs at these equilibria suggests that abandoning a partner is not always to the latter's detriment, when the potential for resource competition is taken into account, but may instead be of benefit to the 'abandoned' mate. PMID:26729926
Walusinski, Olivier
2014-01-01
In the second half of the 19th century, Jean-Martin Charcot (1825-1893) became famous for the quality of his teaching and his innovative neurological discoveries, bringing many French and foreign students to Paris. A hunger for recognition, together with progressive and anticlerical ideals, led Charcot to invite writers, journalists, and politicians to his lessons, during which he presented the results of his work on hysteria. These events became public performances, for which physicians and patients were transformed into actors. Major newspapers ran accounts of these consultations, more like theatrical shows in some respects. The resultant enthusiasm prompted other physicians in Paris and throughout France to try and imitate them. We will compare the form and substance of Charcot's lessons with those given by Jules-Bernard Luys (1828-1897), Victor Dumontpallier (1826-1899), Ambroise-Auguste Liébault (1823-1904), Hippolyte Bernheim (1840-1919), Joseph Grasset (1849-1918), and Albert Pitres (1848-1928). We will also note their impact on contemporary cinema and theatre. PMID:25273491
NASA Astrophysics Data System (ADS)
2007-01-01
its high spatial and spectral resolution, it was possible to zoom into the very heart of this very massive star. In this innermost region, the observations are dominated by the extremely dense stellar wind that totally obscures the underlying central star. The AMBER observations show that this dense stellar wind is not spherically symmetric, but exhibits a clearly elongated structure. Overall, the AMBER observations confirm that the extremely high mass loss of Eta Carinae's massive central star is non-spherical and much stronger along the poles than in the equatorial plane. This is in agreement with theoretical models that predict such an enhanced polar mass-loss in the case of rapidly rotating stars. ESO PR Photo 06c/07 ESO PR Photo 06c/07 RS Ophiuchi in Outburst Several papers from this special feature focus on the later stages in a star's life. One looks at the binary system Gamma 2 Velorum, which contains the closest example of a star known as a Wolf-Rayet. A single AMBER observation allowed the astronomers to separate the spectra of the two components, offering new insights in the modeling of Wolf-Rayet stars, but made it also possible to measure the separation between the two stars. This led to a new determination of the distance of the system, showing that previous estimates were incorrect. The observations also revealed information on the region where the winds from the two stars collide. The famous binary system RS Ophiuchi, an example of a recurrent nova, was observed just 5 days after it was discovered to be in outburst on 12 February 2006, an event that has been expected for 21 years. AMBER was able to detect the extension of the expanding nova emission. These observations show a complex geometry and kinematics, far from the simple interpretation of a spherical fireball in extension. AMBER has detected a high velocity jet probably perpendicular to the orbital plane of the binary system, and allowed a precise and careful study of the wind and the shockwave
Two algorithms for compressing noise like signals
NASA Astrophysics Data System (ADS)
Agaian, Sos S.; Cherukuri, Ravindranath; Akopian, David
2005-05-01
Compression is a technique that is used to encode data so that the data needs less storage/memory space. Compression of random data is vital in case where data where we need preserve data that has low redundancy and whose power spectrum is close to noise. In case of noisy signals that are used in various data hiding schemes the data has low redundancy and low energy spectrum. Therefore, upon compressing with lossy compression algorithms the low energy spectrum might get lost. Since the LSB plane data has low redundancy, lossless compression algorithms like Run length, Huffman coding, Arithmetic coding are in effective in providing a good compression ratio. These problems motivated in developing a new class of compression algorithms for compressing noisy signals. In this paper, we introduce a two new compression technique that compresses the random data like noise with reference to know pseudo noise sequence generated using a key. In addition, we developed a representation model for digital media using the pseudo noise signals. For simulation, we have made comparison between our methods and existing compression techniques like Run length that shows the Run length cannot compress when data is random but the proposed algorithms can compress. Furthermore, the proposed algorithms can be extended to all kinds of random data used in various applications.
Efficient algorithms for survivable virtual network embedding
NASA Astrophysics Data System (ADS)
Sun, Gang; Yu, Hongfang; Li, Lemin; Anand, Vishal; di, Hao; Gao, Xiujiao
2010-12-01
Network Virtualization Technology is serving as an effective method for providing a flexible and highly adaptable shared substrate network to satisfy the diversity of demands. But the problem of efficiently embedding Virtual Network (VN) onto substrate network is intractable since it is NP-hard. How to guarantee survivability of the embedding efficiently is another great challenge. In this paper, we investigate the Survivable Virtual Network Embedding (SVNE) problem and propose two efficient algorithms for solving this problem efficiently. Firstly, we formulate the model with minimum-cost objective of survivable network virtualization problem by Mixed Integer Linear Programming (MILP). We then devise two efficient relaxation-based algorithms for solving survivable virtual network embedding problem: (1) Lagrangian Relaxation based algorithm, called LR-SVNE in this paper; (2) Decomposition based algorithm called DSVNE in this paper. The results of simulation experiments show that these two algorithms both have good performance on time efficiency but LR-SVNE can guarantee the solution converge to optimal one under small scale substrate network.
Multipartite entanglement in quantum algorithms
Bruss, D.; Macchiavello, C.
2011-05-15
We investigate the entanglement features of the quantum states employed in quantum algorithms. In particular, we analyze the multipartite entanglement properties in the Deutsch-Jozsa, Grover, and Simon algorithms. Our results show that for these algorithms most instances involve multipartite entanglement.
What makes good image composition?
NASA Astrophysics Data System (ADS)
Banner, Ron
2011-03-01
Some people are born with an intuitive sense of good composition. They do not need to be taught composition, and their work is immediately perceived as being well by other people. In an attempt to help others learn composition, art critics, scientists and psychologists analyzed well-compose works in the hope of recognizing patterns and trends that anyone could employ to achieve similar results. Unfortunately, the identified patterns are by no means universal. Moreover, since a compositional rule is useful only as long as it enhances the idea that the artist is trying to express, there is no objective standard to judge whether a given composition is "good" or "bad". As a result, the study of composition seems to be full of contradictions. Nevertheless, there are several basic "low level" rules supported by physiological studies in visual perception that artists and photographers intuitively obey. Regardless of image content, a prerequisite for all good images is that their respective composition would be balanced. In a balanced composition, factors such as shape, direction, location and color are determined in a way that is pleasant to the eye. An unbalanced composition looks accidental, transitory and its elements show a tendency to change place or shape in order to reach a state that better reflects the total structure. Under these conditions, the artistic statement becomes incomprehensive and confusing.
On algorithmic rate-coded AER generation.
Linares-Barranco, Alejandro; Jimenez-Moreno, Gabriel; Linares-Barranco, Bernabé; Civit-Balcells, Antón
2006-05-01
This paper addresses the problem of converting a conventional video stream based on sequences of frames into the spike event-based representation known as the address-event-representation (AER). In this paper we concentrate on rate-coded AER. The problem is addressed as an algorithmic problem, in which different methods are proposed, implemented and tested through software algorithms. The proposed algorithms are comparatively evaluated according to different criteria. Emphasis is put on the potential of such algorithms for a) doing the frame-based to event-based representation in real time, and b) that the resulting event streams ressemble as much as possible those generated naturally by rate-coded address-event VLSI chips, such as silicon AER retinae. It is found that simple and straightforward algorithms tend to have high potential for real time but produce event distributions that differ considerably from those obtained in AER VLSI chips. On the other hand, sophisticated algorithms that yield better event distributions are not efficient for real time operations. The methods based on linear-feedback-shift-register (LFSR) pseudorandom number generation is a good compromise, which is feasible for real time and yield reasonably well distributed events in time. Our software experiments, on a 1.6-GHz Pentium IV, show that at 50% AER bus load the proposed algorithms require between 0.011 and 1.14 ms per 8 bit-pixel per frame. One of the proposed LFSR methods is implemented in real time hardware using a prototyping board that includes a VirtexE 300 FPGA. The demonstration hardware is capable of transforming frames of 64 x 64 pixels of 8-bit depth at a frame rate of 25 frames per second, producing spike events at a peak rate of 10(7) events per second. PMID:16722179
A Parallel Prefix Algorithm for Almost Toeplitz Tridiagonal Systems
NASA Technical Reports Server (NTRS)
Sun, Xian-He; Joslin, Ronald D.
1995-01-01
A compact scheme is a discretization scheme that is advantageous in obtaining highly accurate solutions. However, the resulting systems from compact schemes are tridiagonal systems that are difficult to solve efficiently on parallel computers. Considering the almost symmetric Toeplitz structure, a parallel algorithm, simple parallel prefix (SPP), is proposed. The SPP algorithm requires less memory than the conventional LU decomposition and is efficient on parallel machines. It consists of a prefix communication pattern and AXPY operations. Both the computation and the communication can be truncated without degrading the accuracy when the system is diagonally dominant. A formal accuracy study has been conducted to provide a simple truncation formula. Experimental results have been measured on a MasPar MP-1 SIMD machine and on a Cray 2 vector machine. Experimental results show that the simple parallel prefix algorithm is a good algorithm for symmetric, almost symmetric Toeplitz tridiagonal systems and for the compact scheme on high-performance computers.
NASA Astrophysics Data System (ADS)
Gandomi, A. H.; Yang, X.-S.; Talatahari, S.; Alavi, A. H.
2013-01-01
A recently developed metaheuristic optimization algorithm, firefly algorithm (FA), mimics the social behavior of fireflies based on the flashing and attraction characteristics of fireflies. In the present study, we will introduce chaos into FA so as to increase its global search mobility for robust global optimization. Detailed studies are carried out on benchmark problems with different chaotic maps. Here, 12 different chaotic maps are utilized to tune the attractive movement of the fireflies in the algorithm. The results show that some chaotic FAs can clearly outperform the standard FA.
The Algorithm Selection Problem
NASA Technical Reports Server (NTRS)
Minton, Steve; Allen, John; Deiss, Ron (Technical Monitor)
1994-01-01
Work on NP-hard problems has shown that many instances of these theoretically computationally difficult problems are quite easy. The field has also shown that choosing the right algorithm for the problem can have a profound effect on the time needed to find a solution. However, to date there has been little work showing how to select the right algorithm for solving any particular problem. The paper refers to this as the algorithm selection problem. It describes some of the aspects that make this problem difficult, as well as proposes a technique for addressing it.
Good Clinical Practice Training
Arango, Jaime; Chuck, Tina; Ellenberg, Susan S.; Foltz, Bridget; Gorman, Colleen; Hinrichs, Heidi; McHale, Susan; Merchant, Kunal; Shapley, Stephanie; Wild, Gretchen
2016-01-01
Good Clinical Practice (GCP) is an international standard for the design, conduct, performance, monitoring, auditing, recording, analyses, and reporting of clinical trials. The goal of GCP is to ensure the protection of the rights, integrity, and confidentiality of clinical trial participants and to ensure the credibility and accuracy of data and reported results. In the United States, trial sponsors generally require investigators to complete GCP training prior to participating in each clinical trial to foster GCP and as a method to meet regulatory expectations (ie, sponsor’s responsibility to select qualified investigators per 21 CFR 312.50 and 312.53(a) for drugs and biologics and 21 CFR 812.40 and 812.43(a) for medical devices). This training requirement is often extended to investigative site staff, as deemed relevant by the sponsor, institution, or investigator. Those who participate in multiple clinical trials are often required by sponsors to complete repeated GCP training, which is unnecessarily burdensome. The Clinical Trials Transformation Initiative convened a multidisciplinary project team involving partners from academia, industry, other researchers and research staff, and government to develop recommendations for streamlining current GCP training practices. Recommendations drafted by the project team, including the minimum key training elements, frequency, format, and evidence of training completion, were presented to a broad group of experts to foster discussion of the current issues and to seek consensus on proposed solutions. PMID:27390628
Griffin, Bruce A
2016-08-01
Eggs have one of the lowest energy to nutrient density ratios of any food, and contain a quality of protein that is superior to beef steak and similar to dairy. From a nutritional perspective, this must qualify eggs as 'good'. The greater burden of proof has been to establish that eggs are not 'bad', by increasing awareness of the difference between dietary and blood cholesterol, and accumulating sufficient evidence to exonerate eggs from their associations with CVD and diabetes. After 60 years of research, a general consensus has now been reached that dietary cholesterol, chiefly from eggs, exerts a relatively small effect on serum LDL-cholesterol and CVD risk, in comparison with other diet and lifestyle factors. While dietary guidelines have been revised worldwide to reflect this view, associations between egg intake and the incidence of diabetes, and increased CVD risk in diabetes, prevail. These associations may be explained, in part, by residual confounding produced by other dietary components. The strength of evidence that links egg intake to increased CVD risk in diabetes is also complicated by variation in the response of serum LDL-cholesterol to eggs and dietary cholesterol in types 1 and 2 diabetes. On balance, the answer to the question as to whether eggs are 'bad', is probably 'no', but we do need to gain a better understanding of the effects of dietary cholesterol and its association with CVD risk in diabetes. PMID:27126575
An improved filter-u least mean square vibration control algorithm for aircraft framework.
Huang, Quanzhen; Luo, Jun; Gao, Zhiyuan; Zhu, Xiaojin; Li, Hengyu
2014-09-01
Active vibration control of aerospace vehicle structures is very a hot spot and in which filter-u least mean square (FULMS) algorithm is one of the key methods. But for practical reasons and technical limitations, vibration reference signal extraction is always a difficult problem for FULMS algorithm. To solve the vibration reference signal extraction problem, an improved FULMS vibration control algorithm is proposed in this paper. Reference signal is constructed based on the controller structure and the data in the algorithm process, using a vibration response residual signal extracted directly from the vibration structure. To test the proposed algorithm, an aircraft frame model is built and an experimental platform is constructed. The simulation and experimental results show that the proposed algorithm is more practical with a good vibration suppression performance. PMID:25273765
1993-01-01
Suggestions are made on how best to integrate sexually transmitted disease (STD) screening and education within family planning (FP) programs in the UK. FP programs are in a good position to advise about HIV infections and STDs because most clients are in a vulnerable age group (women aged 15-50 years) and because health personnel are experienced in discussing sexual issues. When FP clinics do not provide STD services, the options are to collaborate on joint referral and training efforts with STD clinics and to train staff to recognize and talk about STDs. Information about STDs can be clearly displayed in the clinics. Health personnel can talk about STD transmission to clients, explain the role of condoms in infection prevention, and demonstrate how to use condoms properly. Examples are given of integrated HIV and STD and FP programs in the US, Gambia, Zambia, and Mexico. In the US, Planned Parenthood of New York City trains staff in prevention and counseling skills and supervises staff until a level of comfort is reached. HIV and AIDS education and risk assessment are part of the initial and annual follow-up visits. The Gambia FP Association helps staff learn to counsel clients about the problems with sexual satisfaction between men and women and with communication between partners, impotence, painful intercourse from female circumcision, STDs and AIDS, infertility, and contraceptive side effects. In Zambia, a women's organization helps women prepare educational skits on condom use for males and helps women learn to talk with spouses about condom use without suffering rejection or charges of infidelity. The Ghana Planned Parenthood Association has a Daddy's Club where men learn about HIV and safe sex with condoms and meet for private counseling. Mexfam in Mexico educates for female farm laborers on sex education, FP, reproductive health and pregnancy, child health, water and sanitation, and energy-saving methods. PMID:12287338
Study of image matching algorithm and sub-pixel fitting algorithm in target tracking
NASA Astrophysics Data System (ADS)
Yang, Ming-dong; Jia, Jianjun; Qiang, Jia; Wang, Jian-yu
2015-03-01
Image correlation matching is a tracking method that searched a region most approximate to the target template based on the correlation measure between two images. Because there is no need to segment the image, and the computation of this method is little. Image correlation matching is a basic method of target tracking. This paper mainly studies the image matching algorithm of gray scale image, which precision is at sub-pixel level. The matching algorithm used in this paper is SAD (Sum of Absolute Difference) method. This method excels in real-time systems because of its low computation complexity. The SAD method is introduced firstly and the most frequently used sub-pixel fitting algorithms are introduced at the meantime. These fitting algorithms can't be used in real-time systems because they are too complex. However, target tracking often requires high real-time performance, we put forward a fitting algorithm named paraboloidal fitting algorithm based on the consideration above, this algorithm is simple and realized easily in real-time system. The result of this algorithm is compared with that of surface fitting algorithm through image matching simulation. By comparison, the precision difference between these two algorithms is little, it's less than 0.01pixel. In order to research the influence of target rotation on precision of image matching, the experiment of camera rotation was carried on. The detector used in the camera is a CMOS detector. It is fixed to an arc pendulum table, take pictures when the camera rotated different angles. Choose a subarea in the original picture as the template, and search the best matching spot using image matching algorithm mentioned above. The result shows that the matching error is bigger when the target rotation angle is larger. It's an approximate linear relation. Finally, the influence of noise on matching precision was researched. Gaussian noise and pepper and salt noise were added in the image respectively, and the image
Shuttle Entry Air Data System (SEADS) - Optimization of preflight algorithms based on flight results
NASA Technical Reports Server (NTRS)
Wolf, H.; Henry, M. W.; Siemers, Paul M., III
1988-01-01
The SEADS pressure model algorithm results were tested against other sources of air data, in particular, the Shuttle Best Estimated Trajectory (BET). The algorithm basis was also tested through a comparison of flight-measured pressure distribution vs the wind tunnel database. It is concluded that the successful flight of SEADS and the subsequent analysis of the data shows good agreement between BET and SEADS air data.
ERIC Educational Resources Information Center
Swinbank, Elizabeth
2004-01-01
This article shows how the physical testing of food ingredients and products can be used as a starting point for exploring aspects of physics. The three main areas addressed are: mechanical properties of solid materials; liquid flow; optical techniques for measuring sugar concentration. The activities described were originally developed for…
A Rapid Convergent Low Complexity Interference Alignment Algorithm for Wireless Sensor Networks
Jiang, Lihui; Wu, Zhilu; Ren, Guanghui; Wang, Gangyi; Zhao, Nan
2015-01-01
Interference alignment (IA) is a novel technique that can effectively eliminate the interference and approach the sum capacity of wireless sensor networks (WSNs) when the signal-to-noise ratio (SNR) is high, by casting the desired signal and interference into different signal subspaces. The traditional alternating minimization interference leakage (AMIL) algorithm for IA shows good performance in high SNR regimes, however, the complexity of the AMIL algorithm increases dramatically as the number of users and antennas increases, posing limits to its applications in the practical systems. In this paper, a novel IA algorithm, called directional quartic optimal (DQO) algorithm, is proposed to minimize the interference leakage with rapid convergence and low complexity. The properties of the AMIL algorithm are investigated, and it is discovered that the difference between the two consecutive iteration results of the AMIL algorithm will approximately point to the convergence solution when the precoding and decoding matrices obtained from the intermediate iterations are sufficiently close to their convergence values. Based on this important property, the proposed DQO algorithm employs the line search procedure so that it can converge to the destination directly. In addition, the optimal step size can be determined analytically by optimizing a quartic function. Numerical results show that the proposed DQO algorithm can suppress the interference leakage more rapidly than the traditional AMIL algorithm, and can achieve the same level of sum rate as that of AMIL algorithm with far less iterations and execution time. PMID:26230697
Acoustic design of rotor blades using a genetic algorithm
NASA Technical Reports Server (NTRS)
Wells, V. L.; Han, A. Y.; Crossley, W. A.
1995-01-01
A genetic algorithm coupled with a simplified acoustic analysis was used to generate low-noise rotor blade designs. The model includes thickness, steady loading and blade-vortex interaction noise estimates. The paper presents solutions for several variations in the fitness function, including thickness noise only, loading noise only, and combinations of the noise types. Preliminary results indicate that the analysis provides reasonable assessments of the noise produced, and that genetic algorithm successfully searches for 'good' designs. The results show that, for a given required thrust coefficient, proper blade design can noticeably reduce the noise produced at some expense to the power requirements.
Optimization of multilayer cylindrical cloaks using genetic algorithms and NEWUOA
NASA Astrophysics Data System (ADS)
Sakr, Ahmed A.; Abdelmageed, Alaa K.
2016-06-01
The problem of minimizing the scattering from a multilayer cylindrical cloak is studied. Both TM and TE polarizations are considered. A two-stage optimization procedure using genetic algorithms and NEWUOA (new unconstrained optimization algorithm) is adopted for realizing the cloak using homogeneous isotropic layers. The layers are arranged such that they follow a repeated pattern of alternating DPS and DNG materials. The results show that a good level of invisibility can be realized using a reasonable number of layers. Maintaining the cloak performance over a finite range of frequencies without sacrificing the level of invisibility is achieved.
NASA Astrophysics Data System (ADS)
Shao, Xinxing; Dai, Xiangjun; He, Xiaoyuan
2015-08-01
The inverse compositional Gauss-Newton (IC-GN) algorithm is one of the most popular sub-pixel registration algorithms in digital image correlation (DIC). The IC-GN algorithm, compared with the traditional forward additive Newton-Raphson (FA-NR) algorithm, can achieve the same accuracy in less time. However, there are no clear results regarding the noise robustness of IC-GN algorithm and the computational efficiency is still in need of further improvements. In this paper, a theoretical model of the IC-GN algorithm was derived based on the sum of squared differences correlation criterion and linear interpolation. The model indicates that the IC-GN algorithm has better noise robustness than the FA-NR algorithm, and shows no noise-induced bias if the gray gradient operator is chosen properly. Both numerical simulations and experiments show good agreements with the theoretical predictions. Furthermore, a seed point-based parallel method is proposed to improve the calculation speed. Compared with the recently proposed path-independent method, our model is feasible and practical, and it can maximize the computing speed using an improved initial guess. Moreover, we compared the computational efficiency of our method with that of the reliability-guided method using a four-point bending experiment, and the results show that the computational efficiency is greatly improved. This proposed parallel IC-GN algorithm has good noise robustness and is expected to be a practical option for real-time DIC.
NASA Astrophysics Data System (ADS)
1999-11-01
Along with the increase in the number of young people applying to enter higher education announced back in July, the UK Department for Education and Employment noted that over a thousand more graduates had applied for postgraduate teacher training when compared with the same time in 1998. It appeared that the `Golden hello' programme for new mathematics and science teachers had succeeded in its aim of encouraging applicants in those subjects: an increase of 37% had been witnessed for maths teaching, 33% for physics and 27% for chemistry. Primary teacher training was also well on target with over five applicants seeking each available place. Statistics for UK schools released in August by the DfEE show that 62% of primary schools and 93% of secondary schools are now linked to the Internet (the corresponding figures were 17% and 83% in 1998). On average there is now one computer for every 13 pupils at primary school and one for every eight students in secondary school. The figures show continuing progress towards the Government's target of ensuring that all schools, colleges, universities, libraries and as many community centres as possible should be online (with access to the National Grid for Learning) by 2002.
Good Grubbin': Impact of a TV Cooking Show for College Students Living off Campus
ERIC Educational Resources Information Center
Clifford, Dawn; Anderson, Jennifer; Auld, Garry; Champ, Joseph
2009-01-01
Objective: To determine if a series of 4 15-minute, theory-driven (Social Cognitive Theory) cooking programs aimed at college students living off campus improved cooking self-efficacy, knowledge, attitudes, and behaviors regarding fruit and vegetable intake. Design: A randomized controlled trial with pre-, post- and follow-up tests. Setting:…
Good News for New Orleans: Early Evidence Shows Reforms Lifting Student Achievement
ERIC Educational Resources Information Center
Harris, Douglas N.
2015-01-01
What happened to the New Orleans public schools following the tragic levee breeches after Hurricane Katrina is truly unprecedented. Within the span of one year, all public-school employees were fired, the teacher contract expired and was not replaced, and most attendance zones were eliminated. The state took control of almost all public schools…
EDGA: A Population Evolution Direction-Guided Genetic Algorithm for Protein-Ligand Docking.
Guan, Boxin; Zhang, Changsheng; Ning, Jiaxu
2016-07-01
Protein-ligand docking can be formulated as a search algorithm associated with an accurate scoring function. However, most current search algorithms cannot show good performance in docking problems, especially for highly flexible docking. To overcome this drawback, this article presents a novel and robust optimization algorithm (EDGA) based on the Lamarckian genetic algorithm (LGA) for solving flexible protein-ligand docking problems. This method applies a population evolution direction-guided model of genetics, in which search direction evolves to the optimum solution. The method is more efficient to find the lowest energy of protein-ligand docking. We consider four search methods-a tradition genetic algorithm, LGA, SODOCK, and EDGA-and compare their performance in docking of six protein-ligand docking problems. The results show that EDGA is the most stable, reliable, and successful. PMID:26895461
Optimization of composite structures by estimation of distribution algorithms
NASA Astrophysics Data System (ADS)
Grosset, Laurent
The design of high performance composite laminates, such as those used in aerospace structures, leads to complex combinatorial optimization problems that cannot be addressed by conventional methods. These problems are typically solved by stochastic algorithms, such as evolutionary algorithms. This dissertation proposes a new evolutionary algorithm for composite laminate optimization, named Double-Distribution Optimization Algorithm (DDOA). DDOA belongs to the family of estimation of distributions algorithms (EDA) that build a statistical model of promising regions of the design space based on sets of good points, and use it to guide the search. A generic framework for introducing statistical variable dependencies by making use of the physics of the problem is proposed. The algorithm uses two distributions simultaneously: the marginal distributions of the design variables, complemented by the distribution of auxiliary variables. The combination of the two generates complex distributions at a low computational cost. The dissertation demonstrates the efficiency of DDOA for several laminate optimization problems where the design variables are the fiber angles and the auxiliary variables are the lamination parameters. The results show that its reliability in finding the optima is greater than that of a simple EDA and of a standard genetic algorithm, and that its advantage increases with the problem dimension. A continuous version of the algorithm is presented and applied to a constrained quadratic problem. Finally, a modification of the algorithm incorporating probabilistic and directional search mechanisms is proposed. The algorithm exhibits a faster convergence to the optimum and opens the way for a unified framework for stochastic and directional optimization.
NASA Technical Reports Server (NTRS)
2006-01-01
This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows several small, dark sand dunes and a small crater (about 1 kilometer in diameter) within a much larger crater (not visible in this image). The floor of the larger crater is rough and has been eroded with time. The floor of the smaller crater contains windblown ripples. The steep faces of the dunes point to the east (right), indicating that the dominant winds blew from the west (left). This scene is located near 38.5 S, 347.1 W, and covers an area approximately 3 km (1.9 mi) wide. Sunlight illuminates the landscape from the upper left. This southern autumn image was acquired on 1 July 2006.
Benchmark graphs for testing community detection algorithms
NASA Astrophysics Data System (ADS)
Lancichinetti, Andrea; Fortunato, Santo; Radicchi, Filippo
2008-10-01
Community structure is one of the most important features of real networks and reveals the internal organization of the nodes. Many algorithms have been proposed but the crucial issue of testing, i.e., the question of how good an algorithm is, with respect to others, is still open. Standard tests include the analysis of simple artificial graphs with a built-in community structure, that the algorithm has to recover. However, the special graphs adopted in actual tests have a structure that does not reflect the real properties of nodes and communities found in real networks. Here we introduce a class of benchmark graphs, that account for the heterogeneity in the distributions of node degrees and of community sizes. We use this benchmark to test two popular methods of community detection, modularity optimization, and Potts model clustering. The results show that the benchmark poses a much more severe test to algorithms than standard benchmarks, revealing limits that may not be apparent at a first analysis.
A graph spectrum based geometric biclustering algorithm.
Wang, Doris Z; Yan, Hong
2013-01-21
Biclustering is capable of performing simultaneous clustering on two dimensions of a data matrix and has many applications in pattern classification. For example, in microarray experiments, a subset of genes is co-expressed in a subset of conditions, and biclustering algorithms can be used to detect the coherent patterns in the data for further analysis of function. In this paper, we present a graph spectrum based geometric biclustering (GSGBC) algorithm. In the geometrical view, biclusters can be seen as different linear geometrical patterns in high dimensional spaces. Based on this, the modified Hough transform is used to find the Hough vector (HV) corresponding to sub-bicluster patterns in 2D spaces. A graph can be built regarding each HV as a node. The graph spectrum is utilized to identify the eigengroups in which the sub-biclusters are grouped naturally to produce larger biclusters. Through a comparative study, we find that the GSGBC achieves as good a result as GBC and outperforms other kinds of biclustering algorithms. Also, compared with the original geometrical biclustering algorithm, it reduces the computing time complexity significantly. We also show that biologically meaningful biclusters can be identified by our method from real microarray gene expression data. PMID:23079285
A fast image encryption algorithm based on chaotic map
NASA Astrophysics Data System (ADS)
Liu, Wenhao; Sun, Kehui; Zhu, Congxu
2016-09-01
Derived from Sine map and iterative chaotic map with infinite collapse (ICMIC), a new two-dimensional Sine ICMIC modulation map (2D-SIMM) is proposed based on a close-loop modulation coupling (CMC) model, and its chaotic performance is analyzed by means of phase diagram, Lyapunov exponent spectrum and complexity. It shows that this map has good ergodicity, hyperchaotic behavior, large maximum Lyapunov exponent and high complexity. Based on this map, a fast image encryption algorithm is proposed. In this algorithm, the confusion and diffusion processes are combined for one stage. Chaotic shift transform (CST) is proposed to efficiently change the image pixel positions, and the row and column substitutions are applied to scramble the pixel values simultaneously. The simulation and analysis results show that this algorithm has high security, low time complexity, and the abilities of resisting statistical analysis, differential, brute-force, known-plaintext and chosen-plaintext attacks.
NASA Technical Reports Server (NTRS)
Barth, Timothy J.; Lomax, Harvard
1987-01-01
The past decade has seen considerable activity in algorithm development for the Navier-Stokes equations. This has resulted in a wide variety of useful new techniques. Some examples for the numerical solution of the Navier-Stokes equations are presented, divided into two parts. One is devoted to the incompressible Navier-Stokes equations, and the other to the compressible form.
... Home Current Issue Past Issues Straight Talk For Good Health Past Issues / Summer 2009 Table of Contents ... one of the most important aspects of getting good care. Make a List To Find Out More ...
Skip Navigation Bar Home Current Issue Past Issues Good Health For the Holidays! Past Issues / Fall 2007 ... medical and health question. Healthy families know that good medical information should be a part of everyone's ...
Digital watermarking algorithm based on HVS in wavelet domain
NASA Astrophysics Data System (ADS)
Zhang, Qiuhong; Xia, Ping; Liu, Xiaomei
2013-10-01
As a new technique used to protect the copyright of digital productions, the digital watermark technique has drawn extensive attention. A digital watermarking algorithm based on discrete wavelet transform (DWT) was presented according to human visual properties in the paper. Then some attack analyses were given. Experimental results show that the watermarking scheme proposed in this paper is invisible and robust to cropping, and also has good robustness to cut , compression , filtering , and noise adding .
Good pitch memory is widespread.
Schellenberg, E Glenn; Trehub, Sandra E
2003-05-01
Here we show that good pitch memory is widespread among adults with no musical training. We tested unselected college students on their memory for the pitch level of instrumental soundtracks from familiar television programs. Participants heard 5-s excerpts either at the original pitch level or shifted upward or downward by 1 or 2 semitones. They successfully identified the original pitch levels. Other participants who heard comparable excerpts from unfamiliar recordings could not do so. These findings reveal that ordinary listeners retain fine-grained information about pitch level over extended periods. Adults' reportedly poor memory for pitch is likely to be a by-product of their inability to name isolated pitches. PMID:12741751
ETD: an extended time delay algorithm for ventricular fibrillation detection.
Kim, Jungyoon; Chu, Chao-Hsien
2014-01-01
Ventricular fibrillation (VF) is the most serious type of heart attack which requires quick detection and first aid to improve patients' survival rates. To be most effective in using wearable devices for VF detection, it is vital that the detection algorithms be accurate, robust, reliable and computationally efficient. Previous studies and our experiments both indicate that the time-delay (TD) algorithm has a high reliability for separating sinus rhythm (SR) from VF and is resistant to variable factors, such as window size and filtering method. However, it fails to detect some VF cases. In this paper, we propose an extended time-delay (ETD) algorithm for VF detection and conduct experiments comparing the performance of ETD against five good VF detection algorithms, including TD, using the popular Creighton University (CU) database. Our study shows that (1) TD and ETD outperform the other four algorithms considered and (2) with the same sensitivity setting, ETD improves upon TD in three other quality measures for up to 7.64% and in terms of aggregate accuracy, the ETD algorithm shows an improvement of 2.6% of the area under curve (AUC) compared to TD. PMID:25571480
Substantial Goodness and Nascent Human Life.
Floyd, Shawn
2015-09-01
Many believe that moral value is--at least to some extent--dependent on the developmental states necessary for supporting rational activity. My paper rejects this view, but does not aim simply to register objections to it. Rather, my essay aims to answer the following question: if a human being's developmental state and occurrent capacities do not bequeath moral standing, what does? The question is intended to prompt careful consideration of what makes human beings objects of moral value, dignity, or (to employ my preferred term) goodness. Not only do I think we can answer this question, I think we can show that nascent human life possesses goodness of precisely this sort. I appeal to Aquinas's metaethics to establish the conclusion that the goodness of a human being--even if that being is an embryo or fetus--resides at the substratum of her existence. If she possesses goodness, it is because human existence is good. PMID:25633227
Xiao, Jianyuan; Liu, Jian; Qin, Hong; Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 ; Yu, Zhi
2013-10-15
Smoothing functions are commonly used to reduce numerical noise arising from coarse sampling of particles in particle-in-cell (PIC) plasma simulations. When applying smoothing functions to symplectic algorithms, the conservation of symplectic structure should be guaranteed to preserve good conservation properties. In this paper, we show how to construct a variational multi-symplectic PIC algorithm with smoothing functions for the Vlasov-Maxwell system. The conservation of the multi-symplectic structure and the reduction of numerical noise make this algorithm specifically suitable for simulating long-term dynamics of plasmas, such as those in the steady-state operation or long-pulse discharge of a super-conducting tokamak. The algorithm has been implemented in a 6D large scale PIC code. Numerical examples are given to demonstrate the good conservation properties of the multi-symplectic algorithm and the reduction of the noise due to the application of smoothing function.
Preconditioned quantum linear system algorithm.
Clader, B D; Jacobs, B C; Sprouse, C R
2013-06-21
We describe a quantum algorithm that generalizes the quantum linear system algorithm [Harrow et al., Phys. Rev. Lett. 103, 150502 (2009)] to arbitrary problem specifications. We develop a state preparation routine that can initialize generic states, show how simple ancilla measurements can be used to calculate many quantities of interest, and integrate a quantum-compatible preconditioner that greatly expands the number of problems that can achieve exponential speedup over classical linear systems solvers. To demonstrate the algorithm's applicability, we show how it can be used to compute the electromagnetic scattering cross section of an arbitrary target exponentially faster than the best classical algorithm. PMID:23829722
A novel method to design S-box based on chaotic map and genetic algorithm
NASA Astrophysics Data System (ADS)
Wang, Yong; Wong, Kwok-Wo; Li, Changbing; Li, Yang
2012-01-01
The substitution box (S-box) is an important component in block encryption algorithms. In this Letter, the problem of constructing S-box is transformed to a Traveling Salesman Problem and a method for designing S-box based on chaos and genetic algorithm is proposed. Since the proposed method makes full use of the traits of chaotic map and evolution process, stronger S-box is obtained. The results of performance test show that the presented S-box has good cryptographic properties, which justify that the proposed algorithm is effective in generating strong S-boxes.
A baseline algorithm for face detection and tracking in video
NASA Astrophysics Data System (ADS)
Manohar, Vasant; Soundararajan, Padmanabhan; Korzhova, Valentina; Boonstra, Matthew; Goldgof, Dmitry; Kasturi, Rangachar
2007-10-01
Establishing benchmark datasets, performance metrics and baseline algorithms have considerable research significance in gauging the progress in any application domain. These primarily allow both users and developers to compare the performance of various algorithms on a common platform. In our earlier works, we focused on developing performance metrics and establishing a substantial dataset with ground truth for object detection and tracking tasks (text and face) in two video domains -- broadcast news and meetings. In this paper, we present the results of a face detection and tracking algorithm on broadcast news videos with the objective of establishing a baseline performance for this task-domain pair. The detection algorithm uses a statistical approach that was originally developed by Viola and Jones and later extended by Lienhart. The algorithm uses a feature set that is Haar-like and a cascade of boosted decision tree classifiers as a statistical model. In this work, we used the Intel Open Source Computer Vision Library (OpenCV) implementation of the Haar face detection algorithm. The optimal values for the tunable parameters of this implementation were found through an experimental design strategy commonly used in statistical analyses of industrial processes. Tracking was accomplished as continuous detection with the detected objects in two frames mapped using a greedy algorithm based on the distances between the centroids of bounding boxes. Results on the evaluation set containing 50 sequences (~ 2.5 mins.) using the developed performance metrics show good performance of the algorithm reflecting the state-of-the-art which makes it an appropriate choice as the baseline algorithm for the problem.
Parallel algorithms for unconstrained optimizations by multisplitting
He, Qing
1994-12-31
In this paper a new parallel iterative algorithm for unconstrained optimization using the idea of multisplitting is proposed. This algorithm uses the existing sequential algorithms without any parallelization. Some convergence and numerical results for this algorithm are presented. The experiments are performed on an Intel iPSC/860 Hyper Cube with 64 nodes. It is interesting that the sequential implementation on one node shows that if the problem is split properly, the algorithm converges much faster than one without splitting.
Digital watermarking algorithm research of color images based on quaternion Fourier transform
NASA Astrophysics Data System (ADS)
An, Mali; Wang, Weijiang; Zhao, Zhen
2013-10-01
A watermarking algorithm of color images based on the quaternion Fourier Transform (QFFT) and improved quantization index algorithm (QIM) is proposed in this paper. The original image is transformed by QFFT, the watermark image is processed by compression and quantization coding, and then the processed watermark image is embedded into the components of the transformed original image. It achieves embedding and blind extraction of the watermark image. The experimental results show that the watermarking algorithm based on the improved QIM algorithm with distortion compensation achieves a good tradeoff between invisibility and robustness, and better robustness for the attacks of Gaussian noises, salt and pepper noises, JPEG compression, cropping, filtering and image enhancement than the traditional QIM algorithm.
Distributed Range-Free Localization Algorithm Based on Self-Organizing Maps
NASA Astrophysics Data System (ADS)
Tinh, Pham Doan; Kawai, Makoto
In Mobile Ad-Hoc Networks (MANETs), determining the physical location of nodes (localization) is very important for many network services and protocols. This paper proposes a new Distributed Range-free Localization Algorithm Based on Self-Organizing Maps (SOM) to deal with this issue. Our proposed algorithm utilizes only connectivity information to determine the location of nodes. By utilizing the intersection areas between radio coverage of neighboring nodes, the algorithm has maximized the correlation between neighboring nodes in distributed implementation of SOM and reduced the SOM learning time. An implementation of the algorithm on Network Simulator 2 (NS-2) was done with the mobility consideration to verify the performance of the proposed algorithm. From our intensive simulations, the results show that the proposed scheme achieves very good accuracy in most cases.
Algorithm for identifying and separating beats from arterial pulse records
Treo, Ernesto F; Herrera, Myriam C; Valentinuzzi, Max E
2005-01-01
Background This project was designed as an epidemiological aid-selecting tool for a small country health center with the general objective of screening out possible coronary patients. Peripheral artery function can be non-invasively evaluated by impedance plethysmography. Changes in these vessels appear as good predictors of future coronary behavior. Impedance plethysmography detects volume variations after simple occlusive maneuvers that may show indicative modifications in arterial/venous responses. Averaging of a series of pulses is needed and this, in turn, requires proper determination of the beginning and end of each beat. Thus, the objective here is to describe an algorithm to identify and separate out beats from a plethysmographic record. A secondary objective was to compare the output given by human operators against the algorithm. Methods The identification algorithm detected the beat's onset and end on the basis of the maximum rising phase, the choice of possible ventricular systolic starting points considering cardiac frequency, and the adjustment of some tolerance values to optimize the behavior. Out of 800 patients in the study, 40 occlusive records (supradiastolic- subsystolic) were randomly selected without any preliminary diagnosis. Radial impedance plethysmographic pulse and standard ECG were recorded digitizing and storing the data. Cardiac frequency was estimated with the Power Density Function and, thereafter, the signal was derived twice, followed by binarization of the first derivative and rectification of the second derivative. The product of the two latter results led to a weighing signal from which the cycles' onsets and ends were established. Weighed and frequency filters are needed along with the pre-establishment of their respective tolerances. Out of the 40 records, 30 seconds strands were randomly chosen to be analyzed by the algorithm and by two operators. Sensitivity and accuracy were calculated by means of the true/false and
Predicting the performance of a spatial gamut mapping algorithm
NASA Astrophysics Data System (ADS)
Bakke, Arne M.; Farup, Ivar; Hardeberg, Jon Y.
2009-01-01
Gamut mapping algorithms are currently being developed to take advantage of the spatial information in an image to improve the utilization of the destination gamut. These algorithms try to preserve the spatial information between neighboring pixels in the image, such as edges and gradients, without sacrificing global contrast. Experiments have shown that such algorithms can result in significantly improved reproduction of some images compared with non-spatial methods. However, due to the spatial processing of images, they introduce unwanted artifacts when used on certain types of images. In this paper we perform basic image analysis to predict whether a spatial algorithm is likely to perform better or worse than a good, non-spatial algorithm. Our approach starts by detecting the relative amount of areas in the image that are made up of uniformly colored pixels, as well as the amount of areas that contain details in out-of-gamut areas. A weighted difference is computed from these numbers, and we show that the result has a high correlation with the observed performance of the spatial algorithm in a previously conducted psychophysical experiment.
A new map-making algorithm for CMB polarization experiments
NASA Astrophysics Data System (ADS)
Wallis, Christopher G. R.; Bonaldi, A.; Brown, Michael L.; Battye, Richard A.
2015-10-01
With the temperature power spectrum of the cosmic microwave background (CMB) at least four orders of magnitude larger than the B-mode polarization power spectrum, any instrumental imperfections that couple temperature to polarization must be carefully controlled and/or removed. Here we present two new map-making algorithms that can create polarization maps that are clean of temperature-to-polarization leakage systematics due to differential gain and pointing between a detector pair. Where a half-wave plate is used, we show that the spin-2 systematic due to differential ellipticity can also be removed using our algorithms. The algorithms require no prior knowledge of the imperfections or temperature sky to remove the temperature leakage. Instead, they calculate the systematic and polarization maps in one step directly from the time-ordered data (TOD). The first algorithm is designed to work with scan strategies that have a good range of crossing angles for each map pixel and the second for scan strategies that have a limited range of crossing angles. The first algorithm can also be used to identify if systematic errors that have a particular spin are present in a TOD. We demonstrate the use of both algorithms and the ability to identify systematics with simulations of TOD with realistic scan strategies and instrumental noise.
Comparative analysis of PSO algorithms for PID controller tuning
NASA Astrophysics Data System (ADS)
Štimac, Goranka; Braut, Sanjin; Žigulić, Roberto
2014-09-01
The active magnetic bearing(AMB) suspends the rotating shaft and maintains it in levitated position by applying controlled electromagnetic forces on the rotor in radial and axial directions. Although the development of various control methods is rapid, PID control strategy is still the most widely used control strategy in many applications, including AMBs. In order to tune PID controller, a particle swarm optimization(PSO) method is applied. Therefore, a comparative analysis of particle swarm optimization(PSO) algorithms is carried out, where two PSO algorithms, namely (1) PSO with linearly decreasing inertia weight(LDW-PSO), and (2) PSO algorithm with constriction factor approach(CFA-PSO), are independently tested for different PID structures. The computer simulations are carried out with the aim of minimizing the objective function defined as the integral of time multiplied by the absolute value of error(ITAE). In order to validate the performance of the analyzed PSO algorithms, one-axis and two-axis radial rotor/active magnetic bearing systems are examined. The results show that PSO algorithms are effective and easily implemented methods, providing stable convergence and good computational efficiency of different PID structures for the rotor/AMB systems. Moreover, the PSO algorithms prove to be easily used for controller tuning in case of both SISO and MIMO system, which consider the system delay and the interference among the horizontal and vertical rotor axes.
A Hybrid Ant Colony Algorithm for Loading Pattern Optimization
NASA Astrophysics Data System (ADS)
Hoareau, F.
2014-06-01
Electricité de France (EDF) operates 58 nuclear power plant (NPP), of the Pressurized Water Reactor (PWR) type. The loading pattern (LP) optimization of these NPP is currently done by EDF expert engineers. Within this framework, EDF R&D has developed automatic optimization tools that assist the experts. The latter can resort, for instance, to a loading pattern optimization software based on ant colony algorithm. This paper presents an analysis of the search space of a few realistic loading pattern optimization problems. This analysis leads us to introduce a hybrid algorithm based on ant colony and a local search method. We then show that this new algorithm is able to generate loading patterns of good quality.
Scheduling Jobs with Genetic Algorithms
NASA Astrophysics Data System (ADS)
Ferrolho, António; Crisóstomo, Manuel
Most scheduling problems are NP-hard, the time required to solve the problem optimally increases exponentially with the size of the problem. Scheduling problems have important applications, and a number of heuristic algorithms have been proposed to determine relatively good solutions in polynomial time. Recently, genetic algorithms (GA) are successfully used to solve scheduling problems, as shown by the growing numbers of papers. GA are known as one of the most efficient algorithms for solving scheduling problems. But, when a GA is applied to scheduling problems various crossovers and mutations operators can be applicable. This paper presents and examines a new concept of genetic operators for scheduling problems. A software tool called hybrid and flexible genetic algorithm (HybFlexGA) was developed to examine the performance of various crossover and mutation operators by computing simulations of job scheduling problems.
Dreaming, Stealing, Dancing, Showing Off.
ERIC Educational Resources Information Center
Lavender, Peter; Taylor, Chris
2002-01-01
Lessons learned from British projects to delivery literacy, numeracy, and English as a second language through community agencies included the following: (1) innovation and measured risks are required to attract hard-to-reach adults; (2) good practice needs to be shared; and (3) projects worked best when government funds were managed by community…
Temperature Corrected Bootstrap Algorithm
NASA Technical Reports Server (NTRS)
Comiso, Joey C.; Zwally, H. Jay
1997-01-01
A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.
Power spectral estimation algorithms
NASA Technical Reports Server (NTRS)
Bhatia, Manjit S.
1989-01-01
Algorithms to estimate the power spectrum using Maximum Entropy Methods were developed. These algorithms were coded in FORTRAN 77 and were implemented on the VAX 780. The important considerations in this analysis are: (1) resolution, i.e., how close in frequency two spectral components can be spaced and still be identified; (2) dynamic range, i.e., how small a spectral peak can be, relative to the largest, and still be observed in the spectra; and (3) variance, i.e., how accurate the estimate of the spectra is to the actual spectra. The application of the algorithms based on Maximum Entropy Methods to a variety of data shows that these criteria are met quite well. Additional work in this direction would help confirm the findings. All of the software developed was turned over to the technical monitor. A copy of a typical program is included. Some of the actual data and graphs used on this data are also included.
NASA Astrophysics Data System (ADS)
Syariz, M. A.; Jaelani, L. M.; Subehi, L.; Pamungkas, A.; Koenhardono, E. S.; Sulisetyono, A.
2015-10-01
The Sea Surface Temperature (SST) retrieval from satellites data Thus, it could provide SST data for a long time. Since, the algorithms of SST estimation by using Landsat 8 Thermal Band are sitedependence, we need to develop an applicable algorithm in Indonesian water. The aim of this research was to develop SST algorithms in the North Java Island Water. The data used are in-situ data measured on April 22, 2015 and also estimated brightness temperature data from Landsat 8 Thermal Band Image (band 10 and band 11). The algorithm was established using 45 data by assessing the relation of measured in-situ data and estimated brightness temperature. Then, the algorithm was validated by using another 40 points. The results showed that the good performance of the sea surface temperature algorithm with coefficient of determination (R2) and Root Mean Square Error (RMSE) of 0.912 and 0.028, respectively.
Defect recognition algorithm based on direction curve in x-ray photos
NASA Astrophysics Data System (ADS)
Li, Hua; Liu, Jianguo
1998-09-01
A new recognition algorithm using composite operator and direction curves to auto-detect the dreg defects in x-rays photo is proposed in this paper. This algorithm is simple with little computation, fast detection speed, good effect of real-time process, high accuracy of detection and good adaptability. Experiments evince this algorithm is a good feature detection method.
ERIC Educational Resources Information Center
Moore, Kristin Anderson; Evans, V. Jeffery; Brooks-Gunn, Jeanne; Roth, Jodie
This paper considers the question "What are good child outcomes?" from the perspectives of developmental psychology, economics, and sociology. Section 1 of the paper examines good child outcomes as characteristics of stage-salient tasks of development. Section 2 emphasizes the acquisition of "human capital," the development of productive traits…
... this page please turn JavaScript on. Feature: Healthcare Communication Straight Talk For Good Health Past Issues / Summer 2015 Table of Contents Straight talk with your healthcare provider is important. You and your medical team can then make better decisions for your good ...
ERIC Educational Resources Information Center
Estes, Cheryl; Henderson, Karla
2003-01-01
Presents information to update parks and recreation professionals about what recent research says in regard to enjoyment and the good life, noting what applications this research has for practitioners. The article focuses on: the good life and leisure services; happiness, subjective well-being, and intrinsic motivation; leisure, happiness, and…
Brown, Carolyn T
2013-01-01
This short memoir reflects on the experience of a "good daughter" caring for both parents through their late aging and deaths. The memoir contemplates their personalities as expressed in their aging and the "good daughter's" experience in the death room. Those on a similar journey, whether as travelers, guides, or witnesses, may draw comfort, perhaps reassurance, from this account. PMID:23159687
A test sheet generating algorithm based on intelligent genetic algorithm and hierarchical planning
NASA Astrophysics Data System (ADS)
Gu, Peipei; Niu, Zhendong; Chen, Xuting; Chen, Wei
2013-03-01
In recent years, computer-based testing has become an effective method to evaluate students' overall learning progress so that appropriate guiding strategies can be recommended. Research has been done to develop intelligent test assembling systems which can automatically generate test sheets based on given parameters of test items. A good multisubject test sheet depends on not only the quality of the test items but also the construction of the sheet. Effective and efficient construction of test sheets according to multiple subjects and criteria is a challenging problem. In this paper, a multi-subject test sheet generation problem is formulated and a test sheet generating approach based on intelligent genetic algorithm and hierarchical planning (GAHP) is proposed to tackle this problem. The proposed approach utilizes hierarchical planning to simplify the multi-subject testing problem and adopts genetic algorithm to process the layered criteria, enabling the construction of good test sheets according to multiple test item requirements. Experiments are conducted and the results show that the proposed approach is capable of effectively generating multi-subject test sheets that meet specified requirements and achieve good performance.
A test sheet generating algorithm based on intelligent genetic algorithm and hierarchical planning
NASA Astrophysics Data System (ADS)
Gu, Peipei; Niu, Zhendong; Chen, Xuting; Chen, Wei
2012-04-01
In recent years, computer-based testing has become an effective method to evaluate students' overall learning progress so that appropriate guiding strategies can be recommended. Research has been done to develop intelligent test assembling systems which can automatically generate test sheets based on given parameters of test items. A good multisubject test sheet depends on not only the quality of the test items but also the construction of the sheet. Effective and efficient construction of test sheets according to multiple subjects and criteria is a challenging problem. In this paper, a multi-subject test sheet generation problem is formulated and a test sheet generating approach based on intelligent genetic algorithm and hierarchical planning (GAHP) is proposed to tackle this problem. The proposed approach utilizes hierarchical planning to simplify the multi-subject testing problem and adopts genetic algorithm to process the layered criteria, enabling the construction of good test sheets according to multiple test item requirements. Experiments are conducted and the results show that the proposed approach is capable of effectively generating multi-subject test sheets that meet specified requirements and achieve good performance.
Alpuche Aviles, Jorge E; Pistorius, Stephen; Gordon, Richard; Elbakri, Idris A
2011-01-01
This work presents a first generation incoherent scatter CT (ISCT) hybrid (analytic-iterative) reconstruction algorithm for accurate ρ{e}imaging of objects with clinically relevant sizes. The algorithm reconstructs quantitative images of ρ{e} within a few iterations, avoiding the challenges of optimization based reconstruction algorithms while addressing the limitations of current analytical algorithms. A 4π detector is conceptualized in order to address the issue of directional dependency and is then replaced with a ring of detectors which detect a constant fraction of the scattered photons. The ISCT algorithm corrects for the attenuation of photons using a limited number of iterations and filtered back projection (FBP) for image reconstruction. This results in a hybrid reconstruction algorithm that was tested with sinograms generated by Monte Carlo (MC) and analytical (AN) simulations. Results show that the ISCT algorithm is weakly dependent on the ρ{e} initial estimate. Simulation results show that the proposed algorithm reconstruct ρ{e} images with a mean error of -1% ± 3% for the AN model and from -6% to -8% for the MC model. Finally, the algorithm is capable of reconstructing qualitatively good images even in the presence of multiple scatter. The proposed algorithm would be suitable for in-vivo medical imaging as long as practical limitations can be addressed. PMID:21422588
Ensembles of satellite aerosol retrievals based on three AATSR algorithms within aerosol_cci
NASA Astrophysics Data System (ADS)
Kosmale, Miriam; Popp, Thomas
2016-04-01
Ensemble techniques are widely used in the modelling community, combining different modelling results in order to reduce uncertainties. This approach could be also adapted to satellite measurements. Aerosol_cci is an ESA funded project, where most of the European aerosol retrieval groups work together. The different algorithms are homogenized as far as it makes sense, but remain essentially different. Datasets are compared with ground based measurements and between each other. Three AATSR algorithms (Swansea university aerosol retrieval, ADV aerosol retrieval by FMI and Oxford aerosol retrieval ORAC) provide within this project 17 year global aerosol records. Each of these algorithms provides also uncertainty information on pixel level. Within the presented work, an ensembles of the three AATSR algorithms is performed. The advantage over each single algorithm is the higher spatial coverage due to more measurement pixels per gridbox. A validation to ground based AERONET measurements shows still a good correlation of the ensemble, compared to the single algorithms. Annual mean maps show the global aerosol distribution, based on a combination of the three aerosol algorithms. In addition, pixel level uncertainties of each algorithm are used for weighting the contributions, in order to reduce the uncertainty of the ensemble. Results of different versions of the ensembles for aerosol optical depth will be presented and discussed. The results are validated against ground based AERONET measurements. A higher spatial coverage on daily basis allows better results in annual mean maps. The benefit of using pixel level uncertainties is analysed.
Good Law, Good Practice, Good Sense: Using Legal Guidelines for Drafting Educational Policies.
ERIC Educational Resources Information Center
Bogotch, Ira E.
1988-01-01
Suggests how to use legal guidelines for drafting educational policies. Analyzes the political context in which present policymaking and governance initiatives exist. Two assumptions frame this article. First, good law makes for good administrative practice. Second, administrator policymaking is more important than the content of the policy…
The Chopthin Algorithm for Resampling
NASA Astrophysics Data System (ADS)
Gandy, Axel; Lau, F. Din-Houn
2016-08-01
Resampling is a standard step in particle filters and more generally sequential Monte Carlo methods. We present an algorithm, called chopthin, for resampling weighted particles. In contrast to standard resampling methods the algorithm does not produce a set of equally weighted particles; instead it merely enforces an upper bound on the ratio between the weights. Simulation studies show that the chopthin algorithm consistently outperforms standard resampling methods. The algorithms chops up particles with large weight and thins out particles with low weight, hence its name. It implicitly guarantees a lower bound on the effective sample size. The algorithm can be implemented efficiently, making it practically useful. We show that the expected computational effort is linear in the number of particles. Implementations for C++, R (on CRAN), Python and Matlab are available.
VIEW SHOWING WEST ELEVATION, EAST SIDE OF MEYER AVENUE. SHOWS ...
VIEW SHOWING WEST ELEVATION, EAST SIDE OF MEYER AVENUE. SHOWS 499-501, MUNOZ HOUSE (AZ-73-37) ON FAR RIGHT - Antonio Bustamente House, 485-489 South Meyer Avenue & 186 West Kennedy Street, Tucson, Pima County, AZ
15. Detail showing lower chord pinconnected to vertical member, showing ...
15. Detail showing lower chord pin-connected to vertical member, showing floor beam riveted to extension of vertical member below pin-connection, and showing brackets supporting cantilevered sidewalk. View to southwest. - Selby Avenue Bridge, Spanning Short Line Railways track at Selby Avenue between Hamline & Snelling Avenues, Saint Paul, Ramsey County, MN
An algorithm on distributed mining association rules
NASA Astrophysics Data System (ADS)
Xu, Fan
2005-12-01
With the rapid development of the Internet/Intranet, distributed databases have become a broadly used environment in various areas. It is a critical task to mine association rules in distributed databases. The algorithms of distributed mining association rules can be divided into two classes. One is a DD algorithm, and another is a CD algorithm. A DD algorithm focuses on data partition optimization so as to enhance the efficiency. A CD algorithm, on the other hand, considers a setting where the data is arbitrarily partitioned horizontally among the parties to begin with, and focuses on parallelizing the communication. A DD algorithm is not always applicable, however, at the time the data is generated, it is often already partitioned. In many cases, it cannot be gathered and repartitioned for reasons of security and secrecy, cost transmission, or sheer efficiency. A CD algorithm may be a more appealing solution for systems which are naturally distributed over large expenses, such as stock exchange and credit card systems. An FDM algorithm provides enhancement to CD algorithm. However, CD and FDM algorithms are both based on net-structure and executing in non-shareable resources. In practical applications, however, distributed databases often are star-structured. This paper proposes an algorithm based on star-structure networks, which are more practical in application, have lower maintenance costs and which are more practical in the construction of the networks. In addition, the algorithm provides high efficiency in communication and good extension in parallel computation.
Research on registration algorithm for check seal verification
NASA Astrophysics Data System (ADS)
Wang, Shuang; Liu, Tiegen
2008-03-01
Nowadays seals play an important role in China. With the development of social economy, the traditional method of manual check seal identification can't meet the need s of banking transactions badly. This paper focus on pre-processing and registration algorithm for check seal verification using theory of image processing and pattern recognition. First of all, analyze the complex characteristics of check seals. To eliminate the difference of producing conditions and the disturbance caused by background and writing in check image, many methods are used in the pre-processing of check seal verification, such as color components transformation, linearity transform to gray-scale image, medium value filter, Otsu, close calculations and labeling algorithm of mathematical morphology. After the processes above, the good binary seal image can be obtained. On the basis of traditional registration algorithm, a double-level registration method including rough and precise registration method is proposed. The deflection angle of precise registration method can be precise to 0.1°. This paper introduces the concepts of difference inside and difference outside and use the percent of difference inside and difference outside to judge whether the seal is real or fake. The experimental results of a mass of check seals are satisfied. It shows that the methods and algorithmic presented have good robustness to noise sealing conditions and satisfactory tolerance of difference within class.
Bouc-Wen hysteresis model identification using Modified Firefly Algorithm
NASA Astrophysics Data System (ADS)
Zaman, Mohammad Asif; Sikder, Urmita
2015-12-01
The parameters of Bouc-Wen hysteresis model are identified using a Modified Firefly Algorithm. The proposed algorithm uses dynamic process control parameters to improve its performance. The algorithm is used to find the model parameter values that results in the least amount of error between a set of given data points and points obtained from the Bouc-Wen model. The performance of the algorithm is compared with the performance of conventional Firefly Algorithm, Genetic Algorithm and Differential Evolution algorithm in terms of convergence rate and accuracy. Compared to the other three optimization algorithms, the proposed algorithm is found to have good convergence rate with high degree of accuracy in identifying Bouc-Wen model parameters. Finally, the proposed method is used to find the Bouc-Wen model parameters from experimental data. The obtained model is found to be in good agreement with measured data.
King, L A; Napa, C K
1998-07-01
Two studies examined folk concepts of the good life. Samples of college students (N = 104) and community adults (N = 264) were shown a career survey ostensibly completed by a person rating his or her occupation. After reading the survey, participants judged the desirability and moral goodness of the respondent's life, as a function of the amount of happiness, meaning in life, and wealth experienced. Results revealed significant effects of happiness and meaning on ratings of desirability and moral goodness. In the college sample, individuals high on all 3 independent variables were judged as likely to go to heaven. In the adult sample, wealth was also related to higher desirability. Results suggest a general perception that meaning in life and happiness are essential to the folk concept of the good life, whereas money is relatively unimportant. PMID:9686456
ERIC Educational Resources Information Center
Mikell, William G.; Drinkard, William C.
1984-01-01
Describes safety practices for laboratory fume hoods based on certain assumptions of hood design and performance. Also discusses the procedures in preparing to work at a hood. A checklist of good hood practices is included. (JM)
Good News About Childhood Cancer
... Home Current Issue Past Issues Good News About Childhood Cancer Past Issues / Spring 2008 Table of Contents ... 85 percent for the most common form of childhood cancer (acute lymphoblastic leukemia or ALL). During the ...
An innovative localisation algorithm for railway vehicles
NASA Astrophysics Data System (ADS)
Allotta, B.; D'Adamio, P.; Malvezzi, M.; Pugi, L.; Ridolfi, A.; Rindi, A.; Vettori, G.
2014-11-01
. The estimation strategy has good performance also under degraded adhesion conditions and could be put on board of high-speed railway vehicles; it represents an accurate and reliable solution. The IMU board is tested via a dedicated Hardware in the Loop (HIL) test rig: it includes an industrial robot able to replicate the motion of the railway vehicle. Through the generated experimental outputs the performances of the innovative localisation algorithm have been evaluated: the HIL test rig permitted to test the proposed algorithm, avoiding expensive (in terms of time and cost) on-track tests, obtaining encouraging results. In fact, the preliminary results show a significant improvement of the position and speed estimation performances compared to those obtained with SCMT algorithms, currently in use on the Italian railway network.
Communicating for the Good of Your Child.
ERIC Educational Resources Information Center
Lerman, Saf
1985-01-01
Parents can help their children feel secure and have a good self-image by communicating these feelings through words and actions. Suggestions for showing respect, building self-esteem, fostering security and success, and talking to children in a positive way are dicussed. (DF)
Planning Good Change with Technology and Literacy.
ERIC Educational Resources Information Center
McKenzie, Jamie
This book describes strategies to put information literacy and student learning at the center of technology planning. Filled with stories of success and with models of good planning, the book shows how to clarify purpose, involve important stakeholders, and pace the change process to maximize the daily use of new technologies. The following…
A Good Teaching Technique: WebQuests
ERIC Educational Resources Information Center
Halat, Erdogan
2008-01-01
In this article, the author first introduces and describes a new teaching tool called WebQuests to practicing teachers. He then provides detailed information about the structure of a good WebQuest. Third, the author shows the strengths and weaknesses of using Web-Quests in teaching and learning. Last, he points out the challenges for practicing…
Do good actions inspire good actions in others?
Capraro, Valerio; Marcelletti, Alessandra
2014-01-01
Actions such as sharing food and cooperating to reach a common goal have played a fundamental role in the evolution of human societies. Despite the importance of such good actions, little is known about if and how they can spread from person to person to person. For instance, does being recipient of an altruistic act increase your probability of being cooperative with a third party? We have conducted an experiment on Amazon Mechanical Turk to test this mechanism using economic games. We have measured willingness to be cooperative through a standard Prisoner's dilemma and willingness to act altruistically using a binary Dictator game. In the baseline treatments, the endowments needed to play were given by the experimenters, as usual; in the control treatments, they came from a good action made by someone else. Across four different comparisons and a total of 572 subjects, we have never found a significant increase of cooperation or altruism when the endowment came from a good action. We conclude that good actions do not necessarily inspire good actions in others. While this is consistent with the theoretical prediction, it challenges the majority of other experimental studies. PMID:25502617
Depreciation of public goods in spatial public goods games
NASA Astrophysics Data System (ADS)
Shi, Dong-Mei; Zhuang, Yong; Li, Yu-Jian; Wang, Bing-Hong
2011-10-01
In real situations, the value of public goods will be reduced or even lost because of external factors or for intrinsic reasons. In this work, we investigate the evolution of cooperation by considering the effect of depreciation of public goods in spatial public goods games on a square lattice. It is assumed that each individual gains full advantage if the number of the cooperators nc within a group centered on that individual equals or exceeds the critical mass (CM). Otherwise, there is depreciation of the public goods, which is realized by rescaling the multiplication factor r to (nc/CM)r. It is shown that the emergence of cooperation is remarkably promoted for CM > 1 even at small values of r, and a global cooperative level is achieved at an intermediate value of CM = 4 at a small r. We further study the effect of depreciation of public goods on different topologies of a regular lattice, and find that the system always reaches global cooperation at a moderate value of CM = G - 1 regardless of whether or not there exist overlapping triangle structures on the regular lattice, where G is the group size of the associated regular lattice.
SAGE II inversion algorithm. [Stratospheric Aerosol and Gas Experiment
NASA Technical Reports Server (NTRS)
Chu, W. P.; Mccormick, M. P.; Lenoble, J.; Brogniez, C.; Pruvost, P.
1989-01-01
The operational Stratospheric Aerosol and Gas Experiment II multichannel data inversion algorithm is described. Aerosol and ozone retrievals obtained with the algorithm are discussed. The algorithm is compared to an independently developed algorithm (Lenoble, 1989), showing that the inverted aerosol and ozone profiles from the two algorithms are similar within their respective uncertainties.
[Drug flow. Good manufacturing practices, good clinical practices].
Dupin-Spriet, T; Spriet, A
1991-01-01
On a worldwide basis, the drug development circuit in clinical trials undergoes a general movement towards improvement which is sensitive to the degree of quality. The methods used to achieve this are found at the interface of Good Manufacturing Practices (GMP) and Good Clinical Practices (GCP). They consist primarily of two types, for which examples are given here: strengthening of controls (verification of the resemblance of test drugs in double-blind comparison by a "jury" and computerized systems of drug accountability), improvement in "compliance with therapy at the site of investigation" (use of more "intelligent" drug packages and labels). PMID:2020929
Research on Routing Selection Algorithm Based on Genetic Algorithm
NASA Astrophysics Data System (ADS)
Gao, Guohong; Zhang, Baojian; Li, Xueyong; Lv, Jinna
The hereditary algorithm is a kind of random searching and method of optimizing based on living beings natural selection and hereditary mechanism. In recent years, because of the potentiality in solving complicate problems and the successful application in the fields of industrial project, hereditary algorithm has been widely concerned by the domestic and international scholar. Routing Selection communication has been defined a standard communication model of IP version 6.This paper proposes a service model of Routing Selection communication, and designs and implements a new Routing Selection algorithm based on genetic algorithm.The experimental simulation results show that this algorithm can get more resolution at less time and more balanced network load, which enhances search ratio and the availability of network resource, and improves the quality of service.
Fontana, W.
1990-12-13
In this paper complex adaptive systems are defined by a self- referential loop in which objects encode functions that act back on these objects. A model for this loop is presented. It uses a simple recursive formal language, derived from the lambda-calculus, to provide a semantics that maps character strings into functions that manipulate symbols on strings. The interaction between two functions, or algorithms, is defined naturally within the language through function composition, and results in the production of a new function. An iterated map acting on sets of functions and a corresponding graph representation are defined. Their properties are useful to discuss the behavior of a fixed size ensemble of randomly interacting functions. This function gas'', or Turning gas'', is studied under various conditions, and evolves cooperative interaction patterns of considerable intricacy. These patterns adapt under the influence of perturbations consisting in the addition of new random functions to the system. Different organizations emerge depending on the availability of self-replicators.
28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS ...
28. MAP SHOWING LOCATION OF ARVFS FACILITY AS BUILT. SHOWS LINCOLN BOULEVARD, BIG LOST RIVER, AND NAVAL REACTORS FACILITY. F.C. TORKELSON DRAWING NUMBER 842-ARVFS-101-2. DATED OCTOBER 12, 1965. INEL INDEX CODE NUMBER: 075 0101 851 151969. - Idaho National Engineering Laboratory, Advanced Reentry Vehicle Fusing System, Scoville, Butte County, ID
8. Detail showing concrete abutment, showing substructure of bridge, specifically ...
8. Detail showing concrete abutment, showing substructure of bridge, specifically west side of arch and substructure. - Presumpscot Falls Bridge, Spanning Presumptscot River at Allen Avenue extension, 0.75 mile west of U.S. Interstate 95, Falmouth, Cumberland County, ME
ICESat-2 / ATLAS Flight Science Receiver Algorithms
NASA Astrophysics Data System (ADS)
Mcgarry, J.; Carabajal, C. C.; Degnan, J. J.; Mallama, A.; Palm, S. P.; Ricklefs, R.; Saba, J. L.
2013-12-01
. This Simulator makes it possible to check all logic paths that could be encountered by the Algorithms on orbit. In addition the NASA airborne instrument MABEL is collecting data with characteristics similar to what ATLAS will see. MABEL data is being used to test the ATLAS Receiver Algorithms. Further verification will be performed during Integration and Testing of the ATLAS instrument and during Environmental Testing on the full ATLAS instrument. Results from testing to date show the Receiver Algorithms have the ability to handle a wide range of signal and noise levels with a very good sensitivity at relatively low signal to noise ratios. In addition, preliminary tests have demonstrated, using the ICESat-2 Science Team's selected land ice and sea ice test cases, the capability of the Algorithms to successfully find and telemeter the surface echoes. In this presentation we will describe the ATLAS Flight Science Receiver Algorithms and the Software Simulator, and will present results of the testing to date. The onboard databases (DEM, DRM and the Surface Reference Mask) are being developed at the University of Texas at Austin as part of the ATLAS Flight Science Receiver Algorithms. Verification of the onboard databases is being performed by ATLAS Receiver Algorithms team members Claudia Carabajal and Jack Saba.
8 CFR 274a.4 - Good faith defense.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Good faith defense. 274a.4 Section 274a.4 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY IMMIGRATION REGULATIONS CONTROL OF EMPLOYMENT OF ALIENS Employer Requirements § 274a.4 Good faith defense. An employer or a recruiter or referrer for a fee for employment who shows good...
A Pretty Good Paper about Pretty Good Privacy.
ERIC Educational Resources Information Center
McCollum, Roy
With today's growth in the use of electronic information systems for e-mail, data development and research, and the relative ease of access to such resources, protecting one's data and correspondence has become a great concern. "Pretty Good Privacy" (PGP), an encryption program developed by Phil Zimmermann, may be the software tool that will…
Good Practices in Free-energy Calculations
NASA Technical Reports Server (NTRS)
Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christopher
2013-01-01
As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in drug design. Yet, in a number of instances, the reliability of these calculations can be improved significantly if a number of precepts, or good practices are followed. For the most part, the theory upon which these good practices rely has been known for many years, but often overlooked, or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. The current best practices for carrying out free-energy calculations will be reviewed demonstrating that, at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. In energy perturbation and nonequilibrium work methods, monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision. In thermodynamic integration and probability distribution (histogramming) methods, properly designed adaptive techniques yield nearly uniform sampling of the relevant degrees of freedom and, by doing so, could markedly improve efficiency and accuracy of free energy calculations without incurring any additional computational expense.
Sort-Mid tasks scheduling algorithm in grid computing.
Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M
2015-11-01
Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan. PMID:26644937
Nonuniformity correction algorithm based on Gaussian mixture model
NASA Astrophysics Data System (ADS)
Mou, Xin-gang; Zhang, Gui-lin; Hu, Ruo-lan; Zhou, Xiao
2011-08-01
As an important tool to acquire information of target scene, infrared detector is widely used in imaging guidance field. Because of the limit of material and technique, the performance of infrared imaging system is known to be strongly affected by the spatial nonuniformity in the photoresponse of the detectors in the array. Temporal highpass filter(THPF) is a popular adaptive NUC algorithm because of its simpleness and effectiveness. However, there still exists the problem of ghosting artifact in the algorithms caused by blind update of parameters, and the performance is noticeably degraded when the methods are applied over scenes with lack of motion. In order to tackle with this problem, a novel adaptive NUC algorithm based on Gaussian mixed model (GMM) is put forward according to traditional THPF. The drift of the detectors is assumed to obey a single Gaussian distribution, and the update of the parameters is selectively performed based on the scene. GMM is applied in the new algorithm for background modeling, in which the background is updated selectively so as to avoid the influence of the foreground target on the update of the background, thus eliminating the ghosting artifact. The performance of the proposed algorithm is evaluated with infrared image sequences with simulated and real fixed-pattern noise. The results show a more reliable fixed-pattern noise reduction, tracking the parameter drift, and presenting a good adaptability to scene changes.
A Comparison of Three Algorithms for Orion Drogue Parachute Release
NASA Technical Reports Server (NTRS)
Matz, Daniel A.; Braun, Robert D.
2015-01-01
The Orion Multi-Purpose Crew Vehicle is susceptible to ipping apex forward between drogue parachute release and main parachute in ation. A smart drogue release algorithm is required to select a drogue release condition that will not result in an apex forward main parachute deployment. The baseline algorithm is simple and elegant, but does not perform as well as desired in drogue failure cases. A simple modi cation to the baseline algorithm can improve performance, but can also sometimes fail to identify a good release condition. A new algorithm employing simpli ed rotational dynamics and a numeric predictor to minimize a rotational energy metric is proposed. A Monte Carlo analysis of a drogue failure scenario is used to compare the performance of the algorithms. The numeric predictor prevents more of the cases from ipping apex forward, and also results in an improvement in the capsule attitude at main bag extraction. The sensitivity of the numeric predictor to aerodynamic dispersions, errors in the navigated state, and execution rate is investigated, showing little degradation in performance.
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.
1993-01-01
New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.
IR and visual image registration based on mutual information and PSO-Powell algorithm
NASA Astrophysics Data System (ADS)
Zhuang, Youwen; Gao, Kun; Miu, Xianghu
2014-11-01
Infrared and visual image registration has a wide application in the fields of remote sensing and military. Mutual information (MI) has proved effective and successful in infrared and visual image registration process. To find the most appropriate registration parameters, optimal algorithms, such as Particle Swarm Optimization (PSO) algorithm or Powell search method, are often used. The PSO algorithm has strong global search ability and search speed is fast at the beginning, while the weakness is low search performance in late search stage. In image registration process, it often takes a lot of time to do useless search and solution's precision is low. Powell search method has strong local search ability. However, the search performance and time is more sensitive to initial values. In image registration, it is often obstructed by local maximum and gets wrong results. In this paper, a novel hybrid algorithm, which combined PSO algorithm and Powell search method, is proposed. It combines both advantages that avoiding obstruction caused by local maximum and having higher precision. Firstly, using PSO algorithm gets a registration parameter which is close to global minimum. Based on the result in last stage, the Powell search method is used to find more precision registration parameter. The experimental result shows that the algorithm can effectively correct the scale, rotation and translation additional optimal algorithm. It can be a good solution to register infrared difference of two images and has a greater performance on time and precision than traditional and visible images.
NASA Astrophysics Data System (ADS)
Niu, Chaojun; Han, Xiang'e.
2015-10-01
Adaptive optics (AO) technology is an effective way to alleviate the effect of turbulence on free space optical communication (FSO). A new adaptive compensation method can be used without a wave-front sensor. Artificial bee colony algorithm (ABC) is a population-based heuristic evolutionary algorithm inspired by the intelligent foraging behaviour of the honeybee swarm with the advantage of simple, good convergence rate, robust and less parameter setting. In this paper, we simulate the application of the improved ABC to correct the distorted wavefront and proved its effectiveness. Then we simulate the application of ABC algorithm, differential evolution (DE) algorithm and stochastic parallel gradient descent (SPGD) algorithm to the FSO system and analyze the wavefront correction capabilities by comparison of the coupling efficiency, the error rate and the intensity fluctuation in different turbulence before and after the correction. The results show that the ABC algorithm has much faster correction speed than DE algorithm and better correct ability for strong turbulence than SPGD algorithm. Intensity fluctuation can be effectively reduced in strong turbulence, but not so effective in week turbulence.
A Space-Bounded Anytime Algorithm for the Multiple Longest Common Subsequence Problem
Yang, Jiaoyun; Xu, Yun; Shang, Yi; Chen, Guoliang
2014-01-01
The multiple longest common subsequence (MLCS) problem, related to the identification of sequence similarity, is an important problem in many fields. As an NP-hard problem, its exact algorithms have difficulty in handling large-scale data and time- and space-efficient algorithms are required in real-world applications. To deal with time constraints, anytime algorithms have been proposed to generate good solutions with a reasonable time. However, there exists little work on space-efficient MLCS algorithms. In this paper, we formulate the MLCS problem into a graph search problem and present two space-efficient anytime MLCS algorithms, SA-MLCS and SLA-MLCS. SA-MLCS uses an iterative beam widening search strategy to reduce space usage during the iterative process of finding better solutions. Based on SA-MLCS, SLA-MLCS, a space-bounded algorithm, is developed to avoid space usage from exceeding available memory. SLA-MLCS uses a replacing strategy when SA-MLCS reaches a given space bound. Experimental results show SA-MLCS and SLA-MLCS use an order of magnitude less space and time than the state-of-the-art approximate algorithm MLCS-APP while finding better solutions. Compared to the state-of-the-art anytime algorithm Pro-MLCS, SA-MLCS and SLA-MLCS can solve an order of magnitude larger size instances. Furthermore, SLA-MLCS can find much better solutions than SA-MLCS on large size instances. PMID:25400485
On Parallel Push-Relabel based Algorithms for Bipartite Maximum Matching
Langguth, Johannes; Azad, Md Ariful; Halappanavar, Mahantesh; Manne, Fredrik
2014-07-01
We study multithreaded push-relabel based algorithms for computing maximum cardinality matching in bipartite graphs. Matching is a fundamental combinatorial (graph) problem with applications in a wide variety of problems in science and engineering. We are motivated by its use in the context of sparse linear solvers for computing maximum transversal of a matrix. We implement and test our algorithms on several multi-socket multicore systems and compare their performance to state-of-the-art augmenting path-based serial and parallel algorithms using a testset comprised of a wide range of real-world instances. Building on several heuristics for enhancing performance, we demonstrate good scaling for the parallel push-relabel algorithm. We show that it is comparable to the best augmenting path-based algorithms for bipartite matching. To the best of our knowledge, this is the first extensive study of multithreaded push-relabel based algorithms. In addition to a direct impact on the applications using matching, the proposed algorithmic techniques can be extended to preflow-push based algorithms for computing maximum flow in graphs.
An Algorithm for Testing the Efficient Market Hypothesis
Boboc, Ioana-Andreea; Dinică, Mihai-Cristian
2013-01-01
The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH). PMID:24205148
An algorithm for testing the efficient market hypothesis.
Boboc, Ioana-Andreea; Dinică, Mihai-Cristian
2013-01-01
The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH). PMID:24205148
Projection Classification Based Iterative Algorithm
NASA Astrophysics Data System (ADS)
Zhang, Ruiqiu; Li, Chen; Gao, Wenhua
2015-05-01
Iterative algorithm has good performance as it does not need complete projection data in 3D image reconstruction area. It is possible to be applied in BGA based solder joints inspection but with low convergence speed which usually acts with x-ray Laminography that has a worse reconstruction image compared to the former one. This paper explores to apply one projection classification based method which tries to separate the object to three parts, i.e. solute, solution and air, and suppose that the reconstruction speed decrease from solution to two other parts on both side lineally. And then SART and CAV algorithms are improved under the proposed idea. Simulation experiment result with incomplete projection images indicates the fast convergence speed of the improved iterative algorithms and the effectiveness of the proposed method. Less the projection images, more the superiority is also founded.
Planning a Successful Tech Show
ERIC Educational Resources Information Center
Nikirk, Martin
2011-01-01
Tech shows are a great way to introduce prospective students, parents, and local business and industry to a technology and engineering or career and technical education program. In addition to showcasing instructional programs, a tech show allows students to demonstrate their professionalism and skills, practice public presentations, and interact…
Hey Teacher, Your Personality's Showing!
ERIC Educational Resources Information Center
Paulsen, James R.
1977-01-01
A study of 30 fourth, fifth, and sixth grade teachers and 300 of their students showed that a teacher's age, sex, and years of experience did not relate to students' mathematics achievement, but that more effective teachers showed greater "freedom from defensive behavior" than did less effective teachers. (DT)
... shows the ranges for blood glucose levels after 8 to 12 hours of fasting (not eating). It shows the normal range and the abnormal ranges that are a sign of prediabetes or diabetes. Plasma Glucose Results (mg/dL)* Diagnosis 70 to 99 ...
Computed laminography and reconstruction algorithm
NASA Astrophysics Data System (ADS)
Que, Jie-Min; Cao, Da-Quan; Zhao, Wei; Tang, Xiao; Sun, Cui-Li; Wang, Yan-Fang; Wei, Cun-Feng; Shi, Rong-Jian; Wei, Long; Yu, Zhong-Qiang; Yan, Yong-Lian
2012-08-01
Computed laminography (CL) is an alternative to computed tomography if large objects are to be inspected with high resolution. This is especially true for planar objects. In this paper, we set up a new scanning geometry for CL, and study the algebraic reconstruction technique (ART) for CL imaging. We compare the results of ART with variant weighted functions by computer simulation with a digital phantom. It proves that ART algorithm is a good choice for the CL system.
Linear Bregman algorithm implemented in parallel GPU
NASA Astrophysics Data System (ADS)
Li, Pengyan; Ke, Jue; Sui, Dong; Wei, Ping
2015-08-01
At present, most compressed sensing (CS) algorithms have poor converging speed, thus are difficult to run on PC. To deal with this issue, we use a parallel GPU, to implement a broadly used compressed sensing algorithm, the Linear Bregman algorithm. Linear iterative Bregman algorithm is a reconstruction algorithm proposed by Osher and Cai. Compared with other CS reconstruction algorithms, the linear Bregman algorithm only involves the vector and matrix multiplication and thresholding operation, and is simpler and more efficient for programming. We use C as a development language and adopt CUDA (Compute Unified Device Architecture) as parallel computing architectures. In this paper, we compared the parallel Bregman algorithm with traditional CPU realized Bregaman algorithm. In addition, we also compared the parallel Bregman algorithm with other CS reconstruction algorithms, such as OMP and TwIST algorithms. Compared with these two algorithms, the result of this paper shows that, the parallel Bregman algorithm needs shorter time, and thus is more convenient for real-time object reconstruction, which is important to people's fast growing demand to information technology.
On mapping systolic algorithms onto the hypercube
Ibarra, O.H.; Sohn, S.M. )
1990-01-01
Much effort has been devoted toward developing efficient algorithms for systolic arrays. Here the authors consider the problem of mapping these algorithms into efficient algorithms for a fixed-size hypercube architecture. They describe in detail several optimal implementations of algorithms given for one-way one and two-dimensional systolic arrays. Since interprocessor communication is many times slower than local computation in parallel computers built to date, the problem of efficient communication is specifically addressed for these mappings. In order to experimentally validate the technique, five systolic algorithms were mapped in various ways onto a 64-node NCUBE/7 MMD hypercube machine. The algorithms are for the following problems: the shuffle scheduling problem, finite impulse response filtering, linear context-free language recognition, matrix multiplication, and computing the Boolean transitive closure. Experimental evidence indicates that good performance is obtained for the mappings.
ERIC Educational Resources Information Center
Chase, Barbara
2011-01-01
How are independent schools to be useful to the wider world? Beyond their common commitment to educate their students for meaningful lives in service of the greater good, can they educate a broader constituency and, thus, share their resources and skills more broadly? Their answers to this question will be shaped by their independence. Any…
Gender Play and Good Governance
ERIC Educational Resources Information Center
Powell, Mark
2008-01-01
Like good government, thoughtful care of children requires those in power, whether teachers or parents, to recognize when it is appropriate for them to step back from day-to-day decision-making while still working behind the scenes to ensure an organizational structure that supports the independence and equitable development of those they serve.…
ERIC Educational Resources Information Center
Westwood, Andy
Some new work is good work. Quality is ultimately defined by the individual. However, these perceptions are inevitably colored by the circumstances in which people find themselves, by the time, place, and wide range of motivations for having to do a particular job in the first place. One person's quality may be another's purgatory and vice versa.…
Practicing Good Habits, Grade 2.
ERIC Educational Resources Information Center
Nguyen Van Quan; And Others
This illustrated primer, designed for second grade students in Vietnam, consists of stories depicting rural family life in Vietnam. The book is divided into the following six chapters: (1) Practicing Good Habits (health, play, helpfulness); (2) Duties at Home (grandparents, father and mother, servants, the extended family; (3) Duties in School…
Practicing Good Habits, Grade 1.
ERIC Educational Resources Information Center
Huynh Cong Tu; And Others
This primer, intended for use during the child's first year in elementary school in Vietnam, relates the story of the daily lives of Hong, age 10, and her brother Lac, age 7, at home and at school. The 64 lessons are divided into four chapters: (1) Good Habits (personal hygiene, grooming, dressing, obedience, truthfulness); (2) At Home: Father and…
Measuring Goodness of Story Narratives
ERIC Educational Resources Information Center
Le, Karen; Coelho, Carl; Mozeiko, Jennifer; Grafman, Jordan
2011-01-01
Purpose: The purpose of this article was to evaluate a new measure of story narrative performance: story completeness. It was hypothesized that by combining organizational (story grammar) and completeness measures, story "goodness" could be quantified. Method: Discourse samples from 46 typically developing adults were compared with those from 24…
ERIC Educational Resources Information Center
Bigler, Rebecca S.
2005-01-01
It happens every day across the nation: Teachers welcome their students to class by saying, "Good morning, boys and girls." It is one of countless ways teachers highlight gender with their speech and behavior. Unfortunately, teachers' use of gender to label students and organize the classroom can have negative consequences. New research in the…
ERIC Educational Resources Information Center
Croxall, Kathy C.; Gubler, Rea R.
2006-01-01
Everyone loves a good story. Reading brings back pleasant memories of being read to by parents or others. Literacy is encouraged when students are continually exposed to stories and books. Teachers can encourage students to discover their parents' favorite stories and share them with the class. In this article, the authors recommend the use of…
Metrics for Soft Goods Merchandising.
ERIC Educational Resources Information Center
Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.
Designed to meet the job-related metric measurement needs of students interested in soft goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…
Metrics for Hard Goods Merchandising.
ERIC Educational Resources Information Center
Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.
Designed to meet the job-related metric measurement needs of students interested in hard goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…
"What's the Plan?": "Good Management Begins with Good People"
ERIC Educational Resources Information Center
Vicars, Dennis
2008-01-01
In order for a successful center/school to achieve all it can for its children, staff, and operator, a plan is critical. Good planning begins by looking into the future that one wants for his or her center/school. Be as descriptive as possible in writing down the details of what that future looks like. Next, walk backwards from that future to the…
Satellite Movie Shows Erika Dissipate
This animation of visible and infrared imagery from NOAA's GOES-West satellite from Aug. 27 to 29 shows Tropical Storm Erika move through the Eastern Caribbean Sea and dissipate near eastern Cuba. ...
Performance Analysis of Selective Breeding Algorithm on One Dimensional Bin Packing Problems
NASA Astrophysics Data System (ADS)
Sriramya, P.; Parvathavarthini, B.
2012-12-01
The bin packing optimization problem packs a set of objects into a set of bins so that the amount of wasted space is minimized. The bin packing problem has many important applications. The objective is to find a feasible assignment of all weights to bins that minimizes the total number of bins used. The bin packing problem models several practical problems in such diverse areas as industrial control, computer systems, machine scheduling, VLSI chip layout and etc. Selective breeding algorithm (SBA) is an iterative procedure which borrows the ideas of artificial selection and breeding process. By simulating artificial evolution in this way SBA algorithm can easily solve complex problems. One dimensional bin packing benchmark problems are taken for evaluating the performance of the SBA. The computational results of SBA algorithm show optimal solution for the tested benchmark problems. The proposed SBA algorithm is a good problem-solving technique for one dimensional bin packing problems.
Combining algorithms in automatic detection of QRS complexes in ECG signals.
Meyer, Carsten; Fernández Gavela, José; Harris, Matthew
2006-07-01
QRS complex and specifically R-Peak detection is the crucial first step in every automatic electrocardiogram analysis. Much work has been carried out in this field, using various methods ranging from filtering and threshold methods, through wavelet methods, to neural networks and others. Performance is generally good, but each method has situations where it fails. In this paper, we suggest an approach to automatically combine different QRS complex detection algorithms, here the Pan-Tompkins and wavelet algorithms, to benefit from the strengths of both methods. In particular, we introduce parameters allowing to balance the contribution of the individual algorithms; these parameters are estimated in a data-driven way. Experimental results and analysis are provided on the Massachusetts Institute of Technology-Beth Israel Hospital (MIT-BIH) Arrhythmia Database. We show that our combination approach outperforms both individual algorithms. PMID:16871713
Study on algorithm and real-time implementation of infrared image processing based on FPGA
NASA Astrophysics Data System (ADS)
Pang, Yulin; Ding, Ruijun; Liu, Shanshan; Chen, Zhe
2010-10-01
With the fast development of Infrared Focal Plane Arrays (IRFPA) detectors, high quality real-time image processing becomes more important in infrared imaging system. Facing the demand of better visual effect and good performance, we find FPGA is an ideal choice of hardware to realize image processing algorithm that fully taking advantage of its high speed, high reliability and processing a great amount of data in parallel. In this paper, a new idea of dynamic linear extension algorithm is introduced, which has the function of automatically finding the proper extension range. This image enhancement algorithm is designed in Verilog HDL and realized on FPGA. It works on higher speed than serial processing device like CPU and DSP. Experiment shows that this hardware unit of dynamic linear extension algorithm enhances the visual effect of infrared image effectively.
A simple parallel prefix algorithm for compact finite-difference schemes
NASA Technical Reports Server (NTRS)
Sun, Xian-He; Joslin, Ronald D.
1993-01-01
A compact scheme is a discretization scheme that is advantageous in obtaining highly accurate solutions. However, the resulting systems from compact schemes are tridiagonal systems that are difficult to solve efficiently on parallel computers. Considering the almost symmetric Toeplitz structure, a parallel algorithm, simple parallel prefix (SPP), is proposed. The SPP algorithm requires less memory than the conventional LU decomposition and is highly efficient on parallel machines. It consists of a prefix communication pattern and AXPY operations. Both the computation and the communication can be truncated without degrading the accuracy when the system is diagonally dominant. A formal accuracy study was conducted to provide a simple truncation formula. Experimental results were measured on a MasPar MP-1 SIMD machine and on a Cray 2 vector machine. Experimental results show that the simple parallel prefix algorithm is a good algorithm for the compact scheme on high-performance computers.
An improved distributed routing algorithm for Benes based optical NoC
NASA Astrophysics Data System (ADS)
Zhang, Jing; Gu, Huaxi; Yang, Yintang
2010-08-01
Integrated optical interconnect is believed to be one of the main technologies to replace electrical wires. Optical Network-on-Chip (ONoC) has attracted more attentions nowadays. Benes topology is a good choice for ONoC for its rearrangeable non-blocking character, multistage feature and easy scalability. Routing algorithm plays an important role in determining the performance of ONoC. But traditional routing algorithms for Benes network are not suitable for ONoC communication, we developed a new distributed routing algorithm for Benes ONoC in this paper. Our algorithm selected the routing path dynamically according to network condition and enables more path choices for the message traveling in the network. We used OPNET to evaluate the performance of our routing algorithm and also compared it with a well-known bit-controlled routing algorithm. ETE delay and throughput were showed under different packet length and network sizes. Simulation results show that our routing algorithm can provide better performance for ONoC.
Stability of Bareiss algorithm
NASA Astrophysics Data System (ADS)
Bojanczyk, Adam W.; Brent, Richard P.; de Hoog, F. R.
1991-12-01
In this paper, we present a numerical stability analysis of Bareiss algorithm for solving a symmetric positive definite Toeplitz system of linear equations. We also compare Bareiss algorithm with Levinson algorithm and conclude that the former has superior numerical properties.
Jiang, Wenjuan; Shi, Yunbo; Zhao, Wenjie; Wang, Xiangxin
2016-01-01
The main part of the magnetic fluxgate sensor is the magnetic core, the hysteresis characteristic of which affects the performance of the sensor. When the fluxgate sensors are modelled for design purposes, an accurate model of hysteresis characteristic of the cores is necessary to achieve good agreement between modelled and experimental data. The Jiles-Atherton model is simple and can reflect the hysteresis properties of the magnetic material precisely, which makes it widely used in hysteresis modelling and simulation of ferromagnetic materials. However, in practice, it is difficult to determine the parameters accurately owing to the sensitivity of the parameters. In this paper, the Biogeography-Based Optimization (BBO) algorithm is applied to identify the Jiles-Atherton model parameters. To enhance the performances of the BBO algorithm such as global search capability, search accuracy and convergence rate, an improved Biogeography-Based Optimization (IBBO) algorithm is put forward by using Arnold map and mutation strategy of Differential Evolution (DE) algorithm. Simulation results show that IBBO algorithm is superior to Genetic Algorithm (GA), Particle Swarm Optimization (PSO) algorithm, Differential Evolution algorithm and BBO algorithm in identification accuracy and convergence rate. The IBBO algorithm is applied to identify Jiles-Atherton model parameters of selected permalloy. The simulation hysteresis loop is in high agreement with experimental data. Using permalloy as core of fluxgate probe, the simulation output is consistent with experimental output. The IBBO algorithm can identify the parameters of Jiles-Atherton model accurately, which provides a basis for the precise analysis and design of instruments and equipment with magnetic core. PMID:27347974
Jiang, Wenjuan; Shi, Yunbo; Zhao, Wenjie; Wang, Xiangxin
2016-01-01
The main part of the magnetic fluxgate sensor is the magnetic core, the hysteresis characteristic of which affects the performance of the sensor. When the fluxgate sensors are modelled for design purposes, an accurate model of hysteresis characteristic of the cores is necessary to achieve good agreement between modelled and experimental data. The Jiles-Atherton model is simple and can reflect the hysteresis properties of the magnetic material precisely, which makes it widely used in hysteresis modelling and simulation of ferromagnetic materials. However, in practice, it is difficult to determine the parameters accurately owing to the sensitivity of the parameters. In this paper, the Biogeography-Based Optimization (BBO) algorithm is applied to identify the Jiles-Atherton model parameters. To enhance the performances of the BBO algorithm such as global search capability, search accuracy and convergence rate, an improved Biogeography-Based Optimization (IBBO) algorithm is put forward by using Arnold map and mutation strategy of Differential Evolution (DE) algorithm. Simulation results show that IBBO algorithm is superior to Genetic Algorithm (GA), Particle Swarm Optimization (PSO) algorithm, Differential Evolution algorithm and BBO algorithm in identification accuracy and convergence rate. The IBBO algorithm is applied to identify Jiles-Atherton model parameters of selected permalloy. The simulation hysteresis loop is in high agreement with experimental data. Using permalloy as core of fluxgate probe, the simulation output is consistent with experimental output. The IBBO algorithm can identify the parameters of Jiles-Atherton model accurately, which provides a basis for the precise analysis and design of instruments and equipment with magnetic core. PMID:27347974
A Multigrid Algorithm for Immersed Interface Problems
NASA Technical Reports Server (NTRS)
Adams, Loyce
1996-01-01
Many physical problems involve interior interfaces across which the coefficients in the problem, the solution, its derivatives, the flux, or the source term may have jumps. These interior interfaces may or may not align with a underlying Cartesian grid. Zhilin Li, in his dissertation, showed how to discretize such elliptic problems using only a Cartesian grid and the known jump conditions to second order accuracy. In this paper, we describe how to apply the full multigrid algorithm in this context. In particular, the restriction, interpolation, and coarse grid problem will be described. Numerical results for several model problems are given to demonstrate that good rates can be obtained even when jumps in the coefficients are large and do not align with the grid.
National Orange Show Photovoltaic Demonstration
Dan Jimenez Sheri Raborn, CPA; Tom Baker
2008-03-31
National Orange Show Photovoltaic Demonstration created a 400KW Photovoltaic self-generation plant at the National Orange Show Events Center (NOS). The NOS owns a 120-acre state fairground where it operates an events center and produces an annual citrus fair known as the Orange Show. The NOS governing board wanted to employ cost-saving programs for annual energy expenses. It is hoped the Photovoltaic program will result in overall savings for the NOS, help reduce the State's energy demands as relating to electrical power consumption, improve quality of life within the affected grid area as well as increase the energy efficiency of buildings at our venue. In addition, the potential to reduce operational expenses would have a tremendous effect on the ability of the NOS to service its community.
Spectral Regularization Algorithms for Learning Large Incomplete Matrices.
Mazumder, Rahul; Hastie, Trevor; Tibshirani, Robert
2010-03-01
We use convex relaxation techniques to provide a sequence of regularized low-rank solutions for large-scale matrix completion problems. Using the nuclear norm as a regularizer, we provide a simple and very efficient convex algorithm for minimizing the reconstruction error subject to a bound on the nuclear norm. Our algorithm Soft-Impute iteratively replaces the missing elements with those obtained from a soft-thresholded SVD. With warm starts this allows us to efficiently compute an entire regularization path of solutions on a grid of values of the regularization parameter. The computationally intensive part of our algorithm is in computing a low-rank SVD of a dense matrix. Exploiting the problem structure, we show that the task can be performed with a complexity linear in the matrix dimensions. Our semidefinite-programming algorithm is readily scalable to large matrices: for example it can obtain a rank-80 approximation of a 10(6) × 10(6) incomplete matrix with 10(5) observed entries in 2.5 hours, and can fit a rank 40 approximation to the full Netflix training set in 6.6 hours. Our methods show very good performance both in training and test error when compared to other competitive state-of-the art techniques. PMID:21552465
Public Good Diffusion Limits Microbial Mutualism
NASA Astrophysics Data System (ADS)
Menon, Rajita; Korolev, Kirill S.
2015-04-01
Standard game theory cannot describe microbial interactions mediated by diffusible molecules. Nevertheless, we show that one can still model microbial dynamics using game theory with parameters renormalized by diffusion. Contrary to expectations, greater sharing of metabolites reduces the strength of cooperation and leads to species extinction via a nonequilibrium phase transition. We report analytic results for the critical diffusivity and the length scale of species intermixing. Species producing slower public good is favored by selection when fitness saturates with nutrient concentration.
Switch for Good Community Program
Crawford, Tabitha; Amran, Martha
2013-11-19
Switch4Good is an energy-savings program that helps residents reduce consumption from behavior changes; it was co-developed by Balfour Beatty Military Housing Management (BB) and WattzOn in Phase I of this grant. The program was offered at 11 Navy bases. Three customer engagement strategies were evaluated, and it was found that Digital Nudges (a combination of monthly consumption statements with frequent messaging via text or email) was most cost-effective.
Statistical behaviour of adaptive multilevel splitting algorithms in simple models
Rolland, Joran Simonnet, Eric
2015-02-15
Adaptive multilevel splitting algorithms have been introduced rather recently for estimating tail distributions in a fast and efficient way. In particular, they can be used for computing the so-called reactive trajectories corresponding to direct transitions from one metastable state to another. The algorithm is based on successive selection–mutation steps performed on the system in a controlled way. It has two intrinsic parameters, the number of particles/trajectories and the reaction coordinate used for discriminating good or bad trajectories. We investigate first the convergence in law of the algorithm as a function of the timestep for several simple stochastic models. Second, we consider the average duration of reactive trajectories for which no theoretical predictions exist. The most important aspect of this work concerns some systems with two degrees of freedom. They are studied in detail as a function of the reaction coordinate in the asymptotic regime where the number of trajectories goes to infinity. We show that during phase transitions, the statistics of the algorithm deviate significatively from known theoretical results when using non-optimal reaction coordinates. In this case, the variance of the algorithm is peaking at the transition and the convergence of the algorithm can be much slower than the usual expected central limit behaviour. The duration of trajectories is affected as well. Moreover, reactive trajectories do not correspond to the most probable ones. Such behaviour disappears when using the optimal reaction coordinate called committor as predicted by the theory. We finally investigate a three-state Markov chain which reproduces this phenomenon and show logarithmic convergence of the trajectory durations.
Melo; Puga; Gentil; Brito; Alves; Ramos
2000-05-01
Molecular dynamics is a well-known technique very much used in the study of biomolecular systems. The trajectory files produced by molecular dynamics simulations are extensive, and the classical lossless algorithms give poor efficiencies in their compression. In this work, a new specific algorithm, named byte structure variable length coding (BS-VLC), is introduced. Trajectory files, obtained by molecular dynamics applied to trypsin and a trypsin:pancreatic trypsin inhibitor complex, were compressed using four classical lossless algorithms (Huffman, adaptive Huffman, LZW, and LZ77) as well as the BS-VLC algorithm. The results obtained show that BS-VLC nearly triplicates the compression efficiency of the best classical lossless algorithm, preserving a near lossless behavior. Compression efficiencies close to 50% can be obtained with a high degree of precision, and the maximum efficiency possible (75%), within this algorithm, can be performed with good precision. PMID:10850759
Oyana, Tonny J.; Achenie, Luke E. K.; Heo, Joon
2012-01-01
The objective of this paper is to introduce an efficient algorithm, namely, the mathematically improved learning-self organizing map (MIL-SOM) algorithm, which speeds up the self-organizing map (SOM) training process. In the proposed MIL-SOM algorithm, the weights of Kohonen's SOM are based on the proportional-integral-derivative (PID) controller. Thus, in a typical SOM learning setting, this improvement translates to faster convergence. The basic idea is primarily motivated by the urgent need to develop algorithms with the competence to converge faster and more efficiently than conventional techniques. The MIL-SOM algorithm is tested on four training geographic datasets representing biomedical and disease informatics application domains. Experimental results show that the MIL-SOM algorithm provides a competitive, better updating procedure and performance, good robustness, and it runs faster than Kohonen's SOM. PMID:22481977
Creating Slide Show Book Reports.
ERIC Educational Resources Information Center
Taylor, Harriet G.; Stuhlmann, Janice M.
1995-01-01
Describes the use of "Kid Pix 2" software by fourth grade students to develop slide-show book reports. Highlights include collaboration with education majors from Louisiana State University, changes in attitudes of the education major students and elementary students, and problems with navigation and disk space. (LRW)
Producing Talent and Variety Shows.
ERIC Educational Resources Information Center
Szabo, Chuck
1995-01-01
Identifies key aspects of producing talent shows and outlines helpful hints for avoiding pitfalls and ensuring a smooth production. Presents suggestions concerning publicity, scheduling, and support personnel. Describes types of acts along with special needs and problems specific to each act. Includes a list of resources. (MJP)
Spaceborne SAR Imaging Algorithm for Coherence Optimized
Qiu, Zhiwei; Yue, Jianping; Wang, Xueqin; Yue, Shun
2016-01-01
This paper proposes SAR imaging algorithm with largest coherence based on the existing SAR imaging algorithm. The basic idea of SAR imaging algorithm in imaging processing is that output signal can have maximum signal-to-noise ratio (SNR) by using the optimal imaging parameters. Traditional imaging algorithm can acquire the best focusing effect, but would bring the decoherence phenomenon in subsequent interference process. Algorithm proposed in this paper is that SAR echo adopts consistent imaging parameters in focusing processing. Although the SNR of the output signal is reduced slightly, their coherence is ensured greatly, and finally the interferogram with high quality is obtained. In this paper, two scenes of Envisat ASAR data in Zhangbei are employed to conduct experiment for this algorithm. Compared with the interferogram from the traditional algorithm, the results show that this algorithm is more suitable for SAR interferometry (InSAR) research and application. PMID:26871446
Which fMRI clustering gives good brain parcellations?
Thirion, Bertrand; Varoquaux, Gaël; Dohmatob, Elvis; Poline, Jean-Baptiste
2014-01-01
Analysis and interpretation of neuroimaging data often require one to divide the brain into a number of regions, or parcels, with homogeneous characteristics, be these regions defined in the brain volume or on the cortical surface. While predefined brain atlases do not adapt to the signal in the individual subject images, parcellation approaches use brain activity (e.g., found in some functional contrasts of interest) and clustering techniques to define regions with some degree of signal homogeneity. In this work, we address the question of which clustering technique is appropriate and how to optimize the corresponding model. We use two principled criteria: goodness of fit (accuracy), and reproducibility of the parcellation across bootstrap samples. We study these criteria on both simulated and two task-based functional Magnetic Resonance Imaging datasets for the Ward, spectral and k-means clustering algorithms. We show that in general Ward’s clustering performs better than alternative methods with regard to reproducibility and accuracy and that the two criteria diverge regarding the preferred models (reproducibility leading to more conservative solutions), thus deferring the practical decision to a higher level alternative, namely the choice of a trade-off between accuracy and stability. PMID:25071425
NASA Astrophysics Data System (ADS)
Gumerov, Nail A.; Karavaev, Alexey V.; Surjalal Sharma, A.; Shao, Xi; Papadopoulos, Konstantinos D.
2011-04-01
Efficient spectral and pseudospectral algorithms for simulation of linear and nonlinear 3D whistler waves in a cold electron plasma are developed. These algorithms are applied to the simulation of whistler waves generated by loop antennas and spheromak-like stationary waves of considerable amplitude. The algorithms are linearly stable and show good stability properties for computations of nonlinear waves over tens of thousands of time steps. Additional speedups by factors of 10-20 (comparing single core CPU and one GPU) are achieved by using graphics processors (GPUs), which enable efficient numerical simulation of the wave propagation on relatively high resolution meshes (tens of millions nodes) in personal computing environment. Comparisons of the numerical results with analytical solutions and experiments show good agreement. The limitations of the codes and the performance of the GPU computing are discussed.
Virtual Goods Recommendations in Virtual Worlds
Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren
2015-01-01
Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods. PMID:25834837
NASA Technical Reports Server (NTRS)
2004-01-01
The upper left image in this display is from the panoramic camera on the Mars Exploration Rover Spirit, showing the 'Magic Carpet' region near the rover at Gusev Crater, Mars, on Sol 7, the seventh martian day of its journey (Jan. 10, 2004). The lower image, also from the panoramic camera, is a monochrome (single filter) image of a rock in the 'Magic Carpet' area. Note that colored portions of the rock correlate with extracted spectra shown in the plot to the side. Four different types of materials are shown: the rock itself, the soil in front of the rock, some brighter soil on top of the rock, and some dust that has collected in small recesses on the rock face ('spots'). Each color on the spectra matches a line on the graph, showing how the panoramic camera's different colored filters are used to broadly assess the varying mineral compositions of martian rocks and soils.
Stitching algorithm of the images acquired from different points of fixation
NASA Astrophysics Data System (ADS)
Semenishchev, E. A.; Voronin, V. V.; Marchuk, V. I.; Pismenskova, M. M.
2015-02-01
Image mosaicing is the act of combining two or more images and is used in many applications in computer vision, image processing, and computer graphics. It aims to combine images such that no obstructive boundaries exist around overlapped regions and to create a mosaic image that exhibits as little distortion as possible from the original images. Most of the existing algorithms are the computationally complex and don't show good results always in obtaining of the stitched images, which are different: scale, light, various free points of view and others. In this paper we consider an algorithm which allows increasing the speed of processing in the case of stitching high-resolution images. We reduced the computational complexity used an edge image analysis and saliency map on high-detailisation areas. On detected areas are determined angles of rotation, scaling factors, the coefficients of the color correction and transformation matrix. We define key points using SURF detector and ignore false correspondences based on correlation analysis. The proposed algorithm allows to combine images from free points of view with the different color balances, time shutter and scale. We perform a comparative study and show that statistically, the new algorithm deliver good quality results compared to existing algorithms.
Analysis of the Dryden Wet Bulb GLobe Temperature Algorithm for White Sands Missile Range
NASA Technical Reports Server (NTRS)
LaQuay, Ryan Matthew
2011-01-01
In locations where workforce is exposed to high relative humidity and light winds, heat stress is a significant concern. Such is the case at the White Sands Missile Range in New Mexico. Heat stress is depicted by the wet bulb globe temperature, which is the official measurement used by the American Conference of Governmental Industrial Hygienists. The wet bulb globe temperature is measured by an instrument which was designed to be portable and needing routine maintenance. As an alternative form for measuring the wet bulb globe temperature, algorithms have been created to calculate the wet bulb globe temperature from basic meteorological observations. The algorithms are location dependent; therefore a specific algorithm is usually not suitable for multiple locations. Due to climatology similarities, the algorithm developed for use at the Dryden Flight Research Center was applied to data from the White Sands Missile Range. A study was performed that compared a wet bulb globe instrument to data from two Surface Atmospheric Measurement Systems that was applied to the Dryden wet bulb globe temperature algorithm. The period of study was from June to September of2009, with focus being applied from 0900 to 1800, local time. Analysis showed that the algorithm worked well, with a few exceptions. The algorithm becomes less accurate to the measurement when the dew point temperature is over 10 Celsius. Cloud cover also has a significant effect on the measured wet bulb globe temperature. The algorithm does not show red and black heat stress flags well due to shorter time scales of such events. The results of this study show that it is plausible that the Dryden Flight Research wet bulb globe temperature algorithm is compatible with the White Sands Missile Range, except for when there are increased dew point temperatures and cloud cover or precipitation. During such occasions, the wet bulb globe temperature instrument would be the preferred method of measurement. Out of the 30
NASA Astrophysics Data System (ADS)
Bolognesi, Tommaso
2011-07-01
In the context of quantum gravity theories, several researchers have proposed causal sets as appropriate discrete models of spacetime. We investigate families of causal sets obtained from two simple models of computation - 2D Turing machines and network mobile automata - that operate on 'high-dimensional' supports, namely 2D arrays of cells and planar graphs, respectively. We study a number of quantitative and qualitative emergent properties of these causal sets, including dimension, curvature and localized structures, or 'particles'. We show how the possibility to detect and separate particles from background space depends on the choice between a global or local view at the causal set. Finally, we spot very rare cases of pseudo-randomness, or deterministic chaos; these exhibit a spontaneous phenomenon of 'causal compartmentation' that appears as a prerequisite for the occurrence of anything of physical interest in the evolution of spacetime.
A 2D vector map watermarking algorithm resistant to simplication attack
NASA Astrophysics Data System (ADS)
Wang, Chuanjian; Liang, Bin; Zhao, Qingzhan; Qiu, Zuqi; Peng, Yuwei; Yu, Liang
2009-12-01
Vector maps are valuable asset of data producers. How to protect copyright of vector maps effectively using digital watermarking is a hot research issue. In this paper, we propose a new robust and blind watermarking algorithm resilient to simplification attack. We proof that spatial topological relation between map objects bears an important property of approximate simplification invariance. We choose spatial topological relations as watermark feature domain and embed watermarks by slightly modifying spatial topological relation between map objects. Experiment shows that our algorithm has good performance to resist simplification attack and tradeoff of the robustness and data fidelity is acquired.
Hu, Y.; Liu, Z.; Shi, X.; Wang, B.
2006-07-01
A brief introduction of characteristic statistic algorithm (CSA) is given in the paper, which is a new global optimization algorithm to solve the problem of PWR in-core fuel management optimization. CSA is modified by the adoption of back propagation neural network and fast local adjustment. Then the modified CSA is applied to PWR Equilibrium Cycle Reloading Optimization, and the corresponding optimization code of CSA-DYW is developed. CSA-DYW is used to optimize the equilibrium cycle of 18 month reloading of Daya bay nuclear plant Unit 1 reactor. The results show that CSA-DYW has high efficiency and good global performance on PWR Equilibrium Cycle Reloading Optimization. (authors)
A new algorithm for detecting cloud height using OMPS/LP measurements
NASA Astrophysics Data System (ADS)
Chen, Zhong; DeLand, Matthew; Bhartia, Pawan K.
2016-03-01
The Ozone Mapping and Profiler Suite Limb Profiler (OMPS/LP) ozone product requires the determination of cloud height for each event to establish the lower boundary of the profile for the retrieval algorithm. We have created a revised cloud detection algorithm for LP measurements that uses the spectral dependence of the vertical gradient in radiance between two wavelengths in the visible and near-IR spectral regions. This approach provides better discrimination between clouds and aerosols than results obtained using a single wavelength. Observed LP cloud height values show good agreement with coincident Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) measurements.
ENVITEC shows off air technologies
McIlvaine, R.W.
1995-08-01
The ENVITEC International Trade Fair for Environmental Protection and Waste Management Technologies, held in June in Duesseldorf, Germany, is the largest air pollution exhibition in the world and may be the largest environmental technology show overall. Visitors saw thousands of environmental solutions from 1,318 companies representing 29 countries and occupying roughly 43,000 square meters of exhibit space. Many innovations were displayed under the category, ``thermal treatment of air pollutants.`` New technologies include the following: regenerative thermal oxidizers; wet systems for removing pollutants; biological scrubbers;electrostatic precipitators; selective adsorption systems; activated-coke adsorbers; optimization of scrubber systems; and air pollution monitors.
Detection algorithm of big bandwidth chirp signals based on STFT
NASA Astrophysics Data System (ADS)
Wang, Jinzhen; Wu, Juhong; Su, Shaoying; Chen, Zengping
2014-10-01
Aiming at solving the problem of detecting the wideband chirp signals under low Signal-to-Noise Ratio (SNR) condition, an effective signal detection algorithm based on Short-Time-Fourier-Transform (STFT) is proposed. Considering the characteristic of dispersion of noise spectrum and concentration of chirp spectrum, STFT is performed on chirp signals with Gauss window by fixed step, and these frequencies of peak spectrum obtained from every STFT are in correspondence to the time of every stepped window. Then, the frequencies are binarized and the approach similar to mnk method in time domain is used to detect the chirp pulse signal and determine the coarse starting time and ending time. Finally, the data segments, where the former starting time and ending time locate, are subdivided into many segments evenly, on which the STFT is implemented respectively. By that, the precise starting and ending time are attained. Simulations shows that when the SNR is higher than -28dB, the detection probability is not less than 99% and false alarm probability is zero, and also good estimation accuracy of starting and ending time is acquired. The algorithm is easy to realize and surpasses FFT in computation when the width of STFT window and step length are selected properly, so the presented algorithm has good engineering value.
PACS model based on digital watermarking and its core algorithms
NASA Astrophysics Data System (ADS)
Que, Dashun; Wen, Xianlin; Chen, Bi
2009-10-01
PACS model based on digital watermarking is proposed by analyzing medical image features and PACS requirements from the point of view of information security, its core being digital watermarking server and the corresponding processing module. Two kinds of digital watermarking algorithm are studied; one is non-region of interest (NROI) digital watermarking algorithm based on wavelet domain and block-mean, the other is reversible watermarking algorithm on extended difference and pseudo-random matrix. The former belongs to robust lossy watermarking, which embedded in NROI by wavelet provides a good way for protecting the focus area (ROI) of images, and introduction of block-mean approach a good scheme to enhance the anti-attack capability; the latter belongs to fragile lossless watermarking, which has the performance of simple implementation and can realize tamper localization effectively, and the pseudo-random matrix enhances the correlation and security between pixels. Plenty of experimental research has been completed in this paper, including the realization of digital watermarking PACS model, the watermarking processing module and its anti-attack experiments, the digital watermarking server and the network transmission simulating experiments of medical images. Theoretical analysis and experimental results show that the designed PACS model can effectively ensure confidentiality, authenticity, integrity and security of medical image information.
Parallelizing a Symbolic Compositional Model-Checking Algorithm
NASA Astrophysics Data System (ADS)
Cohen, Ariel; Namjoshi, Kedar S.; Sa'Ar, Yaniv; Zuck, Lenore D.; Kisyova, Katya I.
We describe a parallel, symbolic, model-checking algorithm, built around a compositional reasoning method. The method constructs a collection of per-process (i.e., local) invariants, which together imply a desired global safety property. The local invariant computation is a simultaneous fixpoint evaluation, which easily lends itself to parallelization. Moreover, locality of reasoning helps limit both the frequency and the amount of cross-thread synchronization, leading to good parallel performance. Experimental results show that the parallelized computation can achieve substantial speed-up, with reasonably small memory overhead.
Beneficence: doing good for others.
Gillon, R
1985-07-01
Gillon's essay on beneficence is one in a series of British Medical Journal articles on philosophical medical ethics. The duty of beneficence, or doing good for others, figures more prominently in medicine than in most other professions. As important as beneficence is in the physician patient relationship, however, it must be tempered by respect for the patient's autonomy; by the duty of nonmaleficence, or of doing no harm; and by a concern for justice, especially in the allocation of scarce medical resources. PMID:3926060
Library of Continuation Algorithms
Energy Science and Technology Software Center (ESTSC)
2005-03-01
LOCA (Library of Continuation Algorithms) is scientific software written in C++ that provides advanced analysis tools for nonlinear systems. In particular, it provides parameter continuation algorithms. bifurcation tracking algorithms, and drivers for linear stability analysis. The algorithms are aimed at large-scale applications that use Newtons method for their nonlinear solve.
Parallelized Dilate Algorithm for Remote Sensing Image
Zhang, Suli; Hu, Haoran; Pan, Xin
2014-01-01
As an important algorithm, dilate algorithm can give us more connective view of a remote sensing image which has broken lines or objects. However, with the technological progress of satellite sensor, the resolution of remote sensing image has been increasing and its data quantities become very large. This would lead to the decrease of algorithm running speed or cannot obtain a result in limited memory or time. To solve this problem, our research proposed a parallelized dilate algorithm for remote sensing Image based on MPI and MP. Experiments show that our method runs faster than traditional single-process algorithm. PMID:24955392
An improved algorithm of mask image dodging for aerial image
NASA Astrophysics Data System (ADS)
Zhang, Zuxun; Zou, Songbai; Zuo, Zhiqi
2011-12-01
The technology of Mask image dodging based on Fourier transform is a good algorithm in removing the uneven luminance within a single image. At present, the difference method and the ratio method are the methods in common use, but they both have their own defects .For example, the difference method can keep the brightness uniformity of the whole image, but it is deficient in local contrast; meanwhile the ratio method can work better in local contrast, but sometimes it makes the dark areas of the original image too bright. In order to remove the defects of the two methods effectively, this paper on the basis of research of the two methods proposes a balance solution. Experiments show that the scheme not only can combine the advantages of the difference method and the ratio method, but also can avoid the deficiencies of the two algorithms.
An improved piecewise linear chaotic map based image encryption algorithm.
Hu, Yuping; Zhu, Congxu; Wang, Zhijian
2014-01-01
An image encryption algorithm based on improved piecewise linear chaotic map (MPWLCM) model was proposed. The algorithm uses the MPWLCM to permute and diffuse plain image simultaneously. Due to the sensitivity to initial key values, system parameters, and ergodicity in chaotic system, two pseudorandom sequences are designed and used in the processes of permutation and diffusion. The order of processing pixels is not in accordance with the index of pixels, but it is from beginning or end alternately. The cipher feedback was introduced in diffusion process. Test results and security analysis show that not only the scheme can achieve good encryption results but also its key space is large enough to resist against brute attack. PMID:24592159