Why is Boris Algorithm So Good?
et al, Hong Qin
2013-03-03
Due to its excellent long term accuracy, the Boris algorithm is the de facto standard for advancing a charged particle. Despite its popularity, up to now there has been no convincing explanation why the Boris algorithm has this advantageous feature. In this letter, we provide an answer to this question. We show that the Boris algorithm conserves phase space volume, even though it is not symplectic. The global bound on energy error typically associated with symplectic algorithms still holds for the Boris algorithm, making it an effective algorithm for the multi-scale dynamics of plasmas.
Collagen-Immobilized Lipases Show Good Activity and Reusability for Butyl Butyrate Synthesis.
Dewei, Song; Min, Chen; Haiming, Cheng
2016-11-01
Candida rugosa lipases were immobilized onto collagen fibers through glutaraldehyde cross-linking method. The immobilization process has been optimized. Under the optimal immobilization conditions, the activity of the collagen-immobilized lipase reached 340 U/g. The activity was recovered of 28.3 % by immobilization. The operational stability of the obtained collagen-immobilized lipase for hydrolysis of olive oil emulsion was determined. The collagen-immobilized lipase showed good tolerance to temperature and pH variations in comparison to free lipase. The collagen-immobilized lipase was also applied as biocatalyst for synthesis of butyl butyrate from butyric acid and 1-butanol in n-hexane. The conversion yield was 94 % at the optimal conditions. Of its initial activity, 64 % was retained after 5 cycles for synthesizing butyl butyrate in n-hexane.
Cationorm shows good tolerability on human HCE-2 corneal epithelial cell cultures.
Kinnunen, Kati; Kauppinen, Anu; Piippo, Niina; Koistinen, Arto; Toropainen, Elisa; Kaarniranta, Kai
2014-03-01
mitochondrial metabolism to 73% with Cationorm and 53% with BAK from that of the control cells after 30 min exposure in MTT assay. BAK was the only test compound having clear adverse effects on the cell number and metabolism in CCK-8 assay. The activity of caspase-3 did not show significant differences between the groups. Inflammatory response after exposure to Cationorm was significantly lower than after exposure to BAK. There were no significant differences in NF-κB activity between the groups. Diluted Cationorm and Systane with polyquaternium-1/polidronium chloride 0.001% showed good tolerability on HCE-2 cells and thereby provide a clear improvement when compared to BAK-containing eye drop formulations.
Katabami, Takuyuki; Ishii, Satoshi; Obi, Ryusei; Asai, Shiko; Tanaka, Yasushi
2016-12-30
Unilateral and/or predominant uptake on adrenocortical scintigraphy (ACS) may be related to autonomous cortisol overproduction in patients with subclinical Cushing's syndrome (SCS). However, there is no information regarding whether increased tracer uptake on the tumor side or decreased uptake on the contralateral side on ACS is more greatly associated with inappropriate cortisol production. Therefore, we evaluated the relationship between quantitative (131)I-6β-iodomethyl-norcholesterol ((131)I-NP-59) uptake in both adrenal glands and parameters of autonomic cortisol secretion and attempted to set a cut off for SCS detection. The study included 90 patients with unilateral adrenal adenoma who fulfilled strict criteria. The diagnosis of SCS was based on serum cortisol ≥3.0 μg/dL after 1-mg dexamethasone suppression test (DST) with at least 1 other hypothalamus-pituitary-adrenal axis function abnormality. Twenty-two (27.7%) subjects were diagnosed with SCS. The uptake rate on the affected side in the SCS group was comparable to that in the non-functioning adenoma group. In contrast, the uptake rate on the contralateral side was lower and the laterality ratio significantly higher in the SCS group. The two ACS indices were correlated with serum cortisol levels after a 1-mg DST, but uptake on the tumor side was not. Tumor size was also important for the functional statuses of adrenal tumors and NP-59 imaging patterns. The best cut-off point for the laterality ratio to detect SCS was 3.07. These results clearly indicate that contralateral adrenal suppression in ACS is good evidence showing subclinical cortisol overproduction.
You Showed Your Whiteness: You Don't Get a "Good" White People's Medal
ERIC Educational Resources Information Center
Hayes, Cleveland; Juarez, Brenda G.
2009-01-01
The White liberal is a person who finds themselves defined as White, as an oppressor, in short, and retreats in horror from that designation. The desire to be and to be known as a good White person stems from the recognition that Whiteness is problematic, recognition that many White liberals try to escape by being demonstrably different from…
More Research on Veteran Employment Would Show What’s Good for Business and for Veterans
2016-01-01
Perspective What’s Good for Business and for Veterans Although there have been many public- and private- sector initiatives to help veterans... business to improve veteran employment oppor- tunities, the rhetoric and themes in both policy and media forums are evolving to reflect current... business . In short, veterans are in demand, according to workshop participants. Companies that want the opportunity to hire veterans are encouraged
Nonoperatively treated forearm shaft fractures in children show good long-term recovery
Sinikumpu, Juha-Jaakko; Victorzon, Sarita; Antila, Eeva; Pokka, Tytti; Serlo, Willy
2014-01-01
Background and purpose — The incidence of forearm shaft fractures in children has increased and operative treatment has increased compared with nonoperative treatment in recent years. We analyzed the long-term results of nonoperative treatment. Patients and methods — We performed a population-based age- and sex-matched case-control study in Vaasa Central Hospital, concerning fractures treated in the period 1995–1999. There were 47 nonoperatively treated both-bone forearm shaft fractures, and the patients all participated in the study. 1 healthy control per case was randomly selected and evaluated for comparison. We analyzed clinical and radiographic outcomes of all fractures at a mean of 11 (9–14) years after the trauma. Results — The main outcome, pronosupination of the forearm, was not decreased in the long term. Grip strength was also equally as good as in the controls. Wrist mobility was similar in flexion (85°) and extension (83°) compared to the contralateral side. The patients were satisfied with the outcome, and pain-free. Radiographally, 4 cases had radio-carpal joint degeneration and 4 had a local bone deformity. Interpretation — The long-term outcome of nonoperatively treated both-bone forearm shaft fractures in children was excellent. PMID:25238437
Good-Enough Brain Model: Challenges, Algorithms, and Discoveries in Multisubject Experiments.
Papalexakis, Evangelos E; Fyshe, Alona; Sidiropoulos, Nicholas D; Talukdar, Partha Pratim; Mitchell, Tom M; Faloutsos, Christos
2014-12-01
Given a simple noun such as apple, and a question such as "Is it edible?," what processes take place in the human brain? More specifically, given the stimulus, what are the interactions between (groups of) neurons (also known as functional connectivity) and how can we automatically infer those interactions, given measurements of the brain activity? Furthermore, how does this connectivity differ across different human subjects? In this work, we show that this problem, even though originating from the field of neuroscience, can benefit from big data techniques; we present a simple, novel good-enough brain model, or GeBM in short, and a novel algorithm Sparse-SysId, which are able to effectively model the dynamics of the neuron interactions and infer the functional connectivity. Moreover, GeBM is able to simulate basic psychological phenomena such as habituation and priming (whose definition we provide in the main text). We evaluate GeBM by using real brain data. GeBM produces brain activity patterns that are strikingly similar to the real ones, where the inferred functional connectivity is able to provide neuroscientific insights toward a better understanding of the way that neurons interact with each other, as well as detect regularities and outliers in multisubject brain activity measurements.
Is It that Difficult to Find a Good Preference Order for the Incremental Algorithm?
ERIC Educational Resources Information Center
Krahmer, Emiel; Koolen, Ruud; Theune, Mariet
2012-01-01
In a recent article published in this journal (van Deemter, Gatt, van der Sluis, & Power, 2012), the authors criticize the Incremental Algorithm (a well-known algorithm for the generation of referring expressions due to Dale & Reiter, 1995, also in this journal) because of its strong reliance on a pre-determined, domain-dependent Preference Order.…
Wilkinson, Craig; McPhillie, Martin J; Zhou, Ying; Woods, Stuart; Afanador, Gustavo A; Rawson, Shaun; Khaliq, Farzana; Prigge, Sean T; Roberts, Craig W; Rice, David W; McLeod, Rima; Fishwick, Colin W; Muench, Stephen P
2014-02-01
The enoyl acyl-carrier protein reductase (ENR) enzyme of the apicomplexan parasite family has been intensely studied for antiparasitic drug design for over a decade, with the most potent inhibitors targeting the NAD(+) bound form of the enzyme. However, the higher affinity for the NADH co-factor over NAD(+) and its availability in the natural environment makes the NADH complex form of ENR an attractive target. Herein, we have examined a benzimidazole family of inhibitors which target the NADH form of Francisella ENR, but despite good efficacy against Toxoplasma gondii, the IC50 for T. gondii ENR is poor, with no inhibitory activity at 1 μM. Moreover similar benzimidazole scaffolds are potent against fungi which lack the ENR enzyme and as such we believe that there may be significant off target effects for this family of inhibitors.
ERIC Educational Resources Information Center
Lowry, W. Kenneth
1977-01-01
Investigates whether today's students would score as well as students of the 1930-1950 era on achievement tests. Uses the Progressive Achievement Test, a test widely used in the 1930-1950 era as a barometer of student ability. (RK)
da Silva, Bárbara Pereira; Dias, Desirrê Morais; de Castro Moreira, Maria Eliza; Toledo, Renata Celi Lopes; da Matta, Sérgio Luis Pinto; Lucia, Ceres Mattos Della; Martino, Hércia Stampini Duarte; Pinheiro-Sant'Ana, Helena Maria
2016-09-01
Chia has been consumed by the world population due to its high fiber, lipids and proteins content. The objective was to evaluate the protein quality of chia untreated (seed and flour) and heat treated (90 °C/20 min), their influence on glucose and lipid homeostasis and integrity of liver and intestinal morphology of Wistar rats. 36 male rats, weanling, divided into six groups which received control diet (casein), free protein diet (aproteic) and four diet tests (chia seed; chia seed with heat treatment; chia flour and chia flour with heat treatment) for 14 days were used. The protein efficiency ratio (PER), net protein ratio (NPR) and true digestibility (TD) were evaluated. The biochemical variables and liver and intestinal morphologies of animals were determined. The values of PER, NPR and TD did not differ among the animals that were fed with chia and were lower than the control group. The animals that were fed with chia showed lower concentrations of glucose; triacylglycerides, low-density lipoprotein cholesterol and very low-density lipoprotein and higher high-density lipoprotein cholesterol than the control group. The liver weight of animals that were fed with chia was lower than the control group. Crypt depth and thickness of intestinal muscle layers were higher in groups that were fed with chia. The consumption of chia has shown good digestibility, hypoglycemic effect, improved lipid and glycemic profiles and reduced fat deposition in liver of animals, and also promoted changes in intestinal tissue that enhanced its functionality.
Nadanaciva, Sashi; Aleo, Michael D.; Strock, Christopher J.; Stedman, Donald B.; Wang, Huijun; Will, Yvonne
2013-10-15
To reduce costly late-stage compound attrition, there has been an increased focus on assessing compounds in in vitro assays that predict attributes of human safety liabilities, before preclinical in vivo studies are done. Relevant questions when choosing a panel of assays for predicting toxicity are (a) whether there is general concordance in the data among the assays, and (b) whether, in a retrospective analysis, the rank order of toxicity of compounds in the assays correlates with the known safety profile of the drugs in humans. The aim of our study was to answer these questions using nonsteroidal anti-inflammatory drugs (NSAIDs) as a test set since NSAIDs are generally associated with gastrointestinal injury, hepatotoxicity, and/or cardiovascular risk, with mitochondrial impairment and endoplasmic reticulum stress being possible contributing factors. Eleven NSAIDs, flufenamic acid, tolfenamic acid, mefenamic acid, diclofenac, meloxicam, sudoxicam, piroxicam, diflunisal, acetylsalicylic acid, nimesulide, and sulindac (and its two metabolites, sulindac sulfide and sulindac sulfone), were tested for their effects on (a) the respiration of rat liver mitochondria, (b) a panel of mechanistic endpoints in rat hepatocytes, and (c) the viability and organ morphology of zebrafish. We show good concordance for distinguishing among/between NSAID chemical classes in the observations among the three approaches. Furthermore, the assays were complementary and able to correctly identify “toxic” and “non-toxic” drugs in accordance with their human safety profile, with emphasis on hepatic and gastrointestinal safety. We recommend implementing our multi-assay approach in the drug discovery process to reduce compound attrition. - Highlights: • NSAIDS cause liver and GI toxicity. • Mitochondrial uncoupling contributes to NSAID liver toxicity. • ER stress is a mechanism that contributes to liver toxicity. • Zebrafish and cell based assays are complimentary.
ERIC Educational Resources Information Center
Allocco, Katherine
2010-01-01
One of the most versatile and multi-faceted films that an educator can use to illustrate urban America in the 1930s is "Great Guy," a relatively obscure film from 1936 directed by John G. Blystone and starring James Cagney and Mae Clarke. There are some simple practical considerations that make the film such a good fit for an American history or…
Yubero, D; Adin, A; Montero, R; Jou, C; Jiménez-Mallebrera, C; García-Cazorla, A; Nascimento, A; O'Callaghan, M M; Montoya, J; Gort, L; Navas, P; Ribes, A; Ugarte, M D; Artuch, R
2016-12-01
Laboratory data interpretation for the assessment of complex biological systems remains a great challenge, as occurs in mitochondrial function research studies. The classical biochemical data interpretation of patients versus reference values may be insufficient, and in fact the current classifications of mitochondrial patients are still done on basis of probability criteria. We have developed and applied a mathematic agglomerative algorithm to search for correlations among the different biochemical variables of the mitochondrial respiratory chain in order to identify populations displaying correlation coefficients >0.95. We demonstrated that coenzyme Q10 may be a better biomarker of mitochondrial respiratory chain enzyme activities than the citrate synthase activity. Furthermore, the application of this algorithm may be useful to re-classify mitochondrial patients or to explore associations among other biochemical variables from different biological systems.
Wada, Yuko; Yanagihara, Chie; Nishimura, Yo; Oka, Nobuyuki
2007-01-01
A 45-year-old man with insulin-dependent diabetic mellitus developed progressive asymmetrical weakness and atrophy of both shoulder girdle muscles within 1 year. In the last month, he also developed slight weakness of both thighs. Neuropathology of the sural nerve showed an axonal degeneration and perivascular inflammation and electromyography revealed neurogenic changes. Because of a diagnosis of suspected diabetic amyotrophy, intravenous immunoglobulin was administered. This treatment produced marked improvement. Physicians should take into account the possibility of diabetic amyotrophy in patients with diabetic mellitus showing primary involvement of shoulder girdle muscles marked by weakness and atrophy.
Feng, Yaya; Liu, Xiangyu; Duan, Linqiang; Yang, Qi; Wei, Qing; Xie, Gang; Chen, Sanping; Yang, Xuwu; Gao, Shengli
2015-02-07
A reticular 3D heterometallic metal-organic framework (MOF), [Cu4Na(Mtta)5(CH3CN)]n () (N% = 40.08%), has been synthesized, using a 5-methyl tetrazole (Mtta) ligand formed from acetonitrile and azide, through in situ synthesis and structurally characterized by X-ray single crystal diffraction. The fluorescence spectra demonstrate that undergoes an interesting structural transformation in aqueous solution, yielding the compound [Cu4Na(Mtta)5H2O]n () as confirmed by (1)H NMR, IR and PXRD. Thermoanalysis showed that possesses excellent thermostability up to 335 °C. The calculated detonation properties and the sensitivity test illustrate that compound could be used as a potential explosive. In addition, the non-isothermal kinetics for were studied using the Kissinger and Ozawa-Doyle methods. The enthalpy of formation was obtained from the determination of the constant-volume combustion energy.
WASHINGTON - The U.S. Environmental Protection Agency (EPA) today released the 2010 National Coastal Condition Assessment showing that more than half of the nation's coastal and Great Lakes nearshore waters are rated good for biological and sediment
Good Agreements Make Good Friends
Han, The Anh; Pereira, Luís Moniz; Santos, Francisco C.; Lenaerts, Tom
2013-01-01
When starting a new collaborative endeavor, it pays to establish upfront how strongly your partner commits to the common goal and what compensation can be expected in case the collaboration is violated. Diverse examples in biological and social contexts have demonstrated the pervasiveness of making prior agreements on posterior compensations, suggesting that this behavior could have been shaped by natural selection. Here, we analyze the evolutionary relevance of such a commitment strategy and relate it to the costly punishment strategy, where no prior agreements are made. We show that when the cost of arranging a commitment deal lies within certain limits, substantial levels of cooperation can be achieved. Moreover, these levels are higher than that achieved by simple costly punishment, especially when one insists on sharing the arrangement cost. Not only do we show that good agreements make good friends, agreements based on shared costs result in even better outcomes. PMID:24045873
Latifi, Kujtim; Oliver, Jasmine; Baker, Ryan; Dilling, Thomas J.; Stevens, Craig W.; Kim, Jongphil; Yue, Binglin; DeMarco, MaryLou; Zhang, Geoffrey G.; Moros, Eduardo G.; Feygelman, Vladimir
2014-04-01
Purpose: Pencil beam (PB) and collapsed cone convolution (CCC) dose calculation algorithms differ significantly when used in the thorax. However, such differences have seldom been previously directly correlated with outcomes of lung stereotactic ablative body radiation (SABR). Methods and Materials: Data for 201 non-small cell lung cancer patients treated with SABR were analyzed retrospectively. All patients were treated with 50 Gy in 5 fractions of 10 Gy each. The radiation prescription mandated that 95% of the planning target volume (PTV) receive the prescribed dose. One hundred sixteen patients were planned with BrainLab treatment planning software (TPS) with the PB algorithm and treated on a Novalis unit. The other 85 were planned on the Pinnacle TPS with the CCC algorithm and treated on a Varian linac. Treatment planning objectives were numerically identical for both groups. The median follow-up times were 24 and 17 months for the PB and CCC groups, respectively. The primary endpoint was local/marginal control of the irradiated lesion. Gray's competing risk method was used to determine the statistical differences in local/marginal control rates between the PB and CCC groups. Results: Twenty-five patients planned with PB and 4 patients planned with the CCC algorithms to the same nominal doses experienced local recurrence. There was a statistically significant difference in recurrence rates between the PB and CCC groups (hazard ratio 3.4 [95% confidence interval: 1.18-9.83], Gray's test P=.019). The differences (Δ) between the 2 algorithms for target coverage were as follows: ΔD99{sub GITV} = 7.4 Gy, ΔD99{sub PTV} = 10.4 Gy, ΔV90{sub GITV} = 13.7%, ΔV90{sub PTV} = 37.6%, ΔD95{sub PTV} = 9.8 Gy, and ΔD{sub ISO} = 3.4 Gy. GITV = gross internal tumor volume. Conclusions: Local control in patients receiving who were planned to the same nominal dose with PB and CCC algorithms were statistically significantly different. Possible alternative
Oosterveld, Fredrikus G J; Rasker, Johannes J; Floors, Mark; Landkroon, Robert; van Rennes, Bob; Zwijnenberg, Jan; van de Laar, Mart A F J; Koel, Gerard J
2009-01-01
To study the effects of infrared (IR) Sauna, a form of total-body hyperthermia in patients with rheumatoid arthritis (RA) and ankylosing spondylitis (AS) patients were treated for a 4-week period with a series of eight IR treatments. Seventeen RA patients and 17 AS patients were studied. IR was well tolerated, and no adverse effects were reported, no exacerbation of disease. Pain and stiffness decreased clinically, and improvements were statistically significant (p < 0.05 and p < 0.001 in RA and AS patients, respectively) during an IR session. Fatigue also decreased. Both RA and AS patients felt comfortable on average during and especially after treatment. In the RA and AS patients, pain, stiffness, and fatigue also showed clinical improvements during the 4-week treatment period, but these did not reach statistical significance. No relevant changes in disease activity scores were found, indicating no exacerbation of disease activity. In conclusion, infrared treatment has statistically significant short-term beneficial effects and clinically relevant period effects during treatment in RA and AS patients without enhancing disease activity. IR has good tolerability and no adverse effects.
Parallel Wolff Cluster Algorithms
NASA Astrophysics Data System (ADS)
Bae, S.; Ko, S. H.; Coddington, P. D.
The Wolff single-cluster algorithm is the most efficient method known for Monte Carlo simulation of many spin models. Due to the irregular size, shape and position of the Wolff clusters, this method does not easily lend itself to efficient parallel implementation, so that simulations using this method have thus far been confined to workstations and vector machines. Here we present two parallel implementations of this algorithm, and show that one gives fairly good performance on a MIMD parallel computer.
ERIC Educational Resources Information Center
Schoenheimer, Henry P.
This book contains seventeen thumb-nail sketches of schools in Europe, the United States, Asia, Britain, and Australia, as they appeared in the eye of the author as a professional educator and a journalist while travelling around the world. The author considers the schools described to be good schools, and not necessarily the 17 best schools in…
ERIC Educational Resources Information Center
Gehring, John
2004-01-01
For the past 16 years, the blue-collar city of Huntington, West Virginia, has rolled out the red carpet to welcome young wrestlers and their families as old friends. They have come to town chasing the same dream for a spot in what many of them call "The Show". For three days, under the lights of an arena packed with 5,000 fans, the…
Applications and accuracy of the parallel diagonal dominant algorithm
NASA Technical Reports Server (NTRS)
Sun, Xian-He
1993-01-01
The Parallel Diagonal Dominant (PDD) algorithm is a highly efficient, ideally scalable tridiagonal solver. In this paper, a detailed study of the PDD algorithm is given. First the PDD algorithm is introduced. Then the algorithm is extended to solve periodic tridiagonal systems. A variant, the reduced PDD algorithm, is also proposed. Accuracy analysis is provided for a class of tridiagonal systems, the symmetric, and anti-symmetric Toeplitz tridiagonal systems. Implementation results show that the analysis gives a good bound on the relative error, and the algorithm is a good candidate for the emerging massively parallel machines.
Sun, Mingshuang; Zhu, Zhihong; Wang, Huixin; Jin, Shanshan; Yang, Xinggang; Han, Cuiyan; Pan, Weisan
2017-03-30
We constructed a dual ligands-modified nanostructured lipid carrier (NLC) called PAR-NLC, in which the epidermal growth factor receptor (EGFR)-targeted small peptide AEYLR was attached to the distal end of PEG2000 anchored on the NLC surface naming PEG-AEYLR, and poly-arginine (R8) as a classic cell-penetrating peptide was attached directly to the NLC surface. PAR-NLC was near-spherical particle with average size ∼50 nm and zeta potential at +14.09 mV; the cellular uptake of PAR-NLC showed synergistic effect of the two peptides, presented as significant superior cellular uptake in EGFR-positive cells NCI-H1299 and S180 over EGFR-negative cell K562. In the animal optical imaging study, 2 h after the administration of the Dir-loaded PAR-NLC, maximum Dir signal appeared in tumor tissue, indicating prompt tumor targeting effect, as time prolonged to 48 h, the Dir signal attenuated in the organs except tumor, suggesting constant clearance from the body. In the in vivo antitumor study, in premise of the same dose, paclitaxel-loaded PAR-NLC exhibited better antitumor and safety effect than Taxol, the body weight of the mice was more stable and tumor size was smaller. In summary, PAR-NLC was a potential drug carrier to deliver anticancer drugs safely and effectively.
Atmospheric Science Data Center
2016-08-24
article title: Aerosol retrieval over Cape of Good Hope (Enlargement) ... (MISR) image is an enlargement of the aerosol retrieval over Cape of Good Hope, August 23, 2000 , showing a more detailed ... energy, so MISR's contribution is not only the aerosol retrieval necessary to do the correction, but the multi-angular integration. ...
NASA Astrophysics Data System (ADS)
Zheng, Genrang; Lin, ZhengChun
The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.
Algorithms and Algorithmic Languages.
ERIC Educational Resources Information Center
Veselov, V. M.; Koprov, V. M.
This paper is intended as an introduction to a number of problems connected with the description of algorithms and algorithmic languages, particularly the syntaxes and semantics of algorithmic languages. The terms "letter, word, alphabet" are defined and described. The concept of the algorithm is defined and the relation between the algorithm and…
Defining the Good Reading Teacher.
ERIC Educational Resources Information Center
Kupersmith, Judy; And Others
In the quest for a definition of the good reading teacher, a review of the literature shows that new or copious materials, one specific teaching method, and static teaching behaviors are not responsible for effective teaching. However, observations of five reading teachers, with good references and good reputations but with widely divergent…
Advanced optimization of permanent magnet wigglers using a genetic algorithm
Hajima, Ryoichi
1995-12-31
In permanent magnet wigglers, magnetic imperfection of each magnet piece causes field error. This field error can be reduced or compensated by sorting magnet pieces in proper order. We showed a genetic algorithm has good property for this sorting scheme. In this paper, this optimization scheme is applied to the case of permanent magnets which have errors in the direction of field. The result shows the genetic algorithm is superior to other algorithms.
NASA Astrophysics Data System (ADS)
Zhang, Yanjun; Yu, Chunjuan; Fu, Xinghu; Liu, Wenzhe; Bi, Weihong
2015-12-01
In the distributed optical fiber sensing system based on Brillouin scattering, strain and temperature are the main measuring parameters which can be obtained by analyzing the Brillouin center frequency shift. The novel algorithm which combines the cuckoo search algorithm (CS) with the improved differential evolution (IDE) algorithm is proposed for the Brillouin scattering parameter estimation. The CS-IDE algorithm is compared with CS algorithm and analyzed in different situation. The results show that both the CS and CS-IDE algorithm have very good convergence. The analysis reveals that the CS-IDE algorithm can extract the scattering spectrum features with different linear weight ratio, linewidth combination and SNR. Moreover, the BOTDR temperature measuring system based on electron optical frequency shift is set up to verify the effectiveness of the CS-IDE algorithm. Experimental results show that there is a good linear relationship between the Brillouin center frequency shift and temperature changes.
Runtime support for parallelizing data mining algorithms
NASA Astrophysics Data System (ADS)
Jin, Ruoming; Agrawal, Gagan
2002-03-01
With recent technological advances, shared memory parallel machines have become more scalable, and offer large main memories and high bus bandwidths. They are emerging as good platforms for data warehousing and data mining. In this paper, we focus on shared memory parallelization of data mining algorithms. We have developed a series of techniques for parallelization of data mining algorithms, including full replication, full locking, fixed locking, optimized full locking, and cache-sensitive locking. Unlike previous work on shared memory parallelization of specific data mining algorithms, all of our techniques apply to a large number of common data mining algorithms. In addition, we propose a reduction-object based interface for specifying a data mining algorithm. We show how our runtime system can apply any of the technique we have developed starting from a common specification of the algorithm.
Study of sequential optimal control algorithm smart isolation structure based on Simulink-S function
NASA Astrophysics Data System (ADS)
Liu, Xiaohuan; Liu, Yanhui
2017-01-01
The study of this paper focuses on smart isolation structure, a method for realizing structural vibration control by using Simulink simulation is proposed according to the proposed sequential optimal control algorithm. In the Simulink simulation environment, A smart isolation structure is used to compare the control effect of three algorithms, i.e., classical optimal control algorithm, linear quadratic gaussian control algorithm and sequential optimal control algorithm under the condition of sensor contaminated with noise. Simulation results show that this method can be applied to the simulation of sequential optimal control algorithm and the proposed sequential optimal control algorithm has a good ability of resisting the noise and better control efficiency.
A hybrid monkey search algorithm for clustering analysis.
Chen, Xin; Zhou, Yongquan; Luo, Qifang
2014-01-01
Clustering is a popular data analysis and data mining technique. The k-means clustering algorithm is one of the most commonly used methods. However, it highly depends on the initial solution and is easy to fall into local optimum solution. In view of the disadvantages of the k-means method, this paper proposed a hybrid monkey algorithm based on search operator of artificial bee colony algorithm for clustering analysis and experiment on synthetic and real life datasets to show that the algorithm has a good performance than that of the basic monkey algorithm for clustering analysis.
A connected component labeling algorithm for wheat root thinned image
NASA Astrophysics Data System (ADS)
Mu, ShaoMin; Zha, XuHeng; Du, HaiYang; Hao, QingBo; Chang, TengTeng
Measuring wheat root length need manual measure by measuring rule, waste time and energy, low precision, aiming at this problem in this paper a connected component labeling algorithm for wheat root thinned image is presented. The algorithm realized on the basis of regional growth thought by dynamic queue list, only need one scan can finish label process. Aiming at label of wheat root thinned image, the algorithm compared with three algorithms, the experimental results show that the algorithm effect is good and suited to connecting component labeling for wheat root thinned image.
Sobel, E.; Lange, K.; O`Connell, J.R.
1996-12-31
Haplotyping is the logical process of inferring gene flow in a pedigree based on phenotyping results at a small number of genetic loci. This paper formalizes the haplotyping problem and suggests four algorithms for haplotype reconstruction. These algorithms range from exhaustive enumeration of all haplotype vectors to combinatorial optimization by simulated annealing. Application of the algorithms to published genetic analyses shows that manual haplotyping is often erroneous. Haplotyping is employed in screening pedigrees for phenotyping errors and in positional cloning of disease genes from conserved haplotypes in population isolates. 26 refs., 6 figs., 3 tabs.
Schulz, Andreas S.; Shmoys, David B.; Williamson, David P.
1997-01-01
Increasing global competition, rapidly changing markets, and greater consumer awareness have altered the way in which corporations do business. To become more efficient, many industries have sought to model some operational aspects by gigantic optimization problems. It is not atypical to encounter models that capture 106 separate “yes” or “no” decisions to be made. Although one could, in principle, try all 2106 possible solutions to find the optimal one, such a method would be impractically slow. Unfortunately, for most of these models, no algorithms are known that find optimal solutions with reasonable computation times. Typically, industry must rely on solutions of unguaranteed quality that are constructed in an ad hoc manner. Fortunately, for some of these models there are good approximation algorithms: algorithms that produce solutions quickly that are provably close to optimal. Over the past 6 years, there has been a sequence of major breakthroughs in our understanding of the design of approximation algorithms and of limits to obtaining such performance guarantees; this area has been one of the most flourishing areas of discrete mathematics and theoretical computer science. PMID:9370525
Good and not so good medical ethics.
Rhodes, Rosamond
2015-01-01
In this paper, I provide a brief sketch of the purposes that medical ethics serves and what makes for good medical ethics. Medical ethics can guide clinical practice and biomedical research, contribute to the education of clinicians, advance thinking in the field, and direct healthcare policy. Although these are distinct activities, they are alike in several critical respects. Good medical ethics is coherent, illuminating, accurate, reasonable, consistent, informed, and measured. After this overview, I provide specific examples to illustrate some of the ways in which medical ethics could go wrong as a caution and a reminder that taking on the role of an ethicist involves serious responsibilities that must be exercised with care.
Good Concrete Activity Is Good Mental Activity
ERIC Educational Resources Information Center
McDonough, Andrea
2016-01-01
Early years mathematics classrooms can be colourful, exciting, and challenging places of learning. Andrea McDonough and fellow teachers have noticed that some students make good decisions about using materials to assist their problem solving, but this is not always the case. These experiences lead her to ask the following questions: (1) Are…
ERIC Educational Resources Information Center
Cruickshank, Donald R.
2000-01-01
Over time, variations in describing what constitutes good teaching have included ideal, analytic, effective, dutiful, competent, expert, reflective, satisfying, diversity-responsive, and respected. If good teaching could be observed and measured, the results would not indicate a one-size-fits-all model, but rather demonstrate that good teaching is…
NASA Astrophysics Data System (ADS)
Gong, Lina; Xu, Tao; Zhang, Wei; Li, Xuhong; Wang, Xia; Pan, Wenwen
2017-03-01
The traditional microblog recommendation algorithm has the problems of low efficiency and modest effect in the era of big data. In the aim of solving these issues, this paper proposed a mixed recommendation algorithm with user clustering. This paper first introduced the situation of microblog marketing industry. Then, this paper elaborates the user interest modeling process and detailed advertisement recommendation methods. Finally, this paper compared the mixed recommendation algorithm with the traditional classification algorithm and mixed recommendation algorithm without user clustering. The results show that the mixed recommendation algorithm with user clustering has good accuracy and recall rate in the microblog advertisements promotion.
An Image Encryption Algorithm Based on Information Hiding
NASA Astrophysics Data System (ADS)
Ge, Xin; Lu, Bin; Liu, Fenlin; Gong, Daofu
Aiming at resolving the conflict between security and efficiency in the design of chaotic image encryption algorithms, an image encryption algorithm based on information hiding is proposed based on the “one-time pad” idea. A random parameter is introduced to ensure a different keystream for each encryption, which has the characteristics of “one-time pad”, improving the security of the algorithm rapidly without significant increase in algorithm complexity. The random parameter is embedded into the ciphered image with information hiding technology, which avoids negotiation for its transport and makes the application of the algorithm easier. Algorithm analysis and experiments show that the algorithm is secure against chosen plaintext attack, differential attack and divide-and-conquer attack, and has good statistical properties in ciphered images.
Underwater Sensor Network Redeployment Algorithm Based on Wolf Search
Jiang, Peng; Feng, Yang; Wu, Feng
2016-01-01
This study addresses the optimization of node redeployment coverage in underwater wireless sensor networks. Given that nodes could easily become invalid under a poor environment and the large scale of underwater wireless sensor networks, an underwater sensor network redeployment algorithm was developed based on wolf search. This study is to apply the wolf search algorithm combined with crowded degree control in the deployment of underwater wireless sensor networks. The proposed algorithm uses nodes to ensure coverage of the events, and it avoids the prematurity of the nodes. The algorithm has good coverage effects. In addition, considering that obstacles exist in the underwater environment, nodes are prevented from being invalid by imitating the mechanism of avoiding predators. Thus, the energy consumption of the network is reduced. Comparative analysis shows that the algorithm is simple and effective in wireless sensor network deployment. Compared with the optimized artificial fish swarm algorithm, the proposed algorithm exhibits advantages in network coverage, energy conservation, and obstacle avoidance. PMID:27775659
A scalable parallel algorithm for multiple objective linear programs
NASA Technical Reports Server (NTRS)
Wiecek, Malgorzata M.; Zhang, Hong
1994-01-01
This paper presents an ADBASE-based parallel algorithm for solving multiple objective linear programs (MOLP's). Job balance, speedup and scalability are of primary interest in evaluating efficiency of the new algorithm. Implementation results on Intel iPSC/2 and Paragon multiprocessors show that the algorithm significantly speeds up the process of solving MOLP's, which is understood as generating all or some efficient extreme points and unbounded efficient edges. The algorithm gives specially good results for large and very large problems. Motivation and justification for solving such large MOLP's are also included.
ERIC Educational Resources Information Center
Fox, Sharon E.
1972-01-01
Discusses some good oral reading concepts that teachers should consider when developing new classroom practices, including: round robin reading, silent reading before oral reading, shared reading, etc. (NL)
NASA Astrophysics Data System (ADS)
Lu, Li; Sheng, Wen; Liu, Shihua; Zhang, Xianzhi
2014-10-01
The ballistic missile hyperspectral data of imaging spectrometer from the near-space platform are generated by numerical method. The characteristic of the ballistic missile hyperspectral data is extracted and matched based on two different kinds of algorithms, which called transverse counting and quantization coding, respectively. The simulation results show that two algorithms extract the characteristic of ballistic missile adequately and accurately. The algorithm based on the transverse counting has the low complexity and can be implemented easily compared to the algorithm based on the quantization coding does. The transverse counting algorithm also shows the good immunity to the disturbance signals and speed up the matching and recognition of subsequent targets.
Variable neighbourhood simulated annealing algorithm for capacitated vehicle routing problems
NASA Astrophysics Data System (ADS)
Xiao, Yiyong; Zhao, Qiuhong; Kaku, Ikou; Mladenovic, Nenad
2014-04-01
This article presents the variable neighbourhood simulated annealing (VNSA) algorithm, a variant of the variable neighbourhood search (VNS) combined with simulated annealing (SA), for efficiently solving capacitated vehicle routing problems (CVRPs). In the new algorithm, the deterministic 'Move or not' criterion of the original VNS algorithm regarding the incumbent replacement is replaced by an SA probability, and the neighbourhood shifting of the original VNS (from near to far by k← k+1) is replaced by a neighbourhood shaking procedure following a specified rule. The geographical neighbourhood structure is introduced in constructing the neighbourhood structures for the CVRP of the string model. The proposed algorithm is tested against 39 well-known benchmark CVRP instances of different scales (small/middle, large, very large). The results show that the VNSA algorithm outperforms most existing algorithms in terms of computational effectiveness and efficiency, showing good performance in solving large and very large CVRPs.
An ant colony algorithm on continuous searching space
NASA Astrophysics Data System (ADS)
Xie, Jing; Cai, Chao
2015-12-01
Ant colony algorithm is heuristic, bionic and parallel. Because of it is property of positive feedback, parallelism and simplicity to cooperate with other method, it is widely adopted in planning on discrete space. But it is still not good at planning on continuous space. After a basic introduction to the basic ant colony algorithm, we will propose an ant colony algorithm on continuous space. Our method makes use of the following three tricks. We search for the next nodes of the route according to fixed-step to guarantee the continuity of solution. When storing pheromone, it discretizes field of pheromone, clusters states and sums up the values of pheromone of these states. When updating pheromone, it makes good resolutions measured in relative score functions leave more pheromone, so that ant colony algorithm can find a sub-optimal solution in shorter time. The simulated experiment shows that our ant colony algorithm can find sub-optimal solution in relatively shorter time.
Atmospheric Science Data Center
2013-04-16
article title: Aerosol retrieval over Cape of Good Hope ... Da image in the southern part of South Africa - the aerosol retrieval picks it up, and also the slightly clearer area in the middle. Also, ... MISR Science Teams Aug 23, 2000 - Aerosol retrieval over Cape of Good Hope. project: MISR ...
ERIC Educational Resources Information Center
Dawkins, John
The punctuation system presented in this paper has explanatory power insofar as it explains how good writers punctuate. The paper notes that good writers have learned, through reading, the differences among a hierarchy of marks and acquired a sense of independent clauses that allows them to use the hierarchy, along with a reader-sensitive notion…
ERIC Educational Resources Information Center
Csikszentmihalyi, Mihaly
2003-01-01
Examines the working lives of geneticists and journalists to place into perspective what lies behind personal ethics and success. Defines "good work" as productive activity that is valued socially and loved by people engaged in it. Asserts that certain cultural values, social controls, and personal standards are necessary to maintain good work and…
ERIC Educational Resources Information Center
Tingey, Carol
1987-01-01
Suggestions are presented from parents on how to help children with disabilities (with particular focus on Downs Syndrome) learn good grooming habits in such areas as good health, exercise, cleanliness, teeth and hair care, skin care, glasses and other devices, and social behavior. (CB)
ERIC Educational Resources Information Center
Connell, Raewyn
2016-01-01
This paper considers how we can arrive at a concept of the good university. It begins with ideas expressed by Australian Vice-Chancellors and in the "league tables" for universities, which essentially reproduce existing privilege. It then considers definitions of the good university via wish lists, classic texts, horror lists, structural…
Productivity and Capital Goods.
ERIC Educational Resources Information Center
Zicht, Barbara, Ed.; And Others
1981-01-01
Providing teacher background on the concepts of productivity and capital goods, this document presents 3 teaching units about these ideas for different grade levels. The grade K-2 unit, "How Do They Do It?," is designed to provide students with an understanding of how physical capital goods add to productivity. Activities include a field trip to…
Packard, R C; Andrasik, F; Weaver, R
1989-02-01
Occasionally patients with headache present with the complaint of "a really good one." This paper examines three cases of patients with migraine who often referred to their headaches as "good." When the patients were asked what made the headaches good, they immediately tried to clarify their terminology as "just a figure of speech" that really meant bad. Further exploration usually revealed the headache symptoms had indeed been "good" in a relative sense, in that it had somehow served to help the patient avoid a more unpleasant emotional situation. The headache may have allowed a "time out" or a forced period of rest in a hectic schedule, resolved a conflict for the patient in an acceptable way by becoming sick, or represented a suppressed or repressed affect, usually anger. When headaches are described as good, there may very well be something in the patient's life that is worse.
Reconsidering the "Good Divorce"
Amato, Paul R; Kane, Jennifer B; James, Spencer
2011-12-01
This study attempted to assess the notion that a "good divorce" protects children from the potential negative consequences of marital dissolution. A cluster analysis of data on postdivorce parenting from 944 families resulted in three groups: cooperative coparenting, parallel parenting, and single parenting. Children in the cooperative coparenting (good divorce) cluster had the smallest number of behavior problems and the closest ties to their fathers. Nevertheless, children in this cluster did not score significantly better than other children on 10 additional outcomes. These findings provide only modest support for the good divorce hypothesis.
... Venous Thromboembolism Aortic Aneurysm More Good vs. Bad Cholesterol Updated:Apr 3,2017 Cholesterol can't dissolve ... test . View an animation of cholesterol . LDL (Bad) Cholesterol LDL cholesterol is considered the “bad” cholesterol because ...
2011-10-26
Definitions of a good death often include being at home. Dying at home may be optimal for the patient but could place a significant burden on families and leave them with traumatic memories. Death in hospital should not mean that it is a 'bad death'. How someone dies is more important than where they die and nurses should be taught to provide good end of life care in all settings.
ERIC Educational Resources Information Center
Cai, Li; Lee, Taehun
2009-01-01
We apply the Supplemented EM algorithm (Meng & Rubin, 1991) to address a chronic problem with the "two-stage" fitting of covariance structure models in the presence of ignorable missing data: the lack of an asymptotically chi-square distributed goodness-of-fit statistic. We show that the Supplemented EM algorithm provides a…
Television Quiz Show Simulation
ERIC Educational Resources Information Center
Hill, Jonnie Lynn
2007-01-01
This article explores the simulation of four television quiz shows for students in China studying English as a foreign language (EFL). It discusses the adaptation and implementation of television quiz shows and how the students reacted to them.
ERIC Educational Resources Information Center
Gustafson, James W.
1971-01-01
Our value system relating to the natural sciences is examined for its acceptability and worthiness. Scrutinized are the cognitive meanings about values, validity of values, subjective and cultural relativism, the good of objective realities, and cooperation with natural forces and God. (BL)
ERIC Educational Resources Information Center
Macedo, Stephen
2004-01-01
Americans are rightly concerned that schools are not providing students with the knowledge and habits necessary to be good citizens. With the notable exception of volunteer activity, every form of civic engagement among the young has declined. As a response, increasing attention is being paid to civic education in the schools. But strangely, at a…
Restructuring for Good Governance
ERIC Educational Resources Information Center
Robert, Stephen; Carey, Russell C.
2006-01-01
American higher education has never been more in need of good governance than it is right now. Yet much of the structure many boards have inherited or created tends to stall or impede timely, well-informed, and broadly supported decision making. At many institutions (ours included), layers of governance have been added with each passing year,…
Reconsidering the "Good Divorce"
ERIC Educational Resources Information Center
Amato, Paul R.; Kane, Jennifer B.; James, Spencer
2011-01-01
This study attempted to assess the notion that a "good divorce" protects children from the potential negative consequences of marital dissolution. A cluster analysis of data on postdivorce parenting from 944 families resulted in three groups: cooperative coparenting, parallel parenting, and single parenting. Children in the cooperative coparenting…
The research on algorithms for optoelectronic tracking servo control systems
NASA Astrophysics Data System (ADS)
Zhu, Qi-Hai; Zhao, Chang-Ming; Zhu, Zheng; Li, Kun
2016-10-01
The photoelectric servo control system based on PC controllers is mainly used to control the speed and position of the load. This paper analyzed the mathematical modeling and the system identification of the servo system. In the aspect of the control algorithm, the IP regulator, the fuzzy PID, the Active Disturbance Rejection Control (ADRC) and the adaptive algorithms were compared and analyzed. The PI-P control algorithm was proposed in this paper, which not only has the advantages of the PI regulator that can be quickly saturated, but also overcomes the shortcomings of the IP regulator. The control system has a good starting performance and the anti-load ability in a wide range. Experimental results show that the system has good performance under the guarantee of the PI-P control algorithm.
NASA Technical Reports Server (NTRS)
Wang, Lui; Bayer, Steven E.
1991-01-01
Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.
A Novel Image Encryption Algorithm Based on DNA Subsequence Operation
Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng
2012-01-01
We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack. PMID:23093912
Flower pollination algorithm: A novel approach for multiobjective optimization
NASA Astrophysics Data System (ADS)
Yang, Xin-She; Karamanoglu, Mehmet; He, Xingshi
2014-09-01
Multiobjective design optimization problems require multiobjective optimization techniques to solve, and it is often very challenging to obtain high-quality Pareto fronts accurately. In this article, the recently developed flower pollination algorithm (FPA) is extended to solve multiobjective optimization problems. The proposed method is used to solve a set of multiobjective test functions and two bi-objective design benchmarks, and a comparison of the proposed algorithm with other algorithms has been made, which shows that the FPA is efficient with a good convergence rate. Finally, the importance for further parametric studies and theoretical analysis is highlighted and discussed.
Changes to the COS Extraction Algorithm for Lifetime Position 3
NASA Astrophysics Data System (ADS)
Proffitt, Charles R.; Bostroem, K. Azalee; Ely, Justin; Foster, Deatrick; Hernandez, Svea; Hodge, Philip; Jedrzejewski, Robert I.; Lockwood, Sean A.; Massa, Derck; Peeples, Molly S.; Oliveira, Cristina M.; Penton, Steven V.; Plesha, Rachel; Roman-Duval, Julia; Sana, Hugues; Sahnow, David J.; Sonnentrucker, Paule; Taylor, Joanna M.
2015-09-01
The COS FUV Detector Lifetime Position 3 (LP3) has been placed only 2.5" below the original lifetime position (LP1). This is sufficiently close to gain-sagged regions at LP1 that a revised extraction algorithm is needed to ensure good spectral quality. We provide an overview of this new "TWOZONE" extraction algorithm, discuss its strengths and limitations, describe new output columns in the X1D files that show the boundaries of the new extraction regions, and provide some advice on how to manually tune the algorithm for specialized applications.
A Fast and Efficient Algorithm for Mining Top-k Nodes in Complex Networks
NASA Astrophysics Data System (ADS)
Liu, Dong; Jing, Yun; Zhao, Jing; Wang, Wenjun; Song, Guojie
2017-02-01
One of the key problems in social network analysis is influence maximization, which has great significance both in theory and practical applications. Given a complex network and a positive integer k, and asks the k nodes to trigger the largest expected number of the remaining nodes. Many mature algorithms are mainly divided into propagation-based algorithms and topology- based algorithms. The propagation-based algorithms are based on optimization of influence spread process, so the influence spread of them significantly outperforms the topology-based algorithms. But these algorithms still takes days to complete on large networks. Contrary to propagation based algorithms, the topology-based algorithms are based on intuitive parameter statistics and static topology structure properties. Their running time are extremely short but the results of influence spread are unstable. In this paper, we propose a novel topology-based algorithm based on local index rank (LIR). The influence spread of our algorithm is close to the propagation-based algorithm and sometimes over them. Moreover, the running time of our algorithm is millions of times shorter than that of propagation-based algorithms. Our experimental results show that our algorithm has a good and stable performance in IC and LT model.
A Fast and Efficient Algorithm for Mining Top-k Nodes in Complex Networks.
Liu, Dong; Jing, Yun; Zhao, Jing; Wang, Wenjun; Song, Guojie
2017-02-27
One of the key problems in social network analysis is influence maximization, which has great significance both in theory and practical applications. Given a complex network and a positive integer k, and asks the k nodes to trigger the largest expected number of the remaining nodes. Many mature algorithms are mainly divided into propagation-based algorithms and topology- based algorithms. The propagation-based algorithms are based on optimization of influence spread process, so the influence spread of them significantly outperforms the topology-based algorithms. But these algorithms still takes days to complete on large networks. Contrary to propagation based algorithms, the topology-based algorithms are based on intuitive parameter statistics and static topology structure properties. Their running time are extremely short but the results of influence spread are unstable. In this paper, we propose a novel topology-based algorithm based on local index rank (LIR). The influence spread of our algorithm is close to the propagation-based algorithm and sometimes over them. Moreover, the running time of our algorithm is millions of times shorter than that of propagation-based algorithms. Our experimental results show that our algorithm has a good and stable performance in IC and LT model.
A Fast and Efficient Algorithm for Mining Top-k Nodes in Complex Networks
Liu, Dong; Jing, Yun; Zhao, Jing; Wang, Wenjun; Song, Guojie
2017-01-01
One of the key problems in social network analysis is influence maximization, which has great significance both in theory and practical applications. Given a complex network and a positive integer k, and asks the k nodes to trigger the largest expected number of the remaining nodes. Many mature algorithms are mainly divided into propagation-based algorithms and topology- based algorithms. The propagation-based algorithms are based on optimization of influence spread process, so the influence spread of them significantly outperforms the topology-based algorithms. But these algorithms still takes days to complete on large networks. Contrary to propagation based algorithms, the topology-based algorithms are based on intuitive parameter statistics and static topology structure properties. Their running time are extremely short but the results of influence spread are unstable. In this paper, we propose a novel topology-based algorithm based on local index rank (LIR). The influence spread of our algorithm is close to the propagation-based algorithm and sometimes over them. Moreover, the running time of our algorithm is millions of times shorter than that of propagation-based algorithms. Our experimental results show that our algorithm has a good and stable performance in IC and LT model. PMID:28240238
ERIC Educational Resources Information Center
Cech, Scott J.
2008-01-01
Having students show their skills in three dimensions, known as performance-based assessment, dates back at least to Socrates. Individual schools such as Barrington High School--located just outside of Providence--have been requiring students to actively demonstrate their knowledge for years. The Rhode Island's high school graduating class became…
ERIC Educational Resources Information Center
Mathieu, Aaron
2000-01-01
Uses a talk show activity for a final assessment tool for students to debate about the ozone hole. Students are assessed on five areas: (1) cooperative learning; (2) the written component; (3) content; (4) self-evaluation; and (5) peer evaluation. (SAH)
ERIC Educational Resources Information Center
Geological Survey (Dept. of Interior), Reston, VA.
This curriculum packet, appropriate for grades 4-8, features a teaching poster which shows different types of maps (different views of Salt Lake City, Utah), as well as three reproducible maps and reproducible activity sheets which complement the maps. The poster provides teacher background, including step-by-step lesson plans for four geography…
ERIC Educational Resources Information Center
Dicks, Matthew J.
2005-01-01
Because today's students have grown up steeped in video games and the Internet, most of them expect feedback, and usually gratification, very soon after they expend effort on a task. Teachers can get quick feedback to students by showing them videotapes of their learning performances. The author, a 3rd grade teacher describes how the seemingly…
NASA Astrophysics Data System (ADS)
Campbell, Susan; Muzyka, Jennifer
2002-04-01
We present a technological improvement to the use of game shows to help students review for tests. Our approach uses HTML files interpreted with a browser on a computer attached to an LCD projector. The HTML files can be easily modified for use of the game in a variety of courses.
Honored Teacher Shows Commitment.
ERIC Educational Resources Information Center
Ratte, Kathy
1987-01-01
Part of the acceptance speech of the 1985 National Council for the Social Studies Teacher of the Year, this article describes the censorship experience of this honored social studies teacher. The incident involved the showing of a videotape version of the feature film entitled "The Seduction of Joe Tynan." (JDH)
ERIC Educational Resources Information Center
Moore, Mitzi Ruth
1992-01-01
Proposes having students perform skits in which they play the roles of the science concepts they are trying to understand. Provides the dialog for a skit in which hot and cold gas molecules are interviewed on a talk show to study how these properties affect wind, rain, and other weather phenomena. (MDH)
ERIC Educational Resources Information Center
Frasier, Debra
2008-01-01
In the author's book titled "The Incredible Water Show," the characters from "Miss Alaineus: A Vocabulary Disaster" used an ocean of information to stage an inventive performance about the water cycle. In this article, the author relates how she turned the story into hands-on science teaching for real-life fifth-grade students. The author also…
Scalable Nearest Neighbor Algorithms for High Dimensional Data.
Muja, Marius; Lowe, David G
2014-11-01
For many computer vision and machine learning problems, large training sets are key for good performance. However, the most computationally expensive part of many computer vision and machine learning algorithms consists of finding nearest neighbor matches to high dimensional vectors that represent the training data. We propose new algorithms for approximate nearest neighbor matching and evaluate and compare them with previous algorithms. For matching high dimensional features, we find two algorithms to be the most efficient: the randomized k-d forest and a new algorithm proposed in this paper, the priority search k-means tree. We also propose a new algorithm for matching binary features by searching multiple hierarchical clustering trees and show it outperforms methods typically used in the literature. We show that the optimal nearest neighbor algorithm and its parameters depend on the data set characteristics and describe an automated configuration procedure for finding the best algorithm to search a particular data set. In order to scale to very large data sets that would otherwise not fit in the memory of a single machine, we propose a distributed nearest neighbor matching framework that can be used with any of the algorithms described in the paper. All this research has been released as an open source library called fast library for approximate nearest neighbors (FLANN), which has been incorporated into OpenCV and is now one of the most popular libraries for nearest neighbor matching.
Good clinical sense in diabetology.
Kalra, Sanjay; Gupta, Yashdeep
2015-08-01
This article defines and explains the concept of good clinical sense. It defines good clinical sense as "the presence of sensory faculties, their usage and interpretation, by which one is able to practice good clinical medicine". Good clinical sense differs from good clinical practice (GCP) and good clinical acumen. It encompasses all steps of the clinical, diagnostic and therapeutic process, and encourages diligent practice of clinical medicine. Good clinical sense is integral to the practice of diabetology.
Powell, Tia; Hulkower, Adira
2017-01-01
A good death is hard to find. Family members tell us that loved ones die in the wrong place-the hospital-and do not receive high-quality care at the end of life. This issue of the Hastings Center Report offers two articles from authors who strive to provide good end-of-life care and to prevent needless suffering. We agree with their goals, but we have substantial reservations about the approaches they recommend. Respect for the decisions of patients and their surrogates is a relatively new and still vulnerable aspect of medical care. For thousands of years, patients and surrogates had no say in medical decision-making. Today, standards support shared decision-making, but these articles both carve out exceptions to those standards, limiting the rights of patients and families in decisions about specific end-of-life treatments. As bioethics consultants in an acute care setting, we frequently confront conflicts similar to those described by Jeffrey Berger and by Ellen Robinson and colleagues. In such cases, our service emphasizes redoubled efforts at communication and mediation. Focusing on goals and values, rather than interventions, produces the best possible collaboration in health care decision-making. Cases in which we would overturn a surrogate's recommendations regarding palliative sedation or do-not-resuscitate orders are rare and require careful processes and clear evidence that the surrogate's choice is contrary to patient values.
Prescott, John E; Fresne, Julie A; Youngclaus, James A
2017-01-24
The authors reflect on the article in this issue entitled "Borrow or Serve? An Economic Analysis of Options for Financing a Medical School Education" by Marcu and colleagues, which makes a compelling case that a medical school education is a good investment, no matter what financing option students use, from federal service programs to federal loans. The lead author of this Commentary shares lessons learned from his own medical school education, which was funded by an Armed Forces Health Professions Scholarship, and from his current position interacting with medical students across the United States.Regardless of the financing path they choose, all students should understand basic financial concepts and the details of the various pathways that are available to pay for their medical school education, as well as how each could potentially impact their own future and that of their families. One underappreciated aspect of financing a medical school education is that federal repayment scenarios can link loan payments to income, rather than debt levels, which means that all physicians are able to afford their loan payments no matter what specialty they practice, what they are paid, or where they live.Medical education, while expensive, remains the good investment. An MD degree can lead to a lifetime of personal fulfillment and societal contributions. Everyone, with rare exceptions, accepted to a U.S. medical school will be able to finance their medical education via a path that aligns with their personal values and priorities.
Cooperation and the common good
Johnstone, Rufus A.; Rodrigues, António M. M.
2016-01-01
In this paper, we draw the attention of biologists to a result from the economic literature, which suggests that when individuals are engaged in a communal activity of benefit to all, selection may favour cooperative sharing of resources even among non-relatives. Provided that group members all invest some resources in the public good, they should refrain from conflict over the division of these resources. The reason is that, given diminishing returns on investment in public and private goods, claiming (or ceding) a greater share of total resources only leads to the actor (or its competitors) investing more in the public good, such that the marginal costs and benefits of investment remain in balance. This cancels out any individual benefits of resource competition. We illustrate how this idea may be applied in the context of biparental care, using a sequential game in which parents first compete with one another over resources, and then choose how to allocate the resources they each obtain to care of their joint young (public good) versus their own survival and future reproductive success (private good). We show that when the two parents both invest in care to some extent, they should refrain from any conflict over the division of resources. The same effect can also support asymmetric outcomes in which one parent competes for resources and invests in care, whereas the other does not invest but refrains from competition. The fact that the caring parent gains higher fitness pay-offs at these equilibria suggests that abandoning a partner is not always to the latter's detriment, when the potential for resource competition is taken into account, but may instead be of benefit to the ‘abandoned’ mate. PMID:26729926
Advancing x-ray scattering metrology using inverse genetic algorithms
NASA Astrophysics Data System (ADS)
Hannon, Adam F.; Sunday, Daniel F.; Windover, Donald; Joseph Kline, R.
2016-07-01
We compare the speed and effectiveness of two genetic optimization algorithms to the results of statistical sampling via a Markov chain Monte Carlo algorithm to find which is the most robust method for determining real-space structure in periodic gratings measured using critical dimension small-angle x-ray scattering. Both a covariance matrix adaptation evolutionary strategy and differential evolution algorithm are implemented and compared using various objective functions. The algorithms and objective functions are used to minimize differences between diffraction simulations and measured diffraction data. These simulations are parameterized with an electron density model known to roughly correspond to the real-space structure of our nanogratings. The study shows that for x-ray scattering data, the covariance matrix adaptation coupled with a mean-absolute error log objective function is the most efficient combination of algorithm and goodness of fit criterion for finding structures with little foreknowledge about the underlying fine scale structure features of the nanograting.
Advancing X-ray scattering metrology using inverse genetic algorithms.
Hannon, Adam F; Sunday, Daniel F; Windover, Donald; Kline, R Joseph
2016-01-01
We compare the speed and effectiveness of two genetic optimization algorithms to the results of statistical sampling via a Markov chain Monte Carlo algorithm to find which is the most robust method for determining real space structure in periodic gratings measured using critical dimension small angle X-ray scattering. Both a covariance matrix adaptation evolutionary strategy and differential evolution algorithm are implemented and compared using various objective functions. The algorithms and objective functions are used to minimize differences between diffraction simulations and measured diffraction data. These simulations are parameterized with an electron density model known to roughly correspond to the real space structure of our nanogratings. The study shows that for X-ray scattering data, the covariance matrix adaptation coupled with a mean-absolute error log objective function is the most efficient combination of algorithm and goodness of fit criterion for finding structures with little foreknowledge about the underlying fine scale structure features of the nanograting.
Wrong, Terence; Baumgart, Erica
2013-01-01
The authors of the preceding articles raise legitimate questions about patient and staff rights and the unintended consequences of allowing ABC News to film inside teaching hospitals. We explain why we regard their fears as baseless and not supported by what we heard from individuals portrayed in the filming, our decade-long experience making medical documentaries, and the full un-aired context of the scenes shown in the broadcast. The authors don't and can't know what conversations we had, what documents we reviewed, and what protections we put in place in each televised scene. Finally, we hope to correct several misleading examples cited by the authors as well as their offhand mischaracterization of our program as a "reality" show.
Monte Carlo simulations: Hidden errors from ``good'' random number generators
NASA Astrophysics Data System (ADS)
Ferrenberg, Alan M.; Landau, D. P.; Wong, Y. Joanna
1992-12-01
The Wolff algorithm is now accepted as the best cluster-flipping Monte Carlo algorithm for beating ``critical slowing down.'' We show how this method can yield incorrect answers due to subtle correlations in ``high quality'' random number generators.
Walusinski, Olivier
2014-01-01
In the second half of the 19th century, Jean-Martin Charcot (1825-1893) became famous for the quality of his teaching and his innovative neurological discoveries, bringing many French and foreign students to Paris. A hunger for recognition, together with progressive and anticlerical ideals, led Charcot to invite writers, journalists, and politicians to his lessons, during which he presented the results of his work on hysteria. These events became public performances, for which physicians and patients were transformed into actors. Major newspapers ran accounts of these consultations, more like theatrical shows in some respects. The resultant enthusiasm prompted other physicians in Paris and throughout France to try and imitate them. We will compare the form and substance of Charcot's lessons with those given by Jules-Bernard Luys (1828-1897), Victor Dumontpallier (1826-1899), Ambroise-Auguste Liébault (1823-1904), Hippolyte Bernheim (1840-1919), Joseph Grasset (1849-1918), and Albert Pitres (1848-1928). We will also note their impact on contemporary cinema and theatre.
NASA Technical Reports Server (NTRS)
Abrams, D.; Williams, C.
1999-01-01
This thesis describes several new quantum algorithms. These include a polynomial time algorithm that uses a quantum fast Fourier transform to find eigenvalues and eigenvectors of a Hamiltonian operator, and that can be applied in cases for which all know classical algorithms require exponential time.
NASA Astrophysics Data System (ADS)
2007-01-01
its high spatial and spectral resolution, it was possible to zoom into the very heart of this very massive star. In this innermost region, the observations are dominated by the extremely dense stellar wind that totally obscures the underlying central star. The AMBER observations show that this dense stellar wind is not spherically symmetric, but exhibits a clearly elongated structure. Overall, the AMBER observations confirm that the extremely high mass loss of Eta Carinae's massive central star is non-spherical and much stronger along the poles than in the equatorial plane. This is in agreement with theoretical models that predict such an enhanced polar mass-loss in the case of rapidly rotating stars. ESO PR Photo 06c/07 ESO PR Photo 06c/07 RS Ophiuchi in Outburst Several papers from this special feature focus on the later stages in a star's life. One looks at the binary system Gamma 2 Velorum, which contains the closest example of a star known as a Wolf-Rayet. A single AMBER observation allowed the astronomers to separate the spectra of the two components, offering new insights in the modeling of Wolf-Rayet stars, but made it also possible to measure the separation between the two stars. This led to a new determination of the distance of the system, showing that previous estimates were incorrect. The observations also revealed information on the region where the winds from the two stars collide. The famous binary system RS Ophiuchi, an example of a recurrent nova, was observed just 5 days after it was discovered to be in outburst on 12 February 2006, an event that has been expected for 21 years. AMBER was able to detect the extension of the expanding nova emission. These observations show a complex geometry and kinematics, far from the simple interpretation of a spherical fireball in extension. AMBER has detected a high velocity jet probably perpendicular to the orbital plane of the binary system, and allowed a precise and careful study of the wind and the shockwave
Stretched View Showing 'Victoria'
NASA Technical Reports Server (NTRS)
2006-01-01
[figure removed for brevity, see original site] Stretched View Showing 'Victoria'
This pair of images from the panoramic camera on NASA's Mars Exploration Rover Opportunity served as initial confirmation that the two-year-old rover is within sight of 'Victoria Crater,' which it has been approaching for more than a year. Engineers on the rover team were unsure whether Opportunity would make it as far as Victoria, but scientists hoped for the chance to study such a large crater with their roving geologist. Victoria Crater is 800 meters (nearly half a mile) in diameter, about six times wider than 'Endurance Crater,' where Opportunity spent several months in 2004 examining rock layers affected by ancient water.
When scientists using orbital data calculated that they should be able to detect Victoria's rim in rover images, they scrutinized frames taken in the direction of the crater by the panoramic camera. To positively characterize the subtle horizon profile of the crater and some of the features leading up to it, researchers created a vertically-stretched image (top) from a mosaic of regular frames from the panoramic camera (bottom), taken on Opportunity's 804th Martian day (April 29, 2006).
The stretched image makes mild nearby dunes look like more threatening peaks, but that is only a result of the exaggerated vertical dimension. This vertical stretch technique was first applied to Viking Lander 2 panoramas by Philip Stooke, of the University of Western Ontario, Canada, to help locate the lander with respect to orbiter images. Vertically stretching the image allows features to be more readily identified by the Mars Exploration Rover science team.
The bright white dot near the horizon to the right of center (barely visible without labeling or zoom-in) is thought to be a light-toned outcrop on the far wall of the crater, suggesting that the rover can see over the low rim of Victoria. In figure 1, the northeast and southeast rims are labeled
The Superior Lambert Algorithm
NASA Astrophysics Data System (ADS)
der, G.
2011-09-01
Lambert algorithms are used extensively for initial orbit determination, mission planning, space debris correlation, and missile targeting, just to name a few applications. Due to the significance of the Lambert problem in Astrodynamics, Gauss, Battin, Godal, Lancaster, Gooding, Sun and many others (References 1 to 15) have provided numerous formulations leading to various analytic solutions and iterative methods. Most Lambert algorithms and their computer programs can only work within one revolution, break down or converge slowly when the transfer angle is near zero or 180 degrees, and their multi-revolution limitations are either ignored or barely addressed. Despite claims of robustness, many Lambert algorithms fail without notice, and the users seldom have a clue why. The DerAstrodynamics lambert2 algorithm, which is based on the analytic solution formulated by Sun, works for any number of revolutions and converges rapidly at any transfer angle. It provides significant capability enhancements over every other Lambert algorithm in use today. These include improved speed, accuracy, robustness, and multirevolution capabilities as well as implementation simplicity. Additionally, the lambert2 algorithm provides a powerful tool for solving the angles-only problem without artificial singularities (pointed out by Gooding in Reference 16), which involves 3 lines of sight captured by optical sensors, or systems such as the Air Force Space Surveillance System (AFSSS). The analytic solution is derived from the extended Godal’s time equation by Sun, while the iterative method of solution is that of Laguerre, modified for robustness. The Keplerian solution of a Lambert algorithm can be extended to include the non-Keplerian terms of the Vinti algorithm via a simple targeting technique (References 17 to 19). Accurate analytic non-Keplerian trajectories can be predicted for satellites and ballistic missiles, while performing at least 100 times faster in speed than most
Griffin, Bruce A
2016-08-01
Eggs have one of the lowest energy to nutrient density ratios of any food, and contain a quality of protein that is superior to beef steak and similar to dairy. From a nutritional perspective, this must qualify eggs as 'good'. The greater burden of proof has been to establish that eggs are not 'bad', by increasing awareness of the difference between dietary and blood cholesterol, and accumulating sufficient evidence to exonerate eggs from their associations with CVD and diabetes. After 60 years of research, a general consensus has now been reached that dietary cholesterol, chiefly from eggs, exerts a relatively small effect on serum LDL-cholesterol and CVD risk, in comparison with other diet and lifestyle factors. While dietary guidelines have been revised worldwide to reflect this view, associations between egg intake and the incidence of diabetes, and increased CVD risk in diabetes, prevail. These associations may be explained, in part, by residual confounding produced by other dietary components. The strength of evidence that links egg intake to increased CVD risk in diabetes is also complicated by variation in the response of serum LDL-cholesterol to eggs and dietary cholesterol in types 1 and 2 diabetes. On balance, the answer to the question as to whether eggs are 'bad', is probably 'no', but we do need to gain a better understanding of the effects of dietary cholesterol and its association with CVD risk in diabetes.
Good Clinical Practice Training
Arango, Jaime; Chuck, Tina; Ellenberg, Susan S.; Foltz, Bridget; Gorman, Colleen; Hinrichs, Heidi; McHale, Susan; Merchant, Kunal; Shapley, Stephanie; Wild, Gretchen
2016-01-01
Good Clinical Practice (GCP) is an international standard for the design, conduct, performance, monitoring, auditing, recording, analyses, and reporting of clinical trials. The goal of GCP is to ensure the protection of the rights, integrity, and confidentiality of clinical trial participants and to ensure the credibility and accuracy of data and reported results. In the United States, trial sponsors generally require investigators to complete GCP training prior to participating in each clinical trial to foster GCP and as a method to meet regulatory expectations (ie, sponsor’s responsibility to select qualified investigators per 21 CFR 312.50 and 312.53(a) for drugs and biologics and 21 CFR 812.40 and 812.43(a) for medical devices). This training requirement is often extended to investigative site staff, as deemed relevant by the sponsor, institution, or investigator. Those who participate in multiple clinical trials are often required by sponsors to complete repeated GCP training, which is unnecessarily burdensome. The Clinical Trials Transformation Initiative convened a multidisciplinary project team involving partners from academia, industry, other researchers and research staff, and government to develop recommendations for streamlining current GCP training practices. Recommendations drafted by the project team, including the minimum key training elements, frequency, format, and evidence of training completion, were presented to a broad group of experts to foster discussion of the current issues and to seek consensus on proposed solutions. PMID:27390628
ERIC Educational Resources Information Center
Swinbank, Elizabeth
2004-01-01
This article shows how the physical testing of food ingredients and products can be used as a starting point for exploring aspects of physics. The three main areas addressed are: mechanical properties of solid materials; liquid flow; optical techniques for measuring sugar concentration. The activities described were originally developed for…
Automatic design of decision-tree algorithms with evolutionary algorithms.
Barros, Rodrigo C; Basgalupp, Márcio P; de Carvalho, André C P L F; Freitas, Alex A
2013-01-01
This study reports the empirical analysis of a hyper-heuristic evolutionary algorithm that is capable of automatically designing top-down decision-tree induction algorithms. Top-down decision-tree algorithms are of great importance, considering their ability to provide an intuitive and accurate knowledge representation for classification problems. The automatic design of these algorithms seems timely, given the large literature accumulated over more than 40 years of research in the manual design of decision-tree induction algorithms. The proposed hyper-heuristic evolutionary algorithm, HEAD-DT, is extensively tested using 20 public UCI datasets and 10 microarray gene expression datasets. The algorithms automatically designed by HEAD-DT are compared with traditional decision-tree induction algorithms, such as C4.5 and CART. Experimental results show that HEAD-DT is capable of generating algorithms which are significantly more accurate than C4.5 and CART.
NASA Astrophysics Data System (ADS)
1999-11-01
Along with the increase in the number of young people applying to enter higher education announced back in July, the UK Department for Education and Employment noted that over a thousand more graduates had applied for postgraduate teacher training when compared with the same time in 1998. It appeared that the `Golden hello' programme for new mathematics and science teachers had succeeded in its aim of encouraging applicants in those subjects: an increase of 37% had been witnessed for maths teaching, 33% for physics and 27% for chemistry. Primary teacher training was also well on target with over five applicants seeking each available place. Statistics for UK schools released in August by the DfEE show that 62% of primary schools and 93% of secondary schools are now linked to the Internet (the corresponding figures were 17% and 83% in 1998). On average there is now one computer for every 13 pupils at primary school and one for every eight students in secondary school. The figures show continuing progress towards the Government's target of ensuring that all schools, colleges, universities, libraries and as many community centres as possible should be online (with access to the National Grid for Learning) by 2002.
Good News for New Orleans: Early Evidence Shows Reforms Lifting Student Achievement
ERIC Educational Resources Information Center
Harris, Douglas N.
2015-01-01
What happened to the New Orleans public schools following the tragic levee breeches after Hurricane Katrina is truly unprecedented. Within the span of one year, all public-school employees were fired, the teacher contract expired and was not replaced, and most attendance zones were eliminated. The state took control of almost all public schools…
Good Grubbin': Impact of a TV Cooking Show for College Students Living off Campus
ERIC Educational Resources Information Center
Clifford, Dawn; Anderson, Jennifer; Auld, Garry; Champ, Joseph
2009-01-01
Objective: To determine if a series of 4 15-minute, theory-driven (Social Cognitive Theory) cooking programs aimed at college students living off campus improved cooking self-efficacy, knowledge, attitudes, and behaviors regarding fruit and vegetable intake. Design: A randomized controlled trial with pre-, post- and follow-up tests. Setting:…
Flexible ureteroscopy for renal stone without preoperative ureteral stenting shows good prognosis
Zhang, Jiaqiao; Xu, Chuou; He, Deng; Lu, Yuchao; Hu, Henglong; Qin, Baolong; Wang, Yufeng; Wang, Qing; Li, Cong; Liu, Jihong
2016-01-01
Purpose To clarify the outcome of flexible ureteroscopy (fURS) for management of renal calculi without preoperative stenting. Methods A total of 171 patients who received 176 fURS procedures for unilateral renal stones were reviewed. All procedures were divided into two groups depending on whether they received ureteral stenting preoperatively. Baseline characteristics of patients, stone burden, operation time, stone-free rates, and complications were compared between both groups. Results Successful primary access to the renal pelvis was achieved in 104 of 114 (91.2%) patients without preoperative stenting, while all procedures with preoperative stenting (n = 62) were successfully performed. A total of 156 procedures were included for further data analysis (56 procedures in stenting group and 100 in non-stenting group). No significant differences was found regardless of a preoperative stent placement in terms of stone-free rate (73.2% with stenting vs. 71.0% without, P = 0.854), operative time (70.4 ± 32.8 with stenting vs. 70.2 ± 32.1 without, P = 0.969). Conclusions fURS for management of renal stone without preoperative ureteral stenting are associated with well outcome in short term follow-up. Our study may help patients and doctors to decide if an optional stent is placed or not. PMID:27917317
A Parallel Prefix Algorithm for Almost Toeplitz Tridiagonal Systems
NASA Technical Reports Server (NTRS)
Sun, Xian-He; Joslin, Ronald D.
1995-01-01
A compact scheme is a discretization scheme that is advantageous in obtaining highly accurate solutions. However, the resulting systems from compact schemes are tridiagonal systems that are difficult to solve efficiently on parallel computers. Considering the almost symmetric Toeplitz structure, a parallel algorithm, simple parallel prefix (SPP), is proposed. The SPP algorithm requires less memory than the conventional LU decomposition and is efficient on parallel machines. It consists of a prefix communication pattern and AXPY operations. Both the computation and the communication can be truncated without degrading the accuracy when the system is diagonally dominant. A formal accuracy study has been conducted to provide a simple truncation formula. Experimental results have been measured on a MasPar MP-1 SIMD machine and on a Cray 2 vector machine. Experimental results show that the simple parallel prefix algorithm is a good algorithm for symmetric, almost symmetric Toeplitz tridiagonal systems and for the compact scheme on high-performance computers.
NASA Astrophysics Data System (ADS)
Qin, Hong
2016-10-01
Littlejohn's introduction of the non-canonical symplectic structure for the gyrocenter dynamics revolutionized plasma kinetic theory. The discovery of the non-canonical symplectic algorithm for gyrocenters initiated the search for symplectic algorithms for the gyrokinetic system. This effort is enforced by the recent discovery of canonical and non-canonical symplectic algorithms for the Vlasov-Maxwell (VM) system. However, symplectic algorithms for the gyrokinetic system remain elusive despite intense effort. In retrospect, the success of the symplectic algorithms for the VM system can be attributed to its global canonicalizability. Darboux's theorem ensures that any symplectic structure is locally canonicalizable, but not necessarily globally. Indeed, Littlejohn's gyrocenter is not globally canonicalizable. In this talk, I will show to construct a different gyrocenter that is globally canonicalizable. It should be a good starting point for developing symplectic algorithms for the gyrokinetic system. Research supported by the U.S. Department of Energy (DE-AC02-09CH11466).
An improved filter-u least mean square vibration control algorithm for aircraft framework.
Huang, Quanzhen; Luo, Jun; Gao, Zhiyuan; Zhu, Xiaojin; Li, Hengyu
2014-09-01
Active vibration control of aerospace vehicle structures is very a hot spot and in which filter-u least mean square (FULMS) algorithm is one of the key methods. But for practical reasons and technical limitations, vibration reference signal extraction is always a difficult problem for FULMS algorithm. To solve the vibration reference signal extraction problem, an improved FULMS vibration control algorithm is proposed in this paper. Reference signal is constructed based on the controller structure and the data in the algorithm process, using a vibration response residual signal extracted directly from the vibration structure. To test the proposed algorithm, an aircraft frame model is built and an experimental platform is constructed. The simulation and experimental results show that the proposed algorithm is more practical with a good vibration suppression performance.
NASA Astrophysics Data System (ADS)
Chen, Jiaoxuan; Zhang, Maomao; Liu, Yinyan; Chen, Jiaoliao; Li, Yi
2017-03-01
Electrical capacitance tomography (ECT) is a promising technique applied in many fields. However, the solutions for ECT are not unique and highly sensitive to the measurement noise. To remain a good shape of reconstructed object and endure a noisy data, a Rudin–Osher–Fatemi (ROF) model with total variation regularization is applied to image reconstruction in ECT. Two numerical methods, which are simplified augmented Lagrangian (SAL) and accelerated alternating direction method of multipliers (AADMM), are innovatively introduced to try to solve the above mentioned problems in ECT. The effect of the parameters and the number of iterations for different algorithms, and the noise level in capacitance data are discussed. Both simulation and experimental tests were carried out to validate the feasibility of the proposed algorithms, compared to the Landweber iteration (LI) algorithm. The results show that the SAL and AADMM algorithms can handle a high level of noise and the AADMM algorithm outperforms other algorithms in identifying the object from its background.
NASA Astrophysics Data System (ADS)
Wolfe, William J.; Wood, David; Sorensen, Stephen E.
1996-12-01
This paper discusses automated scheduling as it applies to complex domains such as factories, transportation, and communications systems. The window-constrained-packing problem is introduced as an ideal model of the scheduling trade offs. Specific algorithms are compared in terms of simplicity, speed, and accuracy. In particular, dispatch, look-ahead, and genetic algorithms are statistically compared on randomly generated job sets. The conclusion is that dispatch methods are fast and fairly accurate; while modern algorithms, such as genetic and simulate annealing, have excessive run times, and are too complex to be practical.
A novel tree structure based watermarking algorithm
NASA Astrophysics Data System (ADS)
Lin, Qiwei; Feng, Gui
2008-03-01
In this paper, we propose a new blind watermarking algorithm for images which is based on tree structure. The algorithm embeds the watermark in wavelet transform domain, and the embedding positions are determined by significant coefficients wavelets tree(SCWT) structure, which has the same idea with the embedded zero-tree wavelet (EZW) compression technique. According to EZW concepts, we obtain coefficients that are related to each other by a tree structure. This relationship among the wavelet coefficients allows our technique to embed more watermark data. If the watermarked image is attacked such that the set of significant coefficients is changed, the tree structure allows the correlation-based watermark detector to recover synchronously. The algorithm also uses a visual adaptive scheme to insert the watermark to minimize watermark perceptibility. In addition to the watermark, a template is inserted into the watermarked image at the same time. The template contains synchronization information, allowing the detector to determine the geometric transformations type applied to the watermarked image. Experimental results show that the proposed watermarking algorithm is robust against most signal processing attacks, such as JPEG compression, median filtering, sharpening and rotating. And it is also an adaptive method which shows a good performance to find the best areas to insert a stronger watermark.
Guo, Liyong; Yan, Zhiqiang; Zheng, Xiliang; Hu, Liang; Yang, Yongliang; Wang, Jin
2014-07-01
In protein-ligand docking, an optimization algorithm is used to find the best binding pose of a ligand against a protein target. This algorithm plays a vital role in determining the docking accuracy. To evaluate the relative performance of different optimization algorithms and provide guidance for real applications, we performed a comparative study on six efficient optimization algorithms, containing two evolutionary algorithm (EA)-based optimizers (LGA, DockDE) and four particle swarm optimization (PSO)-based optimizers (SODock, varCPSO, varCPSO-ls, FIPSDock), which were implemented into the protein-ligand docking program AutoDock. We unified the objective functions by applying the same scoring function, and built a new fitness accuracy as the evaluation criterion that incorporates optimization accuracy, robustness, and efficiency. The varCPSO and varCPSO-ls algorithms show high efficiency with fast convergence speed. However, their accuracy is not optimal, as they cannot reach very low energies. SODock has the highest accuracy and robustness. In addition, SODock shows good performance in efficiency when optimizing drug-like ligands with less than ten rotatable bonds. FIPSDock shows excellent robustness and is close to SODock in accuracy and efficiency. In general, the four PSO-based algorithms show superior performance than the two EA-based algorithms, especially for highly flexible ligands. Our method can be regarded as a reference for the validation of new optimization algorithms in protein-ligand docking.
The Algorithm Selection Problem
NASA Technical Reports Server (NTRS)
Minton, Steve; Allen, John; Deiss, Ron (Technical Monitor)
1994-01-01
Work on NP-hard problems has shown that many instances of these theoretically computationally difficult problems are quite easy. The field has also shown that choosing the right algorithm for the problem can have a profound effect on the time needed to find a solution. However, to date there has been little work showing how to select the right algorithm for solving any particular problem. The paper refers to this as the algorithm selection problem. It describes some of the aspects that make this problem difficult, as well as proposes a technique for addressing it.
Study of image matching algorithm and sub-pixel fitting algorithm in target tracking
NASA Astrophysics Data System (ADS)
Yang, Ming-dong; Jia, Jianjun; Qiang, Jia; Wang, Jian-yu
2015-03-01
Image correlation matching is a tracking method that searched a region most approximate to the target template based on the correlation measure between two images. Because there is no need to segment the image, and the computation of this method is little. Image correlation matching is a basic method of target tracking. This paper mainly studies the image matching algorithm of gray scale image, which precision is at sub-pixel level. The matching algorithm used in this paper is SAD (Sum of Absolute Difference) method. This method excels in real-time systems because of its low computation complexity. The SAD method is introduced firstly and the most frequently used sub-pixel fitting algorithms are introduced at the meantime. These fitting algorithms can't be used in real-time systems because they are too complex. However, target tracking often requires high real-time performance, we put forward a fitting algorithm named paraboloidal fitting algorithm based on the consideration above, this algorithm is simple and realized easily in real-time system. The result of this algorithm is compared with that of surface fitting algorithm through image matching simulation. By comparison, the precision difference between these two algorithms is little, it's less than 0.01pixel. In order to research the influence of target rotation on precision of image matching, the experiment of camera rotation was carried on. The detector used in the camera is a CMOS detector. It is fixed to an arc pendulum table, take pictures when the camera rotated different angles. Choose a subarea in the original picture as the template, and search the best matching spot using image matching algorithm mentioned above. The result shows that the matching error is bigger when the target rotation angle is larger. It's an approximate linear relation. Finally, the influence of noise on matching precision was researched. Gaussian noise and pepper and salt noise were added in the image respectively, and the image
A Rapid Convergent Low Complexity Interference Alignment Algorithm for Wireless Sensor Networks
Jiang, Lihui; Wu, Zhilu; Ren, Guanghui; Wang, Gangyi; Zhao, Nan
2015-01-01
Interference alignment (IA) is a novel technique that can effectively eliminate the interference and approach the sum capacity of wireless sensor networks (WSNs) when the signal-to-noise ratio (SNR) is high, by casting the desired signal and interference into different signal subspaces. The traditional alternating minimization interference leakage (AMIL) algorithm for IA shows good performance in high SNR regimes, however, the complexity of the AMIL algorithm increases dramatically as the number of users and antennas increases, posing limits to its applications in the practical systems. In this paper, a novel IA algorithm, called directional quartic optimal (DQO) algorithm, is proposed to minimize the interference leakage with rapid convergence and low complexity. The properties of the AMIL algorithm are investigated, and it is discovered that the difference between the two consecutive iteration results of the AMIL algorithm will approximately point to the convergence solution when the precoding and decoding matrices obtained from the intermediate iterations are sufficiently close to their convergence values. Based on this important property, the proposed DQO algorithm employs the line search procedure so that it can converge to the destination directly. In addition, the optimal step size can be determined analytically by optimizing a quartic function. Numerical results show that the proposed DQO algorithm can suppress the interference leakage more rapidly than the traditional AMIL algorithm, and can achieve the same level of sum rate as that of AMIL algorithm with far less iterations and execution time. PMID:26230697
Acoustic design of rotor blades using a genetic algorithm
NASA Technical Reports Server (NTRS)
Wells, V. L.; Han, A. Y.; Crossley, W. A.
1995-01-01
A genetic algorithm coupled with a simplified acoustic analysis was used to generate low-noise rotor blade designs. The model includes thickness, steady loading and blade-vortex interaction noise estimates. The paper presents solutions for several variations in the fitness function, including thickness noise only, loading noise only, and combinations of the noise types. Preliminary results indicate that the analysis provides reasonable assessments of the noise produced, and that genetic algorithm successfully searches for 'good' designs. The results show that, for a given required thrust coefficient, proper blade design can noticeably reduce the noise produced at some expense to the power requirements.
Convergence properties of the softassign quadratic assignment algorithm.
Rangarajan, A; Vuille, A; Mjolsness, E
1999-08-15
The softassign quadratic assignment algorithm is a discrete-time, continuous-state, synchronous updating optimizing neural network. While its effectiveness has been shown in the traveling salesman problem, graph matching, and graph partitioning in thousands of simulations, its convergence properties have not been studied. Here, we construct discrete-time Lyapunov functions for the cases of exact and approximate doubly stochastic constraint satisfaction, which show convergence to a fixed point. The combination of good convergence properties and experimental success makes the softassign algorithm an excellent choice for neural quadratic assignment optimization.
Algorithm and implementation of GPS/VRS network RTK
NASA Astrophysics Data System (ADS)
Gao, Chengfa; Yuan, Benyin; Ke, Fuyang; Pan, Shuguo
2009-06-01
This paper presents a virtual reference station method and its application. Details of how to generate GPS virtual phase observation are discussed in depth. The developed algorithms are successfully applied to the independent development network digital land investigation system. Experiments are carried out to investigate the system's performance whose results show that the algorithms have good availability and stability. The resulted accuracy of the VRS/RTK positioning was found to be within +/-3.3cm in the horizontal component and +/-7.9cm in the vertical component, which meets the requirements of precise digital land investigation.
Parallelized quantum Monte Carlo algorithm with nonlocal worm updates.
Masaki-Kato, Akiko; Suzuki, Takafumi; Harada, Kenji; Todo, Synge; Kawashima, Naoki
2014-04-11
Based on the worm algorithm in the path-integral representation, we propose a general quantum Monte Carlo algorithm suitable for parallelizing on a distributed-memory computer by domain decomposition. Of particular importance is its application to large lattice systems of bosons and spins. A large number of worms are introduced and its population is controlled by a fictitious transverse field. For a benchmark, we study the size dependence of the Bose-condensation order parameter of the hard-core Bose-Hubbard model with L×L×βt=10240×10240×16, using 3200 computing cores, which shows good parallelization efficiency.
A Color Image Edge Detection Algorithm Based on Color Difference
NASA Astrophysics Data System (ADS)
Zhuo, Li; Hu, Xiaochen; Jiang, Liying; Zhang, Jing
2016-12-01
Although image edge detection algorithms have been widely applied in image processing, the existing algorithms still face two important problems. On one hand, to restrain the interference of noise, smoothing filters are generally exploited in the existing algorithms, resulting in loss of significant edges. On the other hand, since the existing algorithms are sensitive to noise, many noisy edges are usually detected, which will disturb the subsequent processing. Therefore, a color image edge detection algorithm based on color difference is proposed in this paper. Firstly, a new operation called color separation is defined in this paper, which can reflect the information of color difference. Then, for the neighborhood of each pixel, color separations are calculated in four different directions to detect the edges. Experimental results on natural and synthetic images show that the proposed algorithm can remove a large number of noisy edges and be robust to the smoothing filters. Furthermore, the proposed edge detection algorithm is applied in road foreground segmentation and shadow removal, which achieves good performances.
Novel permutation measures for image encryption algorithms
NASA Astrophysics Data System (ADS)
Abd-El-Hafiz, Salwa K.; AbdElHaleem, Sherif H.; Radwan, Ahmed G.
2016-10-01
This paper proposes two measures for the evaluation of permutation techniques used in image encryption. First, a general mathematical framework for describing the permutation phase used in image encryption is presented. Using this framework, six different permutation techniques, based on chaotic and non-chaotic generators, are described. The two new measures are, then, introduced to evaluate the effectiveness of permutation techniques. These measures are (1) Percentage of Adjacent Pixels Count (PAPC) and (2) Distance Between Adjacent Pixels (DBAP). The proposed measures are used to evaluate and compare the six permutation techniques in different scenarios. The permutation techniques are applied on several standard images and the resulting scrambled images are analyzed. Moreover, the new measures are used to compare the permutation algorithms on different matrix sizes irrespective of the actual parameters used in each algorithm. The analysis results show that the proposed measures are good indicators of the effectiveness of the permutation technique.
EDGA: A Population Evolution Direction-Guided Genetic Algorithm for Protein-Ligand Docking.
Guan, Boxin; Zhang, Changsheng; Ning, Jiaxu
2016-07-01
Protein-ligand docking can be formulated as a search algorithm associated with an accurate scoring function. However, most current search algorithms cannot show good performance in docking problems, especially for highly flexible docking. To overcome this drawback, this article presents a novel and robust optimization algorithm (EDGA) based on the Lamarckian genetic algorithm (LGA) for solving flexible protein-ligand docking problems. This method applies a population evolution direction-guided model of genetics, in which search direction evolves to the optimum solution. The method is more efficient to find the lowest energy of protein-ligand docking. We consider four search methods-a tradition genetic algorithm, LGA, SODOCK, and EDGA-and compare their performance in docking of six protein-ligand docking problems. The results show that EDGA is the most stable, reliable, and successful.
Optimization of composite structures by estimation of distribution algorithms
NASA Astrophysics Data System (ADS)
Grosset, Laurent
The design of high performance composite laminates, such as those used in aerospace structures, leads to complex combinatorial optimization problems that cannot be addressed by conventional methods. These problems are typically solved by stochastic algorithms, such as evolutionary algorithms. This dissertation proposes a new evolutionary algorithm for composite laminate optimization, named Double-Distribution Optimization Algorithm (DDOA). DDOA belongs to the family of estimation of distributions algorithms (EDA) that build a statistical model of promising regions of the design space based on sets of good points, and use it to guide the search. A generic framework for introducing statistical variable dependencies by making use of the physics of the problem is proposed. The algorithm uses two distributions simultaneously: the marginal distributions of the design variables, complemented by the distribution of auxiliary variables. The combination of the two generates complex distributions at a low computational cost. The dissertation demonstrates the efficiency of DDOA for several laminate optimization problems where the design variables are the fiber angles and the auxiliary variables are the lamination parameters. The results show that its reliability in finding the optima is greater than that of a simple EDA and of a standard genetic algorithm, and that its advantage increases with the problem dimension. A continuous version of the algorithm is presented and applied to a constrained quadratic problem. Finally, a modification of the algorithm incorporating probabilistic and directional search mechanisms is proposed. The algorithm exhibits a faster convergence to the optimum and opens the way for a unified framework for stochastic and directional optimization.
ERIC Educational Resources Information Center
Moore, Kristin Anderson; Evans, V. Jeffery; Brooks-Gunn, Jeanne; Roth, Jodie
This paper considers the question "What are good child outcomes?" from the perspectives of developmental psychology, economics, and sociology. Section 1 of the paper examines good child outcomes as characteristics of stage-salient tasks of development. Section 2 emphasizes the acquisition of "human capital," the development of productive traits…
ERIC Educational Resources Information Center
Estes, Cheryl; Henderson, Karla
2003-01-01
Presents information to update parks and recreation professionals about what recent research says in regard to enjoyment and the good life, noting what applications this research has for practitioners. The article focuses on: the good life and leisure services; happiness, subjective well-being, and intrinsic motivation; leisure, happiness, and…
A fast image encryption algorithm based on chaotic map
NASA Astrophysics Data System (ADS)
Liu, Wenhao; Sun, Kehui; Zhu, Congxu
2016-09-01
Derived from Sine map and iterative chaotic map with infinite collapse (ICMIC), a new two-dimensional Sine ICMIC modulation map (2D-SIMM) is proposed based on a close-loop modulation coupling (CMC) model, and its chaotic performance is analyzed by means of phase diagram, Lyapunov exponent spectrum and complexity. It shows that this map has good ergodicity, hyperchaotic behavior, large maximum Lyapunov exponent and high complexity. Based on this map, a fast image encryption algorithm is proposed. In this algorithm, the confusion and diffusion processes are combined for one stage. Chaotic shift transform (CST) is proposed to efficiently change the image pixel positions, and the row and column substitutions are applied to scramble the pixel values simultaneously. The simulation and analysis results show that this algorithm has high security, low time complexity, and the abilities of resisting statistical analysis, differential, brute-force, known-plaintext and chosen-plaintext attacks.
Xiao, Jianyuan; Liu, Jian; Qin, Hong; Yu, Zhi
2013-10-15
Smoothing functions are commonly used to reduce numerical noise arising from coarse sampling of particles in particle-in-cell (PIC) plasma simulations. When applying smoothing functions to symplectic algorithms, the conservation of symplectic structure should be guaranteed to preserve good conservation properties. In this paper, we show how to construct a variational multi-symplectic PIC algorithm with smoothing functions for the Vlasov-Maxwell system. The conservation of the multi-symplectic structure and the reduction of numerical noise make this algorithm specifically suitable for simulating long-term dynamics of plasmas, such as those in the steady-state operation or long-pulse discharge of a super-conducting tokamak. The algorithm has been implemented in a 6D large scale PIC code. Numerical examples are given to demonstrate the good conservation properties of the multi-symplectic algorithm and the reduction of the noise due to the application of smoothing function.
Dreaming, Stealing, Dancing, Showing Off.
ERIC Educational Resources Information Center
Lavender, Peter; Taylor, Chris
2002-01-01
Lessons learned from British projects to delivery literacy, numeracy, and English as a second language through community agencies included the following: (1) innovation and measured risks are required to attract hard-to-reach adults; (2) good practice needs to be shared; and (3) projects worked best when government funds were managed by community…
Comparison of l₁-Norm SVR and Sparse Coding Algorithms for Linear Regression.
Zhang, Qingtian; Hu, Xiaolin; Zhang, Bo
2015-08-01
Support vector regression (SVR) is a popular function estimation technique based on Vapnik's concept of support vector machine. Among many variants, the l1-norm SVR is known to be good at selecting useful features when the features are redundant. Sparse coding (SC) is a technique widely used in many areas and a number of efficient algorithms are available. Both l1-norm SVR and SC can be used for linear regression. In this brief, the close connection between the l1-norm SVR and SC is revealed and some typical algorithms are compared for linear regression. The results show that the SC algorithms outperform the Newton linear programming algorithm, an efficient l1-norm SVR algorithm, in efficiency. The algorithms are then used to design the radial basis function (RBF) neural networks. Experiments on some benchmark data sets demonstrate the high efficiency of the SC algorithms. In particular, one of the SC algorithms, the orthogonal matching pursuit is two orders of magnitude faster than a well-known RBF network designing algorithm, the orthogonal least squares algorithm.
Good Law, Good Practice, Good Sense: Using Legal Guidelines for Drafting Educational Policies.
ERIC Educational Resources Information Center
Bogotch, Ira E.
1988-01-01
Suggests how to use legal guidelines for drafting educational policies. Analyzes the political context in which present policymaking and governance initiatives exist. Two assumptions frame this article. First, good law makes for good administrative practice. Second, administrator policymaking is more important than the content of the policy…
Nascov, Victor; Logofătu, Petre Cătălin
2009-08-01
We describe a fast computational algorithm able to evaluate the Rayleigh-Sommerfeld diffraction formula, based on a special formulation of the convolution theorem and the fast Fourier transform. What is new in our approach compared to other algorithms is the use of a more general type of convolution with a scale parameter, which allows for independent sampling intervals in the input and output computation windows. Comparison between the calculations made using our algorithm and direct numeric integration show a very good agreement, while the computation speed is increased by orders of magnitude.
A parallel encryption algorithm for dual-core processor based on chaotic map
NASA Astrophysics Data System (ADS)
Liu, Jiahui; Song, Dahua; Xu, Yiqiu
2011-12-01
In this paper, we propose a parallel chaos-based encryption scheme in order to take advantage of the dual-core processor. The chaos-based cryptosystem is combinatorially generated by the logistic map and Fibonacci sequence. Fibonacci sequence is employed to convert the value of the logistic map to integer data. The parallel algorithm is designed with a master/slave communication model with the Message Passing Interface (MPI). The experimental results show that chaotic cryptosystem possesses good statistical properties, and the parallel algorithm provides more enhanced performance against the serial version of the algorithm. It is suitable for encryption/decryption large sensitive data or multimedia.
Good News About Childhood Cancer
... Home Current Issue Past Issues Good News About Childhood Cancer Past Issues / Spring 2008 Table of Contents ... 85 percent for the most common form of childhood cancer (acute lymphoblastic leukemia or ALL). During the ...
... for you in a new report. The key indicators include: sleeping at least 85 percent of the ... outlined research needed to identify and describe more indicators of good sleep quality among people of all ...
... age. A healthy diet is especially important for children since a variety of food is needed for proper development. Other elements of good health include exercise, rest and avoidance of stimulants such as sugar and caffeine.
NASA Technical Reports Server (NTRS)
Barth, Timothy J.; Lomax, Harvard
1987-01-01
The past decade has seen considerable activity in algorithm development for the Navier-Stokes equations. This has resulted in a wide variety of useful new techniques. Some examples for the numerical solution of the Navier-Stokes equations are presented, divided into two parts. One is devoted to the incompressible Navier-Stokes equations, and the other to the compressible form.
Depreciation of public goods in spatial public goods games
NASA Astrophysics Data System (ADS)
Shi, Dong-Mei; Zhuang, Yong; Li, Yu-Jian; Wang, Bing-Hong
2011-10-01
In real situations, the value of public goods will be reduced or even lost because of external factors or for intrinsic reasons. In this work, we investigate the evolution of cooperation by considering the effect of depreciation of public goods in spatial public goods games on a square lattice. It is assumed that each individual gains full advantage if the number of the cooperators nc within a group centered on that individual equals or exceeds the critical mass (CM). Otherwise, there is depreciation of the public goods, which is realized by rescaling the multiplication factor r to (nc/CM)r. It is shown that the emergence of cooperation is remarkably promoted for CM > 1 even at small values of r, and a global cooperative level is achieved at an intermediate value of CM = 4 at a small r. We further study the effect of depreciation of public goods on different topologies of a regular lattice, and find that the system always reaches global cooperation at a moderate value of CM = G - 1 regardless of whether or not there exist overlapping triangle structures on the regular lattice, where G is the group size of the associated regular lattice.
Preconditioned quantum linear system algorithm.
Clader, B D; Jacobs, B C; Sprouse, C R
2013-06-21
We describe a quantum algorithm that generalizes the quantum linear system algorithm [Harrow et al., Phys. Rev. Lett. 103, 150502 (2009)] to arbitrary problem specifications. We develop a state preparation routine that can initialize generic states, show how simple ancilla measurements can be used to calculate many quantities of interest, and integrate a quantum-compatible preconditioner that greatly expands the number of problems that can achieve exponential speedup over classical linear systems solvers. To demonstrate the algorithm's applicability, we show how it can be used to compute the electromagnetic scattering cross section of an arbitrary target exponentially faster than the best classical algorithm.
Planning Good Change with Technology and Literacy.
ERIC Educational Resources Information Center
McKenzie, Jamie
This book describes strategies to put information literacy and student learning at the center of technology planning. Filled with stories of success and with models of good planning, the book shows how to clarify purpose, involve important stakeholders, and pace the change process to maximize the daily use of new technologies. The following…
A Good Teaching Technique: WebQuests
ERIC Educational Resources Information Center
Halat, Erdogan
2008-01-01
In this article, the author first introduces and describes a new teaching tool called WebQuests to practicing teachers. He then provides detailed information about the structure of a good WebQuest. Third, the author shows the strengths and weaknesses of using Web-Quests in teaching and learning. Last, he points out the challenges for practicing…
Robustness properties of hill-climbing algorithm based on Zernike modes for laser beam correction.
Liu, Ying; Ma, Jianqiang; Chen, Junjie; Li, Baoqing; Chu, Jiaru
2014-04-01
A modified hill-climbing algorithm based on Zernike modes is used for laser beam correction. The algorithm adopts the Zernike mode coefficients, instead of the deformable mirror actuators' voltages in a traditional hill-climbing algorithm, as the adjustable variables to optimize the object function. The effect of the mismatches between the laser beam and the deformable mirror both in the aperture size and the center position was analyzed numerically and experimentally to test the robustness of the algorithm. Both simulation and experimental results show that the mismatches have almost no influence on the laser beam correction, unless the laser beam exceeds the effective aperture of the deformable mirror, which indicates the good robustness of the algorithm.
Design and FPGA implementation of real-time automatic image enhancement algorithm
NASA Astrophysics Data System (ADS)
Dong, GuoWei; Hou, ZuoXun; Tang, Qi; Pan, Zheng; Li, Xin
2016-11-01
In order to improve image processing quality and boost processing rate, this paper proposes an real-time automatic image enhancement algorithm. It is based on the histogram equalization algorithm and the piecewise linear enhancement algorithm, and it calculate the relationship of the histogram and the piecewise linear function by analyzing the histogram distribution for adaptive image enhancement. Furthermore, the corresponding FPGA processing modules are designed to implement the methods. Especially, the high-performance parallel pipelined technology and inner potential parallel processing ability of the modules are paid more attention to ensure the real-time processing ability of the complete system. The simulations and the experimentations show that the algorithm is based on the design and implementation of FPGA hardware circuit less cost on hardware, high real-time performance, the good processing performance in different sceneries. The algorithm can effectively improve the image quality, and would have wide prospect on imaging processing field.
Digital watermarking algorithm research of color images based on quaternion Fourier transform
NASA Astrophysics Data System (ADS)
An, Mali; Wang, Weijiang; Zhao, Zhen
2013-10-01
A watermarking algorithm of color images based on the quaternion Fourier Transform (QFFT) and improved quantization index algorithm (QIM) is proposed in this paper. The original image is transformed by QFFT, the watermark image is processed by compression and quantization coding, and then the processed watermark image is embedded into the components of the transformed original image. It achieves embedding and blind extraction of the watermark image. The experimental results show that the watermarking algorithm based on the improved QIM algorithm with distortion compensation achieves a good tradeoff between invisibility and robustness, and better robustness for the attacks of Gaussian noises, salt and pepper noises, JPEG compression, cropping, filtering and image enhancement than the traditional QIM algorithm.
15. Detail showing lower chord pinconnected to vertical member, showing ...
15. Detail showing lower chord pin-connected to vertical member, showing floor beam riveted to extension of vertical member below pin-connection, and showing brackets supporting cantilevered sidewalk. View to southwest. - Selby Avenue Bridge, Spanning Short Line Railways track at Selby Avenue between Hamline & Snelling Avenues, Saint Paul, Ramsey County, MN
A Pretty Good Paper about Pretty Good Privacy.
ERIC Educational Resources Information Center
McCollum, Roy
With today's growth in the use of electronic information systems for e-mail, data development and research, and the relative ease of access to such resources, protecting one's data and correspondence has become a great concern. "Pretty Good Privacy" (PGP), an encryption program developed by Phil Zimmermann, may be the software tool that…
Answering the Complex Question of "How Good Is Good Enough?"
ERIC Educational Resources Information Center
Suskie, Linda
2007-01-01
Imagine that a student--let's call him Michael--has earned a score of 55 on an examination. How well did he do? Was his score good enough for him to pass the examination, or pass the course, or be deemed at least minimally competent in whatever the exam was assessing? Michael's score alone, in the absence of any other information, cannot answer…
One improved LSB steganography algorithm
NASA Astrophysics Data System (ADS)
Song, Bing; Zhang, Zhi-hong
2013-03-01
It is easy to be detected by X2 and RS steganalysis with high accuracy that using LSB algorithm to hide information in digital image. We started by selecting information embedded location and modifying the information embedded method, combined with sub-affine transformation and matrix coding method, improved the LSB algorithm and a new LSB algorithm was proposed. Experimental results show that the improved one can resist the X2 and RS steganalysis effectively.
Parallel algorithms for unconstrained optimizations by multisplitting
He, Qing
1994-12-31
In this paper a new parallel iterative algorithm for unconstrained optimization using the idea of multisplitting is proposed. This algorithm uses the existing sequential algorithms without any parallelization. Some convergence and numerical results for this algorithm are presented. The experiments are performed on an Intel iPSC/860 Hyper Cube with 64 nodes. It is interesting that the sequential implementation on one node shows that if the problem is split properly, the algorithm converges much faster than one without splitting.
Comparative analysis of PSO algorithms for PID controller tuning
NASA Astrophysics Data System (ADS)
Štimac, Goranka; Braut, Sanjin; Žigulić, Roberto
2014-09-01
The active magnetic bearing(AMB) suspends the rotating shaft and maintains it in levitated position by applying controlled electromagnetic forces on the rotor in radial and axial directions. Although the development of various control methods is rapid, PID control strategy is still the most widely used control strategy in many applications, including AMBs. In order to tune PID controller, a particle swarm optimization(PSO) method is applied. Therefore, a comparative analysis of particle swarm optimization(PSO) algorithms is carried out, where two PSO algorithms, namely (1) PSO with linearly decreasing inertia weight(LDW-PSO), and (2) PSO algorithm with constriction factor approach(CFA-PSO), are independently tested for different PID structures. The computer simulations are carried out with the aim of minimizing the objective function defined as the integral of time multiplied by the absolute value of error(ITAE). In order to validate the performance of the analyzed PSO algorithms, one-axis and two-axis radial rotor/active magnetic bearing systems are examined. The results show that PSO algorithms are effective and easily implemented methods, providing stable convergence and good computational efficiency of different PID structures for the rotor/AMB systems. Moreover, the PSO algorithms prove to be easily used for controller tuning in case of both SISO and MIMO system, which consider the system delay and the interference among the horizontal and vertical rotor axes.
NASA Astrophysics Data System (ADS)
Syariz, M. A.; Jaelani, L. M.; Subehi, L.; Pamungkas, A.; Koenhardono, E. S.; Sulisetyono, A.
2015-10-01
The Sea Surface Temperature (SST) retrieval from satellites data Thus, it could provide SST data for a long time. Since, the algorithms of SST estimation by using Landsat 8 Thermal Band are sitedependence, we need to develop an applicable algorithm in Indonesian water. The aim of this research was to develop SST algorithms in the North Java Island Water. The data used are in-situ data measured on April 22, 2015 and also estimated brightness temperature data from Landsat 8 Thermal Band Image (band 10 and band 11). The algorithm was established using 45 data by assessing the relation of measured in-situ data and estimated brightness temperature. Then, the algorithm was validated by using another 40 points. The results showed that the good performance of the sea surface temperature algorithm with coefficient of determination (R2) and Root Mean Square Error (RMSE) of 0.912 and 0.028, respectively.
Sherlock, Richard
2008-09-01
Public Goods can be seen as one important way in which societies sustain themselves over time. These are part of the puzzle of the development of political order. Public goods like the rule of law are non-substractable and non-excludable . For economists the classic textbook examples are national defense and police protection. In this paper I argue that religiosity can function like police protection, a means of sustaining order through fear of punishment from a transcendent source. As a means of reducing defection from social norms it has a role to play as a public good. But religion cannot at the same time be seen as the source of such norms or dissention will undermine the very order that punishment seems to reinforce.
Method of stereo matching based on genetic algorithm
NASA Astrophysics Data System (ADS)
Lu, Chaohui; An, Ping; Zhang, Zhaoyang
2003-09-01
A new stereo matching scheme based on image edge and genetic algorithm (GA) is presented to improve the conventional stereo matching method in this paper. In order to extract robust edge feature for stereo matching, infinite symmetric exponential filter (ISEF) is firstly applied to remove the noise of image, and nonlinear Laplace operator together with local variance of intensity are then used to detect edges. Apart from the detected edge, the polarity of edge pixels is also obtained. As an efficient search method, genetic algorithm is applied to find the best matching pair. For this purpose, some new ideas are developed for applying genetic algorithm to stereo matching. Experimental results show that the proposed methods are effective and can obtain good results.
A novel speech watermarking algorithm by line spectrum pair modification
NASA Astrophysics Data System (ADS)
Zhang, Qian; Yang, Senbin; Chen, Guang; Zhou, Jun
2011-10-01
To explore digital watermarking specifically suitable for the speech domain, this paper experimentally investigates the properties of line spectrum pair (LSP) parameters firstly. The results show that the differences between contiguous LSPs are robust against common signal processing operations and small modifications of LSPs are imperceptible to the human auditory system (HAS). According to these conclusions, three contiguous LSPs of a speech frame are selected to embed a watermark bit. The middle LSP is slightly altered to modify the differences of these LSPs when embedding watermark. Correspondingly, the watermark is extracted by comparing these differences. The proposed algorithm's transparency is adjustable to meet the needs of different applications. The algorithm has good robustness against additive noise, quantization, amplitude scale and MP3 compression attacks, for the bit error rate (BER) is less than 5%. In addition, the algorithm allows a relatively low capacity, which approximates to 50 bps.
A self-tuning phase-shifting algorithm for interferometry.
Estrada, Julio C; Servin, Manuel; Quiroga, Juan A
2010-02-01
In Phase Stepping Interferometry (PSI) an interferogram sequence having a known, and constant phase shift between the interferograms is required. Here we take the case where this constant phase shift is unknown and the only assumption is that the interferograms do have a temporal carrier. To recover the modulating phase from the interferograms, we propose a self-tuning phase-shifting algorithm. Our algorithm estimates the temporal frequency first, and then this knowledge is used to estimate the interesting modulating phase. There are several well known iterative schemes published before, but our approach has the unique advantage of being very fast. Our new temporal carrier, and phase estimator is capable of obtaining a very good approximation of their temporal carrier in a single iteration. Numerical experiments are given to show the performance of this simple yet powerful self-tuning phase shifting algorithm.
Preliminary results from the ASF/GPS ice classification algorithm
NASA Technical Reports Server (NTRS)
Cunningham, G.; Kwok, R.; Holt, B.
1992-01-01
The European Space Agency Remote Sensing Satellite (ERS-1) satellite carried a C-band synthetic aperture radar (SAR) to study the earth's polar regions. The radar returns from sea ice can be used to infer properties of ice, including ice type. An algorithm has been developed for the Alaska SAR facility (ASF)/Geophysical Processor System (GPS) to infer ice type from the SAR observations over sea ice and open water. The algorithm utilizes look-up tables containing expected backscatter values from various ice types. An analysis has been made of two overlapping strips with 14 SAR images. The backscatter values of specific ice regions were sampled to study the backscatter characteristics of the ice in time and space. Results show both stability of the backscatter values in time and a good separation of multiyear and first-year ice signals, verifying the approach used in the classification algorithm.
A Hybrid Ant Colony Algorithm for Loading Pattern Optimization
NASA Astrophysics Data System (ADS)
Hoareau, F.
2014-06-01
Electricité de France (EDF) operates 58 nuclear power plant (NPP), of the Pressurized Water Reactor (PWR) type. The loading pattern (LP) optimization of these NPP is currently done by EDF expert engineers. Within this framework, EDF R&D has developed automatic optimization tools that assist the experts. The latter can resort, for instance, to a loading pattern optimization software based on ant colony algorithm. This paper presents an analysis of the search space of a few realistic loading pattern optimization problems. This analysis leads us to introduce a hybrid algorithm based on ant colony and a local search method. We then show that this new algorithm is able to generate loading patterns of good quality.
Good Practices in Free-energy Calculations
NASA Technical Reports Server (NTRS)
Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christopher
2013-01-01
As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in drug design. Yet, in a number of instances, the reliability of these calculations can be improved significantly if a number of precepts, or good practices are followed. For the most part, the theory upon which these good practices rely has been known for many years, but often overlooked, or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. The current best practices for carrying out free-energy calculations will be reviewed demonstrating that, at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. In energy perturbation and nonequilibrium work methods, monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision. In thermodynamic integration and probability distribution (histogramming) methods, properly designed adaptive techniques yield nearly uniform sampling of the relevant degrees of freedom and, by doing so, could markedly improve efficiency and accuracy of free energy calculations without incurring any additional computational expense.
8 CFR 274a.4 - Good faith defense.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Good faith defense. 274a.4 Section 274a.4... ALIENS Employer Requirements § 274a.4 Good faith defense. An employer or a recruiter or referrer for a fee for employment who shows good faith compliance with the employment verification requirements...
8 CFR 274a.4 - Good faith defense.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Good faith defense. 274a.4 Section 274a.4... ALIENS Employer Requirements § 274a.4 Good faith defense. An employer or a recruiter or referrer for a fee for employment who shows good faith compliance with the employment verification requirements...
8 CFR 274a.4 - Good faith defense.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Good faith defense. 274a.4 Section 274a.4... ALIENS Employer Requirements § 274a.4 Good faith defense. An employer or a recruiter or referrer for a fee for employment who shows good faith compliance with the employment verification requirements...
8 CFR 274a.4 - Good faith defense.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Good faith defense. 274a.4 Section 274a.4... ALIENS Employer Requirements § 274a.4 Good faith defense. An employer or a recruiter or referrer for a fee for employment who shows good faith compliance with the employment verification requirements...
8 CFR 274a.4 - Good faith defense.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Good faith defense. 274a.4 Section 274a.4... ALIENS Employer Requirements § 274a.4 Good faith defense. An employer or a recruiter or referrer for a fee for employment who shows good faith compliance with the employment verification requirements...
Algorithm aversion: people erroneously avoid algorithms after seeing them err.
Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade
2015-02-01
Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.
NASA Astrophysics Data System (ADS)
Wang, Mengmeng; Zhang, Zhaoming; He, Guojin; Wang, Guizhou; Long, Tengfei; Peng, Yan
2016-10-01
Land surface temperature (LST) is a critical parameter in the physics of Earth surface processes and is required for many applications related to ecology and environment. Landsat series satellites have provided more than 30 years of thermal information at medium spatial resolution. This paper proposes an enhanced single-channel algorithm (SCen) for retrieving LST from Landsat series data (Landsat 4 to Landsat 8). The SCen algorithm includes three atmospheric functions (AFs), and the latitude and acquisition month of Landsat image were added to the AF models to improve LST retrieval. Performance of the SCen algorithm was assessed with both simulated and in situ data, and accuracy of three single-channel algorithms (including the monowindow algorithm developed by Qin et al., SCQin, and the generalized single-channel algorithm developed by Jiménez-Muñoz and Sobrino, SCJ&S) were compared. The accuracy assessments with simulated data had root-mean-square deviations (RMSDs) for the SCen, SCJ&S, and SCQin algorithms of 1.363 K, 1.858 K, and 2.509 K, respectively. Validation with in situ data showed RMSDs for the SCen and SCJ&S algorithms of 1.04 K and 1.49 K, respectively. It was concluded that the SCen algorithm is very operational, has good precision, and can be used to develop an LST product for Landsat series data.
ERIC Educational Resources Information Center
Ida, Zagyváné Szucs
2017-01-01
The introduction of the new Teacher Career Model, the School Inspectorate and the Complex School Assessment imply the basic question "What makes a good teacher?" The scholars have been focusing on the issue for a long period of time. The target of the recent studies is the teachers', school principals' and the teacher students' beliefs.…
Measuring Goodness of Story Narratives
ERIC Educational Resources Information Center
Le, Karen; Coelho, Carl; Mozeiko, Jennifer; Grafman, Jordan
2011-01-01
Purpose: The purpose of this article was to evaluate a new measure of story narrative performance: story completeness. It was hypothesized that by combining organizational (story grammar) and completeness measures, story "goodness" could be quantified. Method: Discourse samples from 46 typically developing adults were compared with those from 24…
Practicing Good Habits, Grade 2.
ERIC Educational Resources Information Center
Nguyen Van Quan; And Others
This illustrated primer, designed for second grade students in Vietnam, consists of stories depicting rural family life in Vietnam. The book is divided into the following six chapters: (1) Practicing Good Habits (health, play, helpfulness); (2) Duties at Home (grandparents, father and mother, servants, the extended family; (3) Duties in School…
ERIC Educational Resources Information Center
Chase, Barbara
2011-01-01
How are independent schools to be useful to the wider world? Beyond their common commitment to educate their students for meaningful lives in service of the greater good, can they educate a broader constituency and, thus, share their resources and skills more broadly? Their answers to this question will be shaped by their independence. Any…
Metrics for Soft Goods Merchandising.
ERIC Educational Resources Information Center
Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.
Designed to meet the job-related metric measurement needs of students interested in soft goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…
Metrics for Hard Goods Merchandising.
ERIC Educational Resources Information Center
Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.
Designed to meet the job-related metric measurement needs of students interested in hard goods merchandising, this instructional package is one of five for the marketing and distribution cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational…
Gender Play and Good Governance
ERIC Educational Resources Information Center
Powell, Mark
2008-01-01
Like good government, thoughtful care of children requires those in power, whether teachers or parents, to recognize when it is appropriate for them to step back from day-to-day decision-making while still working behind the scenes to ensure an organizational structure that supports the independence and equitable development of those they serve.…
What Good Are Conferences, Anyway?
ERIC Educational Resources Information Center
Pietro, David C.
1996-01-01
According to Frederick Herzberg's studies of employee motivation, humans are driven by motivating factors that allow them to grow psychologically and hygiene factors that help them meet physical needs. Good education conferences can enhance both factors by helping principals refocus their energies, exchange ideas with trusted colleagues, and view…
ERIC Educational Resources Information Center
Cockburn, Stewart
1969-01-01
The basic requirements of all good prose are clarity, accuracy, brevity, and simplicity. Especially in public prose--in which the meaning is the crux of the article or speech--concise, vigorous English demands a minimum of adjectives, a maximum use of the active voice, nouns carefully chosen, a logical argument with no labored or obscure points,…
ERIC Educational Resources Information Center
Perlmutter, David D.
2008-01-01
Perhaps getting advice seems a clearer-cut task than giving it. But at a time when budding academics seem busier and more distracted than ever, it is all the more important to understand how to learn from a mentor, and that being a good protege has its own strategies, techniques and responsibilities. The mentor relationship is alive and well in…
Good Science, Good Sense and Good Sensibilities: The Three Ss of Carol Newton.
Smith, Adrian J; Hawkins, Penny
2016-11-11
The Three Rs principle of Replacement, Reduction and Refinement developed by William M. S. Russell and Rex L. Burch in the 1950s has achieved worldwide recognition as a means of reducing the impact of science on animals and improving their welfare. However, application of the Three Rs is still far from universal, and evidence-based methods to implement the Three Rs are still lacking in many areas of laboratory animal science. The purpose of this paper is to create interest in a less well-known but equally useful principle that complements the Three Rs, which was proposed by the American biomathematician Carol M. Newton in the 1970s: the Three Ss-Good Science, Good Sense and Good Sensibilities.
Good Science, Good Sense and Good Sensibilities: The Three Ss of Carol Newton
Smith, Adrian J.; Hawkins, Penny
2016-01-01
Simple Summary The use of animals in research is controversial and there are ongoing efforts to refine and limit their use to an absolute minimum. The Three Rs principle developed by William M. S. Russell and Rex L. Burch in the UK in the 1950s is widely used for this purpose: Replacement, Reduction and Refinement. Another useful principle which complements the Three Rs was proposed by Carol Newton in the 1970s. This principle is the Three Ss: Good Science, Good Sense and Good Sensibilities. Unlike the Three Rs, the Three Ss have not been described in detail in the literature. The purpose of this paper is to increase awareness of the Three Ss, which are a useful supplement to the Three Rs, improving animal welfare and leading to better science. Abstract The Three Rs principle of Replacement, Reduction and Refinement developed by William M. S. Russell and Rex L. Burch in the 1950s has achieved worldwide recognition as a means of reducing the impact of science on animals and improving their welfare. However, application of the Three Rs is still far from universal, and evidence-based methods to implement the Three Rs are still lacking in many areas of laboratory animal science. The purpose of this paper is to create interest in a less well-known but equally useful principle that complements the Three Rs, which was proposed by the American biomathematician Carol M. Newton in the 1970s: the Three Ss—Good Science, Good Sense and Good Sensibilities. PMID:27845707
Power spectral estimation algorithms
NASA Technical Reports Server (NTRS)
Bhatia, Manjit S.
1989-01-01
Algorithms to estimate the power spectrum using Maximum Entropy Methods were developed. These algorithms were coded in FORTRAN 77 and were implemented on the VAX 780. The important considerations in this analysis are: (1) resolution, i.e., how close in frequency two spectral components can be spaced and still be identified; (2) dynamic range, i.e., how small a spectral peak can be, relative to the largest, and still be observed in the spectra; and (3) variance, i.e., how accurate the estimate of the spectra is to the actual spectra. The application of the algorithms based on Maximum Entropy Methods to a variety of data shows that these criteria are met quite well. Additional work in this direction would help confirm the findings. All of the software developed was turned over to the technical monitor. A copy of a typical program is included. Some of the actual data and graphs used on this data are also included.
Temperature Corrected Bootstrap Algorithm
NASA Technical Reports Server (NTRS)
Comiso, Joey C.; Zwally, H. Jay
1997-01-01
A temperature corrected Bootstrap Algorithm has been developed using Nimbus-7 Scanning Multichannel Microwave Radiometer data in preparation to the upcoming AMSR instrument aboard ADEOS and EOS-PM. The procedure first calculates the effective surface emissivity using emissivities of ice and water at 6 GHz and a mixing formulation that utilizes ice concentrations derived using the current Bootstrap algorithm but using brightness temperatures from 6 GHz and 37 GHz channels. These effective emissivities are then used to calculate surface ice which in turn are used to convert the 18 GHz and 37 GHz brightness temperatures to emissivities. Ice concentrations are then derived using the same technique as with the Bootstrap algorithm but using emissivities instead of brightness temperatures. The results show significant improvement in the area where ice temperature is expected to vary considerably such as near the continental areas in the Antarctic, where the ice temperature is colder than average, and in marginal ice zones.
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.
1993-01-01
New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.
Ensembles of satellite aerosol retrievals based on three AATSR algorithms within aerosol_cci
NASA Astrophysics Data System (ADS)
Kosmale, Miriam; Popp, Thomas
2016-04-01
Ensemble techniques are widely used in the modelling community, combining different modelling results in order to reduce uncertainties. This approach could be also adapted to satellite measurements. Aerosol_cci is an ESA funded project, where most of the European aerosol retrieval groups work together. The different algorithms are homogenized as far as it makes sense, but remain essentially different. Datasets are compared with ground based measurements and between each other. Three AATSR algorithms (Swansea university aerosol retrieval, ADV aerosol retrieval by FMI and Oxford aerosol retrieval ORAC) provide within this project 17 year global aerosol records. Each of these algorithms provides also uncertainty information on pixel level. Within the presented work, an ensembles of the three AATSR algorithms is performed. The advantage over each single algorithm is the higher spatial coverage due to more measurement pixels per gridbox. A validation to ground based AERONET measurements shows still a good correlation of the ensemble, compared to the single algorithms. Annual mean maps show the global aerosol distribution, based on a combination of the three aerosol algorithms. In addition, pixel level uncertainties of each algorithm are used for weighting the contributions, in order to reduce the uncertainty of the ensemble. Results of different versions of the ensembles for aerosol optical depth will be presented and discussed. The results are validated against ground based AERONET measurements. A higher spatial coverage on daily basis allows better results in annual mean maps. The benefit of using pixel level uncertainties is analysed.
Hey Teacher, Your Personality's Showing!
ERIC Educational Resources Information Center
Paulsen, James R.
1977-01-01
A study of 30 fourth, fifth, and sixth grade teachers and 300 of their students showed that a teacher's age, sex, and years of experience did not relate to students' mathematics achievement, but that more effective teachers showed greater "freedom from defensive behavior" than did less effective teachers. (DT)
Planning a Successful Tech Show
ERIC Educational Resources Information Center
Nikirk, Martin
2011-01-01
Tech shows are a great way to introduce prospective students, parents, and local business and industry to a technology and engineering or career and technical education program. In addition to showcasing instructional programs, a tech show allows students to demonstrate their professionalism and skills, practice public presentations, and interact…
Switch for Good Community Program
Crawford, Tabitha; Amran, Martha
2013-11-19
Switch4Good is an energy-savings program that helps residents reduce consumption from behavior changes; it was co-developed by Balfour Beatty Military Housing Management (BB) and WattzOn in Phase I of this grant. The program was offered at 11 Navy bases. Three customer engagement strategies were evaluated, and it was found that Digital Nudges (a combination of monthly consumption statements with frequent messaging via text or email) was most cost-effective.
Research on registration algorithm for check seal verification
NASA Astrophysics Data System (ADS)
Wang, Shuang; Liu, Tiegen
2008-03-01
Nowadays seals play an important role in China. With the development of social economy, the traditional method of manual check seal identification can't meet the need s of banking transactions badly. This paper focus on pre-processing and registration algorithm for check seal verification using theory of image processing and pattern recognition. First of all, analyze the complex characteristics of check seals. To eliminate the difference of producing conditions and the disturbance caused by background and writing in check image, many methods are used in the pre-processing of check seal verification, such as color components transformation, linearity transform to gray-scale image, medium value filter, Otsu, close calculations and labeling algorithm of mathematical morphology. After the processes above, the good binary seal image can be obtained. On the basis of traditional registration algorithm, a double-level registration method including rough and precise registration method is proposed. The deflection angle of precise registration method can be precise to 0.1°. This paper introduces the concepts of difference inside and difference outside and use the percent of difference inside and difference outside to judge whether the seal is real or fake. The experimental results of a mass of check seals are satisfied. It shows that the methods and algorithmic presented have good robustness to noise sealing conditions and satisfactory tolerance of difference within class.
An algorithm on distributed mining association rules
NASA Astrophysics Data System (ADS)
Xu, Fan
2005-12-01
With the rapid development of the Internet/Intranet, distributed databases have become a broadly used environment in various areas. It is a critical task to mine association rules in distributed databases. The algorithms of distributed mining association rules can be divided into two classes. One is a DD algorithm, and another is a CD algorithm. A DD algorithm focuses on data partition optimization so as to enhance the efficiency. A CD algorithm, on the other hand, considers a setting where the data is arbitrarily partitioned horizontally among the parties to begin with, and focuses on parallelizing the communication. A DD algorithm is not always applicable, however, at the time the data is generated, it is often already partitioned. In many cases, it cannot be gathered and repartitioned for reasons of security and secrecy, cost transmission, or sheer efficiency. A CD algorithm may be a more appealing solution for systems which are naturally distributed over large expenses, such as stock exchange and credit card systems. An FDM algorithm provides enhancement to CD algorithm. However, CD and FDM algorithms are both based on net-structure and executing in non-shareable resources. In practical applications, however, distributed databases often are star-structured. This paper proposes an algorithm based on star-structure networks, which are more practical in application, have lower maintenance costs and which are more practical in the construction of the networks. In addition, the algorithm provides high efficiency in communication and good extension in parallel computation.
Clustering algorithm for determining community structure in large networks
NASA Astrophysics Data System (ADS)
Pujol, Josep M.; Béjar, Javier; Delgado, Jordi
2006-07-01
We propose an algorithm to find the community structure in complex networks based on the combination of spectral analysis and modularity optimization. The clustering produced by our algorithm is as accurate as the best algorithms on the literature of modularity optimization; however, the main asset of the algorithm is its efficiency. The best match for our algorithm is Newman’s fast algorithm, which is the reference algorithm for clustering in large networks due to its efficiency. When both algorithms are compared, our algorithm outperforms the fast algorithm both in efficiency and accuracy of the clustering, in terms of modularity. Thus, the results suggest that the proposed algorithm is a good choice to analyze the community structure of medium and large networks in the range of tens and hundreds of thousand vertices.
Satellite Animation Shows California Storms
This animation of visible and infrared imagery from NOAA's GOES-West satellite shows a series of moisture-laden storms affecting California from Jan. 6 through Jan. 9, 2017. TRT: 00:36 Credit: NASA...
Satellite Movie Shows Erika Dissipate
This animation of visible and infrared imagery from NOAA's GOES-West satellite from Aug. 27 to 29 shows Tropical Storm Erika move through the Eastern Caribbean Sea and dissipate near eastern Cuba. ...
An innovative localisation algorithm for railway vehicles
NASA Astrophysics Data System (ADS)
Allotta, B.; D'Adamio, P.; Malvezzi, M.; Pugi, L.; Ridolfi, A.; Rindi, A.; Vettori, G.
2014-11-01
. The estimation strategy has good performance also under degraded adhesion conditions and could be put on board of high-speed railway vehicles; it represents an accurate and reliable solution. The IMU board is tested via a dedicated Hardware in the Loop (HIL) test rig: it includes an industrial robot able to replicate the motion of the railway vehicle. Through the generated experimental outputs the performances of the innovative localisation algorithm have been evaluated: the HIL test rig permitted to test the proposed algorithm, avoiding expensive (in terms of time and cost) on-track tests, obtaining encouraging results. In fact, the preliminary results show a significant improvement of the position and speed estimation performances compared to those obtained with SCMT algorithms, currently in use on the Italian railway network.
Virtual Goods Recommendations in Virtual Worlds
Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren
2015-01-01
Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods. PMID:25834837
Spatial dynamics of ecological public goods
Wakano, Joe Yuichiro; Nowak, Martin A.; Hauert, Christoph
2009-01-01
The production, consumption, and exploitation of common resources ranging from extracellular products in microorganisms to global issues of climate change refer to public goods interactions. Individuals can cooperate and sustain common resources at some cost or defect and exploit the resources without contributing. This generates a conflict of interest, which characterizes social dilemmas: Individual selection favors defectors, but for the community, it is best if everybody cooperates. Traditional models of public goods do not take into account that benefits of the common resource enable cooperators to maintain higher population densities. This leads to a natural feedback between population dynamics and interaction group sizes as captured by “ecological public goods.” Here, we show that the spatial evolutionary dynamics of ecological public goods in “selection-diffusion” systems promotes cooperation based on different types of pattern formation processes. In spatial settings, individuals can migrate (diffuse) to populate new territories. Slow diffusion of cooperators fosters aggregation in highly productive patches (activation), whereas fast diffusion enables defectors to readily locate and exploit these patches (inhibition). These antagonistic forces promote coexistence of cooperators and defectors in static or dynamic patterns, including spatial chaos of ever-changing configurations. The local environment of cooperators and defectors is shaped by the production or consumption of common resources. Hence, diffusion-induced self-organization into spatial patterns not only enhances cooperation but also provides simple mechanisms for the spontaneous generation of habitat diversity, which denotes a crucial determinant of the viability of ecological systems. PMID:19416839
DNABIT Compress - Genome compression algorithm.
Rajarajeswari, Pothuraju; Apparao, Allam
2011-01-22
Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, "DNABIT Compress" for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that "DNABIT Compress" algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases.
SAGE II inversion algorithm. [Stratospheric Aerosol and Gas Experiment
NASA Technical Reports Server (NTRS)
Chu, W. P.; Mccormick, M. P.; Lenoble, J.; Brogniez, C.; Pruvost, P.
1989-01-01
The operational Stratospheric Aerosol and Gas Experiment II multichannel data inversion algorithm is described. Aerosol and ozone retrievals obtained with the algorithm are discussed. The algorithm is compared to an independently developed algorithm (Lenoble, 1989), showing that the inverted aerosol and ozone profiles from the two algorithms are similar within their respective uncertainties.
National Orange Show Photovoltaic Demonstration
Dan Jimenez Sheri Raborn, CPA; Tom Baker
2008-03-31
National Orange Show Photovoltaic Demonstration created a 400KW Photovoltaic self-generation plant at the National Orange Show Events Center (NOS). The NOS owns a 120-acre state fairground where it operates an events center and produces an annual citrus fair known as the Orange Show. The NOS governing board wanted to employ cost-saving programs for annual energy expenses. It is hoped the Photovoltaic program will result in overall savings for the NOS, help reduce the State's energy demands as relating to electrical power consumption, improve quality of life within the affected grid area as well as increase the energy efficiency of buildings at our venue. In addition, the potential to reduce operational expenses would have a tremendous effect on the ability of the NOS to service its community.
Marucci, Evandro A.; Neves, Leandro A.; Valêncio, Carlo R.; Pinto, Alex R.; Cansian, Adriano M.; de Souza, Rogeria C. G.; Shiyou, Yang; Machado, José M.
2014-01-01
With the advance of genomic researches, the number of sequences involved in comparative methods has grown immensely. Among them, there are methods for similarities calculation, which are used by many bioinformatics applications. Due the huge amount of data, the union of low complexity methods with the use of parallel computing is becoming desirable. The k-mers counting is a very efficient method with good biological results. In this work, the development of a parallel algorithm for multiple sequence similarities calculation using the k-mers counting method is proposed. Tests show that the algorithm presents a very good scalability and a nearly linear speedup. For 14 nodes was obtained 12x speedup. This algorithm can be used in the parallelization of some multiple sequence alignment tools, such as MAFFT and MUSCLE. PMID:25140318
NASA Astrophysics Data System (ADS)
Liu, Zhi; Zhou, Baotong; Zhang, Changnian
2017-03-01
Vehicle-mounted panoramic system is important safety assistant equipment for driving. However, traditional systems only render fixed top-down perspective view of limited view field, which may have potential safety hazard. In this paper, a texture mapping algorithm for 3D vehicle-mounted panoramic system is introduced, and an implementation of the algorithm utilizing OpenGL ES library based on Android smart platform is presented. Initial experiment results show that the proposed algorithm can render a good 3D panorama, and has the ability to change view point freely.
NASA Astrophysics Data System (ADS)
Cowie, L. L.; Barger, A. J.; Hsu, L.-Y.; Chen, Chian-Chou; Owen, F. N.; Wang, W.-H.
2017-03-01
In this first paper in the SUPER GOODS series on powerfully star-forming galaxies in the two GOODS fields, we present a deep SCUBA-2 survey of the GOODS-N at both 850 and 450 μm (central rms noise of 0.28 mJy and 2.6 mJy, respectively). In the central region, the 850 μm observations cover the GOODS-N to near the confusion limit of ∼1.65 mJy, while over a wider 450 arcmin2 region—well complemented by Herschel far-infrared imaging—they have a median 4σ limit of 3.5 mJy. We present ≥slant 4σ catalogs of 186 850 μm and 31 450 μm selected sources. We use interferometric observations from the Submillimeter Array (SMA) and the Karl G. Jansky Very Large Array (VLA) to obtain precise positions for 114 SCUBA-2 sources (28 from the SMA, all of which are also VLA sources). We present new spectroscopic redshifts and include all existing spectroscopic or photometric redshifts. We also compare redshifts estimated using the 20 cm/850 μm and the 250 cm/850 μm flux ratios. We show that the redshift distribution increases with increasing flux, and we parameterize the dependence. We compute the star formation history and the star formation rate (SFR) density distribution functions in various redshift intervals, finding that they reach a peak at z=2{--}3 before dropping to higher redshifts. We show that the number density per unit volume of {SFR} ≳ 500 {M}ȯ {{yr}}-1 galaxies measured from the SCUBA-2 sample does not change much relative to that of lower SFR galaxies from UV selected samples over z=2{--}5, suggesting that, apart from changes in the normalization, the shape in the number density as a function of SFR is invariant over this redshift interval.
Phyllodes tumor showing intraductal growth.
Makidono, Akari; Tsunoda, Hiroko; Mori, Miki; Yagata, Hiroshi; Onoda, Yui; Kikuchi, Mari; Nozaki, Taiki; Saida, Yukihisa; Nakamura, Seigo; Suzuki, Koyu
2013-07-01
Phyllodes tumor of the breast is a rare fibroepithelial lesion and particularly uncommon in adolescent girls. It is thought to arise from the periductal rather than intralobular stroma. Usually, it is seen as a well-defined mass. Phyllodes tumor showing intraductal growth is extremely rare. Here we report a girl who has a phyllodes tumor with intraductal growth.
ICESat-2 / ATLAS Flight Science Receiver Algorithms
NASA Astrophysics Data System (ADS)
Mcgarry, J.; Carabajal, C. C.; Degnan, J. J.; Mallama, A.; Palm, S. P.; Ricklefs, R.; Saba, J. L.
2013-12-01
. This Simulator makes it possible to check all logic paths that could be encountered by the Algorithms on orbit. In addition the NASA airborne instrument MABEL is collecting data with characteristics similar to what ATLAS will see. MABEL data is being used to test the ATLAS Receiver Algorithms. Further verification will be performed during Integration and Testing of the ATLAS instrument and during Environmental Testing on the full ATLAS instrument. Results from testing to date show the Receiver Algorithms have the ability to handle a wide range of signal and noise levels with a very good sensitivity at relatively low signal to noise ratios. In addition, preliminary tests have demonstrated, using the ICESat-2 Science Team's selected land ice and sea ice test cases, the capability of the Algorithms to successfully find and telemeter the surface echoes. In this presentation we will describe the ATLAS Flight Science Receiver Algorithms and the Software Simulator, and will present results of the testing to date. The onboard databases (DEM, DRM and the Surface Reference Mask) are being developed at the University of Texas at Austin as part of the ATLAS Flight Science Receiver Algorithms. Verification of the onboard databases is being performed by ATLAS Receiver Algorithms team members Claudia Carabajal and Jack Saba.
Wiley, H. S.
2010-10-01
I was talking with some younger colleagues at a meeting last month when the subject of career goals came up. These colleagues were successful in that they had recently received tenure at top research universities and had some grants and good students. Thus, the early career pressure to simply survive was gone. So now what motivated them? Solving challenging and significant scientific problems was at the top of their lists. Interestingly, they were also motivated by a desire to become one of the “good guys” in science. The fact that being an important contributor to the scientific community can be fulfilling should not come as a surprise to anyone. However, what I do consider surprising is how rarely this seems to be discussed with students and postdocs. What we do discuss are either those issues that are fundamental aspects of the job (get a grant, get tenure, do research in an important field) or those that are important to our institutions. Knowing how to do our jobs well is indeed essential for any kind of professional success. However, achieving the right balance in our ambitions is also important for our happiness.
Goode Gym Energy Renovation Project
Coleman, Andrena
2014-12-11
The Ida H. Goode Gymnasium was constructed in 1964 to serve as a focal point for academics, student recreation, and health and wellness activities. This 38,000 SF building contains a gymnasium with a stage, swimming pool, eight classrooms, a weight room, six offices and auxiliary spaces for the athletic programs. The gym is located on a 4-acre greenfield, which is slated for improvement and enhancement to future athletics program at Bennett College. The available funding for this project was used to weatherize the envelope of the gymnasium, installation of a new energy-efficient mechanical system, and a retrofit of the existing lighting systems in the building’s interior. The envelope weatherization was completed without disturbing the building’s historic preservation eligibility. The existing heating system was replaced with a new high efficiency condensing system. The new heating system also includes a new Building Automation System which provides additional monitoring. Proper usage of this system will provide additional energy savings. Most of the existing interior lighting fixtures and bulbs were replaced with new LED and high efficiency T-8 bulbs and fixtures. Occupancy sensors were installed in applicable areas. The Ida Goode Gymnasium should experience high electricity and natural gas savings as well as operational/maintenance efficiency increases. The aesthetics of the building was maintained and the overall safety was improved.
Going public: good scientific conduct.
Meyer, Gitte; Sandøe, Peter
2012-06-01
The paper addresses issues of scientific conduct regarding relations between science and the media, relations between scientists and journalists, and attitudes towards the public at large. In the large and increasing body of literature on scientific conduct and misconduct, these issues seem underexposed as ethical challenges. Consequently, individual scientists here tend to be left alone with problems and dilemmas, with no guidance for good conduct. Ideas are presented about how to make up for this omission. Using a practical, ethical approach, the paper attempts to identify ways scientists might deal with ethical public relations issues, guided by a norm or maxim of openness. Drawing on and rethinking the CUDOS codification of the scientific ethos, as it was worked out by Robert K. Merton in 1942, we propose that this, which is echoed in current codifications of norms for good scientific conduct, contains a tacit maxim of openness which may naturally be extended to cover the public relations of science. Discussing openness as access, accountability, transparency and receptiveness, the argumentation concentrates on the possible prevention of misconduct with respect to, on the one hand, sins of omission-withholding important information from the public-and, on the other hand, abuses of the authority of science in order to gain publicity. Statements from interviews with scientists are used to illustrate how scientists might view the relevance of the issues raised.
Higher-order force gradient symplectic algorithms
NASA Astrophysics Data System (ADS)
Chin, Siu A.; Kidwell, Donald W.
2000-12-01
We show that a recently discovered fourth order symplectic algorithm, which requires one evaluation of force gradient in addition to three evaluations of the force, when iterated to higher order, yielded algorithms that are far superior to similarly iterated higher order algorithms based on the standard Forest-Ruth algorithm. We gauge the accuracy of each algorithm by comparing the step-size independent error functions associated with energy conservation and the rotation of the Laplace-Runge-Lenz vector when solving a highly eccentric Kepler problem. For orders 6, 8, 10, and 12, the new algorithms are approximately a factor of 103, 104, 104, and 105 better.
Research on Routing Selection Algorithm Based on Genetic Algorithm
NASA Astrophysics Data System (ADS)
Gao, Guohong; Zhang, Baojian; Li, Xueyong; Lv, Jinna
The hereditary algorithm is a kind of random searching and method of optimizing based on living beings natural selection and hereditary mechanism. In recent years, because of the potentiality in solving complicate problems and the successful application in the fields of industrial project, hereditary algorithm has been widely concerned by the domestic and international scholar. Routing Selection communication has been defined a standard communication model of IP version 6.This paper proposes a service model of Routing Selection communication, and designs and implements a new Routing Selection algorithm based on genetic algorithm.The experimental simulation results show that this algorithm can get more resolution at less time and more balanced network load, which enhances search ratio and the availability of network resource, and improves the quality of service.
Quantum-based algorithm for optimizing artificial neural networks.
Tzyy-Chyang Lu; Gwo-Ruey Yu; Jyh-Ching Juang
2013-08-01
This paper presents a quantum-based algorithm for evolving artificial neural networks (ANNs). The aim is to design an ANN with few connections and high classification performance by simultaneously optimizing the network structure and the connection weights. Unlike most previous studies, the proposed algorithm uses quantum bit representation to codify the network. As a result, the connectivity bits do not indicate the actual links but the probability of the existence of the connections, thus alleviating mapping problems and reducing the risk of throwing away a potential candidate. In addition, in the proposed model, each weight space is decomposed into subspaces in terms of quantum bits. Thus, the algorithm performs a region by region exploration, and evolves gradually to find promising subspaces for further exploitation. This is helpful to provide a set of appropriate weights when evolving the network structure and to alleviate the noisy fitness evaluation problem. The proposed model is tested on four benchmark problems, namely breast cancer and iris, heart, and diabetes problems. The experimental results show that the proposed algorithm can produce compact ANN structures with good generalization ability compared to other algorithms.
Harmony search algorithm: application to the redundancy optimization problem
NASA Astrophysics Data System (ADS)
Nahas, Nabil; Thien-My, Dao
2010-09-01
The redundancy optimization problem is a well known NP-hard problem which involves the selection of elements and redundancy levels to maximize system performance, given different system-level constraints. This article presents an efficient algorithm based on the harmony search algorithm (HSA) to solve this optimization problem. The HSA is a new nature-inspired algorithm which mimics the improvization process of music players. Two kinds of problems are considered in testing the proposed algorithm, with the first limited to the binary series-parallel system, where the problem consists of a selection of elements and redundancy levels used to maximize the system reliability given various system-level constraints; the second problem for its part concerns the multi-state series-parallel systems with performance levels ranging from perfect operation to complete failure, and in which identical redundant elements are included in order to achieve a desirable level of availability. Numerical results for test problems from previous research are reported and compared. The results of HSA showed that this algorithm could provide very good solutions when compared to those obtained through other approaches.
A Comparison of Three Algorithms for Orion Drogue Parachute Release
NASA Technical Reports Server (NTRS)
Matz, Daniel A.; Braun, Robert D.
2015-01-01
The Orion Multi-Purpose Crew Vehicle is susceptible to ipping apex forward between drogue parachute release and main parachute in ation. A smart drogue release algorithm is required to select a drogue release condition that will not result in an apex forward main parachute deployment. The baseline algorithm is simple and elegant, but does not perform as well as desired in drogue failure cases. A simple modi cation to the baseline algorithm can improve performance, but can also sometimes fail to identify a good release condition. A new algorithm employing simpli ed rotational dynamics and a numeric predictor to minimize a rotational energy metric is proposed. A Monte Carlo analysis of a drogue failure scenario is used to compare the performance of the algorithms. The numeric predictor prevents more of the cases from ipping apex forward, and also results in an improvement in the capsule attitude at main bag extraction. The sensitivity of the numeric predictor to aerodynamic dispersions, errors in the navigated state, and execution rate is investigated, showing little degradation in performance.
Sort-Mid tasks scheduling algorithm in grid computing.
Reda, Naglaa M; Tawfik, A; Marzok, Mohamed A; Khamis, Soheir M
2015-11-01
Scheduling tasks on heterogeneous resources distributed over a grid computing system is an NP-complete problem. The main aim for several researchers is to develop variant scheduling algorithms for achieving optimality, and they have shown a good performance for tasks scheduling regarding resources selection. However, using of the full power of resources is still a challenge. In this paper, a new heuristic algorithm called Sort-Mid is proposed. It aims to maximizing the utilization and minimizing the makespan. The new strategy of Sort-Mid algorithm is to find appropriate resources. The base step is to get the average value via sorting list of completion time of each task. Then, the maximum average is obtained. Finally, the task has the maximum average is allocated to the machine that has the minimum completion time. The allocated task is deleted and then, these steps are repeated until all tasks are allocated. Experimental tests show that the proposed algorithm outperforms almost other algorithms in terms of resources utilization and makespan.
NASA Astrophysics Data System (ADS)
Niu, Chaojun; Han, Xiang'e.
2015-10-01
Adaptive optics (AO) technology is an effective way to alleviate the effect of turbulence on free space optical communication (FSO). A new adaptive compensation method can be used without a wave-front sensor. Artificial bee colony algorithm (ABC) is a population-based heuristic evolutionary algorithm inspired by the intelligent foraging behaviour of the honeybee swarm with the advantage of simple, good convergence rate, robust and less parameter setting. In this paper, we simulate the application of the improved ABC to correct the distorted wavefront and proved its effectiveness. Then we simulate the application of ABC algorithm, differential evolution (DE) algorithm and stochastic parallel gradient descent (SPGD) algorithm to the FSO system and analyze the wavefront correction capabilities by comparison of the coupling efficiency, the error rate and the intensity fluctuation in different turbulence before and after the correction. The results show that the ABC algorithm has much faster correction speed than DE algorithm and better correct ability for strong turbulence than SPGD algorithm. Intensity fluctuation can be effectively reduced in strong turbulence, but not so effective in week turbulence.
On Parallel Push-Relabel based Algorithms for Bipartite Maximum Matching
Langguth, Johannes; Azad, Md Ariful; Halappanavar, Mahantesh; Manne, Fredrik
2014-07-01
We study multithreaded push-relabel based algorithms for computing maximum cardinality matching in bipartite graphs. Matching is a fundamental combinatorial (graph) problem with applications in a wide variety of problems in science and engineering. We are motivated by its use in the context of sparse linear solvers for computing maximum transversal of a matrix. We implement and test our algorithms on several multi-socket multicore systems and compare their performance to state-of-the-art augmenting path-based serial and parallel algorithms using a testset comprised of a wide range of real-world instances. Building on several heuristics for enhancing performance, we demonstrate good scaling for the parallel push-relabel algorithm. We show that it is comparable to the best augmenting path-based algorithms for bipartite matching. To the best of our knowledge, this is the first extensive study of multithreaded push-relabel based algorithms. In addition to a direct impact on the applications using matching, the proposed algorithmic techniques can be extended to preflow-push based algorithms for computing maximum flow in graphs.
NASA Technical Reports Server (NTRS)
2004-01-01
The upper left image in this display is from the panoramic camera on the Mars Exploration Rover Spirit, showing the 'Magic Carpet' region near the rover at Gusev Crater, Mars, on Sol 7, the seventh martian day of its journey (Jan. 10, 2004). The lower image, also from the panoramic camera, is a monochrome (single filter) image of a rock in the 'Magic Carpet' area. Note that colored portions of the rock correlate with extracted spectra shown in the plot to the side. Four different types of materials are shown: the rock itself, the soil in front of the rock, some brighter soil on top of the rock, and some dust that has collected in small recesses on the rock face ('spots'). Each color on the spectra matches a line on the graph, showing how the panoramic camera's different colored filters are used to broadly assess the varying mineral compositions of martian rocks and soils.
An adaptive gyroscope-based algorithm for temporal gait analysis.
Greene, Barry R; McGrath, Denise; O'Neill, Ross; O'Donovan, Karol J; Burns, Adrian; Caulfield, Brian
2010-12-01
Body-worn kinematic sensors have been widely proposed as the optimal solution for portable, low cost, ambulatory monitoring of gait. This study aims to evaluate an adaptive gyroscope-based algorithm for automated temporal gait analysis using body-worn wireless gyroscopes. Gyroscope data from nine healthy adult subjects performing four walks at four different speeds were then compared against data acquired simultaneously using two force plates and an optical motion capture system. Data from a poliomyelitis patient, exhibiting pathological gait walking with and without the aid of a crutch, were also compared to the force plate. Results show that the mean true error between the adaptive gyroscope algorithm and force plate was -4.5 ± 14.4 ms and 43.4 ± 6.0 ms for IC and TC points, respectively, in healthy subjects. Similarly, the mean true error when data from the polio patient were compared against the force plate was -75.61 ± 27.53 ms and 99.20 ± 46.00 ms for IC and TC points, respectively. A comparison of the present algorithm against temporal gait parameters derived from an optical motion analysis system showed good agreement for nine healthy subjects at four speeds. These results show that the algorithm reported here could constitute the basis of a robust, portable, low-cost system for ambulatory monitoring of gait.
Comprehensive eye evaluation algorithm
NASA Astrophysics Data System (ADS)
Agurto, C.; Nemeth, S.; Zamora, G.; Vahtel, M.; Soliz, P.; Barriga, S.
2016-03-01
In recent years, several research groups have developed automatic algorithms to detect diabetic retinopathy (DR) in individuals with diabetes (DM), using digital retinal images. Studies have indicated that diabetics have 1.5 times the annual risk of developing primary open angle glaucoma (POAG) as do people without DM. Moreover, DM patients have 1.8 times the risk for age-related macular degeneration (AMD). Although numerous investigators are developing automatic DR detection algorithms, there have been few successful efforts to create an automatic algorithm that can detect other ocular diseases, such as POAG and AMD. Consequently, our aim in the current study was to develop a comprehensive eye evaluation algorithm that not only detects DR in retinal images, but also automatically identifies glaucoma suspects and AMD by integrating other personal medical information with the retinal features. The proposed system is fully automatic and provides the likelihood of each of the three eye disease. The system was evaluated in two datasets of 104 and 88 diabetic cases. For each eye, we used two non-mydriatic digital color fundus photographs (macula and optic disc centered) and, when available, information about age, duration of diabetes, cataracts, hypertension, gender, and laboratory data. Our results show that the combination of multimodal features can increase the AUC by up to 5%, 7%, and 8% in the detection of AMD, DR, and glaucoma respectively. Marked improvement was achieved when laboratory results were combined with retinal image features.
Methods of information theory and algorithmic complexity for network biology.
Zenil, Hector; Kiani, Narsis A; Tegnér, Jesper
2016-03-01
We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.
An Algorithm for Testing the Efficient Market Hypothesis
Boboc, Ioana-Andreea; Dinică, Mihai-Cristian
2013-01-01
The objective of this research is to examine the efficiency of EUR/USD market through the application of a trading system. The system uses a genetic algorithm based on technical analysis indicators such as Exponential Moving Average (EMA), Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI) and Filter that gives buying and selling recommendations to investors. The algorithm optimizes the strategies by dynamically searching for parameters that improve profitability in the training period. The best sets of rules are then applied on the testing period. The results show inconsistency in finding a set of trading rules that performs well in both periods. Strategies that achieve very good returns in the training period show difficulty in returning positive results in the testing period, this being consistent with the efficient market hypothesis (EMH). PMID:24205148
NASA Astrophysics Data System (ADS)
Hall, Lindy
It is a pleasure and privilege to add some words to this collection of writings. I think that I can claim to have known Jan longest of anyone in the world science culture, as we met in February, 1956 at Carnegie Institute of Technology in Pittsburgh, Pennsylvania. As a senior in physics, he was already taking advanced classes in that Department, so his decision that spring to transition into graduate school at CIT was an easy and natural one. The next five years were a golden time of fun and learning for us both. In our respective programs, there was a good mix of technical and humanistic specialties to study and lots of opportunities for practical application and problem solving. As we and others in our core group of students married and started families, a community of colleagues and friends developed with shared interests and interactions…
Beatty, John
2016-08-01
Narratives may be easy to come by, but not everything is worth narrating. What merits a narrative? Here, I follow the lead of narratologists and literary theorists, and focus on one particular proposal concerning the elements of a story that make it narrative-worthy. These elements correspond to features of the natural world addressed by the historical sciences, where narratives figure so prominently. What matters is contingency. Narratives are especially good for representing contingency and accounting for contingent outcomes. This will be squared with a common view that narratives leave no room for chance. On the contrary, I will argue, tracing one path through a maze of alternative possibilities, and alluding to those possibilities along the way, is what a narrative does particularly well.
Warnock, Mary
1987-04-01
Warnock, chair of Britain's Committee of Inquiry into Human Fertilisation and Embryology, discusses the implications of the "artificial family" for children born through the use of reproductive technologies. She considers both treatment of infertility and the possible use of assisted reproduction to enable persons other than infertile couples, such as single persons and homosexuals, to have children. Warnock has found that emphasis has been placed on the wants and well-being of the adult(s) involved, and that the "good of the child" is a "wide and vague concept, widely invoked, not always plausibly." She is particularly concerned about children born as a result of the delayed implantation of frozen embryos, AID children who are deceived about their origins, and children born of surrogate pregnancies. She recommends that a detailed study of existing "artificial family" children be conducted to aid public policy decisions on assisted reproduction.
ARPANET Routing Algorithm Improvements
1978-10-01
IMPROVEMENTS . .PFOnINI ORG. REPORT MUNDER -- ) _ .. .... 3940 7, AUT(c) .. .. .. CONTRACT Of GRANT NUMSlet e) SJ. M. /Mc~uillan E. C./Rosen I...8217), this problem may persist for a very long time, causing extremely bad performance throughout the whole network (for instance, if w’ reports that one of...algorithm may naturally tend to oscillate between bad routing paths and become itself a major contributor to network congestion. These examples show
2016-06-07
XBT’s sound speed values instead of temperature values. Studies show that the sound speed at the surface in a specific location varies less than...be entered at the terminal in metric or English temperatures or sound speeds. The algorithm automatically determines which form each data point was... sound speeds. Leroy’s equation is used to derive sound speed from temperature or temperature from sound speed. The previous, current, and next months
Coordinating towards a Common Good
NASA Astrophysics Data System (ADS)
Santos, Francisco C.; Pacheco, Jorge M.
2010-09-01
Throughout their life, humans often engage in collective endeavors ranging from family related issues to global warming. In all cases, the tragedy of the commons threatens the possibility of reaching the optimal solution associated with global cooperation, a scenario predicted by theory and demonstrated by many experiments. Using the toolbox of evolutionary game theory, I will address two important aspects of evolutionary dynamics that have been neglected so far in the context of public goods games and evolution of cooperation. On one hand, the fact that often there is a threshold above which a public good is reached [1, 2]. On the other hand, the fact that individuals often participate in several games, related to the their social context and pattern of social ties, defined by a social network [3, 4, 5]. In the first case, the existence of a threshold above which collective action is materialized dictates a rich pattern of evolutionary dynamics where the direction of natural selection can be inverted compared to standard expectations. Scenarios of defector dominance, pure coordination or coexistence may arise simultaneously. Both finite and infinite population models are analyzed. In networked games, cooperation blooms whenever the act of contributing is more important than the effort contributed. In particular, the heterogeneous nature of social networks naturally induces a symmetry breaking of the dilemmas of cooperation, as contributions made by cooperators may become contingent on the social context in which the individual is embedded. This diversity in context provides an advantage to cooperators, which is particularly strong when both wealth and social ties follow a power-law distribution, providing clues on the self-organization of social communities. Finally, in both situations, it can be shown that individuals no longer play a defection dominance dilemma, but effectively engage in a general N-person coordination game. Even if locally defection may seem
"Medicine show." Alice in Doctorland.
1987-01-01
This is an excerpt from the script of a 1939 play provided to the Institute of Social Medicine and Community Health by the Library of Congress Federal Theater Project Collection at George Mason University Library, Fairfax, Virginia, pages 2-1-8 thru 2-1-14. The Federal Theatre Project (FTP) was part of the New Deal program for the arts 1935-1939. Funded by the Works Progress Administration (WPA) its goal was to employ theater professionals from the relief rolls. A number of FTP plays deal with aspects of medicine and public health. Pageants, puppet shows and documentary plays celebrated progress in medical science while examining social controversies in medical services and the public health movement. "Medicine Show" sharply contrasts technological wonders with social backwardness. The play was rehearsed by the FTP but never opened because funding ended. A revised version ran on Broadway in 1940. The preceding comments are adapted from an excellent, well-illustrated review of five of these plays by Barabara Melosh: "The New Deal's Federal Theatre Project," Medical Heritage, Vol. 2, No. 1 (Jan/Feb 1986), pp. 36-47.
Fontana, W.
1990-12-13
In this paper complex adaptive systems are defined by a self- referential loop in which objects encode functions that act back on these objects. A model for this loop is presented. It uses a simple recursive formal language, derived from the lambda-calculus, to provide a semantics that maps character strings into functions that manipulate symbols on strings. The interaction between two functions, or algorithms, is defined naturally within the language through function composition, and results in the production of a new function. An iterated map acting on sets of functions and a corresponding graph representation are defined. Their properties are useful to discuss the behavior of a fixed size ensemble of randomly interacting functions. This function gas'', or Turning gas'', is studied under various conditions, and evolves cooperative interaction patterns of considerable intricacy. These patterns adapt under the influence of perturbations consisting in the addition of new random functions to the system. Different organizations emerge depending on the availability of self-replicators.
Firmware algorithms for PINGU experiment
NASA Astrophysics Data System (ADS)
Pankova, Daria; Anderson, Tyler; IceCube Collaboration
2017-01-01
PINGU is a future low energy extension for the IceCube experiment. It will be implemented as several additional closer positioned stings of digital optical modules (DOMs) inside the main detector volume. PINGU would be able to register neutrinos with energies as low as few GeV. One of the proposed designs for the new PINGU DOMs is an updated version of IceCube DOMs with newer electronic components, particularly a better more modern FPGA. With those improvements it is desirable to run some waveform feature extraction directly on the DOM, thus decreasing amount of data sent over the detector's bandwidth-limited cable. In order to use the existing feature extraction package for this purpose the signal waveform needs to be prepared by subtracting of a variable baseline from it. The baseline shape is dependant mostly on the environment temperature, which causes the long term drift of the signal, and the induction used in signal readout electronics, which modifies the signal shape. Algorithms have been selected to counter those baseline variances, modeled and partly implemented in FPGA fabric. The simulation shows good agreement between initial signal and the ``corrected'' version.
An efficient approximation algorithm for finding a maximum clique using Hopfield network learning.
Wang, Rong Long; Tang, Zheng; Cao, Qi Ping
2003-07-01
In this article, we present a solution to the maximum clique problem using a gradient-ascent learning algorithm of the Hopfield neural network. This method provides a near-optimum parallel algorithm for finding a maximum clique. To do this, we use the Hopfield neural network to generate a near-maximum clique and then modify weights in a gradient-ascent direction to allow the network to escape from the state of near-maximum clique to maximum clique or better. The proposed parallel algorithm is tested on two types of random graphs and some benchmark graphs from the Center for Discrete Mathematics and Theoretical Computer Science (DIMACS). The simulation results show that the proposed learning algorithm can find good solutions in reasonable computation time.
A simple parallel prefix algorithm for compact finite-difference schemes
NASA Technical Reports Server (NTRS)
Sun, Xian-He; Joslin, Ronald D.
1993-01-01
A compact scheme is a discretization scheme that is advantageous in obtaining highly accurate solutions. However, the resulting systems from compact schemes are tridiagonal systems that are difficult to solve efficiently on parallel computers. Considering the almost symmetric Toeplitz structure, a parallel algorithm, simple parallel prefix (SPP), is proposed. The SPP algorithm requires less memory than the conventional LU decomposition and is highly efficient on parallel machines. It consists of a prefix communication pattern and AXPY operations. Both the computation and the communication can be truncated without degrading the accuracy when the system is diagonally dominant. A formal accuracy study was conducted to provide a simple truncation formula. Experimental results were measured on a MasPar MP-1 SIMD machine and on a Cray 2 vector machine. Experimental results show that the simple parallel prefix algorithm is a good algorithm for the compact scheme on high-performance computers.
Study on algorithm and real-time implementation of infrared image processing based on FPGA
NASA Astrophysics Data System (ADS)
Pang, Yulin; Ding, Ruijun; Liu, Shanshan; Chen, Zhe
2010-10-01
With the fast development of Infrared Focal Plane Arrays (IRFPA) detectors, high quality real-time image processing becomes more important in infrared imaging system. Facing the demand of better visual effect and good performance, we find FPGA is an ideal choice of hardware to realize image processing algorithm that fully taking advantage of its high speed, high reliability and processing a great amount of data in parallel. In this paper, a new idea of dynamic linear extension algorithm is introduced, which has the function of automatically finding the proper extension range. This image enhancement algorithm is designed in Verilog HDL and realized on FPGA. It works on higher speed than serial processing device like CPU and DSP. Experiment shows that this hardware unit of dynamic linear extension algorithm enhances the visual effect of infrared image effectively.
Linear Bregman algorithm implemented in parallel GPU
NASA Astrophysics Data System (ADS)
Li, Pengyan; Ke, Jue; Sui, Dong; Wei, Ping
2015-08-01
At present, most compressed sensing (CS) algorithms have poor converging speed, thus are difficult to run on PC. To deal with this issue, we use a parallel GPU, to implement a broadly used compressed sensing algorithm, the Linear Bregman algorithm. Linear iterative Bregman algorithm is a reconstruction algorithm proposed by Osher and Cai. Compared with other CS reconstruction algorithms, the linear Bregman algorithm only involves the vector and matrix multiplication and thresholding operation, and is simpler and more efficient for programming. We use C as a development language and adopt CUDA (Compute Unified Device Architecture) as parallel computing architectures. In this paper, we compared the parallel Bregman algorithm with traditional CPU realized Bregaman algorithm. In addition, we also compared the parallel Bregman algorithm with other CS reconstruction algorithms, such as OMP and TwIST algorithms. Compared with these two algorithms, the result of this paper shows that, the parallel Bregman algorithm needs shorter time, and thus is more convenient for real-time object reconstruction, which is important to people's fast growing demand to information technology.
An improved distributed routing algorithm for Benes based optical NoC
NASA Astrophysics Data System (ADS)
Zhang, Jing; Gu, Huaxi; Yang, Yintang
2010-08-01
Integrated optical interconnect is believed to be one of the main technologies to replace electrical wires. Optical Network-on-Chip (ONoC) has attracted more attentions nowadays. Benes topology is a good choice for ONoC for its rearrangeable non-blocking character, multistage feature and easy scalability. Routing algorithm plays an important role in determining the performance of ONoC. But traditional routing algorithms for Benes network are not suitable for ONoC communication, we developed a new distributed routing algorithm for Benes ONoC in this paper. Our algorithm selected the routing path dynamically according to network condition and enables more path choices for the message traveling in the network. We used OPNET to evaluate the performance of our routing algorithm and also compared it with a well-known bit-controlled routing algorithm. ETE delay and throughput were showed under different packet length and network sizes. Simulation results show that our routing algorithm can provide better performance for ONoC.
"Show me" bioethics and politics.
Christopher, Myra J
2007-10-01
Missouri, the "Show Me State," has become the epicenter of several important national public policy debates, including abortion rights, the right to choose and refuse medical treatment, and, most recently, early stem cell research. In this environment, the Center for Practical Bioethics (formerly, Midwest Bioethics Center) emerged and grew. The Center's role in these "cultural wars" is not to advocate for a particular position but to provide well researched and objective information, perspective, and advocacy for the ethical justification of policy positions; and to serve as a neutral convener and provider of a public forum for discussion. In this article, the Center's work on early stem cell research is a case study through which to argue that not only the Center, but also the field of bioethics has a critical role in the politics of public health policy.
Phoenix Scoop Inverted Showing Rasp
NASA Technical Reports Server (NTRS)
2008-01-01
This image taken by the Surface Stereo Imager on Sol 49, or the 49th Martian day of the mission (July 14, 2008), shows the silver colored rasp protruding from NASA's Phoenix Mars Lander's Robotic Arm scoop. The scoop is inverted and the rasp is pointing up.
Shown with its forks pointing toward the ground is the thermal and electrical conductivity probe, at the lower right. The Robotic Arm Camera is pointed toward the ground.
The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.
Sinclair, Michael B
2012-01-05
ShowMe3D is a data visualization graphical user interface specifically designed for use with hyperspectral image obtained from the Hyperspectral Confocal Microscope. The program allows the user to select and display any single image from a three dimensional hyperspectral image stack. By moving a slider control, the user can easily move between images of the stack. The user can zoom into any region of the image. The user can select any pixel or region from the displayed image and display the fluorescence spectrum associated with that pixel or region. The user can define up to 3 spectral filters to apply to the hyperspectral image and view the image as it would appear from a filter-based confocal microscope. The user can also obtain statistics such as intensity average and variance from selected regions.
Good-enough linguistic representations and online cognitive equilibrium in language processing.
Karimi, Hossein; Ferreira, Fernanda
2016-01-01
We review previous research showing that representations formed during language processing are sometimes just "good enough" for the task at hand and propose the "online cognitive equilibrium" hypothesis as the driving force behind the formation of good-enough representations in language processing. Based on this view, we assume that the language comprehension system by default prefers to achieve as early as possible and remain as long as possible in a state of cognitive equilibrium where linguistic representations are successfully incorporated with existing knowledge structures (i.e., schemata) so that a meaningful and coherent overall representation is formed, and uncertainty is resolved or at least minimized. We also argue that the online equilibrium hypothesis is consistent with current theories of language processing, which maintain that linguistic representations are formed through a complex interplay between simple heuristics and deep syntactic algorithms and also theories that hold that linguistic representations are often incomplete and lacking in detail. We also propose a model of language processing that makes use of both heuristic and algorithmic processing, is sensitive to online cognitive equilibrium, and, we argue, is capable of explaining the formation of underspecified representations. We review previous findings providing evidence for underspecification in relation to this hypothesis and the associated language processing model and argue that most of these findings are compatible with them.
Jiang, Wenjuan; Shi, Yunbo; Zhao, Wenjie; Wang, Xiangxin
2016-01-01
The main part of the magnetic fluxgate sensor is the magnetic core, the hysteresis characteristic of which affects the performance of the sensor. When the fluxgate sensors are modelled for design purposes, an accurate model of hysteresis characteristic of the cores is necessary to achieve good agreement between modelled and experimental data. The Jiles-Atherton model is simple and can reflect the hysteresis properties of the magnetic material precisely, which makes it widely used in hysteresis modelling and simulation of ferromagnetic materials. However, in practice, it is difficult to determine the parameters accurately owing to the sensitivity of the parameters. In this paper, the Biogeography-Based Optimization (BBO) algorithm is applied to identify the Jiles-Atherton model parameters. To enhance the performances of the BBO algorithm such as global search capability, search accuracy and convergence rate, an improved Biogeography-Based Optimization (IBBO) algorithm is put forward by using Arnold map and mutation strategy of Differential Evolution (DE) algorithm. Simulation results show that IBBO algorithm is superior to Genetic Algorithm (GA), Particle Swarm Optimization (PSO) algorithm, Differential Evolution algorithm and BBO algorithm in identification accuracy and convergence rate. The IBBO algorithm is applied to identify Jiles-Atherton model parameters of selected permalloy. The simulation hysteresis loop is in high agreement with experimental data. Using permalloy as core of fluxgate probe, the simulation output is consistent with experimental output. The IBBO algorithm can identify the parameters of Jiles-Atherton model accurately, which provides a basis for the precise analysis and design of instruments and equipment with magnetic core. PMID:27347974
Benchmarking monthly homogenization algorithms
NASA Astrophysics Data System (ADS)
Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.
2011-08-01
. Training was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.
Casimir experiments showing saturation effects
Sernelius, Bo E.
2009-10-15
We address several different Casimir experiments where theory and experiment disagree. First out is the classical Casimir force measurement between two metal half spaces; here both in the form of the torsion pendulum experiment by Lamoreaux and in the form of the Casimir pressure measurement between a gold sphere and a gold plate as performed by Decca et al.; theory predicts a large negative thermal correction, absent in the high precision experiments. The third experiment is the measurement of the Casimir force between a metal plate and a laser irradiated semiconductor membrane as performed by Chen et al.; the change in force with laser intensity is larger than predicted by theory. The fourth experiment is the measurement of the Casimir force between an atom and a wall in the form of the measurement by Obrecht et al. of the change in oscillation frequency of a {sup 87}Rb Bose-Einstein condensate trapped to a fused silica wall; the change is smaller than predicted by theory. We show that saturation effects can explain the discrepancies between theory and experiment observed in all these cases.
GOES-West Shows U.S. West's Record Rainfall
A new time-lapse animation of data from NOAA's GOES-West satellite provides a good picture of why the U.S. West Coast continues to experience record rainfall. The new animation shows the movement o...
Bunkhouse basement interior showing storage area and a conveyor belt ...
Bunkhouse basement interior showing storage area and a conveyor belt (circa 1936) used to unload dry goods into the basement through an opening on the east side of the bunkhouse. - Sespe Ranch, Bunkhouse, 2896 Telegraph Road, Fillmore, Ventura County, CA
Lee, Thomas H
2009-06-01
Whether it's a basic Mr. Coffee or a gadget that sports a snazzy device for grinding beans on demand, the office coffee machine offers a place for serendipitous encounters that can improve the social aspect of work and generate new ideas. What's more, a steaming cup of joe may be as good for your health as it is for the bottom line, says Lee, a professor of medicine at Harvard Medical School and the CEO of Partners Community HealthCare. Fears of coffee's carcinogenic effects now appear to be unfounded, and, in fact, the brew might even protect against some types of cancer. What's more, coffee may guard against Alzheimer's disease and other forms of dementia and somehow soften the blow of a heart attack. Of course, its role as a pick-me-up is well known. So there's no need to take your coffee with a dollop of guilt, especially if you ease up on the sugar, cream, double chocolate, and whipped-cream topping.
Performance analysis of LVQ algorithms: a statistical physics approach.
Ghosh, Anarta; Biehl, Michael; Hammer, Barbara
2006-01-01
Learning vector quantization (LVQ) constitutes a powerful and intuitive method for adaptive nearest prototype classification. However, original LVQ has been introduced based on heuristics and numerous modifications exist to achieve better convergence and stability. Recently, a mathematical foundation by means of a cost function has been proposed which, as a limiting case, yields a learning rule similar to classical LVQ2.1. It also motivates a modification which shows better stability. However, the exact dynamics as well as the generalization ability of many LVQ algorithms have not been thoroughly investigated so far. Using concepts from statistical physics and the theory of on-line learning, we present a mathematical framework to analyse the performance of different LVQ algorithms in a typical scenario in terms of their dynamics, sensitivity to initial conditions, and generalization ability. Significant differences in the algorithmic stability and generalization ability can be found already for slightly different variants of LVQ. We study five LVQ algorithms in detail: Kohonen's original LVQ1, unsupervised vector quantization (VQ), a mixture of VQ and LVQ, LVQ2.1, and a variant of LVQ which is based on a cost function. Surprisingly, basic LVQ1 shows very good performance in terms of stability, asymptotic generalization ability, and robustness to initializations and model parameters which, in many cases, is superior to recent alternative proposals.
Spectral Regularization Algorithms for Learning Large Incomplete Matrices.
Mazumder, Rahul; Hastie, Trevor; Tibshirani, Robert
2010-03-01
We use convex relaxation techniques to provide a sequence of regularized low-rank solutions for large-scale matrix completion problems. Using the nuclear norm as a regularizer, we provide a simple and very efficient convex algorithm for minimizing the reconstruction error subject to a bound on the nuclear norm. Our algorithm Soft-Impute iteratively replaces the missing elements with those obtained from a soft-thresholded SVD. With warm starts this allows us to efficiently compute an entire regularization path of solutions on a grid of values of the regularization parameter. The computationally intensive part of our algorithm is in computing a low-rank SVD of a dense matrix. Exploiting the problem structure, we show that the task can be performed with a complexity linear in the matrix dimensions. Our semidefinite-programming algorithm is readily scalable to large matrices: for example it can obtain a rank-80 approximation of a 10(6) × 10(6) incomplete matrix with 10(5) observed entries in 2.5 hours, and can fit a rank 40 approximation to the full Netflix training set in 6.6 hours. Our methods show very good performance both in training and test error when compared to other competitive state-of-the art techniques.
Denoising of gravitational wave signals via dictionary learning algorithms
NASA Astrophysics Data System (ADS)
Torres-Forné, Alejandro; Marquina, Antonio; Font, José A.; Ibáñez, José M.
2016-12-01
Gravitational wave astronomy has become a reality after the historical detections accomplished during the first observing run of the two advanced LIGO detectors. In the following years, the number of detections is expected to increase significantly with the full commissioning of the advanced LIGO, advanced Virgo and KAGRA detectors. The development of sophisticated data analysis techniques to improve the opportunities of detection for low signal-to-noise-ratio events is, hence, a most crucial effort. In this paper, we present one such technique, dictionary-learning algorithms, which have been extensively developed in the last few years and successfully applied mostly in the context of image processing. However, to the best of our knowledge, such algorithms have not yet been employed to denoise gravitational wave signals. By building dictionaries from numerical relativity templates of both binary black holes mergers and bursts of rotational core collapse, we show how machine-learning algorithms based on dictionaries can also be successfully applied for gravitational wave denoising. We use a subset of signals from both catalogs, embedded in nonwhite Gaussian noise, to assess our techniques with a large sample of tests and to find the best model parameters. The application of our method to the actual signal GW150914 shows promising results. Dictionary-learning algorithms could be a complementary addition to the gravitational wave data analysis toolkit. They may be used to extract signals from noise and to infer physical parameters if the data are in good enough agreement with the morphology of the dictionary atoms.
Persons as goods: response to Patrick Lee.
Chappell, T D J
2004-01-01
Developing a British perspective on the abortion debate, I take up some ideas from Patrick Lee's fine paper, and pursue, in particular, the idea of individual humans as goods in themselves. I argue that this notion helps us to avoid the familiar mistake of making moral value impersonal. It also shows us the way out of consequentalism. Since the most philosophically viable notion of the person, the individual human, is (as Lee argues) a notion of individual substance that is there from conception, the move has a third effect, which is to rule out abortion.
What is it to do good medical ethics? On the concepts of 'good' and 'goodness' in medical ethics.
Solbakk, Jan Helge
2015-01-01
In his book The Varieties of Goodness Georg Henrik von Wright advocates that a useful preliminary to the study of the word 'good' is to compile a list of familiar uses and try to group them under some main headings. The present paper aims at exploring the question, 'What is it to do good medical ethics?', and notably from the vantage point of everyday expressions of the word 'good' and von Wright's grouping of them into six different types of goodness.
Statistical behaviour of adaptive multilevel splitting algorithms in simple models
NASA Astrophysics Data System (ADS)
Rolland, Joran; Simonnet, Eric
2015-02-01
Adaptive multilevel splitting algorithms have been introduced rather recently for estimating tail distributions in a fast and efficient way. In particular, they can be used for computing the so-called reactive trajectories corresponding to direct transitions from one metastable state to another. The algorithm is based on successive selection-mutation steps performed on the system in a controlled way. It has two intrinsic parameters, the number of particles/trajectories and the reaction coordinate used for discriminating good or bad trajectories. We investigate first the convergence in law of the algorithm as a function of the timestep for several simple stochastic models. Second, we consider the average duration of reactive trajectories for which no theoretical predictions exist. The most important aspect of this work concerns some systems with two degrees of freedom. They are studied in detail as a function of the reaction coordinate in the asymptotic regime where the number of trajectories goes to infinity. We show that during phase transitions, the statistics of the algorithm deviate significatively from known theoretical results when using non-optimal reaction coordinates. In this case, the variance of the algorithm is peaking at the transition and the convergence of the algorithm can be much slower than the usual expected central limit behaviour. The duration of trajectories is affected as well. Moreover, reactive trajectories do not correspond to the most probable ones. Such behaviour disappears when using the optimal reaction coordinate called committor as predicted by the theory. We finally investigate a three-state Markov chain which reproduces this phenomenon and show logarithmic convergence of the trajectory durations.
Statistical behaviour of adaptive multilevel splitting algorithms in simple models
Rolland, Joran Simonnet, Eric
2015-02-15
Adaptive multilevel splitting algorithms have been introduced rather recently for estimating tail distributions in a fast and efficient way. In particular, they can be used for computing the so-called reactive trajectories corresponding to direct transitions from one metastable state to another. The algorithm is based on successive selection–mutation steps performed on the system in a controlled way. It has two intrinsic parameters, the number of particles/trajectories and the reaction coordinate used for discriminating good or bad trajectories. We investigate first the convergence in law of the algorithm as a function of the timestep for several simple stochastic models. Second, we consider the average duration of reactive trajectories for which no theoretical predictions exist. The most important aspect of this work concerns some systems with two degrees of freedom. They are studied in detail as a function of the reaction coordinate in the asymptotic regime where the number of trajectories goes to infinity. We show that during phase transitions, the statistics of the algorithm deviate significatively from known theoretical results when using non-optimal reaction coordinates. In this case, the variance of the algorithm is peaking at the transition and the convergence of the algorithm can be much slower than the usual expected central limit behaviour. The duration of trajectories is affected as well. Moreover, reactive trajectories do not correspond to the most probable ones. Such behaviour disappears when using the optimal reaction coordinate called committor as predicted by the theory. We finally investigate a three-state Markov chain which reproduces this phenomenon and show logarithmic convergence of the trajectory durations.
NASA Technical Reports Server (NTRS)
2005-01-01
False color images of Saturn's moon, Mimas, reveal variation in either the composition or texture across its surface.
During its approach to Mimas on Aug. 2, 2005, the Cassini spacecraft narrow-angle camera obtained multi-spectral views of the moon from a range of 228,000 kilometers (142,500 miles).
The image at the left is a narrow angle clear-filter image, which was separately processed to enhance the contrast in brightness and sharpness of visible features. The image at the right is a color composite of narrow-angle ultraviolet, green, infrared and clear filter images, which have been specially processed to accentuate subtle changes in the spectral properties of Mimas' surface materials. To create this view, three color images (ultraviolet, green and infrared) were combined into a single black and white picture that isolates and maps regional color differences. This 'color map' was then superimposed over the clear-filter image at the left.
The combination of color map and brightness image shows how the color differences across the Mimas surface materials are tied to geological features. Shades of blue and violet in the image at the right are used to identify surface materials that are bluer in color and have a weaker infrared brightness than average Mimas materials, which are represented by green.
Herschel crater, a 140-kilometer-wide (88-mile) impact feature with a prominent central peak, is visible in the upper right of each image. The unusual bluer materials are seen to broadly surround Herschel crater. However, the bluer material is not uniformly distributed in and around the crater. Instead, it appears to be concentrated on the outside of the crater and more to the west than to the north or south. The origin of the color differences is not yet understood. It may represent ejecta material that was excavated from inside Mimas when the Herschel impact occurred. The bluer color of these materials may be caused by subtle differences in
Algorithm Animation with Galant.
Stallmann, Matthias F
2017-01-01
Although surveys suggest positive student attitudes toward the use of algorithm animations, it is not clear that they improve learning outcomes. The Graph Algorithm Animation Tool, or Galant, challenges and motivates students to engage more deeply with algorithm concepts, without distracting them with programming language details or GUIs. Even though Galant is specifically designed for graph algorithms, it has also been used to animate other algorithms, most notably sorting algorithms.
Surveys show support for green 'activities'.
Baillie, Jonathan
2012-03-01
Two independently conducted surveys on sustainability - one into the 'views and values' of NHS 'leaders', and the other questioning the public about the importance of the 'green agenda' in the NHS, and their opinions on how the service might most effectively reduce its carbon footprint, form the basis of Sustainability in the NHS: Health Check 2012, a new NHS Sustainable Development Unit (NHS SDU) publication. As HEJ editor Jonathan Baillie reports, the new document also presents updated data on the 'size' of the carbon footprint of the NHS in England, showing that, although good work by a number of Trusts in the past two years has seen healthcare-generated carbon emissions start to 'level off', the biggest contributors have been the current health service spending review, and the increased national availability of renewable energy.
Vertical Sextants give Good Sights
NASA Astrophysics Data System (ADS)
Dixon, Mark
Many texts stress the need for marine sextants to be held precisely vertical at the instant that the altitude of a heavenly body is measured. Several authors lay particular emphasis on the technique of the instrument in a small arc about the horizontal axis to obtain a good sight. Nobody, to the author's knowledge, however, has attempted to quantify the errors involved, so as to compare them with other errors inherent in determining celestial position lines. This paper sets out to address these issues and to pose the question: what level of accuracy of vertical alignment can reasonably be expected during marine sextant work at sea ?When a heavenly body is brought to tangency with the visible horizon it is particularly important to ensure that the sextant is held in a truly vertical position. To this end the instrument is rocked gently about the horizontal so that the image of the body describes a small arc in the observer's field of vision. As Bruce Bauer points out, tangency with the horizon must be achieved during the process of rocking and not a second or so after rocking has been discontinued. The altitude is recorded for the instant that the body kisses the visible horizon at the lowest point of the rocking arc, as in Fig. 2. The only other visual clue as to whether the sextant is vertical is provided by the right angle made by the vertical edge of the horizon glass mirror with the horizon. There may also be some input from the observer's sense of balance and his hand orientation.
Spaceborne SAR Imaging Algorithm for Coherence Optimized
Qiu, Zhiwei; Yue, Jianping; Wang, Xueqin; Yue, Shun
2016-01-01
This paper proposes SAR imaging algorithm with largest coherence based on the existing SAR imaging algorithm. The basic idea of SAR imaging algorithm in imaging processing is that output signal can have maximum signal-to-noise ratio (SNR) by using the optimal imaging parameters. Traditional imaging algorithm can acquire the best focusing effect, but would bring the decoherence phenomenon in subsequent interference process. Algorithm proposed in this paper is that SAR echo adopts consistent imaging parameters in focusing processing. Although the SNR of the output signal is reduced slightly, their coherence is ensured greatly, and finally the interferogram with high quality is obtained. In this paper, two scenes of Envisat ASAR data in Zhangbei are employed to conduct experiment for this algorithm. Compared with the interferogram from the traditional algorithm, the results show that this algorithm is more suitable for SAR interferometry (InSAR) research and application. PMID:26871446
Analysis of the Dryden Wet Bulb GLobe Temperature Algorithm for White Sands Missile Range
NASA Technical Reports Server (NTRS)
LaQuay, Ryan Matthew
2011-01-01
In locations where workforce is exposed to high relative humidity and light winds, heat stress is a significant concern. Such is the case at the White Sands Missile Range in New Mexico. Heat stress is depicted by the wet bulb globe temperature, which is the official measurement used by the American Conference of Governmental Industrial Hygienists. The wet bulb globe temperature is measured by an instrument which was designed to be portable and needing routine maintenance. As an alternative form for measuring the wet bulb globe temperature, algorithms have been created to calculate the wet bulb globe temperature from basic meteorological observations. The algorithms are location dependent; therefore a specific algorithm is usually not suitable for multiple locations. Due to climatology similarities, the algorithm developed for use at the Dryden Flight Research Center was applied to data from the White Sands Missile Range. A study was performed that compared a wet bulb globe instrument to data from two Surface Atmospheric Measurement Systems that was applied to the Dryden wet bulb globe temperature algorithm. The period of study was from June to September of2009, with focus being applied from 0900 to 1800, local time. Analysis showed that the algorithm worked well, with a few exceptions. The algorithm becomes less accurate to the measurement when the dew point temperature is over 10 Celsius. Cloud cover also has a significant effect on the measured wet bulb globe temperature. The algorithm does not show red and black heat stress flags well due to shorter time scales of such events. The results of this study show that it is plausible that the Dryden Flight Research wet bulb globe temperature algorithm is compatible with the White Sands Missile Range, except for when there are increased dew point temperatures and cloud cover or precipitation. During such occasions, the wet bulb globe temperature instrument would be the preferred method of measurement. Out of the 30
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 1 2013-10-01 2013-10-01 false Good faith. 93.210 Section 93.210 Public Health... MISCONDUCT Definitions § 93.210 Good faith. Good faith as applied to a complainant or witness, means having a... allegation or cooperation with a research misconduct proceeding is not in good faith if made with knowing...
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 1 2014-10-01 2014-10-01 false Good faith. 93.210 Section 93.210 Public Health... MISCONDUCT Definitions § 93.210 Good faith. Good faith as applied to a complainant or witness, means having a... allegation or cooperation with a research misconduct proceeding is not in good faith if made with knowing...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 1 2011-10-01 2011-10-01 false Good faith. 93.210 Section 93.210 Public Health... MISCONDUCT Definitions § 93.210 Good faith. Good faith as applied to a complainant or witness, means having a... allegation or cooperation with a research misconduct proceeding is not in good faith if made with knowing...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 1 2012-10-01 2012-10-01 false Good faith. 93.210 Section 93.210 Public Health... MISCONDUCT Definitions § 93.210 Good faith. Good faith as applied to a complainant or witness, means having a... allegation or cooperation with a research misconduct proceeding is not in good faith if made with knowing...
The Common Good in Classical Political Philosophy
ERIC Educational Resources Information Center
Lewis, V. Bradley
2006-01-01
The term "common good" names the end (or final cause) of political and social life in the tradition of moral thought that owes its main substance to Aristotle and St. Thomas Aquinas. It names a genuine good ("bonum honestum") and not merely an instrumental or secondary good defeasible in the face of particular goods. However, at the same time, it…
NASA Astrophysics Data System (ADS)
Rao, Sailesh K.; Kollath, T.
1986-07-01
In this paper, we show that every systolic array executes a Regular Iterative Algorithm with a strongly separating hyperplane and conversely, that every such algorithm can be implemented on a systolic array. This characterization provides us with an unified framework for describing the contributions of other authors. It also exposes the relevance of many fundamental concepts that were introduced in the sixties by Hennie, Waite and Karp, Miller and Winograd, to the present day concern of systolic array
A New Algorithm for Detecting Cloud Height using OMPS/LP Measurements
NASA Technical Reports Server (NTRS)
Chen, Zhong; DeLand, Matthew; Bhartia, Pawan K.
2016-01-01
The Ozone Mapping and Profiler Suite Limb Profiler (OMPS/LP) ozone product requires the determination of cloud height for each event to establish the lower boundary of the profile for the retrieval algorithm. We have created a revised cloud detection algorithm for LP measurements that uses the spectral dependence of the vertical gradient in radiance between two wavelengths in the visible and near-IR spectral regions. This approach provides better discrimination between clouds and aerosols than results obtained using a single wavelength. Observed LP cloud height values show good agreement with coincident Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) measurements.
ERIC Educational Resources Information Center
Armstrong, Coleen
1994-01-01
Advises retiring administrators to exercise a bit of dignity and common sense in their remaining months on the job. Administrators should show consideration regarding retirement plans, fight laziness, conduct training sessions for other administrators, accept others' foolish behavior gracefully, and be generous with parting insights. (MLH)
Sorting on STAR. [CDC computer algorithm timing comparison
NASA Technical Reports Server (NTRS)
Stone, H. S.
1978-01-01
Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.
An Upwind Multigrid Algorithm for Calculating Flows on Unstructured Grids
NASA Technical Reports Server (NTRS)
Bonhaus, Daryl L.
1993-01-01
An algorithm is described that calculates inviscid, laminar, and turbulent flows on triangular meshes with an upwind discretization. A brief description of the base solver and the multigrid implementation is given, followed by results that consist mainly of convergence rates for inviscid and viscous flows over a NACA four-digit airfoil section. The results show that multigrid does accelerate convergence when the same relaxation parameters that yield good single-grid performance are used; however, larger gains in performance can be realized by doing less work in the relaxation scheme.
NASA Astrophysics Data System (ADS)
Huang, Sheng; Pang, Hong-Feng; Li, Ling-Xia
2013-03-01
In optical burst switching networks, wavelength converters (WCs) of core nodes are used to decrease the burst loss rate. The implementation of the WCs is difficult in the current technology and the cost of WCs is high. So some core nodes may be configured insufficient WCs to reduce the cost in OBS networks. However, many data channel scheduling algorithms do not count the number of WCs and the performance of burst loss rate is not good in the condition of insufficient WCs. To overcome the defect, two novel batch scheduling algorithm with insufficiency of WC are proposed in this paper. The former algorithm improves the WCs' resource utilization probability to reduce the burst loss rate and the later algorithm saves the WCs' resource for the incoming bursts to use to improve the burst loss performance. The later algorithm can reduce more burst loss rate with the same number of WCs, compared with the other scheduling algorithms. The simulation results show that the later algorithm is more effective in reducing the burst loss rate with insufficient WCs.
Solving the depth of the repeated texture areas based on the clustering algorithm
NASA Astrophysics Data System (ADS)
Xiong, Zhang; Zhang, Jun; Tian, Jinwen
2015-12-01
The reconstruction of the 3D scene in the monocular stereo vision needs to get the depth of the field scenic points in the picture scene. But there will inevitably be error matching in the process of image matching, especially when there are a large number of repeat texture areas in the images, there will be lots of error matches. At present, multiple baseline stereo imaging algorithm is commonly used to eliminate matching error for repeated texture areas. This algorithm can eliminate the ambiguity correspond to common repetition texture. But this algorithm has restrictions on the baseline, and has low speed. In this paper, we put forward an algorithm of calculating the depth of the matching points in the repeat texture areas based on the clustering algorithm. Firstly, we adopt Gauss Filter to preprocess the images. Secondly, we segment the repeated texture regions in the images into image blocks by using spectral clustering segmentation algorithm based on super pixel and tag the image blocks. Then, match the two images and solve the depth of the image. Finally, the depth of the image blocks takes the median in all depth values of calculating point in the bock. So the depth of repeated texture areas is got. The results of a lot of image experiments show that the effect of our algorithm for calculating the depth of repeated texture areas is very good.
NASA Astrophysics Data System (ADS)
Li, Xiaofeng; Xiang, Suying; Zhu, Pengfei; Wu, Min
2015-12-01
In order to avoid the inherent deficiencies of the traditional BP neural network, such as slow convergence speed, that easily leading to local minima, poor generalization ability and difficulty in determining the network structure, the dynamic self-adaptive learning algorithm of the BP neural network is put forward to improve the function of the BP neural network. The new algorithm combines the merit of principal component analysis, particle swarm optimization, correlation analysis and self-adaptive model, hence can effectively solve the problems of selecting structural parameters, initial connection weights and thresholds and learning rates of the BP neural network. This new algorithm not only reduces the human intervention, optimizes the topological structures of BP neural networks and improves the network generalization ability, but also accelerates the convergence speed of a network, avoids trapping into local minima, and enhances network adaptation ability and prediction ability. The dynamic self-adaptive learning algorithm of the BP neural network is used to forecast the total retail sale of consumer goods of Sichuan Province, China. Empirical results indicate that the new algorithm is superior to the traditional BP network algorithm in predicting accuracy and time consumption, which shows the feasibility and effectiveness of the new algorithm.
Evolution of Cooperation in Public Goods Games
NASA Astrophysics Data System (ADS)
Xia, Cheng-Yi; Zhang, Juan-Juan; Wang, Yi-Ling; Wang, Jin-Song
2011-10-01
We investigate the evolution of cooperation with evolutionary public goods games based on finite populations, where four pure strategies: cooperators, defectors, punishers and loners who are unwilling to participate are considered. By adopting approximate best response dynamics, we show that the magnitude of rationality not only quantitatively explains the experiment results in [Nature (London) 425 (2003) 390], but also it will heavily influence the evolution of cooperation. Compared with previous results of infinite populations, which result in two equilibriums, we show that there merely exists a special equilibrium and the relevant high value of bounded rationality will sustain cooperation. In addition, we characterize that loner's payoff plays an active role in the maintenance of cooperation, which will only be warranted for the low and moderate values of loner's payoff. It thus indicates the effects of rationality and loner's payoff will influence the cooperation. Finally, we highlight the important result that the introduction of voluntary participation and punishment will facilitate cooperation greatly.
What are single photons good for?
NASA Astrophysics Data System (ADS)
Sangouard, Nicolas; Zbinden, Hugo
2012-10-01
In a long-held preconception, photons play a central role in present-day quantum technologies. But what are sources producing photons one by one good for precisely? Well, in opposition to what many suggest, we show that single-photon sources are not helpful for point to point quantum key distribution because faint laser pulses do the job comfortably. However, there is no doubt about the usefulness of sources producing single photons for future quantum technologies. In particular, we show how single-photon sources could become the seed of a revolution in the framework of quantum communication, making the security of quantum key distribution device-independent or extending quantum communication over many hundreds of kilometers. Hopefully, these promising applications will provide a guideline for researchers to develop more and more efficient sources, producing narrowband, pure and indistinguishable photons at appropriate wavelengths.
An Affinity Propagation-Based DNA Motif Discovery Algorithm.
Sun, Chunxiao; Huo, Hongwei; Yu, Qiang; Guo, Haitao; Sun, Zhigang
2015-01-01
The planted (l, d) motif search (PMS) is one of the fundamental problems in bioinformatics, which plays an important role in locating transcription factor binding sites (TFBSs) in DNA sequences. Nowadays, identifying weak motifs and reducing the effect of local optimum are still important but challenging tasks for motif discovery. To solve the tasks, we propose a new algorithm, APMotif, which first applies the Affinity Propagation (AP) clustering in DNA sequences to produce informative and good candidate motifs and then employs Expectation Maximization (EM) refinement to obtain the optimal motifs from the candidate motifs. Experimental results both on simulated data sets and real biological data sets show that APMotif usually outperforms four other widely used algorithms in terms of high prediction accuracy.
Consultant-Guided Search Algorithms for the Quadratic Assignment Problem
NASA Astrophysics Data System (ADS)
Iordache, Serban
Consultant-Guided Search (CGS) is a recent swarm intelligence metaheuristic for combinatorial optimization problems, inspired by the way real people make decisions based on advice received from consultants. Until now, CGS has been successfully applied to the Traveling Salesman Problem. Because a good metaheuristic should be able to tackle efficiently a large variety of problems, it is important to see how CGS behaves when applied to other classes of problems. In this paper, we propose an algorithm for the Quadratic Assignment Problem (QAP), which hybridizes CGS with a local search procedure. Our experimental results show that CGS is able to compete in terms of solution quality with one of the best Ant Colony Optimization algorithms, the MAX-MIN Ant System.
Algorithm for precision subsample timing between Gaussian-like pulses.
Lerche, R A; Golick, B P; Holder, J P; Kalantar, D H
2010-10-01
Moderately priced oscilloscopes available for the NIF power sensors and target diagnostics have 6 GHz bandwidths at 20-25 Gsamples/s (40 ps sample spacing). Some NIF experiments require cross timing between instruments be determined with accuracy better than 30 ps. A simple analysis algorithm for Gaussian-like pulses such as the 100-ps-wide NIF timing fiducial can achieve single-event cross-timing precision of 1 ps (1/50 of the sample spacing). The midpoint-timing algorithm is presented along with simulations that show why the technique produces good timing results. Optimum pulse width is found to be ∼2.5 times the sample spacing. Experimental measurements demonstrate use of the technique and highlight the conditions needed to obtain optimum timing performance.
Avoiding or restricting defectors in public goods games?
Han, The Anh; Pereira, Luís Moniz; Lenaerts, Tom
2015-01-01
When creating a public good, strategies or mechanisms are required to handle defectors. We first show mathematically and numerically that prior agreements with posterior compensations provide a strategic solution that leads to substantial levels of cooperation in the context of public goods games, results that are corroborated by available experimental data. Notwithstanding this success, one cannot, as with other approaches, fully exclude the presence of defectors, raising the question of how they can be dealt with to avoid the demise of the common good. We show that both avoiding creation of the common good, whenever full agreement is not reached, and limiting the benefit that disagreeing defectors can acquire, using costly restriction mechanisms, are relevant choices. Nonetheless, restriction mechanisms are found the more favourable, especially in larger group interactions. Given decreasing restriction costs, introducing restraining measures to cope with public goods free-riding issues is the ultimate advantageous solution for all participants, rather than avoiding its creation. PMID:25540240
Avoiding or restricting defectors in public goods games?
Han, The Anh; Pereira, Luís Moniz; Lenaerts, Tom
2015-02-06
When creating a public good, strategies or mechanisms are required to handle defectors. We first show mathematically and numerically that prior agreements with posterior compensations provide a strategic solution that leads to substantial levels of cooperation in the context of public goods games, results that are corroborated by available experimental data. Notwithstanding this success, one cannot, as with other approaches, fully exclude the presence of defectors, raising the question of how they can be dealt with to avoid the demise of the common good. We show that both avoiding creation of the common good, whenever full agreement is not reached, and limiting the benefit that disagreeing defectors can acquire, using costly restriction mechanisms, are relevant choices. Nonetheless, restriction mechanisms are found the more favourable, especially in larger group interactions. Given decreasing restriction costs, introducing restraining measures to cope with public goods free-riding issues is the ultimate advantageous solution for all participants, rather than avoiding its creation.
3. Historic American Buildings Survey Ned Goode, Photographer August ...
3. Historic American Buildings Survey Ned Goode, Photographer - August 1959 WEST OR REAR AND SOUTH SIDE OF HOUSE SHOWING BRICK SMOKEHOUSE IN THE RIGHT FOREGROUND - Buffum House, Main & Middle Streets, Walpole, Cheshire County, NH
1. Historic American Buildings Survey Ned Goode, Photographer August ...
1. Historic American Buildings Survey Ned Goode, Photographer - August 1959 EAST FRONT AND NORTH SIDE, SHOWING ARCHED WINDOWS AND COLUMNS - Howland-Schofield House, Elm & Pleasant Streets, Walpole, Cheshire County, NH
Efficient multiple-way graph partitioning algorithms
Dasdan, A.; Aykanat, C.
1995-12-01
Graph partitioning deals with evenly dividing a graph into two or more parts such that the total weight of edges interconnecting these parts, i.e., cutsize, is minimized. Graph partitioning has important applications in VLSI layout, mapping, and sparse Gaussian elimination. Since graph partitioning problem is NP-hard, we should resort to polynomial-time algorithms to obtain a good solution, or hopefully a near-optimal solution. Kernighan-Lin (KL) propsoed a 2-way partitioning algorithms. Fiduccia-Mattheyses (FM) introduced a faster version of KL algorithm. Sanchis (FMS) generalized FM algorithm to a multiple-way partitioning algorithm. Simulated Annealing (SA) is one of the most successful approaches that are not KL-based.
An Algorithm for Linearly Constrained Nonlinear Programming Programming Problems.
1980-01-01
ALGORITHM FOR LINEARLY CONSTRAINED NONLINEAR PROGRAMMING PROBLEMS Mokhtar S. Bazaraa and Jamie J. Goode In this paper an algorithm for solving a linearly...distance pro- gramr.ing, as in the works of Bazaraa and Goode 12], and Wolfe [16 can be used for solving this problem. Special methods that take advantage of...34 Pacific Journal of Mathematics, Volume 16, pp. 1-3, 1966. 2. M. S. Bazaraa and J. j. Goode, "An Algorithm for Finding the Shortest Element of a
Polarisation, key to good localisation.
van Beest, Moniek; Robben, Joris H; Savelkoul, Paul J M; Hendriks, Giel; Devonald, Mark A J; Konings, Irene B M; Lagendijk, Anne K; Karet, Fiona; Deen, Peter M T
2006-08-01
Polarisation of cells is crucial for vectorial transport of ions and solutes. In literature, however, proteins specifically targeted to the apical or basolateral membrane are often studied in non-polarised cells. To investigate whether these data can be extrapolated to expression in polarised cells, we studied several membrane-specific proteins. In polarised MDCK cells, the Aquaporin-2 water channel resides in intracellular vesicles and apical membrane, while the vasopressin-type 2 receptor, anion-exchanger 1 (AE1) protein and E-Cadherin mainly localise to the basolateral membrane. In non-polarised MDCK cells, however, Aquaporin-2 localises, besides plasma membrane, mainly in the Golgi complex, while the others show a dispersed staining throughout the cell. Moreover, while AQP2 mutants in dominant nephrogenic diabetes insipidus are missorted to different organelles in polarised cells, they all predominantly localise to the Golgi complex in non-polarised MDCK cells. Additionally, the maturation of V2R, and likely its missorting, is affected in transiently-transfected compared to stably-transfected cells. In conclusion, we show that the use of stably-transfected polarised cells is crucial in interpreting the processing and the localisation of membrane targeted proteins.
An Efficient Globally Optimal Algorithm for Asymmetric Point Matching.
Lian, Wei; Zhang, Lei; Yang, Ming-Hsuan
2016-08-29
Although the robust point matching algorithm has been demonstrated to be effective for non-rigid registration, there are several issues with the adopted deterministic annealing optimization technique. First, it is not globally optimal and regularization on the spatial transformation is needed for good matching results. Second, it tends to align the mass centers of two point sets. To address these issues, we propose a globally optimal algorithm for the robust point matching problem where each model point has a counterpart in scene set. By eliminating the transformation variables, we show that the original matching problem is reduced to a concave quadratic assignment problem where the objective function has a low rank Hessian matrix. This facilitates the use of large scale global optimization techniques. We propose a branch-and-bound algorithm based on rectangular subdivision where in each iteration, multiple rectangles are used to increase the chances of subdividing the one containing the global optimal solution. In addition, we present an efficient lower bounding scheme which has a linear assignment formulation and can be efficiently solved. Extensive experiments on synthetic and real datasets demonstrate the proposed algorithm performs favorably against the state-of-the-art methods in terms of robustness to outliers, matching accuracy, and run-time.
Aerosol Retrieval and Atmospheric Correction Algorithms for EPIC
NASA Technical Reports Server (NTRS)
Wang, Yujie; Lyapustin, Alexei; Marshak, Alexander; Korkin, Sergey; Herman, Jay
2011-01-01
EPIC is a multi-spectral imager onboard planned Deep Space Climate ObserVatoRy (DSCOVR) designed for observations of the full illuminated disk of the Earth with high temporal and coarse spatial resolution (10 km) from Lagrangian L1 point. During the course of the day, EPIC will view the same Earth surface area in the full range of solar and view zenith angles at equator with fixed scattering angle near the backscattering direction. This talk will describe a new aerosol retrieval/atmospheric correction algorithm developed for EPIC and tested with EPIC Simulator data. This algorithm uses the time series approach and consists of two stages: the first stage is designed to periodically re-initialize the surface spectral bidirectional reflectance (BRF) on stable low AOD days. Such days can be selected based on the same measured reflectance between the morning and afternoon reciprocal view geometries of EPIC. On the second stage, the algorithm will monitor the diurnal cycle of aerosol optical depth and fine mode fraction based on the known spectral surface BRF. Testing of the developed algorithm with simulated EPIC data over continental USA showed a good accuracy of AOD retrievals (10-20%) except over very bright surfaces.
Adaptive wavelet transform algorithm for lossy image compression
NASA Astrophysics Data System (ADS)
Pogrebnyak, Oleksiy B.; Ramirez, Pablo M.; Acevedo Mosqueda, Marco Antonio
2004-11-01
A new algorithm of locally adaptive wavelet transform based on the modified lifting scheme is presented. It performs an adaptation of the wavelet high-pass filter at the prediction stage to the local image data activity. The proposed algorithm uses the generalized framework for the lifting scheme that permits to obtain easily different wavelet filter coefficients in the case of the (~N, N) lifting. Changing wavelet filter order and different control parameters, one can obtain the desired filter frequency response. It is proposed to perform the hard switching between different wavelet lifting filter outputs according to the local data activity estimate. The proposed adaptive transform possesses a good energy compaction. The designed algorithm was tested on different images. The obtained simulation results show that the visual and quantitative quality of the restored images is high. The distortions are less in the vicinity of high spatial activity details comparing to the non-adaptive transform, which introduces ringing artifacts. The designed algorithm can be used for lossy image compression and in the noise suppression applications.
Geist, G.A.; Howell, G.W.; Watkins, D.S.
1997-11-01
The BR algorithm, a new method for calculating the eigenvalues of an upper Hessenberg matrix, is introduced. It is a bulge-chasing algorithm like the QR algorithm, but, unlike the QR algorithm, it is well adapted to computing the eigenvalues of the narrowband, nearly tridiagonal matrices generated by the look-ahead Lanczos process. This paper describes the BR algorithm and gives numerical evidence that it works well in conjunction with the Lanczos process. On the biggest problems run so far, the BR algorithm beats the QR algorithm by a factor of 30--60 in computing time and a factor of over 100 in matrix storage space.
Ramirez, I
1990-01-01
The preference humans and animals show for sweet solutions has been the subject of hundreds of publications. Nevertheless, the evolutionary origin of sweet preference remains enigmatic because of the relatively low nutritional value of sugars and the absence of specific tastes for other, more essential, nutrients. Moderate concentrations of sugars are found in most plant foods because sugars play an important role in plant physiology. Widespread occurrence of sugars in plants is paralleled by widespread preference for sugar solutions in mammals. These observations suggest that preference for sugars evolved because they are common in plants and easy to detect rather than because of any special nutritional merits they offer. Perception of sweetness cannot be used to accurately meter the metabolizable energy or nutritive value of a food.
Do humans make good decisions?
Summerfield, Christopher; Tsetsos, Konstantinos
2014-01-01
Human performance on perceptual classification tasks approaches that of an ideal observer, but economic decisions are often inconsistent and intransitive, with preferences reversing according to the local context. We discuss the view that suboptimal choices may result from the efficient coding of decision-relevant information, a strategy that allows expected inputs to be processed with higher gain than unexpected inputs. Efficient coding leads to ‘robust’ decisions that depart from optimality but maximise the information transmitted by a limited-capacity system in a rapidly-changing world. We review recent work showing that when perceptual environments are variable or volatile, perceptual decisions exhibit the same suboptimal context-dependence as economic choices, and propose a general computational framework that accounts for findings across the two domains. PMID:25488076
Broadband and Broad-Angle Low-Scattering Metasurface Based on Hybrid Optimization Algorithm
Wang, Ke; Zhao, Jie; Cheng, Qiang; Dong, Di Sha; Cui, Tie Jun
2014-01-01
A broadband and broad-angle low-scattering metasurface is designed, fabricated, and characterized. Based on the optimization algorithm and far-field scattering pattern analysis, we propose a rapid and efficient method to design metasurfaces, which avoids the large amount of time-consuming electromagnetic simulations. Full-wave simulation and measurement results show that the proposed metasurface is insensitive to the polarization of incident waves, and presents good scattering-reduction properties for oblique incident waves. PMID:25089367
Genetic algorithm for the pair distribution function of the electron gas.
Vericat, Fernando; Stoico, César O; Carlevaro, C Manuel; Renzi, Danilo G
2011-12-01
The pair distribution function of the electron gas is calculated using a parameterized generalization of hypernetted chain approximation with the parameters being obtained by optimizing the system energy with a genetic algorithm. The functions so obtained are compared with Monte Carlo simulations performed by other authors in its variational and di_usion versions showing a very good agreement especially with the di_usion Monte Carlo results.
Geovisualization and analysis of the Good Country Index
NASA Astrophysics Data System (ADS)
Tan, C.; Dramowicz, K.
2016-04-01
The Good Country Index measures the contribution of a single country in the humanity and health aspects that are beneficial to the planet. Countries which are globally good for our planet do not necessarily have to be good for their own citizens. The Good Country Index is based on the following seven categories: science and technology, culture, international peace and security, world order, planet and climate, prosperity and equality, and health and well-being. The Good Country Index is focused on the external effects, in contrast to other global indices (for example, the Human Development Index, or the Social Progress Index) showing the level of development of a single country in benefiting its own citizens. The authors verify if these global indices may be good proxies as potential predictors, as well as indicators of a country's ‘goodness’. Non-spatial analysis included analyzing relationships between the overall Good Country Index and the seven contributing categories, as well as between the overall Good Country Index and other global indices. Data analytics was used for building various predictive models and selecting the most accurate model to predict the overall Good Country Index. The most important rules for high and low index values were identified. Spatial analysis included spatial autocorrelation to analyze similarity of index values of a country in relation to its neighbors. Hot spot analysis was used to identify and map significant clusters of countries with high and low index values. Similar countries were grouped into geographically compact clusters and mapped.
Maximizing Submodular Functions under Matroid Constraints by Evolutionary Algorithms.
Friedrich, Tobias; Neumann, Frank
2015-01-01
Many combinatorial optimization problems have underlying goal functions that are submodular. The classical goal is to find a good solution for a given submodular function f under a given set of constraints. In this paper, we investigate the runtime of a simple single objective evolutionary algorithm called (1 + 1) EA and a multiobjective evolutionary algorithm called GSEMO until they have obtained a good approximation for submodular functions. For the case of monotone submodular functions and uniform cardinality constraints, we show that the GSEMO achieves a (1 - 1/e)-approximation in expected polynomial time. For the case of monotone functions where the constraints are given by the intersection of K ≥ 2 matroids, we show that the (1 + 1) EA achieves a (1/k + δ)-approximation in expected polynomial time for any constant δ > 0. Turning to nonmonotone symmetric submodular functions with k ≥ 1 matroid intersection constraints, we show that the GSEMO achieves a 1/((k + 2)(1 + ε))-approximation in expected time O(n(k + 6)log(n)/ε.
Evolution of cooperation in rotating indivisible goods game.
Koike, Shimpei; Nakamaru, Mayuko; Tsujimoto, Masahiro
2010-05-07
Collective behavior is theoretically and experimentally studied through a public goods game in which players contribute resources or efforts to produce goods (or pool), which are then divided equally among all players regardless of the amount of their contribution. However, if goods are indivisible, only one player can receive the goods. In this case, the problem is how to distribute indivisible goods, and here therefore we propose a new game, namely the "rotating indivisible goods game." In this game, the goods are not divided but distributed by regular rotation. An example is rotating savings and credit associations (ROSCAs), which exist all over the world and serve as efficient and informal institutions for collecting savings for small investments. In a ROSCA, members regularly contribute money to produce goods and to distribute them to each member on a regular rotation. It has been pointed out that ROSCA members are selected based on their reliability or reputation, and that defectors who stop contributing are excluded. We elucidate mechanisms that sustain cooperation in rotating indivisible goods games by means of evolutionary simulations. First, we investigate the effect of the peer selection rule by which the group chooses members based on the players reputation, also by which players choose groups based on their reputation. Regardless of the peer selection rule, cooperation is not sustainable in a rotating indivisible goods game. Second, we introduce the forfeiture rule that forbids a member who has not contributed earlier from receiving goods. These analyses show that employing these two rules can sustain cooperation in the rotating indivisible goods game, although employing either of the two cannot. Finally, we prove that evolutionary simulation can be a tool for investigating institutional designs that promote cooperation.
NASA Technical Reports Server (NTRS)
Judge, Russell A.; Snell, Edward H.
1999-01-01
Part of the challenge of macromolecular crystal growth for structure determination is obtaining an appropriate number of crystals with a crystal volume suitable for X-ray analysis. In this respect an understanding of the effect of solution conditions on macromolecule nucleation rates is advantageous. This study investigated the effects of solution conditions on the nucleation rate and final crystal size of two crystal systems; tetragonal lysozyme and glucose isomerase. Batch crystallization plates were prepared at given solution concentration and incubated at set temperatures over one week. The number of crystals per well with their size and axial ratios were recorded and correlated with solution conditions. Duplicate experiments indicate the reproducibility of the technique. Results for each system showing the effect of supersaturation, incubation temperature and solution pH on nucleation rates will be presented and discussed. In the case of lysozyme, having optimized solution conditions to produce an appropriate number of crystals of a suitable size, a batch of crystals were prepared under exactly the same conditions. Fifty of these crystals were analyzed by x-ray techniques. The results indicate that even under the same crystallization conditions, a marked variation in crystal properties exists.
76 FR 52662 - Good Neighbor Environmental Board
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-23
... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Good Neighbor Environmental Board AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... meeting of the Good Neighbor Environmental Board (Board). The Board usually meets three times...
76 FR 31328 - Good Neighbor Environmental Board
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-31
... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Good Neighbor Environmental Board AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... meeting of the Good Neighbor Environmental Board (Board). The Board usually meets three times...
76 FR 76973 - Good Neighbor Environmental Board
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-09
... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Good Neighbor Environmental Board AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... meeting of the Good Neighbor Environmental Board (Board). The Board usually meets three times...
76 FR 12731 - Good Neighbor Environmental Board
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Good Neighbor Environmental Board AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... meeting of the Good Neighbor Environmental Board (Board). The Board usually meets three times...
75 FR 8699 - Good Neighbor Environmental Board
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-25
... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Good Neighbor Environmental Board AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... meeting of the Good Neighbor Environmental Board (Board). The Board usually meets three times...
76 FR 73631 - Good Neighbor Environmental Board
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-29
... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Good Neighbor Environmental Board AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of... meeting of the Good Neighbor Environmental Board (Board). The Board usually meets three times...
76 FR 7845 - Good Neighbor Environmental Board
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Good Neighbor Environmental Board AGENCY: Environmental Protection Agency (EPA). ACTION: Request for Nominations to the Good Neighbor Environmental Board. SUMMARY: The U.S. Environmental...
Good Health Before Pregnancy: Preconception Care
... Login Join Pay Dues Follow us: Women's Health Care Physicians Contact Us My ACOG ACOG Departments Donate ... Patients About ACOG Good Health Before Pregnancy: Preconception Care Home For Patients Search FAQs Good Health Before ...
Code of Federal Regulations, 2014 CFR
2014-07-01
... states that goods, as used in the Act, means “goods (including ships and marine equipment), wares... the ultimate consumer thereof other than a producer, manufacturer, or processor thereof.”...
Code of Federal Regulations, 2010 CFR
2010-07-01
... states that goods, as used in the Act, means “goods (including ships and marine equipment), wares... the ultimate consumer thereof other than a producer, manufacturer, or processor thereof.”...
Code of Federal Regulations, 2012 CFR
2012-07-01
... states that goods, as used in the Act, means “goods (including ships and marine equipment), wares... the ultimate consumer thereof other than a producer, manufacturer, or processor thereof.”...
Code of Federal Regulations, 2013 CFR
2013-07-01
... states that goods, as used in the Act, means “goods (including ships and marine equipment), wares... the ultimate consumer thereof other than a producer, manufacturer, or processor thereof.”...
Code of Federal Regulations, 2011 CFR
2011-07-01
... states that goods, as used in the Act, means “goods (including ships and marine equipment), wares... the ultimate consumer thereof other than a producer, manufacturer, or processor thereof.”...
ERIC Educational Resources Information Center
Niesz, Tricia
2010-01-01
The last two decades have seen dramatic change in U.S. schooling as a response to high-stakes accountability and market-based reform movements. Critics cite a number of unfortunate consequences of these movements, especially for students in urban schools. This article explores the troubling ironies related to one strategy for survival in this…
Aerocapture Guidance Algorithm Comparison Campaign
NASA Technical Reports Server (NTRS)
Rousseau, Stephane; Perot, Etienne; Graves, Claude; Masciarelli, James P.; Queen, Eric
2002-01-01
The aerocapture is a promising technique for the future human interplanetary missions. The Mars Sample Return was initially based on an insertion by aerocapture. A CNES orbiter Mars Premier was developed to demonstrate this concept. Mainly due to budget constraints, the aerocapture was cancelled for the French orbiter. A lot of studies were achieved during the three last years to develop and test different guidance algorithms (APC, EC, TPC, NPC). This work was shared between CNES and NASA, with a fruitful joint working group. To finish this study an evaluation campaign has been performed to test the different algorithms. The objective was to assess the robustness, accuracy, capability to limit the load, and the complexity of each algorithm. A simulation campaign has been specified and performed by CNES, with a similar activity on the NASA side to confirm the CNES results. This evaluation has demonstrated that the numerical guidance principal is not competitive compared to the analytical concepts. All the other algorithms are well adapted to guaranty the success of the aerocapture. The TPC appears to be the more robust, the APC the more accurate, and the EC appears to be a good compromise.
Is the Good Life the Easy Life?
ERIC Educational Resources Information Center
Scollon, Christie Napa; King, Laura A.
2004-01-01
Three studies examined folk concepts of the good life. Participants rated the desirability and moral goodness of a life as a function of the happiness, meaning, and effort experienced. Happiness and meaning were solid predictors of the good life, replicating King and Napa (1998). Study 1 (N = 381) included wealth as an additional factor. Results…
Benefits of tolerance in public goods games
NASA Astrophysics Data System (ADS)
Szolnoki, Attila; Chen, Xiaojie
2015-10-01
Leaving the joint enterprise when defection is unveiled is always a viable option to avoid being exploited. Although loner strategy helps the population not to be trapped into the tragedy of the commons state, it could offer only a modest income for nonparticipants. In this paper we demonstrate that showing some tolerance toward defectors could not only save cooperation in harsh environments but in fact results in a surprisingly high average payoff for group members in public goods games. Phase diagrams and the underlying spatial patterns reveal the high complexity of evolving states where cyclic dominant strategies or two-strategy alliances can characterize the final state of evolution. We identify microscopic mechanisms which are responsible for the superiority of global solutions containing tolerant players. This phenomenon is robust and can be observed both in well-mixed and in structured populations highlighting the importance of tolerance in our everyday life.
When good news leads to bad choices.
McDevitt, Margaret A; Dunn, Roger M; Spetch, Marcia L; Ludvig, Elliot A
2016-01-01
Pigeons and other animals sometimes deviate from optimal choice behavior when given informative signals for delayed outcomes. For example, when pigeons are given a choice between an alternative that always leads to food after a delay and an alternative that leads to food only half of the time after a delay, preference changes dramatically depending on whether the stimuli during the delays are correlated with (signal) the outcomes or not. With signaled outcomes, pigeons show a much greater preference for the suboptimal alternative than with unsignaled outcomes. Key variables and research findings related to this phenomenon are reviewed, including the effects of durations of the choice and delay periods, probability of reinforcement, and gaps in the signal. We interpret the available evidence as reflecting a preference induced by signals for good news in a context of uncertainty. Other explanations are briefly summarized and compared.
Hybrid feature selection algorithm using symmetrical uncertainty and a harmony search algorithm
NASA Astrophysics Data System (ADS)
Salameh Shreem, Salam; Abdullah, Salwani; Nazri, Mohd Zakree Ahmad
2016-04-01
Microarray technology can be used as an efficient diagnostic system to recognise diseases such as tumours or to discriminate between different types of cancers in normal tissues. This technology has received increasing attention from the bioinformatics community because of its potential in designing powerful decision-making tools for cancer diagnosis. However, the presence of thousands or tens of thousands of genes affects the predictive accuracy of this technology from the perspective of classification. Thus, a key issue in microarray data is identifying or selecting the smallest possible set of genes from the input data that can achieve good predictive accuracy for classification. In this work, we propose a two-stage selection algorithm for gene selection problems in microarray data-sets called the symmetrical uncertainty filter and harmony search algorithm wrapper (SU-HSA). Experimental results show that the SU-HSA is better than HSA in isolation for all data-sets in terms of the accuracy and achieves a lower number of genes on 6 out of 10 instances. Furthermore, the comparison with state-of-the-art methods shows that our proposed approach is able to obtain 5 (out of 10) new best results in terms of the number of selected genes and competitive results in terms of the classification accuracy.
Dynamic hybrid algorithms for MAP inference in discrete MRFs.
Alahari, Karteek; Kohli, Pushmeet; Torr, Philip H S
2010-10-01
In this paper, we present novel techniques that improve the computational and memory efficiency of algorithms for solving multilabel energy functions arising from discrete mrfs or crfs. These methods are motivated by the observations that the performance of minimization algorithms depends on: 1) the initialization used for the primal and dual variables and 2) the number of primal variables involved in the energy function. Our first method (dynamic alpha-expansion) works by "recycling" results from previous problem instances. The second method simplifies the energy function by "reducing" the number of unknown variables present in the problem. Further, we show that it can also be used to generate a good initialization for the dynamic alpha-expansion algorithm by "reusing" dual variables. We test the performance of our methods on energy functions encountered in the problems of stereo matching and color and object-based segmentation. Experimental results show that our methods achieve a substantial improvement in the performance of alpha-expansion, as well as other popular algorithms such as sequential tree-reweighted message passing and max-product belief propagation. We also demonstrate the applicability of our schemes for certain higher order energy functions, such as the one described in [1], for interactive texture-based image and video segmentation. In most cases, we achieve a 10-15 times speed-up in the computation time. Our modified alpha-expansion algorithm provides similar performance to Fast-PD, but is conceptually much simpler. Both alpha-expansion and Fast-PD can be made orders of magnitude faster when used in conjunction with the "reduce" scheme proposed in this paper.
Novel biomedical tetrahedral mesh methods: algorithms and applications
NASA Astrophysics Data System (ADS)
Yu, Xiao; Jin, Yanfeng; Chen, Weitao; Huang, Pengfei; Gu, Lixu
2007-12-01
Tetrahedral mesh generation algorithm, as a prerequisite of many soft tissue simulation methods, becomes very important in the virtual surgery programs because of the real-time requirement. Aiming to speed up the computation in the simulation, we propose a revised Delaunay algorithm which makes a good balance of quality of tetrahedra, boundary preservation and time complexity, with many improved methods. Another mesh algorithm named Space-Disassembling is also presented in this paper, and a comparison of Space-Disassembling, traditional Delaunay algorithm and the revised Delaunay algorithm is processed based on clinical soft-tissue simulation projects, including craniofacial plastic surgery and breast reconstruction plastic surgery.
Margolis, C Z
1983-02-04
The clinical algorithm (flow chart) is a text format that is specially suited for representing a sequence of clinical decisions, for teaching clinical decision making, and for guiding patient care. A representative clinical algorithm is described in detail; five steps for writing an algorithm and seven steps for writing a set of algorithms are outlined. Five clinical education and patient care uses of algorithms are then discussed, including a map for teaching clinical decision making and protocol charts for guiding step-by-step care of specific problems. Clinical algorithms are compared as to their clinical usefulness with decision analysis. Three objections to clinical algorithms are answered, including the one that they restrict thinking. It is concluded that methods should be sought for writing clinical algorithms that represent expert consensus. A clinical algorithm could then be written for any area of medical decision making that can be standardized. Medical practice could then be taught more effectively, monitored accurately, and understood better.
Rotational Invariant Dimensionality Reduction Algorithms.
Lai, Zhihui; Xu, Yong; Yang, Jian; Shen, Linlin; Zhang, David
2016-06-30
A common intrinsic limitation of the traditional subspace learning methods is the sensitivity to the outliers and the image variations of the object since they use the L₂ norm as the metric. In this paper, a series of methods based on the L₂,₁-norm are proposed for linear dimensionality reduction. Since the L₂,₁-norm based objective function is robust to the image variations, the proposed algorithms can perform robust image feature extraction for classification. We use different ideas to design different algorithms and obtain a unified rotational invariant (RI) dimensionality reduction framework, which extends the well-known graph embedding algorithm framework to a more generalized form. We provide the comprehensive analyses to show the essential properties of the proposed algorithm framework. This paper indicates that the optimization problems have global optimal solutions when all the orthogonal projections of the data space are computed and used. Experimental results on popular image datasets indicate that the proposed RI dimensionality reduction algorithms can obtain competitive performance compared with the previous L₂ norm based subspace learning algorithms.
Efficient generation of statistically good pseudonoise by linearly interconnected shift registers
NASA Technical Reports Server (NTRS)
Hurd, W. J.
1974-01-01
A number of efficient new algorithms for generating digital pseudonoise are presented. The algorithms are efficient because a large number of new pseudorandom bits are generated at each iteration of a computer implementation or at each clock pulse of a hardware implementation. The maximal-length shift register sequences obtained have good randomness properties. Some of the properties of p-n sequences are reviewed and the results of an extensive statistical evaluation of the pseudonoise are presented.
The global Minmax k-means algorithm.
Wang, Xiaoyan; Bai, Yanping
2016-01-01
The global k-means algorithm is an incremental approach to clustering that dynamically adds one cluster center at a time through a deterministic global search procedure from suitable initial positions, and employs k-means to minimize the sum of the intra-cluster variances. However the global k-means algorithm sometimes results singleton clusters and the initial positions sometimes are bad, after a bad initialization, poor local optimal can be easily obtained by k-means algorithm. In this paper, we modified the global k-means algorithm to eliminate the singleton clusters at first, and then we apply MinMax k-means clustering error method to global k-means algorithm to overcome the effect of bad initialization, proposed the global Minmax k-means algorithm. The proposed clustering method is tested on some popular data sets and compared to the k-means algorithm, the global k-means algorithm and the MinMax k-means algorithm. The experiment results show our proposed algorithm outperforms other algorithms mentioned in the paper.
A family of algorithms for computing consensus about node state from network data.
Brush, Eleanor R; Krakauer, David C; Flack, Jessica C
2013-01-01
Biological and social networks are composed of heterogeneous nodes that contribute differentially to network structure and function. A number of algorithms have been developed to measure this variation. These algorithms have proven useful for applications that require assigning scores to individual nodes-from ranking websites to determining critical species in ecosystems-yet the mechanistic basis for why they produce good rankings remains poorly understood. We show that a unifying property of these algorithms is that they quantify consensus in the network about a node's state or capacity to perform a function. The algorithms capture consensus by either taking into account the number of a target node's direct connections, and, when the edges are weighted, the uniformity of its weighted in-degree distribution (breadth), or by measuring net flow into a target node (depth). Using data from communication, social, and biological networks we find that that how an algorithm measures consensus-through breadth or depth- impacts its ability to correctly score nodes. We also observe variation in sensitivity to source biases in interaction/adjacency matrices: errors arising from systematic error at the node level or direct manipulation of network connectivity by nodes. Our results indicate that the breadth algorithms, which are derived from information theory, correctly score nodes (assessed using independent data) and are robust to errors. However, in cases where nodes "form opinions" about other nodes using indirect information, like reputation, depth algorithms, like Eigenvector Centrality, are required. One caveat is that Eigenvector Centrality is not robust to error unless the network is transitive or assortative. In these cases the network structure allows the depth algorithms to effectively capture breadth as well as depth. Finally, we discuss the algorithms' cognitive and computational demands. This is an important consideration in systems in which individuals use the
Competitive Swarm Optimizer Based Gateway Deployment Algorithm in Cyber-Physical Systems
Huang, Shuqiang; Tao, Ming
2017-01-01
Wireless sensor network topology optimization is a highly important issue, and topology control through node selection can improve the efficiency of data forwarding, while saving energy and prolonging lifetime of the network. To address the problem of connecting a wireless sensor network to the Internet in cyber-physical systems, here we propose a geometric gateway deployment based on a competitive swarm optimizer algorithm. The particle swarm optimization (PSO) algorithm has a continuous search feature in the solution space, which makes it suitable for finding the geometric center of gateway deployment; however, its search mechanism is limited to the individual optimum (pbest) and the population optimum (gbest); thus, it easily falls into local optima. In order to improve the particle search mechanism and enhance the search efficiency of the algorithm, we introduce a new competitive swarm optimizer (CSO) algorithm. The CSO search algorithm is based on an inter-particle competition mechanism and can effectively avoid trapping of the population falling into a local optimum. With the improvement of an adaptive opposition-based search and its ability to dynamically parameter adjustments, this algorithm can maintain the diversity of the entire swarm to solve geometric K-center gateway deployment problems. The simulation results show that this CSO algorithm has a good global explorative ability as well as convergence speed and can improve the network quality of service (QoS) level of cyber-physical systems by obtaining a minimum network coverage radius. We also find that the CSO algorithm is more stable, robust and effective in solving the problem of geometric gateway deployment as compared to the PSO or Kmedoids algorithms. PMID:28117735
Competitive Swarm Optimizer Based Gateway Deployment Algorithm in Cyber-Physical Systems.
Huang, Shuqiang; Tao, Ming
2017-01-22
Wireless sensor network topology optimization is a highly important issue, and topology control through node selection can improve the efficiency of data forwarding, while saving energy and prolonging lifetime of the network. To address the problem of connecting a wireless sensor network to the Internet in cyber-physical systems, here we propose a geometric gateway deployment based on a competitive swarm optimizer algorithm. The particle swarm optimization (PSO) algorithm has a continuous search feature in the solution space, which makes it suitable for finding the geometric center of gateway deployment; however, its search mechanism is limited to the individual optimum (pbest) and the population optimum (gbest); thus, it easily falls into local optima. In order to improve the particle search mechanism and enhance the search efficiency of the algorithm, we introduce a new competitive swarm optimizer (CSO) algorithm. The CSO search algorithm is based on an inter-particle competition mechanism and can effectively avoid trapping of the population falling into a local optimum. With the improvement of an adaptive opposition-based search and its ability to dynamically parameter adjustments, this algorithm can maintain the diversity of the entire swarm to solve geometric K-center gateway deployment problems. The simulation results show that this CSO algorithm has a good global explorative ability as well as convergence speed and can improve the network quality of service (QoS) level of cyber-physical systems by obtaining a minimum network coverage radius. We also find that the CSO algorithm is more stable, robust and effective in solving the problem of geometric gateway deployment as compared to the PSO or Kmedoids algorithms.
Shi, Yuanyuan; Cai, Huajian; Shen, Yiqin Alicia; Yang, Jing
2016-01-01
Three studies were conducted to examine the validity of the four versions of BIATs that are supposed to measure the same construct but differ in shared focal category. Study 1 investigated the criterion validity of four BIATs measuring attitudes toward flower versus insect. Study 2 examined the experimental sensitivity of four BIATs by considering attitudes toward induced ingroup versus outgroup. Study 3 examined the predictive power of the four BIATs by investigating attitudes toward the commercial beverages Coke versus Sprite. The findings suggested that for the two attributes "good" and "bad," "good" rather than "bad" proved to be good as a shared focal category; for two targets, so long as they clearly differed in goodness or valence, the "good" rather than "bad" target emerged as good for a shared focal category. Beyond this case, either target worked well. These findings may facilitate the understanding of the BIAT and its future applications.
Algorithm Visualization: The State of the Field
ERIC Educational Resources Information Center
Shaffer, Clifford A.; Cooper, Matthew L.; Alon, Alexander Joel D.; Akbar, Monika; Stewart, Michael; Ponce, Sean; Edwards, Stephen H.
2010-01-01
We present findings regarding the state of the field of Algorithm Visualization (AV) based on our analysis of a collection of over 500 AVs. We examine how AVs are distributed among topics, who created them and when, their overall quality, and how they are disseminated. There does exist a cadre of good AVs and active developers. Unfortunately, we…
Predictive Caching Using the TDAG Algorithm
NASA Technical Reports Server (NTRS)
Laird, Philip; Saul, Ronald
1992-01-01
We describe how the TDAG algorithm for learning to predict symbol sequences can be used to design a predictive cache store. A model of a two-level mass storage system is developed and used to calculate the performance of the cache under various conditions. Experimental simulations provide good confirmation of the model.
Tian, Xiaochun; Chen, Jiabin; Han, Yongqiang; Shang, Jianyu; Li, Nan
2016-01-01
Zero velocity update (ZUPT) plays an important role in pedestrian navigation algorithms with the premise that the zero velocity interval (ZVI) should be detected accurately and effectively. A novel adaptive ZVI detection algorithm based on a smoothed pseudo Wigner–Ville distribution to remove multiple frequencies intelligently (SPWVD-RMFI) is proposed in this paper. The novel algorithm adopts the SPWVD-RMFI method to extract the pedestrian gait frequency and to calculate the optimal ZVI detection threshold in real time by establishing the function relationships between the thresholds and the gait frequency; then, the adaptive adjustment of thresholds with gait frequency is realized and improves the ZVI detection precision. To put it into practice, a ZVI detection experiment is carried out; the result shows that compared with the traditional fixed threshold ZVI detection method, the adaptive ZVI detection algorithm can effectively reduce the false and missed detection rate of ZVI; this indicates that the novel algorithm has high detection precision and good robustness. Furthermore, pedestrian trajectory positioning experiments at different walking speeds are carried out to evaluate the influence of the novel algorithm on positioning precision. The results show that the ZVI detected by the adaptive ZVI detection algorithm for pedestrian trajectory calculation can achieve better performance. PMID:27669266
Model of wealth and goods dynamics in a closed market
NASA Astrophysics Data System (ADS)
Ausloos, Marcel; Peķalski, Andrzej
2007-01-01
A simple computer simulation model of a closed market on a fixed network with free flow of goods and money is introduced. The model contains only two variables: the amount of goods and money beside the size of the system. An initially flat distribution of both variables is presupposed. We show that under completely random rules, i.e. through the choice of interacting agent pairs on the network and of the exchange rules that the market stabilizes in time and shows diversification of money and goods. We also indicate that the difference between poor and rich agents increases for small markets, as well as for systems in which money is steadily deduced from the market through taxation. It is also found that the price of goods decreases when taxes are introduced, likely due to the less availability of money.
Evolving evolutionary algorithms using linear genetic programming.
Oltean, Mihai
2005-01-01
A new model for evolving Evolutionary Algorithms is proposed in this paper. The model is based on the Linear Genetic Programming (LGP) technique. Every LGP chromosome encodes an EA which is used for solving a particular problem. Several Evolutionary Algorithms for function optimization, the Traveling Salesman Problem and the Quadratic Assignment Problem are evolved by using the considered model. Numerical experiments show that the evolved Evolutionary Algorithms perform similarly and sometimes even better than standard approaches for several well-known benchmarking problems.
A Traffic Motion Object Extraction Algorithm
NASA Astrophysics Data System (ADS)
Wu, Shaofei
2015-12-01
A motion object extraction algorithm based on the active contour model is proposed. Firstly, moving areas involving shadows are segmented with the classical background difference algorithm. Secondly, performing shadow detection and coarse removal, then a grid method is used to extract initial contours. Finally, the active contour model approach is adopted to compute the contour of the real object by iteratively tuning the parameter of the model. Experiments show the algorithm can remove the shadow and keep the integrity of a moving object.
Genetic Algorithms Viewed as Anticipatory Systems
NASA Astrophysics Data System (ADS)
Mocanu, Irina; Kalisz, Eugenia; Negreanu, Lorina
2010-11-01
This paper proposes a new version of genetic algorithms—the anticipatory genetic algorithm AGA. The performance evaluation included in the paper shows that AGA is superior to traditional genetic algorithm from both speed and accuracy points of view. The paper also presents how this algorithm can be applied to solve a complex problem: image annotation, intended to be used in content based image retrieval systems.
Detection of cracks in shafts with the Approximated Entropy algorithm
NASA Astrophysics Data System (ADS)
Sampaio, Diego Luchesi; Nicoletti, Rodrigo
2016-05-01
The Approximate Entropy is a statistical calculus used primarily in the fields of Medicine, Biology, and Telecommunication for classifying and identifying complex signal data. In this work, an Approximate Entropy algorithm is used to detect cracks in a rotating shaft. The signals of the cracked shaft are obtained from numerical simulations of a de Laval rotor with breathing cracks modelled by the Fracture Mechanics. In this case, one analysed the vertical displacements of the rotor during run-up transients. The results show the feasibility of detecting cracks from 5% depth, irrespective of the unbalance of the rotating system and crack orientation in the shaft. The results also show that the algorithm can differentiate the occurrence of crack only, misalignment only, and crack + misalignment in the system. However, the algorithm is sensitive to intrinsic parameters p (number of data points in a sample vector) and f (fraction of the standard deviation that defines the minimum distance between two sample vectors), and good results are only obtained by appropriately choosing their values according to the sampling rate of the signal.
Algorithms for automated DNA assembly
Densmore, Douglas; Hsiau, Timothy H.-C.; Kittleson, Joshua T.; DeLoache, Will; Batten, Christopher; Anderson, J. Christopher
2010-01-01
Generating a defined set of genetic constructs within a large combinatorial space provides a powerful method for engineering novel biological functions. However, the process of assembling more than a few specific DNA sequences can be costly, time consuming and error prone. Even if a correct theoretical construction scheme is developed manually, it is likely to be suboptimal by any number of cost metrics. Modular, robust and formal approaches are needed for exploring these vast design spaces. By automating the design of DNA fabrication schemes using computational algorithms, we can eliminate human error while reducing redundant operations, thus minimizing the time and cost required for conducting biological engineering experiments. Here, we provide algorithms that optimize the simultaneous assembly of a collection of related DNA sequences. We compare our algorithms to an exhaustive search on a small synthetic dataset and our results show that our algorithms can quickly find an optimal solution. Comparison with random search approaches on two real-world datasets show that our algorithms can also quickly find lower-cost solutions for large datasets. PMID:20335162
Quantifying capital goods for waste incineration
Brogaard, L.K.; Riber, C.; Christensen, T.H.
2013-06-15
Highlights: • Materials and energy used for the construction of waste incinerators were quantified. • The data was collected from five incineration plants in Scandinavia. • Included were six main materials, electronic systems, cables and all transportation. • The capital goods contributed 2–3% compared to the direct emissions impact on GW. - Abstract: Materials and energy used for the construction of modern waste incineration plants were quantified. The data was collected from five incineration plants (72,000–240,000 tonnes per year) built in Scandinavia (Norway, Finland and Denmark) between 2006 and 2012. Concrete for the buildings was the main material used amounting to 19,000–26,000 tonnes per plant. The quantification further included six main materials, electronic systems, cables and all transportation. The energy used for the actual on-site construction of the incinerators was in the range 4000–5000 MW h. In terms of the environmental burden of producing the materials used in the construction, steel for the building and the machinery contributed the most. The material and energy used for the construction corresponded to the emission of 7–14 kg CO{sub 2} per tonne of waste combusted throughout the lifetime of the incineration plant. The assessment showed that, compared to data reported in the literature on direct emissions from the operation of incinerators, the environmental impacts caused by the construction of buildings and machinery (capital goods) could amount to 2–3% with respect to kg CO{sub 2} per tonne of waste combusted.
Noise-enhanced clustering and competitive learning algorithms.
Osoba, Osonde; Kosko, Bart
2013-01-01
Noise can provably speed up convergence in many centroid-based clustering algorithms. This includes the popular k-means clustering algorithm. The clustering noise benefit follows from the general noise benefit for the expectation-maximization algorithm because many clustering algorithms are special cases of the expectation-maximization algorithm. Simulations show that noise also speeds up convergence in stochastic unsupervised competitive learning, supervised competitive learning, and differential competitive learning.
A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images
Wang, Yangping; Wang, Song
2016-01-01
The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU). PMID:28053653
A Parallel Nonrigid Registration Algorithm Based on B-Spline for Medical Images.
Du, Xiaogang; Dang, Jianwu; Wang, Yangping; Wang, Song; Lei, Tao
2016-01-01
The nonrigid registration algorithm based on B-spline Free-Form Deformation (FFD) plays a key role and is widely applied in medical image processing due to the good flexibility and robustness. However, it requires a tremendous amount of computing time to obtain more accurate registration results especially for a large amount of medical image data. To address the issue, a parallel nonrigid registration algorithm based on B-spline is proposed in this paper. First, the Logarithm Squared Difference (LSD) is considered as the similarity metric in the B-spline registration algorithm to improve registration precision. After that, we create a parallel computing strategy and lookup tables (LUTs) to reduce the complexity of the B-spline registration algorithm. As a result, the computing time of three time-consuming steps including B-splines interpolation, LSD computation, and the analytic gradient computation of LSD, is efficiently reduced, for the B-spline registration algorithm employs the Nonlinear Conjugate Gradient (NCG) optimization method. Experimental results of registration quality and execution efficiency on the large amount of medical images show that our algorithm achieves a better registration accuracy in terms of the differences between the best deformation fields and ground truth and a speedup of 17 times over the single-threaded CPU implementation due to the powerful parallel computing ability of Graphics Processing Unit (GPU).
GenClust: A genetic algorithm for clustering gene expression data
Di Gesú, Vito; Giancarlo, Raffaele; Lo Bosco, Giosué; Raimondi, Alessandra; Scaturro, Davide
2005-01-01
Background Clustering is a key step in the analysis of gene expression data, and in fact, many classical clustering algorithms are used, or more innovative ones have been designed and validated for the task. Despite the widespread use of artificial intelligence techniques in bioinformatics and, more generally, data analysis, there are very few clustering algorithms based on the genetic paradigm, yet that paradigm has great potential in finding good heuristic solutions to a difficult optimization problem such as clustering. Results GenClust is a new genetic algorithm for clustering gene expression data. It has two key features: (a) a novel coding of the search space that is simple, compact and easy to update; (b) it can be used naturally in conjunction with data driven internal validation methods. We have experimented with the FOM methodology, specifically conceived for validating clusters of gene expression data. The validity of GenClust has been assessed experimentally on real data sets, both with the use of validation measures and in comparison with other algorithms, i.e., Average Link, Cast, Click and K-means. Conclusion Experiments show that none of the algorithms we have used is markedly superior to the others across data sets and validation measures; i.e., in many cases the observed differences between the worst and best performing algorithm may be statistically insignificant and they could be considered equivalent. However, there are cases in which an algorithm may be better than others and therefore worthwhile. In particular, experiments for GenClust show that, although simple in its data representation, it converges very rapidly to a local optimum and that its ability to identify meaningful clusters is comparable, and sometimes superior, to that of more sophisticated algorithms. In addition, it is well suited for use in conjunction with data driven internal validation measures and, in particular, the FOM methodology. PMID:16336639
[A Hyperspectral Imagery Anomaly Detection Algorithm Based on Gauss-Markov Model].
Gao, Kun; Liu, Ying; Wang, Li-jing; Zhu, Zhen-yu; Cheng, Hao-bo
2015-10-01
With the development of spectral imaging technology, hyperspectral anomaly detection is getting more and more widely used in remote sensing imagery processing. The traditional RX anomaly detection algorithm neglects spatial correlation of images. Besides, it does not validly reduce the data dimension, which costs too much processing time and shows low validity on hyperspectral data. The hyperspectral images follow Gauss-Markov Random Field (GMRF) in space and spectral dimensions. The inverse matrix of covariance matrix is able to be directly calculated by building the Gauss-Markov parameters, which avoids the huge calculation of hyperspectral data. This paper proposes an improved RX anomaly detection algorithm based on three-dimensional GMRF. The hyperspectral imagery data is simulated with GMRF model, and the GMRF parameters are estimated with the Approximated Maximum Likelihood method. The detection operator is constructed with GMRF estimation parameters. The detecting pixel is considered as the centre in a local optimization window, which calls GMRF detecting window. The abnormal degree is calculated with mean vector and covariance inverse matrix, and the mean vector and covariance inverse matrix are calculated within the window. The image is detected pixel by pixel with the moving of GMRF window. The traditional RX detection algorithm, the regional hypothesis detection algorithm based on GMRF and the algorithm proposed in this paper are simulated with AVIRIS hyperspectral data. Simulation results show that the proposed anomaly detection method is able to improve the detection efficiency and reduce false alarm rate. We get the operation time statistics of the three algorithms in the same computer environment. The results show that the proposed algorithm improves the operation time by 45.2%, which shows good computing efficiency.
Software For Genetic Algorithms
NASA Technical Reports Server (NTRS)
Wang, Lui; Bayer, Steve E.
1992-01-01
SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.
Algorithm-development activities
NASA Technical Reports Server (NTRS)
Carder, Kendall L.
1994-01-01
The task of algorithm-development activities at USF continues. The algorithm for determining chlorophyll alpha concentration, (Chl alpha) and gelbstoff absorption coefficient for SeaWiFS and MODIS-N radiance data is our current priority.
A Good Image Model Eases Restoration
2002-02-06
algorithms, and various classical as well as unexpected new applications of the BV ( bounded variation ) image model, first introduced into image processing by Rudin, Osher, and Fatemi in 1992 Physica D, 60:259-268.
Liu, Wenyuan; Wang, Chunlei; Wang, Baowen; Wang, Changwu
2014-02-01
Cancer gene expression data have the characteristics of high dimensionalities and small samples so it is necessary to perform dimensionality reduction of the data. Traditional linear dimensionality reduction approaches can not find the nonlinear relationship between the data points. In addition, they have bad dimensionality reduction results. Therefore a multiple weights locally linear embedding (LLE) algorithm with improved distance is introduced to perform dimensionality reduction in this study. We adopted an improved distance to calculate the neighbor of each data point in this algorithm, and then we introduced multiple sets of linearly independent local weight vectors for each neighbor, and obtained the embedding results in the low-dimensional space of the high-dimensional data by minimizing the reconstruction error. Experimental result showed that the multiple weights LLE algorithm with improved distance had good dimensionality reduction functions of the cancer gene expression data.
An infrared small target detection algorithm based on high-speed local contrast method
NASA Astrophysics Data System (ADS)
Cui, Zheng; Yang, Jingli; Jiang, Shouda; Li, Junbao
2016-05-01
Small-target detection in infrared imagery with a complex background is always an important task in remote sensing fields. It is important to improve the detection capabilities such as detection rate, false alarm rate, and speed. However, current algorithms usually improve one or two of the detection capabilities while sacrificing the other. In this letter, an Infrared (IR) small target detection algorithm with two layers inspired by Human Visual System (HVS) is proposed to balance those detection capabilities. The first layer uses high speed simplified local contrast method to select significant information. And the second layer uses machine learning classifier to separate targets from background clutters. Experimental results show the proposed algorithm pursue good performance in detection rate, false alarm rate and speed simultaneously.
Dwell time algorithm for multi-mode optimization in manufacturing large optical mirrors
NASA Astrophysics Data System (ADS)
Liu, Zhenyu
2014-08-01
CCOS (Computer Controlled Optical Surfacing) is one of the most important method to manufacture optical surface. By controlling the dwell time of a polishing tool on the mirror we can get the desired material removal. As the optical surface becoming larger, traditional CCOS method can't meet the demand that manufacturing the mirror in higher efficiency and precision. This paper presents a new method using multi-mode optimization. By calculate the dwell time map of different tool in one optimization cycle, the larger tool and the small one have complementary advantages and obtain a global optimization for multi tool and multi-processing cycles. To calculate the dwell time of different tool at the same time we use multi-mode dwell time algorithm that based on matrix calculation. With this algorithm we did simulation experiment, the result shows using multi-mode optimization algorithm can improve the efficiency maintaining good precision.
NASA Technical Reports Server (NTRS)
Strong, James P.
1987-01-01
A local area matching algorithm was developed on the Massively Parallel Processor (MPP). It is an iterative technique that first matches coarse or low resolution areas and at each iteration performs matches of higher resolution. Results so far show that when good matches are possible in the two images, the MPP algorithm matches corresponding areas as well as a human observer. To aid in developing this algorithm, a control or shell program was developed for the MPP that allows interactive experimentation with various parameters and procedures to be used in the matching process. (This would not be possible without the high speed of the MPP). With the system, optimal techniques can be developed for different types of matching problems.
A new dehazing algorithm based on overlapped sub-block homomorphic filtering
NASA Astrophysics Data System (ADS)
Yu, Lu; Liu, Xuebin; Liu, Guizhong
2015-12-01
Considering the images captured under hazy weather conditions are blurred, a new dehazing algorithm based on overlapped sub-block homomorphic filtering in HSV color space is proposed. Firstly, the hazy image is transformed from RGB to HSV color space. Secondly, the luminance component V is dealt with the overlapped sub-block homomorphic filtering. Finally, the processed image is converted from HSV to RGB color space once again. Then the dehazing images will be obtained. According to the established algorithm model, the dehazing images could be evaluated by six objective evaluation parameters including average value, standard deviation, entropy, average gradient, edge intensity and contrast. The experimental results show that this algorithm has good dehazing effect. It can not only improve degradation of the image, but also amplify the image details and enhance the contrast of the image effectively.
Memetic algorithm-based multi-objective coverage optimization for wireless sensor networks.
Chen, Zhi; Li, Shuai; Yue, Wenjing
2014-10-30
Maintaining effective coverage and extending the network lifetime as much as possible has become one of the most critical issues in the coverage of WSNs. In this paper, we propose a multi-objective coverage optimization algorithm for WSNs, namely MOCADMA, which models the coverage control of WSNs as the multi-objective optimization problem. MOCADMA uses a memetic algorithm with a dynamic local search strategy to optimize the coverage of WSNs and achieve the objectives such as high network coverage, effective node utilization and more residual energy. In MOCADMA, the alternative solutions are represented as the chromosomes in matrix form, and the optimal solutions are selected through numerous iterations of the evolution process, including selection, crossover, mutation, local enhancement, and fitness evaluation. The experiment and evaluation results show MOCADMA can have good capabilities in maintaining the sensing coverage, achieve higher network coverage while improving the energy efficiency and effectively prolonging the network lifetime, and have a significant improvement over some existing algorithms.
Lensless optical data hiding system based on phase encoding algorithm in the Fresnel domain.
Chen, Yen-Yu; Wang, Jian-Hong; Lin, Cheng-Chung; Hwang, Hone-Ene
2013-07-20
A novel and efficient algorithm based on a modified Gerchberg-Saxton algorithm (MGSA) in the Fresnel domain is presented, together with mathematical derivation, and two pure phase-only masks (POMs) are generated. The algorithm's application to data hiding is demonstrated by a simulation procedure, in which a hidden image/logo is encoded into phase forms. A hidden image/logo can be extracted by the proposed high-performance lensless optical data-hiding system. The reconstructed image shows good quality and the errors are close to zero. In addition, the robustness of our data-hiding technique is illustrated by simulation results. The position coordinates of the POMs as well as the wavelength are used as secure keys that can ensure sufficient information security and robustness. The main advantages of this proposed watermarking system are that it uses fewer iterative processes to produce the masks, and the image-hiding scheme is straightforward.
A Short Survey of Document Structure Similarity Algorithms
Buttler, D
2004-02-27
This paper provides a brief survey of document structural similarity algorithms, including the optimal Tree Edit Distance algorithm and various approximation algorithms. The approximation algorithms include the simple weighted tag similarity algorithm, Fourier transforms of the structure, and a new application of the shingle technique to structural similarity. We show three surprising results. First, the Fourier transform technique proves to be the least accurate of any of approximation algorithms, while also being slowest. Second, optimal Tree Edit Distance algorithms may not be the best technique for clustering pages from different sites. Third, the simplest approximation to structure may be the most effective and efficient mechanism for many applications.
Accurate Finite Difference Algorithms
NASA Technical Reports Server (NTRS)
Goodrich, John W.
1996-01-01
Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.
Quantum algorithms: an overview
NASA Astrophysics Data System (ADS)
Montanaro, Ashley
2016-01-01
Quantum computers are designed to outperform standard computers by running quantum algorithms. Areas in which quantum algorithms can be applied include cryptography, search and optimisation, simulation of quantum systems and solving large systems of linear equations. Here we briefly survey some known quantum algorithms, with an emphasis on a broad overview of their applications rather than their technical details. We include a discussion of recent developments and near-term applications of quantum algorithms.
INSENS classification algorithm report
Hernandez, J.E.; Frerking, C.J.; Myers, D.W.
1993-07-28
This report describes a new algorithm developed for the Imigration and Naturalization Service (INS) in support of the INSENS project for classifying vehicles and pedestrians using seismic data. This algorithm is less sensitive to nuisance alarms due to environmental events than the previous algorithm. Furthermore, the algorithm is simple enough that it can be implemented in the 8-bit microprocessor used in the INSENS system.
NASA Astrophysics Data System (ADS)
Graf, Norman A.
2001-07-01
An object-oriented framework for undertaking clustering algorithm studies has been developed. We present here the definitions for the abstract Cells and Clusters as well as the interface for the algorithm. We intend to use this framework to investigate the interplay between various clustering algorithms and the resulting jet reconstruction efficiency and energy resolutions to assist in the design of the calorimeter detector.
Listless zerotree image compression algorithm
NASA Astrophysics Data System (ADS)
Lian, Jing; Wang, Ke
2006-09-01
In this paper, an improved zerotree structure and a new coding procedure are adopted, which improve the reconstructed image qualities. Moreover, the lists in SPIHT are replaced by flag maps, and lifting scheme is adopted to realize wavelet transform, which lowers the memory requirements and speeds up the coding process. Experimental results show that the algorithm is more effective and efficient compared with SPIHT.
Patients' narratives concerning good and bad caring.
Lövgren, G; Engström, B; Norberg, A
1996-01-01
Narratives from patients (n = 80) and patients' relatives (n = 12) were collected to illuminate experiences of good and bad caring episodes and to obtain descriptions of good caring. Narratives describing good caring included such task aspects as swift and correct assessment and access to information. Aspects less frequently mentioned were, for example, being given time, receiving pain relief and good food. Relationship aspects mentioned; having an interest shown in the care, being taken seriously and being cared about. There are parallels regarding relationship aspects between the narratives concerning good and bad caring episodes; for example what was praised in the good caring narratives was criticized in those describing bad caring. Such parallels were being/not being trusted, being/not being believed and being/not being respected. The narrations concerning bad caring were more specific and more vivid than those about good caring. The authors' interpretation was that the bad episodes were unexpected and very painful and therefore remained imprinted in the patients' memories. The descriptions of good caring included relationship aspects in only 34 cases, task aspects in only five cases and a combination of both in 50 cases. The ultimate purpose of the study was to obtain a basis for the development of a policy for good caring founded on patients' experiences. It is desirable that further studies be undertaken within various clinical specialties which would also take into consideration medical, social and cultural perspectives.
Duan, Qian-Qian; Yang, Gen-Ke; Pan, Chang-Chun
2014-01-01
A hybrid optimization algorithm combining finite state method (FSM) and genetic algorithm (GA) is proposed to solve the crude oil scheduling problem. The FSM and GA are combined to take the advantage of each method and compensate deficiencies of individual methods. In the proposed algorithm, the finite state method makes up for the weakness of GA which is poor at local searching ability. The heuristic returned by the FSM can guide the GA algorithm towards good solutions. The idea behind this is that we can generate promising substructure or partial solution by using FSM. Furthermore, the FSM can guarantee that the entire solution space is uniformly covered. Therefore, the combination of the two algorithms has better global performance than the existing GA or FSM which is operated individually. Finally, a real-life crude oil scheduling problem from the literature is used for conducting simulation. The experimental results validate that the proposed method outperforms the state-of-art GA method. PMID:24772031
Duan, Qian-Qian; Yang, Gen-Ke; Pan, Chang-Chun
2014-01-01
A hybrid optimization algorithm combining finite state method (FSM) and genetic algorithm (GA) is proposed to solve the crude oil scheduling problem. The FSM and GA are combined to take the advantage of each method and compensate deficiencies of individual methods. In the proposed algorithm, the finite state method makes up for the weakness of GA which is poor at local searching ability. The heuristic returned by the FSM can guide the GA algorithm towards good solutions. The idea behind this is that we can generate promising substructure or partial solution by using FSM. Furthermore, the FSM can guarantee that the entire solution space is uniformly covered. Therefore, the combination of the two algorithms has better global performance than the existing GA or FSM which is operated individually. Finally, a real-life crude oil scheduling problem from the literature is used for conducting simulation. The experimental results validate that the proposed method outperforms the state-of-art GA method.
CLADA: cortical longitudinal atrophy detection algorithm.
Nakamura, Kunio; Fox, Robert; Fisher, Elizabeth
2011-01-01
Measurement of changes in brain cortical thickness is useful for the assessment of regional gray matter atrophy in neurodegenerative conditions. A new longitudinal method, called CLADA (cortical longitudinal atrophy detection algorithm), has been developed for the measurement of changes in cortical thickness in magnetic resonance images (MRI) acquired over time. CLADA creates a subject-specific cortical model which is longitudinally deformed to match images from individual time points. The algorithm was designed to work reliably for lower resolution images, such as the MRIs with 1×1×5 mm(3) voxels previously acquired for many clinical trials in multiple sclerosis (MS). CLADA was evaluated to determine reproducibility, accuracy, and sensitivity. Scan-rescan variability was 0.45% for images with 1mm(3) isotropic voxels and 0.77% for images with 1×1×5 mm(3) voxels. The mean absolute accuracy error was 0.43 mm, as determined by comparison of CLADA measurements to cortical thickness measured directly in post-mortem tissue. CLADA's sensitivity for correctly detecting at least 0.1mm change was 86% in a simulation study. A comparison to FreeSurfer showed good agreement (Pearson correlation=0.73 for global mean thickness). CLADA was also applied to MRIs acquired over 18 months in secondary progressive MS patients who were imaged at two different resolutions. Cortical thinning was detected in this group in both the lower and higher resolution images. CLADA detected a higher rate of cortical thinning in MS patients compared to healthy controls over 2 years. These results show that CLADA can be used for reliable measurement of cortical atrophy in longitudinal studies, even in lower resolution images.
CLADA: Cortical Longitudinal Atrophy Detection Algorithm
Nakamura, Kunio; Fox, Robert; Fisher, Elizabeth
2010-01-01
Measurement of changes in brain cortical thickness is useful for assessment of regional gray matter atrophy in neurodegenerative conditions. A new longitudinal method, called CLADA (cortical longitudinal atrophy detection algorithm), has been developed for measurement of changes in cortical thickness in magnetic resonance images (MRI) acquired over time. CLADA creates a subject-specific cortical model which is longitudinally deformed to match images from individual time points. The algorithm was designed to work reliably for lower-resolution images, such as the MRIs with 1×1×5mm3 voxels previously acquired for many clinical trials in multiple sclerosis (MS). CLADA was evaluated to determine reproducibility, accuracy, and sensitivity. Scan-rescan variability was 0.45% for images with 1mm3 isotropic voxels and 0.77% for images with 1×1×5 mm3 voxels. The mean absolute accuracy error was 0.43 mm, as determined by comparison of CLADA measurements to cortical thickness measured directly in post- mortem tissue. CLADA’s sensitivity for correctly detecting at least 0.1 mm change was 86% in a simulation study. A comparison to FreeSurfer showed good agreement (Pearson correlation = 0.73 for global mean thickness). CLADA was also applied to MRIs acquired over 18 months in secondary progressive MS patients who were imaged at two different resolutions. Cortical thinning was detected in this group in both the lower and higher resolution images. CLADA detected a higher rate of cortical thinning in MS patients compared to healthy controls over 2 years. These results show that CLADA can be used for reliable measurement of cortical atrophy in longitudinal studies, even in lower resolution images. PMID:20674750
Good veterinary governance: definition, measurement and challenges.
Msellati, L; Commault, J; Dehove, A
2012-08-01
Good veterinary governance assumes the provision of veterinary services that are sustainably financed, universally available, and provided efficiently without waste or duplication, in a manner that is transparent and free of fraud or corruption. Good veterinary governance is a necessary condition for sustainable economic development insomuch as it promotes the effective delivery of services and improves the overall performance of animal health systems. This article defines governance in Veterinary Services and proposes a framework for its measurement. It also discusses the role of Veterinary Services and analyses the governance dimensions of the performance-assessment tools developed by the World Organisation for Animal Health (OIE). These tools (OIE PVS Tool and PVS Gap Analysis) track the performance of Veterinary Services across countries (a harmonised tool) and over time (the PVS Pathway). The article shows the usefulness of the OIE PVS Tool for measuring governance, but also points to two shortcomings, namely (i) the lack of clear outcome indicators, which is an impediment to a comprehensive assessment of the performance of Veterinary Services, and (ii) the lack of specific measures for assessing the extent of corruption within Veterinary Services and the extent to which demand for better governance is being strengthened within the animal health system. A discussion follows on the drivers of corruption and instruments for perception-based assessments of country governance and corruption. Similarly, the article introduces the concept of social accountability, which is an approach to enhancing government transparency and accountability, and shows how supply-side and demand-side mechanisms complement each other in improving the governance of service delivery. It further elaborates on two instruments--citizen report card surveys and grievance redress mechanisms--because of their wider relevance and their possible applications in many settings, including Veterinary
Adaptive-feedback control algorithm.
Huang, Debin
2006-06-01
This paper is motivated by giving the detailed proofs and some interesting remarks on the results the author obtained in a series of papers [Phys. Rev. Lett. 93, 214101 (2004); Phys. Rev. E 71, 037203 (2005); 69, 067201 (2004)], where an adaptive-feedback algorithm was proposed to effectively stabilize and synchronize chaotic systems. This note proves in detail the strictness of this algorithm from the viewpoint of mathematics, and gives some interesting remarks for its potential applications to chaos control & synchronization. In addition, a significant comment on synchronization-based parameter estimation is given, which shows some techniques proposed in literature less strict and ineffective in some cases.
Gossip algorithms in quantum networks
NASA Astrophysics Data System (ADS)
Siomau, Michael
2017-01-01
Gossip algorithms is a common term to describe protocols for unreliable information dissemination in natural networks, which are not optimally designed for efficient communication between network entities. We consider application of gossip algorithms to quantum networks and show that any quantum network can be updated to optimal configuration with local operations and classical communication. This allows to speed-up - in the best case exponentially - the quantum information dissemination. Irrespective of the initial configuration of the quantum network, the update requiters at most polynomial number of local operations and classical communication.
Exploration of new multivariate spectral calibration algorithms.
Van Benthem, Mark Hilary; Haaland, David Michael; Melgaard, David Kennett; Martin, Laura Elizabeth; Wehlburg, Christine Marie; Pell, Randy J.; Guenard, Robert D.
2004-03-01
A variety of multivariate calibration algorithms for quantitative spectral analyses were investigated and compared, and new algorithms were developed in the course of this Laboratory Directed Research and Development project. We were able to demonstrate the ability of the hybrid classical least squares/partial least squares (CLSIPLS) calibration algorithms to maintain calibrations in the presence of spectrometer drift and to transfer calibrations between spectrometers from the same or different manufacturers. These methods were found to be as good or better in prediction ability as the commonly used partial least squares (PLS) method. We also present the theory for an entirely new class of algorithms labeled augmented classical least squares (ACLS) methods. New factor selection methods are developed and described for the ACLS algorithms. These factor selection methods are demonstrated using near-infrared spectra collected from a system of dilute aqueous solutions. The ACLS algorithm is also shown to provide improved ease of use and better prediction ability than PLS when transferring calibrations between near-infrared calibrations from the same manufacturer. Finally, simulations incorporating either ideal or realistic errors in the spectra were used to compare the prediction abilities of the new ACLS algorithm with that of PLS. We found that in the presence of realistic errors with non-uniform spectral error variance across spectral channels or with spectral errors correlated between frequency channels, ACLS methods generally out-performed the more commonly used PLS method. These results demonstrate the need for realistic error structure in simulations when the prediction abilities of various algorithms are compared. The combination of equal or superior prediction ability and the ease of use of the ACLS algorithms make the new ACLS methods the preferred algorithms to use for multivariate spectral calibrations.
19 CFR 10.605 - Goods classifiable as goods put up in sets.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Goods classifiable as goods put up in sets. 10.605... put up in sets. Notwithstanding the specific rules set forth in General Note 29(n), HTSUS, goods classifiable as goods put up in sets for retail sale as provided for in General Rule of Interpretation 3,...
Motion Cueing Algorithm Development: Initial Investigation and Redesign of the Algorithms
NASA Technical Reports Server (NTRS)
Telban, Robert J.; Wu, Weimin; Cardullo, Frank M.; Houck, Jacob A. (Technical Monitor)
2000-01-01
In this project four motion cueing algorithms were initially investigated. The classical algorithm generated results with large distortion and delay and low magnitude. The NASA adaptive algorithm proved to be well tuned with satisfactory performance, while the UTIAS adaptive algorithm produced less desirable results. Modifications were made to the adaptive algorithms to reduce the magnitude of undesirable spikes. The optimal algorithm was found to have the potential for improved performance with further redesign. The center of simulator rotation was redefined. More terms were added to the cost function to enable more tuning flexibility. A new design approach using a Fortran/Matlab/Simulink setup was employed. A new semicircular canals model was incorporated in the algorithm. With these changes results show the optimal algorithm has some advantages over the NASA adaptive algorithm. Two general problems observed in the initial investigation required solutions. A nonlinear gain algorithm was developed that scales the aircraft inputs by a third-order polynomial, maximizing the motion cues while remaining within the operational limits of the motion system. A braking algorithm was developed to bring the simulator to a full stop at its motion limit and later release the brake to follow the cueing algorithm output.
Monte Carlo algorithm for free energy calculation.
Bi, Sheng; Tong, Ning-Hua
2015-07-01
We propose a Monte Carlo algorithm for the free energy calculation based on configuration space sampling. An upward or downward temperature scan can be used to produce F(T). We implement this algorithm for the Ising model on a square lattice and triangular lattice. Comparison with the exact free energy shows an excellent agreement. We analyze the properties of this algorithm and compare it with the Wang-Landau algorithm, which samples in energy space. This method is applicable to general classical statistical models. The possibility of extending it to quantum systems is discussed.
Solving Maximal Clique Problem through Genetic Algorithm
NASA Astrophysics Data System (ADS)
Rajawat, Shalini; Hemrajani, Naveen; Menghani, Ekta
2010-11-01
Genetic algorithm is one of the most interesting heuristic search techniques. It depends basically on three operations; selection, crossover and mutation. The outcome of the three operations is a new population for the next generation. Repeating these operations until the termination condition is reached. All the operations in the algorithm are accessible with today's molecular biotechnology. The simulations show that with this new computing algorithm, it is possible to get a solution from a very small initial data pool, avoiding enumerating all candidate solutions. For randomly generated problems, genetic algorithm can give correct solution within a few cycles at high probability.
Thermostat algorithm for generating target ensembles.
Bravetti, A; Tapias, D
2016-02-01
We present a deterministic algorithm called contact density dynamics that generates any prescribed target distribution in the physical phase space. Akin to the famous model of Nosé and Hoover, our algorithm is based on a non-Hamiltonian system in an extended phase space. However, the equations of motion in our case follow from contact geometry and we show that in general they have a similar form to those of the so-called density dynamics algorithm. As a prototypical example, we apply our algorithm to produce a Gibbs canonical distribution for a one-dimensional harmonic oscillator.
Approximate learning algorithm in Boltzmann machines.
Yasuda, Muneki; Tanaka, Kazuyuki
2009-11-01
Boltzmann machines can be regarded as Markov random fields. For binary cases, they are equivalent to the Ising spin model in statistical mechanics. Learning systems in Boltzmann machines are one of the NP-hard problems. Thus, in general we have to use approximate methods to construct practical learning algorithms in this context. In this letter, we propose new and practical learning algorithms for Boltzmann machines by using the belief propagation algorithm and the linear response approximation, which are often referred as advanced mean field methods. Finally, we show the validity of our algorithm using numerical experiments.
Learning with the ratchet algorithm.
Hush, D. R.; Scovel, James C.
2003-01-01
This paper presents a randomized algorithm called Ratchet that asymptotically minimizes (with probability 1) functions that satisfy a positive-linear-dependent (PLD) property. We establish the PLD property and a corresponding realization of Ratchet for a generalized loss criterion for both linear machines and linear classifiers. We describe several learning criteria that can be obtained as special cases of this generalized loss criterion, e.g. classification error, classification loss and weighted classification error. We also establish the PLD property and a corresponding realization of Ratchet for the Neyman-Pearson criterion for linear classifiers. Finally we show how, for linear classifiers, the Ratchet algorithm can be derived as a modification of the Pocket algorithm.
SDR Input Power Estimation Algorithms
NASA Technical Reports Server (NTRS)
Nappier, Jennifer M.; Briones, Janette C.
2013-01-01
The General Dynamics (GD) S-Band software defined radio (SDR) in the Space Communications and Navigation (SCAN) Testbed on the International Space Station (ISS) provides experimenters an opportunity to develop and demonstrate experimental waveforms in space. The SDR has an analog and a digital automatic gain control (AGC) and the response of the AGCs to changes in SDR input power and temperature was characterized prior to the launch and installation of the SCAN Testbed on the ISS. The AGCs were used to estimate the SDR input power and SNR of the received signal and the characterization results showed a nonlinear response to SDR input power and temperature. In order to estimate the SDR input from the AGCs, three algorithms were developed and implemented on the ground software of the SCAN Testbed. The algorithms include a linear straight line estimator, which used the digital AGC and the temperature to estimate the SDR input power over a narrower section of the SDR input power range. There is a linear adaptive filter algorithm that uses both AGCs and the temperature to estimate the SDR input power over a wide input power range. Finally, an algorithm that uses neural networks was designed to estimate the input power over a wide range. This paper describes the algorithms in detail and their associated performance in estimating the SDR input power.
Parallel job-scheduling algorithms
Rodger, S.H.
1989-01-01
In this thesis, we consider solving job scheduling problems on the CREW PRAM model. We show how to adapt Cole's pipeline merge technique to yield several efficient parallel algorithms for a number of job scheduling problems and one optimal parallel algorithm for the following job scheduling problem: Given a set of n jobs defined by release times, deadlines and processing times, find a schedule that minimizes the maximum lateness of the jobs and allows preemption when the jobs are scheduled to run on one machine. In addition, we present the first NC algorithm for the following job scheduling problem: Given a set of n jobs defined by release times, deadlines and unit processing times, determine if there is a schedule of jobs on one machine, and calculate the schedule if it exists. We identify the notion of a canonical schedule, which is the type of schedule our algorithm computes if there is a schedule. Our algorithm runs in O((log n){sup 2}) time and uses O(n{sup 2}k{sup 2}) processors, where k is the minimum number of distinct offsets of release times or deadlines.
Toward an Aristotelian Conception of Good Listening
ERIC Educational Resources Information Center
Rice, Suzanne
2011-01-01
In this essay Suzanne Rice examines Aristotle's ideas about virtue, character, and education as elements in an Aristotelian conception of good listening. Rice begins by surveying of several different contexts in which listening typically occurs, using this information to introduce the argument that what should count as "good listening" must be…
Highlights of Good Manufacturing Practice in Japan.
Morita, K
1990-01-01
Good Manufacturing Practice (GMP) in the pharmaceutical industry originated in the United States. Japan, having absorbed many things from the U.S., is actively seeking to establish Good Manufacturing Practice to match the pharmaceutical manufacturing climate in Japan. Several of the themes which highlight Japanese GMP efforts are presented below.
How To Achieve Good Library Acoustics.
ERIC Educational Resources Information Center
Wiens, Janet
2003-01-01
Discusses how to create a good acoustical environment for college libraries, focusing on requirements related to the HVAC system and lighting, and noting the importance of good maintenance. A sidebar looks at how to design and achieve the most appropriate HVAC and lighting systems for optimum library acoustics. (SM)
Student View: What Do Good Teachers Do?
ERIC Educational Resources Information Center
Educational Horizons, 2012
2012-01-01
Students know what good teaching looks like--but educators rarely ask them. See what these high school students, who are members of the Future Educators Association[R] and want to be teachers themselves, said. FEA is a part of the PDK family of education associations, which includes Pi Lambda Theta. Get insider advice on good teaching from some…
Paleolithic Counseling - The Good Old Days.
ERIC Educational Resources Information Center
King, Paul T.
This paper outlines what clients were like in the "Good Ol' Days", as compared with what they are like now. Formerly clients appeared to come in with a plethora of ego energy, while now it seems more like a depletion. Explicit in our culture now is the idea that it is almost healthy and good to publicize one's private experience. Some of…
19 CFR 10.303 - Originating goods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Originating goods. 10.303 Section 10.303 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... amended from time to time by the Customs Cooperation Council. (d) Articles of feather. The goods...
Feedback after Good Trials Enhances Learning
ERIC Educational Resources Information Center
Chiviacowsky, Suzete; Wulf, Gabriele
2007-01-01
Recent studies (Chiviacowsky & Wulf, 2002, 2005) have shown that learners prefer to receive feedback after they believe they had a "good" rather than "poor" trial. The present study followed up on this finding and examined whether learning would benefit if individuals received feedback after good relative to poor trials. Participants practiced a…
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 4 2010-01-01 2010-01-01 false Good cause. 276.6 Section 276.6 Agriculture... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM STATE AGENCY LIABILITIES AND FEDERAL SANCTIONS § 276.6 Good.../disallowance and injunctive relief provisions in §§ 276.4 and 276.5, FNS may determine that the State had...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 4 2013-01-01 2013-01-01 false Good cause. 276.6 Section 276.6 Agriculture... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM STATE AGENCY LIABILITIES AND FEDERAL SANCTIONS § 276.6 Good.../disallowance and injunctive relief provisions in §§ 276.4 and 276.5, FNS may determine that the State had...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 4 2014-01-01 2014-01-01 false Good cause. 276.6 Section 276.6 Agriculture... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM STATE AGENCY LIABILITIES AND FEDERAL SANCTIONS § 276.6 Good.../disallowance and injunctive relief provisions in §§ 276.4 and 276.5, FNS may determine that the State had...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 4 2012-01-01 2012-01-01 false Good cause. 276.6 Section 276.6 Agriculture... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM STATE AGENCY LIABILITIES AND FEDERAL SANCTIONS § 276.6 Good.../disallowance and injunctive relief provisions in §§ 276.4 and 276.5, FNS may determine that the State had...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 4 2011-01-01 2011-01-01 false Good cause. 276.6 Section 276.6 Agriculture... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM STATE AGENCY LIABILITIES AND FEDERAL SANCTIONS § 276.6 Good.../disallowance and injunctive relief provisions in §§ 276.4 and 276.5, FNS may determine that the State had...
A Good Teacher Can Teach Anything?
ERIC Educational Resources Information Center
Shimon, Jane M.; Brawdy, Paul
This paper examines issues fundamental to educating thoughtful, competent, intelligent teachers by highlighting arguments against out-of-field teaching in physical education teacher education programs. It discusses the inaccurate notion among some people that a good teacher can teach anything, noting what actually constitutes a good teacher in…
Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models.
Yuan, Gonglin; Duan, Xiabin; Liu, Wenjie; Wang, Xiaoliang; Cui, Zengru; Sheng, Zhou
2015-01-01
Two new PRP conjugate Algorithms are proposed in this paper based on two modified PRP conjugate gradient methods: the first algorithm is proposed for solving unconstrained optimization problems, and the second algorithm is proposed for solving nonlinear equations. The first method contains two aspects of information: function value and gradient value. The two methods both possess some good properties, as follows: 1) βk ≥ 0 2) the search direction has the trust region property without the use of any line search method 3) the search direction has sufficient descent property without the use of any line search method. Under some suitable conditions, we establish the global convergence of the two algorithms. We conduct numerical experiments to evaluate our algorithms. The numerical results indicate that the first algorithm is effective and competitive for solving unconstrained optimization problems and that the second algorithm is effective for solving large-scale nonlinear equations.
Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models
Yuan, Gonglin; Duan, Xiabin; Liu, Wenjie; Wang, Xiaoliang; Cui, Zengru; Sheng, Zhou
2015-01-01
Two new PRP conjugate Algorithms are proposed in this paper based on two modified PRP conjugate gradient methods: the first algorithm is proposed for solving unconstrained optimization problems, and the second algorithm is proposed for solving nonlinear equations. The first method contains two aspects of information: function value and gradient value. The two methods both possess some good properties, as follows: 1)βk ≥ 0 2) the search direction has the trust region property without the use of any line search method 3) the search direction has sufficient descent property without the use of any line search method. Under some suitable conditions, we establish the global convergence of the two algorithms. We conduct numerical experiments to evaluate our algorithms. The numerical results indicate that the first algorithm is effective and competitive for solving unconstrained optimization problems and that the second algorithm is effective for solving large-scale nonlinear equations. PMID:26502409
Radiation Hormesis: The Good, the Bad, and the Ugly
Luckey, T.D.
2006-01-01
Three aspects of hormesis with low doses of ionizing radiation are presented: the good, the bad, and the ugly. The good is acceptance by France, Japan, and China of the thousands of studies showing stimulation and/or benefit, with no harm, from low dose irradiation. This includes thousands of people who live in good health with high background radiation. The bad is the nonacceptance of radiation hormesis by the U. S. and most other governments; their linear no threshold (LNT) concept promulgates fear of all radiation and produces laws which have no basis in mammalian physiology. The LNT concept leads to poor health, unreasonable medicine and oppressed industries. The ugly is decades of deception by medical and radiation committees which refuse to consider valid evidence of radiation hormesis in cancer, other diseases, and health. Specific examples are provided for the good, the bad, and the ugly in radiation hormesis. PMID:18648595
Cooperation transition of spatial public goods games
NASA Astrophysics Data System (ADS)
Wang, Xu-Wen; Nie, Sen; Jiang, Luo-Luo; Wang, Bing-Hong
2016-12-01
In the public goods games, players attempt to optimize their payoffs, following as fair, generous, and extortionate rules according to the pattern needed for obtaining their expected profits compared to those of their opponents. In first type of rule (the fair one), players seek equal profits of the opponent, and the generous rule induces to lower payoff than that of opposite players, while the extortionate rule leads to higher payoffs than that of their opponents. To model the three types of behaviors, we introduce a conditional strategy with a parameter χ, which control conditions of existence of the three behaviors. Therefore, players may contribute in a group, but may not contribute in the other group, because it depends on conditions of their opponents. Simulation results show that there exists a pure cooperation state when parameter χ is moderate, even for a lower multiplication factor. Because conditional players cautiously contribute they can form compact clusters to prevent the invasion of defection and finally spread the cooperation when defectors die out.
Joni, Saj-nicole A; Beyer, Damon
2009-12-01
Peace and harmony are overrated. Though conflict-free teamwork is often held up as the be-all and end-all of organizational life, it actually can be the worst thing to ever happen to a company. Look at Lehman Brothers. When Dick Fuld took over, he transformed a notoriously contentious workplace into one of Wall Street's most harmonious firms. But his efforts backfired--directors and managers became too agreeable, afraid to rock the boat by pointing out that the firm was heading into a crisis. Research shows that the single greatest predictor of poor company performance is complacency, which is why every organization needs a healthy dose of dissent. Not all kinds of conflict are productive, of course -companies need to find the right balance of alignment and competition and make sure that people's energies are pointed in a positive direction. In this article, two seasoned business advisers lay down ground rules for the right kinds of fights. First, the stakes must be worthwhile: The issue should involve a noble purpose or create noticeable--preferably game-changing--value. Next, good fights focus on the future; they're never about placing blame for the past. And it's critical for leaders to keep fights sportsmanlike, allow informal give-and-take in the trenches, and help soften the blow for the losing parties.
Developing a game plan for good sportsmanship.
Lodl, Kathleen
2005-01-01
It is widely believed in the United States that competition is beneficial for youngsters. However, the media are full of examples of players, fans, and coaches whose behavior veers out of control. There have been well-documented examples of youth in livestock competitions illegally medicating show animals to make them appear calmer, officials biasing their rulings toward a team that will take the most fans to a playoff game, and team rivalries that have become so caustic as to be dangerous for competitors and fans. A university extension and its partners created a program called "Great Fans. Great Sports." in order to teach the kinds of behaviors we wish to instill among all who are involved in competitions. It requires entire communities to develop and implement plans for enhancing sportsmanship in music, debate, drama, 4-H, and other arenas, as well as sports. The goal is to make good sportsmanship not the exception but the norm. The authors provide anecdotal evidence that "Great Fans. Great Sports." is having a positive impact on the attitudes and behaviors of competitors, fans, and communities.
Chaotic CDMA watermarking algorithm for digital image in FRFT domain
NASA Astrophysics Data System (ADS)
Liu, Weizhong; Yang, Wentao; Feng, Zhuoming; Zou, Xuecheng
2007-11-01
A digital image-watermarking algorithm based on fractional Fourier transform (FRFT) domain is presented by utilizing chaotic CDMA technique in this paper. As a popular and typical transmission technique, CDMA has many advantages such as privacy, anti-jamming and low power spectral density, which can provide robustness against image distortions and malicious attempts to remove or tamper with the watermark. A super-hybrid chaotic map, with good auto-correlation and cross-correlation characteristics, is adopted to produce many quasi-orthogonal codes (QOC) that can replace the periodic PN-code used in traditional CDAM system. The watermarking data is divided into a lot of segments that correspond to different chaotic QOC respectively and are modulated into the CDMA watermarking data embedded into low-frequency amplitude coefficients of FRFT domain of the cover image. During watermark detection, each chaotic QOC extracts its corresponding watermarking segment by calculating correlation coefficients between chaotic QOC and watermarked data of the detected image. The CDMA technique not only can enhance the robustness of watermark but also can compress the data of the modulated watermark. Experimental results show that the watermarking algorithm has good performances in three aspects: better imperceptibility, anti-attack robustness and security.
PCA-based artifact removal algorithm for stroke detection using UWB radar imaging.
Ricci, Elisa; di Domenico, Simone; Cianca, Ernestina; Rossi, Tommaso; Diomedi, Marina
2016-09-16
Stroke patients should be dispatched at the highest level of care available in the shortest time. In this context, a transportable system in specialized ambulances, able to evaluate the presence of an acute brain lesion in a short time interval (i.e., few minutes), could shorten delay of treatment. UWB radar imaging is an emerging diagnostic branch that has great potential for the implementation of a transportable and low-cost device. Transportability, low cost and short response time pose challenges to the signal processing algorithms of the backscattered signals as they should guarantee good performance with a reasonably low number of antennas and low computational complexity, tightly related to the response time of the device. The paper shows that a PCA-based preprocessing algorithm can: (1) achieve good performance already with a computationally simple beamforming algorithm; (2) outperform state-of-the-art preprocessing algorithms; (3) enable a further improvement in the performance (and/or decrease in the number of antennas) by using a multistatic approach with just a modest increase in computational complexity. This is an important result toward the implementation of such a diagnostic device that could play an important role in emergency scenario.
Development of an Inverse Algorithm for Resonance Inspection
Lai, Canhai; Xu, Wei; Sun, Xin
2012-10-01
Resonance inspection (RI), which employs the natural frequency spectra shift between the good and the anomalous part populations to detect defects, is a non-destructive evaluation (NDE) technique with many advantages such as low inspection cost, high testing speed, and broad applicability to structures with complex geometry compared to other contemporary NDE methods. It has already been widely used in the automobile industry for quality inspections of safety critical parts. Unlike some conventionally used NDE methods, the current RI technology is unable to provide details, i.e. location, dimension, or types, of the flaws for the discrepant parts. Such limitation severely hinders its wide spread applications and further development. In this study, an inverse RI algorithm based on maximum correlation function is proposed to quantify the location and size of flaws for a discrepant part. A dog-bone shaped stainless steel sample with and without controlled flaws are used for algorithm development and validation. The results show that multiple flaws can be accurately pinpointed back using the algorithms developed, and the prediction accuracy decreases with increasing flaw numbers and decreasing distance between flaws.
Two Fibonacci P-code based image scrambling algorithms
NASA Astrophysics Data System (ADS)
Zhou, Yicong; Agaian, Sos; Joyner, Valencia M.; Panetta, Karen
2008-02-01
Image scrambling is used to make images visually unrecognizable such that unauthorized users have difficulty decoding the scrambled image to access the original image. This article presents two new image scrambling algorithms based on Fibonacci p-code, a parametric sequence. The first algorithm works in spatial domain and the second in frequency domain (including JPEG domain). A parameter, p, is used as a security-key and has many possible choices to guarantee the high security of the scrambled images. The presented algorithms can be implemented for encoding/decoding both in full and partial image scrambling, and can be used in real-time applications, such as image data hiding and encryption. Examples of image scrambling are provided. Computer simulations are shown to demonstrate that the presented methods also have good performance in common image attacks such as cutting (data loss), compression and noise. The new scrambling methods can be implemented on grey level images and 3-color components in color images. A new Lucas p-code is also introduced. The scrambling images based on Fibonacci p-code are also compared to the scrambling results of classic Fibonacci number and Lucas p-code. This will demonstrate that the classical Fibonacci number is a special sequence of Fibonacci p-code and show the different scrambling results of Fibonacci p-code and Lucas p-code.
RATE-ADJUSTMENT ALGORITHM FOR AGGREGATE TCP CONGESTION CONTROL
P. TINNAKORNSRISUPHAP, ET AL
2000-09-01
The TCP congestion-control mechanism is an algorithm designed to probe the available bandwidth of the network path that TCP packets traverse. However, it is well-known that the TCP congestion-control mechanism does not perform well on networks with a large bandwidth-delay product due to the slow dynamics in adapting its congestion window, especially for short-lived flows. One promising solution to the problem is to aggregate and share the path information among TCP connections that traverse the same bottleneck path, i.e., Aggregate TCP. However, this paper shows via a queueing analysis of a generalized processor-sharing (GPS) queue with regularly-varying service time that a simple aggregation of local TCP connections together into a single aggregate TCP connection can result in a severe performance degradation. To prevent such a degradation, we introduce a rate-adjustment algorithm. Our simulation confirms that by utilizing our rate-adjustment algorithm on aggregate TCP, connections which would normally receive poor service achieve significant performance improvements without penalizing connections which already receive good service.
An effective algorithm for quick fractal analysis of movement biosignals.
Ripoli, A; Belardinelli, A; Palagi, G; Franchi, D; Bedini, R
1999-01-01
The problem of numerically classifying patterns, of crucial importance in the biomedical field, is here faced by means of their fractal dimension. A new simple algorithm was developed to characterize biomedical mono-dimensional signals avoiding computationally expensive methods, generally required by the classical approach of the fractal theory. The algorithm produces a number related to the geometric behaviour of the pattern providing information on the studied phenomenon. The results are independent of the signal amplitude and exhibit a fractal measure ranging from 1 to 2 for monotonically going-forwards monodimensional curves, in accordance with theory. Accurate calibration and qualification were accomplished by analysing basic waveforms. Further studies concerned the biomedical field with special reference to gait analysis: so far, well controlled movements such as walking, going up and downstairs and running, have been investigated. Controlled conditions of the test environment guaranteed the necessary repeatability and the accuracy of the practical experiments in setting up the methodology. The algorithm showed good performance in classifying the considered simple movements in the selected sample of normal subjects. The results obtained encourage us to use this technique for an effective on-line movement correlation with other long-term monitored variables such as blood pressure, ECG, etc.
Adaptive primal-dual genetic algorithms in dynamic environments.
Wang, Hongfeng; Yang, Shengxiang; Ip, W H; Wang, Dingwei
2009-12-01
Recently, there has been an increasing interest in applying genetic algorithms (GAs) in dynamic environments. Inspired by the complementary and dominance mechanisms in nature, a primal-dual GA (PDGA) has been proposed for dynamic optimization problems (DOPs). In this paper, an important operator in PDGA, i.e., the primal-dual mapping (PDM) scheme, is further investigated to improve the robustness and adaptability of PDGA in dynamic environments. In the improved scheme, two different probability-based PDM operators, where the mapping probability of each allele in the chromosome string is calculated through the statistical information of the distribution of alleles in the corresponding gene locus over the population, are effectively combined according to an adaptive Lamarckian learning mechanism. In addition, an adaptive dominant replacement scheme, which can probabilistically accept inferior chromosomes, is also introduced into the proposed algorithm to enhance the diversity level of the population. Experimental results on a series of dynamic problems generated from several stationary benchmark problems show that the proposed algorithm is a good optimizer for DOPs.
Implementation of several mathematical algorithms to breast tissue density classification
NASA Astrophysics Data System (ADS)
Quintana, C.; Redondo, M.; Tirao, G.
2014-02-01
The accuracy of mammographic abnormality detection methods is strongly dependent on breast tissue characteristics, where a dense breast tissue can hide lesions causing cancer to be detected at later stages. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. This paper presents the implementation and the performance of different mathematical algorithms designed to standardize the categorization of mammographic images, according to the American College of Radiology classifications. These mathematical techniques are based on intrinsic properties calculations and on comparison with an ideal homogeneous image (joint entropy, mutual information, normalized cross correlation and index Q) as categorization parameters. The algorithms evaluation was performed on 100 cases of the mammographic data sets provided by the Ministerio de Salud de la Provincia de Córdoba, Argentina—Programa de Prevención del Cáncer de Mama (Department of Public Health, Córdoba, Argentina, Breast Cancer Prevention Program). The obtained breast classifications were compared with the expert medical diagnostics, showing a good performance. The implemented algorithms revealed a high potentiality to classify breasts into tissue density categories.
High rate pulse processing algorithms for microcalorimeters
Rabin, Michael; Hoover, Andrew S; Bacrania, Mnesh K; Tan, Hui; Breus, Dimitry; Henning, Wolfgang; Sabourov, Konstantin; Collins, Jeff; Warburton, William K; Dorise, Bertrand; Ullom, Joel N
2009-01-01
It has been demonstrated that microcalorimeter spectrometers based on superconducting transition-edge-sensor can readily achieve sub-100 eV energy resolution near 100 keV. However, the active volume of a single microcalorimeter has to be small to maintain good energy resolution, and pulse decay times are normally in the order of milliseconds due to slow thermal relaxation. Consequently, spectrometers are typically built with an array of microcalorimeters to increase detection efficiency and count rate. Large arrays, however, require as much pulse processing as possible to be performed at the front end of the readout electronics to avoid transferring large amounts of waveform data to a host computer for processing. In this paper, they present digital filtering algorithms for processing microcalorimeter pulses in real time at high count rates. The goal for these algorithms, which are being implemented in the readout electronics that they are also currently developing, is to achieve sufficiently good energy resolution for most applications while being (a) simple enough to be implemented in the readout electronics and (b) capable of processing overlapping pulses and thus achieving much higher output count rates than the rates that existing algorithms are currently achieving. Details of these algorithms are presented, and their performance was compared to that of the 'optimal filter' that is the dominant pulse processing algorithm in the cryogenic-detector community.
A Parallel Newton-Krylov-Schur Algorithm for the Reynolds-Averaged Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Osusky, Michal
Aerodynamic shape optimization and multidisciplinary optimization algorithms have the potential not only to improve conventional aircraft, but also to enable the design of novel configurations. By their very nature, these algorithms generate and analyze a large number of unique shapes, resulting in high computational costs. In order to improve their efficiency and enable their use in the early stages of the design process, a fast and robust flow solution algorithm is necessary. This thesis presents an efficient parallel Newton-Krylov-Schur flow solution algorithm for the three-dimensional Navier-Stokes equations coupled with the Spalart-Allmaras one-equation turbulence model. The algorithm employs second-order summation-by-parts (SBP) operators on multi-block structured grids with simultaneous approximation terms (SATs) to enforce block interface coupling and boundary conditions. The discrete equations are solved iteratively with an inexact-Newton method, while the linear system at each Newton iteration is solved using the flexible Krylov subspace iterative method GMRES with an approximate-Schur parallel preconditioner. The algorithm is thoroughly verified and validated, highlighting the correspondence of the current algorithm with several established flow solvers. The solution for a transonic flow over a wing on a mesh of medium density (15 million nodes) shows good agreement with experimental results. Using 128 processors, deep convergence is obtained in under 90 minutes. The solution of transonic flow over the Common Research Model wing-body geometry with grids with up to 150 million nodes exhibits the expected grid convergence behavior. This case was completed as part of the Fifth AIAA Drag Prediction Workshop, with the algorithm producing solutions that compare favourably with several widely used flow solvers. The algorithm is shown to scale well on over 6000 processors. The results demonstrate the effectiveness of the SBP-SAT spatial discretization, which can
Ferromagnetic Mass Localization in Check Point Configuration Using a Levenberg Marquardt Algorithm
Alimi, Roger; Geron, Nir; Weiss, Eyal; Ram-Cohen, Tsuriel
2009-01-01
A detection and tracking algorithm for ferromagnetic objects based on a two stage Levenberg Marquardt Algorithm (LMA) is presented. The procedure is applied to localization and magnetic moment estimation of ferromagnetic objects moving in the vicinity of an array of two to four 3-axis magnetometers arranged as a check point configuration. The algorithms first stage provides an estimation of the target trajectory and moment that are further refined using a second iteration where only the position vector is taken as unknown. The whole procedure is fast enough to provide satisfactory results within a few seconds after the target has been detected. Tests were conducted in Soreq NRC assessing various check point scenarios and targets. The results obtained from this experiment show good localization performance and good convivial with “noisy” environment. Small targets can be localized with good accuracy using either a vertical “doorway” two to four sensors configuration or ground level two to four sensors configuration. The calculated trajectory was not affected by nearby magnetic interference such as moving vehicles or a combat soldier inspecting the gateway. PMID:22291540
Scheduling with genetic algorithms
NASA Technical Reports Server (NTRS)
Fennel, Theron R.; Underbrink, A. J., Jr.; Williams, George P. W., Jr.
1994-01-01
In many domains, scheduling a sequence of jobs is an important function contributing to the overall efficiency of the operation. At Boeing, we develop schedules for many different domains, including assembly of military and commercial aircraft, weapons systems, and space vehicles. Boeing is under contract to develop scheduling systems for the Space Station Payload Planning System (PPS) and Payload Operations and Integration Center (POIC). These applications require that we respect certain sequencing restrictions among the jobs to be scheduled while at the same time assigning resources to the jobs. We call this general problem scheduling and resource allocation. Genetic algorithms (GA's) offer a search method that uses a population of solutions and benefits from intrinsic parallelism to search the problem space rapidly, producing near-optimal solutions. Good intermediate solutions are probabalistically recombined to produce better offspring (based upon some application specific measure of solution fitness, e.g., minimum flowtime, or schedule completeness). Also, at any point in the search, any intermediate solution can be accepted as a final solution; allowing the search to proceed longer usually produces a better solution while terminating the search at virtually any time may yield an acceptable solution. Many processes are constrained by restrictions of sequence among the individual jobs. For a specific job, other jobs must be completed beforehand. While there are obviously many other constraints on processes, it is these on which we focussed for this research: how to allocate crews to jobs while satisfying job precedence requirements and personnel, and tooling and fixture (or, more generally, resource) requirements.
NASA Astrophysics Data System (ADS)
Zhou, Mandi; Shu, Jiong; Chen, Zhigang; Ji, Minhe
2012-11-01
Hyperspectral imagery has been widely used in terrain classification for its high resolution. Urban vegetation, known as an essential part of the urban ecosystem, can be difficult to discern due to high similarity of spectral signatures among some land-cover classes. In this paper, we investigate a hybrid approach of the genetic-algorithm tuned fuzzy support vector machine (GA-FSVM) technique and apply it to urban vegetation classification from aerial hyperspectral urban imagery. The approach adopts the genetic algorithm to optimize parameters of support vector machine, and employs the K-nearest neighbor algorithm to calculate the membership function for each fuzzy parameter, aiming to reduce the effects of the isolated and noisy samples. Test data come from push-broom hyperspectral imager (PHI) hyperspectral remote sensing image which partially covers a corner of the Shanghai World Exposition Park, while PHI is a hyper-spectral sensor developed by Shanghai Institute of Technical Physics. Experimental results show the GA-FSVM model generates overall accuracy of 71.2%, outperforming the maximum likelihood classifier with 49.4% accuracy and the artificial neural network method with 60.8% accuracy. It indicates GA-FSVM is a promising model for vegetation classification from hyperspectral urban data, and has good advantage in the application of classification involving abundant mixed pixels and small samples problem.
NASA Astrophysics Data System (ADS)
Vanhellemont, Filip; Mateshvili, Nina; Blanot, Laurent; Étienne Robert, Charles; Bingen, Christine; Sofieva, Viktoria; Dalaudier, Francis; Tétard, Cédric; Fussen, Didier; Dekemper, Emmanuel; Kyrölä, Erkki; Laine, Marko; Tamminen, Johanna; Zehner, Claus
2016-09-01
The GOMOS instrument on Envisat has successfully demonstrated that a UV-Vis-NIR spaceborne stellar occultation instrument is capable of delivering quality data on the gaseous and particulate composition of Earth's atmosphere. Still, some problems related to data inversion remained to be examined. In the past, it was found that the aerosol extinction profile retrievals in the upper troposphere and stratosphere are of good quality at a reference wavelength of 500 nm but suffer from anomalous, retrieval-related perturbations at other wavelengths. Identification of algorithmic problems and subsequent improvement was therefore necessary. This work has been carried out; the resulting AerGOM Level 2 retrieval algorithm together with the first data version AerGOMv1.0 forms the subject of this paper. The AerGOM algorithm differs from the standard GOMOS IPF processor in a number of important ways: more accurate physical laws have been implemented, all retrieval-related covariances are taken into account, and the aerosol extinction spectral model is strongly improved. Retrieval examples demonstrate that the previously observed profile perturbations have disappeared, and the obtained extinction spectra look in general more consistent. We present a detailed validation study in a companion paper; here, to give a first idea of the data quality, a worst-case comparison at 386 nm shows SAGE II-AerGOM correlation coefficients that are up to 1 order of magnitude larger than the ones obtained with the GOMOS IPFv6.01 data set.
NASA Astrophysics Data System (ADS)
Guo, Qian; Lu, Zhichang; Hirata, Yoshito; Aihara, Kazuyuki
2013-12-01
We propose an algorithm based on cross-entropy to determine parameters of a piecewise linear model, which describes intermittent androgen suppression therapy for prostate cancer. By comparing with clinical data, the parameter estimation for the switched system shows good fitting accuracy and efficiency. We further optimize switching time points for the piecewise linear model to obtain a feasible therapeutic schedule. The simulation results of therapeutic effect are superior to those of previous strategy.
A new frame-based registration algorithm
NASA Technical Reports Server (NTRS)
Yan, C. H.; Whalen, R. T.; Beaupre, G. S.; Sumanaweera, T. S.; Yen, S. Y.; Napel, S.
1998-01-01
This paper presents a new algorithm for frame registration. Our algorithm requires only that the frame be comprised of straight rods, as opposed to the N structures or an accurate frame model required by existing algorithms. The algorithm utilizes the full 3D information in the frame as well as a least squares weighting scheme to achieve highly accurate registration. We use simulated CT data to assess the accuracy of our algorithm. We compare the performance of the proposed algorithm to two commonly used algorithms. Simulation results show that the proposed algorithm is comparable to the best existing techniques with knowledge of the exact mathematical frame model. For CT data corrupted with an unknown in-plane rotation or translation, the proposed technique is also comparable to the best existing techniques. However, in situations where there is a discrepancy of more than 2 mm (0.7% of the frame dimension) between the frame and the mathematical model, the proposed technique is significantly better (p < or = 0.05) than the existing techniques. The proposed algorithm can be applied to any existing frame without modification. It provides better registration accuracy and is robust against model mis-match. It allows greater flexibility on the frame structure. Lastly, it reduces the frame construction cost as adherence to a concise model is not required.
Improved algorithm for hyperspectral data dimension determination
NASA Astrophysics Data System (ADS)
CHEN, Jie; DU, Lei; LI, Jing; HAN, Yachao; GAO, Zihong
2017-02-01
The correlation between adjacent bands of hyperspectral image data is relatively strong. However, signal coexists with noise and the HySime (hyperspectral signal identification by minimum error) algorithm which is based on the principle of least squares is designed to calculate the estimated noise value and the estimated signal correlation matrix value. The algorithm is effective with accurate noise value but ineffective with estimated noise value obtained from spectral dimension reduction and de-correlation process. This paper proposes an improved HySime algorithm based on noise whitening process. It carries out the noise whitening, instead of removing noise pixel by pixel, process on the original data first, obtains the noise covariance matrix estimated value accurately, and uses the HySime algorithm to calculate the signal correlation matrix value in order to improve the precision of results. With simulated as well as real data experiments in this paper, results show that: firstly, the improved HySime algorithm are more accurate and stable than the original HySime algorithm; secondly, the improved HySime algorithm results have better consistency under the different conditions compared with the classic noise subspace projection algorithm (NSP); finally, the improved HySime algorithm improves the adaptability of non-white image noise with noise whitening process.
Finding Good Health Information on the Internet
... Health Information Finding Good Health Information on the Internet Past Issues / Fall 2016 Table of Contents Stephanie ... conditions, medications, and wellness issues. Our site provides access to information produced by the National Library of ...
19 CFR 10.594 - Originating goods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... entirely in the territory of one or more of the Parties and: (1) Each non-originating material used in the production of the good undergoes an applicable change in tariff classification specified in General Note...
19 CFR 10.531 - Originating goods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the Parties; (b) The good is transformed in one or both of the Parties so that: (1) Each non-originating material undergoes an applicable change in tariff classification specified in General Note...
Tips for Getting a Good Night's Sleep
... of this page please turn JavaScript on. Feature: Sleep Disorders Tips for Getting A Good Night's Sleep Past ... in bed at night, you may have a sleep disorder. Your family healthcare provider or a sleep specialist ...
The Goodness of Simultaneous Fits in ISIS
NASA Astrophysics Data System (ADS)
Kühnel, Matthias; Falkner, Sebastian; Grossberger, Christoph; Ballhausen, Ralf; Dauser, Thomas; Schwarm, Fritz-Walter; Kreykenbohm, Ingo; Nowak, Michael A.; Pottschmidt, Katja; Ferrigno, Carlo; Rothschild, Richard E.; Martínez-Núñez, Silvia; Torrejón, José Miguel; Fürst, Felix; Klochkov, Dmitry; Staubert, Rüdiger; Kretschmar, Peter; Wilms, Jörn
2016-02-01
In a previous work, we introduced a tool for analyzing multiple datasets simultaneously, which has been implemented into ISIS. This tool was used to fit many spectra of X-ray binaries. However, the large number of degrees of freedom and individual datasets raise an issue about a good measure for a simultaneous fit quality. We present three ways to check the goodness of these fits: we investigate the goodness of each fit in all datasets, we define a combined goodness exploiting the logical structure of a simultaneous fit, and we stack the fit residuals of all datasets to detect weak features. These tools are applied to all RXTE-spectra from GRO 1008-57, revealing calibration features that are not detected significantly in any single spectrum. Stacking the residuals from the best-fit model for the Vela X-1 and XTE J1859+083 data evidences fluorescent emission lines that would have gone undetected otherwise.
Wisdom, technology, and the good life.
Markey, H T
1979-01-01
Wisdom lies in extraction of good from new and old. Wisdom alone produces a society of wise men unable to leave their caves. Technology alone produces a society ruled by cold, despotic facts. A proper combination of wisdom and technology can produce the good life. That requires recognition of our ambivalence toward technology, a move away from our superspecialization of technologists and nontechnologists and toward a clearer understanding of technology as a most important servant of man. PMID:396156
The Goods Upstairs Car Innovative Design
NASA Astrophysics Data System (ADS)
Wang, Feng-Lan; Zhang, Bo; Gao, Bo; Liu, Yan-Xin; Gao, Bo
2016-05-01
The design is a new kind of cars used for loading goods when you upstairs. The cars -- ones are very safe and convenient --consist of body, chassis, bottom, round, object, stage, upstairs, train wheels, handles, storage tank, security fence etc. The design, composed of combination of each structure, achieves the purpose of loading goods and even some large potted plants when you go upstairs or downstairs very flatly.
License plate detection algorithm
NASA Astrophysics Data System (ADS)
Broitman, Michael; Klopovsky, Yuri; Silinskis, Normunds
2013-12-01
A novel algorithm for vehicle license plates localization is proposed. The algorithm is based on pixel intensity transition gradient analysis. Near to 2500 natural-scene gray-level vehicle images of different backgrounds and ambient illumination was tested. The best set of algorithm's parameters produces detection rate up to 0.94. Taking into account abnormal camera location during our tests and therefore geometrical distortion and troubles from trees this result could be considered as passable. Correlation between source data, such as license Plate dimensions and texture, cameras location and others, and parameters of algorithm were also defined.
Distributed Minimum Hop Algorithms
1982-01-01
acknowledgement), node d starts iteration i+1, and otherwise the algorithm terminates. A detailed description of the algorithm is given in pidgin algol...precise behavior of the algorithm under these circumstances is described by the pidgin algol program in the appendix which is executed by each node. The...l) < N!(2) for each neighbor j, and thus by induction,J -1 N!(2-1) < n-i + (Z-1) + N!(Z-1), completing the proof. Algorithm Dl in Pidgin Algol It is
TrackEye tracking algorithm characterization
NASA Astrophysics Data System (ADS)
Valley, Michael T.; Shields, Robert W.; Reed, Jack M.
2004-10-01
TrackEye is a film digitization and target tracking system that offers the potential for quantitatively measuring the dynamic state variables (e.g., absolute and relative position, orientation, linear and angular velocity/acceleration, spin rate, trajectory, angle of attack, etc.) for moving objects using captured single or dual view image sequences. At the heart of the system is a set of tracking algorithms that automatically find and quantify the location of user selected image details such as natural test article features or passive fiducials that have been applied to cooperative test articles. This image position data is converted into real world coordinates and rates with user specified information such as the image scale and frame rate. Though tracking methods such as correlation algorithms are typically robust by nature, the accuracy and suitability of each TrackEye tracking algorithm is in general unknown even under good imaging conditions. The challenges of optimal algorithm selection and algorithm performance/measurement uncertainty are even more significant for long range tracking of high-speed targets where temporally varying atmospheric effects degrade the imagery. This paper will present the preliminary results from a controlled test sequence used to characterize the performance of the TrackEye tracking algorithm suite.
TrackEye tracking algorithm characterization.
Reed, Jack W.; Shields, Rob W; Valley, Michael T.
2004-08-01
TrackEye is a film digitization and target tracking system that offers the potential for quantitatively measuring the dynamic state variables (e.g., absolute and relative position, orientation, linear and angular velocity/acceleration, spin rate, trajectory, angle of attack, etc.) for moving objects using captured single or dual view image sequences. At the heart of the system is a set of tracking algorithms that automatically find and quantify the location of user selected image details such as natural test article features or passive fiducials that have been applied to cooperative test articles. This image position data is converted into real world coordinates and rates with user specified information such as the image scale and frame rate. Though tracking methods such as correlation algorithms are typically robust by nature, the accuracy and suitability of each TrackEye tracking algorithm is in general unknown even under good imaging conditions. The challenges of optimal algorithm selection and algorithm performance/measurement uncertainty are even more significant for long range tracking of high-speed targets where temporally varying atmospheric effects degrade the imagery. This paper will present the preliminary results from a controlled test sequence used to characterize the performance of the TrackEye tracking algorithm suite.
Making Good Teaching Great: Everyday Strategies for Teaching with Impact
ERIC Educational Resources Information Center
Breaux, Annette L.; Whitaker, Todd
2012-01-01
Every good teacher strives to be a great teacher--and this must-have book shows you how! It's filled with practical tips and strategies for connecting with your students in a meaningful and powerful way. Learn how to improve student learning with easy-to-implement daily activities designed to integrate seamlessly into any day of the school year.…
Effects of the Good Behavior Game across Classroom Contexts
ERIC Educational Resources Information Center
Pennington, Brittany; McComas, Jennifer J.
2017-01-01
The Good Behavior Game (GBG), a well-researched classroom group contingency, is typically played for brief periods of time, which raises questions about the effects on subsequent contexts. This study used a multiple baseline design and showed that when the GBG was implemented in one context, behavior improved in only that context. Behavior…
An improved conscan algorithm based on a Kalman filter
NASA Technical Reports Server (NTRS)
Eldred, D. B.
1994-01-01
Conscan is commonly used by DSN antennas to allow adaptive tracking of a target whose position is not precisely known. This article describes an algorithm that is based on a Kalman filter and is proposed to replace the existing fast Fourier transform based (FFT-based) algorithm for conscan. Advantages of this algorithm include better pointing accuracy, continuous update information, and accommodation of missing data. Additionally, a strategy for adaptive selection of the conscan radius is proposed. The performance of the algorithm is illustrated through computer simulations and compared to the FFT algorithm. The results show that the Kalman filter algorithm is consistently superior.
Wire Detection Algorithms for Navigation
NASA Technical Reports Server (NTRS)
Kasturi, Rangachar; Camps, Octavia I.
2002-01-01
In this research we addressed the problem of obstacle detection for low altitude rotorcraft flight. In particular, the problem of detecting thin wires in the presence of image clutter and noise was studied. Wires present a serious hazard to rotorcrafts. Since they are very thin, their detection early enough so that the pilot has enough time to take evasive action is difficult, as their images can be less than one or two pixels wide. Two approaches were explored for this purpose. The first approach involved a technique for sub-pixel edge detection and subsequent post processing, in order to reduce the false alarms. After reviewing the line detection literature, an algorithm for sub-pixel edge detection proposed by Steger was identified as having good potential to solve the considered task. The algorithm was tested using a set of images synthetically generated by combining real outdoor images with computer generated wire images. The performance of the algorithm was evaluated both, at the pixel and the wire levels. It was observed that the algorithm performs well, provided that the wires are not too thin (or distant) and that some post processing is performed to remove false alarms due to clutter. The second approach involved the use of an example-based learning scheme namely, Support Vector Machines. The purpose of this approach was to explore the feasibility of an example-based learning based approach for the task of detecting wires from their images. Support Vector Machines (SVMs) have emerged as a promising pattern classification tool and have been used in various applications. It was found that this approach is not suitable for very thin wires and of course, not suitable at all for sub-pixel thick wires. High dimensionality of the data as such does not present a major problem for SVMs. However it is desirable to have a large number of training examples especially for high dimensional data. The main difficulty in using SVMs (or any other example-based learning
A Dual Frequency Carrier Phase Error Difference Checking Algorithm for the GNSS Compass
Liu, Shuo; Zhang, Lei; Li, Jian
2016-01-01
The performance of the Global Navigation Satellite System (GNSS) compass is related to the quality of carrier phase measurement. How to process the carrier phase error properly is important to improve the GNSS compass accuracy. In this work, we propose a dual frequency carrier phase error difference checking algorithm for the GNSS compass. The algorithm aims at eliminating large carrier phase error in dual frequency double differenced carrier phase measurement according to the error difference between two frequencies. The advantage of the proposed algorithm is that it does not need additional environment information and has a good performance on multiple large errors compared with previous research. The core of the proposed algorithm is removing the geographical distance from the dual frequency carrier phase measurement, then the carrier phase error is separated and detectable. We generate the Double Differenced Geometry-Free (DDGF) measurement according to the characteristic that the different frequency carrier phase measurements contain the same geometrical distance. Then, we propose the DDGF detection to detect the large carrier phase error difference between two frequencies. The theoretical performance of the proposed DDGF detection is analyzed. An open sky test, a manmade multipath test and an urban vehicle test were carried out to evaluate the performance of the proposed algorithm. The result shows that the proposed DDGF detection is able to detect large error in dual frequency carrier phase measurement by checking the error difference between two frequencies. After the DDGF detection, the accuracy of the baseline vector is improved in the GNSS compass. PMID:27886153
A Dual Frequency Carrier Phase Error Difference Checking Algorithm for the GNSS Compass.
Liu, Shuo; Zhang, Lei; Li, Jian
2016-11-24
The performance of the Global Navigation Satellite System (GNSS) compass is related to the quality of carrier phase measurement. How to process the carrier phase error properly is important to improve the GNSS compass accuracy. In this work, we propose a dual frequency carrier phase error difference checking algorithm for the GNSS compass. The algorithm aims at eliminating large carrier phase error in dual frequency double differenced carrier phase measurement according to the error difference between two frequencies. The advantage of the proposed algorithm is that it does not need additional environment information and has a good performance on multiple large errors compared with previous research. The core of the proposed algorithm is removing the geographical distance from the dual frequency carrier phase measurement, then the carrier phase error is separated and detectable. We generate the Double Differenced Geometry-Free (DDGF) measurement according to the characteristic that the different frequency carrier phase measurements contain the same geometrical distance. Then, we propose the DDGF detection to detect the large carrier phase error difference between two frequencies. The theoretical performance of the proposed DDGF detection is analyzed. An open sky test, a manmade multipath test and an urban vehicle test were carried out to evaluate the performance of the proposed algorithm. The result shows that the proposed DDGF detection is able to detect large error in dual frequency carrier phase measurement by checking the error difference between two frequencies. After the DDGF detection, the accuracy of the baseline vector is improved in the GNSS compass.
Fast algorithm for scaling analysis with higher-order detrending moving average method
NASA Astrophysics Data System (ADS)
Tsujimoto, Yutaka; Miki, Yuki; Shimatani, Satoshi; Kiyono, Ken
2016-05-01
Among scaling analysis methods based on the root-mean-square deviation from the estimated trend, it has been demonstrated that centered detrending moving average (DMA) analysis with a simple moving average has good performance when characterizing long-range correlation or fractal scaling behavior. Furthermore, higher-order DMA has also been proposed; it is shown to have better detrending capabilities, removing higher-order polynomial trends than original DMA. However, a straightforward implementation of higher-order DMA requires a very high computational cost, which would prevent practical use of this method. To solve this issue, in this study, we introduce a fast algorithm for higher-order DMA, which consists of two techniques: (1) parallel translation of moving averaging windows by a fixed interval; (2) recurrence formulas for the calculation of summations. Our algorithm can significantly reduce computational cost. Monte Carlo experiments show that the computational time of our algorithm is approximately proportional to the data length, although that of the conventional algorithm is proportional to the square of the data length. The efficiency of our algorithm is also shown by a systematic study of the performance of higher-order DMA, such as the range of detectable scaling exponents and detrending capability for removing polynomial trends. In addition, through the analysis of heart-rate variability time series, we discuss possible applications of higher-order DMA.
Wang, Xueyi
2011-01-01
The k-nearest neighbors (k-NN) algorithm is a widely used machine learning method that finds nearest neighbors of a test object in a feature space. We present a new exact k-NN algorithm called kMkNN (k-Means for k-Nearest Neighbors) that uses the k-means clustering and the triangle inequality to accelerate the searching for nearest neighbors in a high dimensional space. The kMkNN algorithm has two stages. In the buildup stage, instead of using complex tree structures such as metric trees, kd-trees, or ball-tree, kMkNN uses a simple k-means clustering method to preprocess the training dataset. In the searching stage, given a query object, kMkNN finds nearest training objects starting from the nearest cluster to the query object and uses the triangle inequality to reduce the distance calculations. Experiments show that the performance of kMkNN is surprisingly good compared to the traditional k-NN algorithm and tree-based k-NN algorithms such as kd-trees and ball-trees. On a collection of 20 datasets with up to 106 records and 104 dimensions, kMkNN shows a 2-to 80-fold reduction of distance calculations and a 2- to 60-fold speedup over the traditional k-NN algorithm for 16 datasets. Furthermore, kMkNN performs significant better than a kd-tree based k-NN algorithm for all datasets and performs better than a ball-tree based k-NN algorithm for most datasets. The results show that kMkNN is effective for searching nearest neighbors in high dimensional spaces. PMID:22247818
A hybrid artificial bee colony algorithm for numerical function optimization
NASA Astrophysics Data System (ADS)
Alqattan, Zakaria N.; Abdullah, Rosni
2015-02-01
Artificial Bee Colony (ABC) algorithm is one of the swarm intelligence algorithms; it has been introduced by Karaboga in 2005. It is a meta-heuristic optimization search algorithm inspired from the intelligent foraging behavior of the honey bees in nature. Its unique search process made it as one of the most competitive algorithm with some other search algorithms in the area of optimization, such as Genetic algorithm (GA) and Particle Swarm Optimization (PSO). However, the ABC performance of the local search process and the bee movement or the solution improvement equation still has some weaknesses. The ABC is good in avoiding trapping at the local optimum but it spends its time searching around unpromising random selected solutions. Inspired by the PSO, we propose a Hybrid Particle-movement ABC algorithm called HPABC, which adapts the particle movement process to improve the exploration of the original ABC algorithm. Numerical benchmark functions were used in order to experimentally test the HPABC algorithm. The results illustrate that the HPABC algorithm can outperform the ABC algorithm in most of the experiments (75% better in accuracy and over 3 times faster).
Algorithm That Synthesizes Other Algorithms for Hashing
NASA Technical Reports Server (NTRS)
James, Mark
2010-01-01
An algorithm that includes a collection of several subalgorithms has been devised as a means of synthesizing still other algorithms (which could include computer code) that utilize hashing to determine whether an element (typically, a number or other datum) is a member of a set (typically, a list of numbers). Each subalgorithm synthesizes an algorithm (e.g., a block of code) that maps a static set of key hashes to a somewhat linear monotonically increasing sequence of integers. The goal in formulating this mapping is to cause the length of the sequence thus generated to be as close as practicable to the original length of the set and thus to minimize gaps between the elements. The advantage of the approach embodied in this algorithm is that it completely avoids the traditional approach of hash-key look-ups that involve either secondary hash generation and look-up or further searching of a hash table for a desired key in the event of collisions. This algorithm guarantees that it will never be necessary to perform a search or to generate a secondary key in order to determine whether an element is a member of a set. This algorithm further guarantees that any algorithm that it synthesizes can be executed in constant time. To enforce these guarantees, the subalgorithms are formulated to employ a set of techniques, each of which works very effectively covering a certain class of hash-key values. These subalgorithms are of two types, summarized as follows: Given a list of numbers, try to find one or more solutions in which, if each number is shifted to the right by a constant number of bits and then masked with a rotating mask that isolates a set of bits, a unique number is thereby generated. In a variant of the foregoing procedure, omit the masking. Try various combinations of shifting, masking, and/or offsets until the solutions are found. From the set of solutions, select the one that provides the greatest compression for the representation and is executable in the
Efficient Homotopy Continuation Algorithms with Application to Computational Fluid Dynamics
NASA Astrophysics Data System (ADS)
Brown, David A.
New homotopy continuation algorithms are developed and applied to a parallel implicit finite-difference Newton-Krylov-Schur external aerodynamic flow solver for the compressible Euler, Navier-Stokes, and Reynolds-averaged Navier-Stokes equations with the Spalart-Allmaras one-equation turbulence model. Many new analysis tools, calculations, and numerical algorithms are presented for the study and design of efficient and robust homotopy continuation algorithms applicable to solving very large and sparse nonlinear systems of equations. Several specific homotopies are presented and studied and a methodology is presented for assessing the suitability of specific homotopies for homotopy continuation. . A new class of homotopy continuation algorithms, referred to as monolithic homotopy continuation algorithms, is developed. These algorithms differ from classical predictor-corrector algorithms by combining the predictor and corrector stages into a single update, significantly reducing the amount of computation and avoiding wasted computational effort resulting from over-solving in the corrector phase. The new algorithms are also simpler from a user perspective, with fewer input parameters, which also improves the user's ability to choose effective parameters on the first flow solve attempt. Conditional convergence is proved analytically and studied numerically for the new algorithms. The performance of a fully-implicit monolithic homotopy continuation algorithm is evaluated for several inviscid, laminar, and turbulent flows over NACA 0012 airfoils and ONERA M6 wings. The monolithic algorithm is demonstrated to be more efficient than the predictor-corrector algorithm for all applications investigated. It is also demonstrated to be more efficient than the widely-used pseudo-transient continuation algorithm for all inviscid and laminar cases investigated, and good performance scaling with grid refinement is demonstrated for the inviscid cases. Performance is also demonstrated
Transitional Division Algorithms.
ERIC Educational Resources Information Center
Laing, Robert A.; Meyer, Ruth Ann
1982-01-01
A survey of general mathematics students whose teachers were taking an inservice workshop revealed that they had not yet mastered division. More direct introduction of the standard division algorithm is favored in elementary grades, with instruction of transitional processes curtailed. Weaknesses in transitional algorithms appear to outweigh…
Ultrametric Hierarchical Clustering Algorithms.
ERIC Educational Resources Information Center
Milligan, Glenn W.
1979-01-01
Johnson has shown that the single linkage and complete linkage hierarchical clustering algorithms induce a metric on the data known as the ultrametric. Johnson's proof is extended to four other common clustering algorithms. Two additional methods also produce hierarchical structures which can violate the ultrametric inequality. (Author/CTM)
The Training Effectiveness Algorithm.
ERIC Educational Resources Information Center
Cantor, Jeffrey A.
1988-01-01
Describes the Training Effectiveness Algorithm, a systematic procedure for identifying the cause of reported training problems which was developed for use in the U.S. Navy. A two-step review by subject matter experts is explained, and applications of the algorithm to other organizations and training systems are discussed. (Author/LRW)
Prefiltering Model for Homology Detection Algorithms on GPU
Retamosa, Germán; de Pedro, Luis; González, Ivan; Tamames, Javier
2016-01-01
Homology detection has evolved over the time from heavy algorithms based on dynamic programming approaches to lightweight alternatives based on different heuristic models. However, the main problem with these algorithms is that they use complex statistical models, which makes it difficult to achieve a relevant speedup and find exact matches with the original results. Thus, their acceleration is essential. The aim of this article was to prefilter a sequence database. To make this work, we have implemented a groundbreaking heuristic model based on NVIDIA’s graphics processing units (GPUs) and multicore processors. Depending on the sensitivity settings, this makes it possible to quickly reduce the sequence database by factors between 50% and 95%, while rejecting no significant sequences. Furthermore, this prefiltering application can be used together with multiple homology detection algorithms as a part of a next-generation sequencing system. Extensive performance and accuracy tests have been carried out in the Spanish National Centre for Biotechnology (NCB). The results show that GPU hardware can accelerate the execution times of former homology detection applications, such as National Centre for Biotechnology Information (NCBI), Basic Local Alignment Search Tool for Proteins (BLASTP), up to a factor of 4. KEY POINTS:Owing to the increasing size of the current sequence datasets, filtering approach and high-performance computing (HPC) techniques are the best solution to process all these information in acceptable processing times.Graphics processing unit cards and their corresponding programming models are good options to carry out these processing methods.Combination of filtration models with HPC techniques is able to offer new levels of performance and accuracy in homology detection algorithms such as National Centre for Biotechnology Information Basic Local Alignment Search Tool. PMID:28008220
NASA Astrophysics Data System (ADS)
Nagao, Toshiyasu; Takeuchi, Akihiro; Nakamura, Kenji
2011-03-01
There are a number of reports on seismic quiescence phenomena before large earthquakes. The RTL algorithm is a weighted coefficient statistical method that takes into account the magnitude, occurrence time, and place of earthquake when seismicity pattern changes before large earthquakes are being investigated. However, we consider the original RTL algorithm to be overweighted on distance. In this paper, we introduce a modified RTL algorithm, called the RTM algorithm, and apply it to three large earthquakes in Japan, namely, the Hyogo-ken Nanbu earthquake in 1995 ( M JMA7.3), the Noto Hanto earthquake in 2007 ( M JMA 6.9), and the Iwate-Miyagi Nairiku earthquake in 2008 ( M JMA 7.2), as test cases. Because this algorithm uses several parameters to characterize the weighted coefficients, multiparameter sets have to be prepared for the tests. The results show that the RTM algorithm is more sensitive than the RTL algorithm to seismic quiescence phenomena. This paper represents the first step in a series of future analyses of seismic quiescence phenomena using the RTM algorithm. At this moment, whole surveyed parameters are empirically selected for use in the method. We have to consider the physical meaning of the "best fit" parameter, such as the relation of ACFS, among others, in future analyses.
Motion Cueing Algorithm Development: Piloted Performance Testing of the Cueing Algorithms
NASA Technical Reports Server (NTRS)
Houck, Jacob A. (Technical Monitor); Telban, Robert J.; Cardullo, Frank M.; Kelly, Lon C.
2005-01-01
The relative effectiveness in simulating aircraft maneuvers with both current and newly developed motion cueing algorithms was assessed with an eleven-subject piloted performance evaluation conducted on the NASA Langley Visual Motion Simulator (VMS). In addition to the current NASA adaptive algorithm, two new cueing algorithms were evaluated: the optimal algorithm and the nonlinear algorithm. The test maneuvers included a straight-in approach with a rotating wind vector, an offset approach with severe turbulence and an on/off lateral gust that occurs as the aircraft approaches the runway threshold, and a takeoff both with and without engine failure after liftoff. The maneuvers were executed with each cueing algorithm with added visual display delay conditions ranging from zero to 200 msec. Two methods, the quasi-objective NASA Task Load Index (TLX), and power spectral density analysis of pilot control, were used to assess pilot workload. Piloted performance parameters for the approach maneuvers, the vertical velocity upon touchdown and the runway touchdown position, were also analyzed but did not show any noticeable difference among the cueing algorithms. TLX analysis reveals, in most cases, less workload and variation among pilots with the nonlinear algorithm. Control input analysis shows pilot-induced oscillations on a straight-in approach were less prevalent compared to the optimal algorithm. The augmented turbulence cues increased workload on an offset approach that the pilots deemed more realistic compared to the NASA adaptive algorithm. The takeoff with engine failure showed the least roll activity for the nonlinear algorithm, with the least rudder pedal activity for the optimal algorithm.
Ab initio study on (CO2)n clusters via electrostatics- and molecular tailoring-based algorithm
NASA Astrophysics Data System (ADS)
Jovan Jose, K. V.; Gadre, Shridhar R.
An algorithm based on molecular electrostatic potential (MESP) and molecular tailoring approach (MTA) for building energetically favorable molecular clusters is presented. This algorithm is tested on prototype (CO2)n clusters with n = 13, 20, and 25 to explore their structure, energetics, and properties. The most stable clusters in this series are seen to show more number of triangular motifs. Many-body energy decomposition analysis performed on the most stable clusters reveals that the 2-body is the major contributor (>96%) to the total interaction energy. Vibrational frequencies and molecular electrostatic potentials are also evaluated for these large clusters through MTA. The MTA-based MESPs of these clusters show a remarkably good agreement with the corresponding actual ones. The most intense MTA-based normal mode frequencies are in fair agreement with the actual ones for smaller clusters. These calculated asymmetric stretching frequencies are blue-shifted with reference to the CO2 monomer.
A set-covering based heuristic algorithm for the periodic vehicle routing problem.
Cacchiani, V; Hemmelmayr, V C; Tricoire, F
2014-01-30
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.
Advancements to the planogram frequency–distance rebinning algorithm
Champley, Kyle M; Raylman, Raymond R; Kinahan, Paul E
2010-01-01
In this paper we consider the task of image reconstruction in positron emission tomography (PET) with the planogram frequency–distance rebinning (PFDR) algorithm. The PFDR algorithm is a rebinning algorithm for PET systems with panel detectors. The algorithm is derived in the planogram coordinate system which is a native data format for PET systems with panel detectors. A rebinning algorithm averages over the redundant four-dimensional set of PET data to produce a three-dimensional set of data. Images can be reconstructed from this rebinned three-dimensional set of data. This process enables one to reconstruct PET images more quickly than reconstructing directly from the four-dimensional PET data. The PFDR algorithm is an approximate rebinning algorithm. We show that implementing the PFDR algorithm followed by the (ramp) filtered backprojection (FBP) algorithm in linogram coordinates from multiple views reconstructs a filtered version of our image. We develop an explicit formula for this filter which can be used to achieve exact reconstruction by means of a modified FBP algorithm applied to the stack of rebinned linograms and can also be used to quantify the errors introduced by the PFDR algorithm. This filter is similar to the filter in the planogram filtered backprojection algorithm derived by Brasse et al. The planogram filtered backprojection and exact reconstruction with the PFDR algorithm require complete projections which can be completed with a reprojection algorithm. The PFDR algorithm is similar to the rebinning algorithm developed by Kao et al. By expressing the PFDR algorithm in detector coordinates, we provide a comparative analysis between the two algorithms. Numerical experiments using both simulated data and measured data from a positron emission mammography/tomography (PEM/PET) system are performed. Images are reconstructed by PFDR+FBP (PFDR followed by 2D FBP reconstruction), PFDRX (PFDR followed by the modified FBP algorithm for exact
ERIC Educational Resources Information Center
Holzer, Harry J.
2013-01-01
Stagnant earnings and growing inequality in the US labor market reflect both a slowdown in the growth of worker skills and the growing matching of good-paying jobs to skilled workers. Improving the ties between colleges, workforce institutions, and employers would help more workers gain the needed skills. Evaluation evidence shows that training…
ERIC Educational Resources Information Center
Hipson, Will E.; Séguin, Daniel G.
2016-01-01
The Goodness-of-Fit model [Thomas, A., & Chess, S. (1977). Temperament and development. New York: Brunner/Mazel] proposes that a child's temperament interacts with the environment to influence child outcomes. In the past, researchers have shown how the association between the quality of the teacher-child relationship in daycare and child…
Totally parallel multilevel algorithms
NASA Technical Reports Server (NTRS)
Frederickson, Paul O.
1988-01-01
Four totally parallel algorithms for the solution of a sparse linear system have common characteristics which become quite apparent when they are implemented on a highly parallel hypercube such as the CM2. These four algorithms are Parallel Superconvergent Multigrid (PSMG) of Frederickson and McBryan, Robust Multigrid (RMG) of Hackbusch, the FFT based Spectral Algorithm, and Parallel Cyclic Reduction. In fact, all four can be formulated as particular cases of the same totally parallel multilevel algorithm, which are referred to as TPMA. In certain cases the spectral radius of TPMA is zero, and it is recognized to be a direct algorithm. In many other cases the spectral radius, although not zero, is small enough that a single iteration per timestep keeps the local error within the required tolerance.
Algorithms for Automated DNA Assembly
2010-01-01
polyketide synthase gene cluster. Proc. Natl Acad. Sci. USA, 101, 15573–15578. 16. Shetty,R.P., Endy,D. and Knight,T.F. Jr (2008) Engineering BioBrick vectors...correct theoretical construction scheme is de- veloped manually, it is likely to be suboptimal by any number of cost metrics. Modular, robust and...to an exhaustive search on a small synthetic dataset and our results show that our algorithms can quickly find an optimal solution. Comparison with
A Parallel Rendering Algorithm for MIMD Architectures
NASA Technical Reports Server (NTRS)
Crockett, Thomas W.; Orloff, Tobias
1991-01-01
Applications such as animation and scientific visualization demand high performance rendering of complex three dimensional scenes. To deliver the necessary rendering rates, highly parallel hardware architectures are required. The challenge is then to design algorithms and software which effectively use the hardware parallelism. A rendering algorithm targeted to distributed memory MIMD architectures is described. For maximum performance, the algorithm exploits both object-level and pixel-level parallelism. The behavior of the algorithm is examined both analytically and experimentally. Its performance for large numbers of processors is found to be limited primarily by communication overheads. An experimental implementation for the Intel iPSC/860 shows increasing performance from 1 to 128 processors across a wide range of scene complexities. It is shown that minimal modifications to the algorithm will adapt it for use on shared memory architectures as well.
An efficient quantum algorithm for spectral estimation
NASA Astrophysics Data System (ADS)
Steffens, Adrian; Rebentrost, Patrick; Marvian, Iman; Eisert, Jens; Lloyd, Seth
2017-03-01
We develop an efficient quantum implementation of an important signal processing algorithm for line spectral estimation: the matrix pencil method, which determines the frequencies and damping factors of signals consisting of finite sums of exponentially damped sinusoids. Our algorithm provides a quantum speedup in a natural regime where the sampling rate is much higher than the number of sinusoid components. Along the way, we develop techniques that are expected to be useful for other quantum algorithms as well—consecutive phase estimations to efficiently make products of asymmetric low rank matrices classically accessible and an alternative method to efficiently exponentiate non-Hermitian matrices. Our algorithm features an efficient quantum–classical division of labor: the time-critical steps are implemented in quantum superposition, while an interjacent step, requiring much fewer parameters, can operate classically. We show that frequencies and damping factors can be obtained in time logarithmic in the number of sampling points, exponentially faster than known classical algorithms.
A region labeling algorithm based on block
NASA Astrophysics Data System (ADS)
Wang, Jing
2009-10-01
The time performance of region labeling algorithm is important for image process. However, common region labeling algorithms cannot meet the requirements of real-time image processing. In this paper, a technique using block to record the connective area is proposed. By this technique, connective closure and information related to the target can be computed during a one-time image scan. It records the edge pixel's coordinate, including outer side edges and inner side edges, as well as the label, and then it can calculate connecting area's shape center, area and gray. Compared to others, this block based region labeling algorithm is more efficient. It can well meet the time requirements of real-time processing. Experiment results also validate the correctness and efficiency of the algorithm. Experiment results show that it can detect any connecting areas in binary images, which contains various complex and quaint patterns. The block labeling algorithm is used in a real-time image processing program now.
A Fast parallel tridiagonal algorithm for a class of CFD applications
NASA Technical Reports Server (NTRS)
Moitra, Stuti; Sun, Xian-He
1996-01-01
The parallel diagonal dominant (PDD) algorithm is an efficient tridiagonal solver. This paper presents for study a variation of the PDD algorithm, the reduced PDD algorithm. The new algorithm maintains the minimum communication provided by the PDD algorithm, but has a reduced operation count. The PDD algorithm also has a smaller operation count than the conventional sequential algorithm for many applications. Accuracy analysis is provided for the reduced PDD algorithm for symmetric Toeplitz tridiagonal (STT) systems. Implementation results on Langley's Intel Paragon and IBM SP2 show that both the PDD and reduced PDD algorithms are efficient and scalable.
Kidney-inspired algorithm for optimization problems
NASA Astrophysics Data System (ADS)
Jaddi, Najmeh Sadat; Alvankarian, Jafar; Abdullah, Salwani
2017-01-01
In this paper, a population-based algorithm inspired by the kidney process in the human body is proposed. In this algorithm the solutions are filtered in a rate that is calculated based on the mean of objective functions of all solutions in the current population of each iteration. The filtered solutions as the better solutions are moved to filtered blood and the rest are transferred to waste representing the worse solutions. This is a simulation of the glomerular filtration process in the kidney. The waste solutions are reconsidered in the iterations if after applying a defined movement operator they satisfy the filtration rate, otherwise it is expelled from the waste solutions, simulating the reabsorption and excretion functions of the kidney. In addition, a solution assigned as better solution is secreted if it is not better than the worst solutions simulating the secreting process of blood in the kidney. After placement of all the solutions in the population, the best of them is ranked, the waste and filtered blood are merged to become a new population and the filtration rate is updated. Filtration provides the required exploitation while generating a new solution and reabsorption gives the necessary exploration for the algorithm. The algorithm is assessed by applying it on eight well-known benchmark test functions and compares the results with other algorithms in the literature. The performance of the proposed algorithm is better on seven out of eight test functions when it is compared with the most recent researches in literature. The proposed kidney-inspired algorithm is able to find the global optimum with less function evaluations on six out of eight test functions. A statistical analysis further confirms the ability of this algorithm to produce good-quality results.
Detection of Cheating by Decimation Algorithm
NASA Astrophysics Data System (ADS)
Yamanaka, Shogo; Ohzeki, Masayuki; Decelle, Aurélien
2015-02-01
We expand the item response theory to study the case of "cheating students" for a set of exams, trying to detect them by applying a greedy algorithm of inference. This extended model is closely related to the Boltzmann machine learning. In this paper we aim to infer the correct biases and interactions of our model by considering a relatively small number of sets of training data. Nevertheless, the greedy algorithm that we employed in the present study exhibits good performance with a few number of training data. The key point is the sparseness of the interactions in our problem in the context of the Boltzmann machine learning: the existence of cheating students is expected to be very rare (possibly even in real world). We compare a standard approach to infer the sparse interactions in the Boltzmann machine learning to our greedy algorithm and we find the latter to be superior in several aspects.
Evaluation of prostate segmentation algorithms for MRI: the PROMISE12 challenge.
Litjens, Geert; Toth, Robert; van de Ven, Wendy; Hoeks, Caroline; Kerkstra, Sjoerd; van Ginneken, Bram; Vincent, Graham; Guillard, Gwenael; Birbeck, Neil; Zhang, Jindang; Strand, Robin; Malmberg, Filip; Ou, Yangming; Davatzikos, Christos; Kirschner, Matthias; Jung, Florian; Yuan, Jing; Qiu, Wu; Gao, Qinquan; Edwards, Philip Eddie; Maan, Bianca; van der Heijden, Ferdinand; Ghose, Soumya; Mitra, Jhimli; Dowling, Jason; Barratt, Dean; Huisman, Henkjan; Madabhushi, Anant
2014-02-01
-atlas registration, both on accuracy and computation time. Although average algorithm performance was good to excellent and the Imorphics algorithm outperformed the second observer on average, we showed that algorithm combination might lead to further improvement, indicating that optimal performance for prostate segmentation is not yet obtained. All results are available online at http://promise12.grand-challenge.org/.
Paying for international environmental public goods.
Arriagada, Rodrigo; Perrings, Charles
2011-11-01
Supply of international environmental public goods must meet certain conditions to be socially efficient, and several reasons explain why they are currently undersupplied. Diagnosis of the public goods failure associated with particular ecosystem services is critical to the development of the appropriate international response. There are two categories of international environmental public goods that are most likely to be undersupplied. One has an additive supply technology and the other has a weakest link supply technology. The degree to which the collective response should be targeted depends on the importance of supply from any one country. In principle, the solution for the undersupply lies in payments designed to compensate local providers for the additional costs they incur in meeting global demand. Targeted support may take the form of direct investment in supply (the Global Environment Facility model) or of payments for the benefits of supply (the Payments for Ecosystem Services model).
Rainmakers: why bad weather means good productivity.
Lee, Jooa Julia; Gino, Francesca; Staats, Bradley R
2014-05-01
People believe that weather conditions influence their everyday work life, but to date, little is known about how weather affects individual productivity. Contrary to conventional wisdom, we predict and find that bad weather increases individual productivity and that it does so by eliminating potential cognitive distractions resulting from good weather. When the weather is bad, individuals appear to focus more on their work than on alternate outdoor activities. We investigate the proposed relationship between worse weather and higher productivity through 4 studies: (a) field data on employees' productivity from a bank in Japan, (b) 2 studies from an online labor market in the United States, and (c) a laboratory experiment. Our findings suggest that worker productivity is higher on bad-, rather than good-, weather days and that cognitive distractions associated with good weather may explain the relationship. We discuss the theoretical and practical implications of our research.
Terascale spectral element algorithms and implementations.
Fischer, P. F.; Tufo, H. M.
1999-08-17
We describe the development and implementation of an efficient spectral element code for multimillion gridpoint simulations of incompressible flows in general two- and three-dimensional domains. We review basic and recently developed algorithmic underpinnings that have resulted in good parallel and vector performance on a broad range of architectures, including the terascale computing systems now coming online at the DOE labs. Sustained performance of 219 GFLOPS has been recently achieved on 2048 nodes of the Intel ASCI-Red machine at Sandia.
Adaptive Algorithms for HF Antenna Arrays.
1987-07-01
SUBJECT TERMS (Contnue on reverse dfnoceaq and identiy by bkICk numnber) FIELD GROUP SUB-GROUP HP Adaptive Arrays HrF Comunications Systems 4 HP...Although their heavy computational load renders them impractical *1 for many applications, the advancements in cheap, fast digital hardware have...or digital form. For many applications, the LMS algorithm represents a good trade off between speed of convergence* and implementational The speed of
Competition in Healthcare: Good, Bad or Ugly?
Goddard, Maria
2015-08-01
The role of competition in healthcare is much debated. Despite a wealth of international experience in relation to competition, evidence is mixed and contested and the debate about the potential role for competition is often polarised. This paper considers briefly some of the reasons for this, focusing on what is meant by "competition in healthcare" and why it is more valuable to think about the circumstances in which competition is more and less likely to be a good tool to achieve benefits, rather than whether or not it is "good" or "bad," per se.
[Can tobacco companies be good corporate citizens?].
Palazzo, G; Mena, S
2009-07-01
Tobacco companies have jumped on the Corporate social responsibility (CSR) bandwagon as a tentative to be societally accepted as responsible actors and good corporate citizens. This is however not possible for two reasons. First, the product they sell is lethal and thus not compatible with the precondition of doing no harm to be a good corporate citizen. Second, the behavior of tobacco firms is not responsible, being illustrated by four examples: junk science versus sound science strategy, seducing young smokers, political lobbying and getting customers on new markets. To conclude, three implications for regulating the activities of the tobacco industry are given.
Medical Image Encryption: An Application for Improved Padding Based GGH Encryption Algorithm.
Sokouti, Massoud; Zakerolhosseini, Ali; Sokouti, Babak
2016-01-01
Medical images are regarded as important and sensitive data in the medical informatics systems. For transferring medical images over an insecure network, developing a secure encryption algorithm is necessary. Among the three main properties of security services ( i.e. , confidentiality, integrity, and availability), the confidentiality is the most essential feature for exchanging medical images among physicians. The Goldreich Goldwasser Halevi (GGH) algorithm can be a good choice for encrypting medical images as both the algorithm and sensitive data are represented by numeric matrices. Additionally, the GGH algorithm does not increase the size of the image and hence, its complexity will remain as simple as O(n(2) ). However, one of the disadvantages of using the GGH algorithm is the Chosen Cipher Text attack. In our strategy, this shortcoming of GGH algorithm has been taken in to consideration and has been improved by applying the padding (i.e., snail tour XORing), before the GGH encryption process. For evaluating their performances, three measurement criteria are considered including (i) Number of Pixels Change Rate (NPCR), (ii) Unified Average Changing Intensity (UACI), and (iii) Avalanche effect. The results on three different sizes of images showed that padding GGH approach has improved UACI, NPCR, and Avalanche by almost 100%, 35%, and 45%, respectively, in comparison to the standard GGH algorithm. Also, the outcomes will make the padding GGH resist against the cipher text, the chosen cipher text, and the statistical attacks. Furthermore, increasing the avalanche effect of more than 50% is a promising achievement in comparison to the increased complexities of the proposed method in terms of encryption and decryption processes.
Medical Image Encryption: An Application for Improved Padding Based GGH Encryption Algorithm
Sokouti, Massoud; Zakerolhosseini, Ali; Sokouti, Babak
2016-01-01
Medical images are regarded as important and sensitive data in the medical informatics systems. For transferring medical images over an insecure network, developing a secure encryption algorithm is necessary. Among the three main properties of security services (i.e., confidentiality, integrity, and availability), the confidentiality is the most essential feature for exchanging medical images among physicians. The Goldreich Goldwasser Halevi (GGH) algorithm can be a good choice for encrypting medical images as both the algorithm and sensitive data are represented by numeric matrices. Additionally, the GGH algorithm does not increase the size of the image and hence, its complexity will remain as simple as O(n2). However, one of the disadvantages of using the GGH algorithm is the Chosen Cipher Text attack. In our strategy, this shortcoming of GGH algorithm has been taken in to consideration and has been improved by applying the padding (i.e., snail tour XORing), before the GGH encryption process. For evaluating their performances, three measurement criteria are considered including (i) Number of Pixels Change Rate (NPCR), (ii) Unified Average Changing Intensity (UACI), and (iii) Avalanche effect. The results on three different sizes of images showed that padding GGH approach has improved UACI, NPCR, and Avalanche by almost 100%, 35%, and 45%, respectively, in comparison to the standard GGH algorithm. Also, the outcomes will make the padding GGH resist against the cipher text, the chosen cipher text, and the statistical attacks. Furthermore, increasing the avalanche effect of more than 50% is a promising achievement in comparison to the increased complexities of the proposed method in terms of encryption and decryption processes. PMID:27857824
Recent Advancements in Lightning Jump Algorithm Work
NASA Technical Reports Server (NTRS)
Schultz, Christopher J.; Petersen, Walter A.; Carey, Lawrence D.
2010-01-01
In the past year, the primary objectives were to show the usefulness of total lightning as compared to traditional cloud-to-ground (CG) networks, test the lightning jump algorithm configurations in other regions of the country, increase the number of thunderstorms within our thunderstorm database, and to pinpoint environments that could prove difficult for any lightning jump configuration. A total of 561 thunderstorms have been examined in the past year (409 non-severe, 152 severe) from four regions of the country (North Alabama, Washington D.C., High Plains of CO/KS, and Oklahoma). Results continue to indicate that the 2 lightning jump algorithm configuration holds the most promise in terms of prospective operational lightning jump algorithms, with a probability of detection (POD) at 81%, a false alarm rate (FAR) of 45%, a critical success index (CSI) of 49% and a Heidke Skill Score (HSS) of 0.66. The second best performing algorithm configuration was the Threshold 4 algorithm, which had a POD of 72%, FAR of 51%, a CSI of 41% and an HSS of 0.58. Because a more complex algorithm configuration shows the most promise in terms of prospective operational lightning jump algorithms, accurate thunderstorm cell tracking work must be undertaken to track lightning trends on an individual thunderstorm basis over time. While these numbers for the 2 configuration are impressive, the algorithm does have its weaknesses. Specifically, low-topped and tropical cyclone thunderstorm environments are present issues for the 2 lightning jump algorithm, because of the suppressed vertical depth impact on overall flash counts (i.e., a relative dearth in lightning). For example, in a sample of 120 thunderstorms from northern Alabama that contained 72 missed events by the 2 algorithm 36% of the misses were associated with these two environments (17 storms).
Moral Vitalism: Seeing Good and Evil as Real, Agentic Forces.
Bastian, Brock; Bain, Paul; Buhrmester, Michael D; Gómez, Ángel; Vázquez, Alexandra; Knight, Clinton G; Swann, William B
2015-08-01
Moral vitalism refers to a tendency to view good and evil as actual forces that can influence people and events. We introduce a scale designed to assess the belief in moral vitalism. High scorers on the scale endorse items such as "There are underlying forces of good and evil in this world." After establishing the reliability and criterion validity of the scale (Studies 1, 2a, and 2b), we examined the predictive validity of the moral vitalism scale, showing that "moral vitalists" worry about being possessed by evil (Study 3), being contaminated through contact with evil people (Study 4), and forfeiting their own mental purity (Study 5). We discuss the nature of moral vitalism and the implications of the construct for understanding the role of metaphysical lay theories about the nature of good and evil in moral reasoning.
Quantum chi-squared and goodness of fit testing
Temme, Kristan; Verstraete, Frank
2015-01-15
A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fit test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.
Fourth Order Algorithms for Solving Diverse Many-Body Problems
NASA Astrophysics Data System (ADS)
Chin, Siu A.; Forbert, Harald A.; Chen, Chia-Rong; Kidwell, Donald W.; Ciftja, Orion
2001-03-01
We show that the method of factorizing an evolution operator of the form e^ɛ(A+B) to fourth order with purely positive coefficient yields new classes of symplectic algorithms for solving classical dynamical problems, unitary algorithms for solving the time-dependent Schrödinger equation, norm preserving algorithms for solving the Langevin equation and large time step convergent Diffusion Monte Carlo algorithms. Results for each class of problems will be presented and disucss
MU-MIMO Pairing Algorithm Using Received Power
NASA Astrophysics Data System (ADS)
Kim, Young-Joon; Lee, Jung-Seung; Baik, Doo-Kwon
In this letter, a new received power pairing scheduling (PPS) algorithm is proposed for Multi User Multiple Input and Multiple Output (MU-MIMO) systems. In contrast to existing algorithms that manage complex orthogonal factors, the PPS algorithm simply utilizes CINR to determine a MU-MIMO pair. Simulation results show that the PPS algorithm achieves up to 77% of MU-MIMO gain of determinant pairing scheduling (DPS) with low complexity.
A genetic algorithm for solving supply chain network design model
NASA Astrophysics Data System (ADS)
Firoozi, Z.; Ismail, N.; Ariafar, S. H.; Tang, S. H.; Ariffin, M. K. M. A.
2013-09-01
Network design is by nature costly and optimization models play significant role in reducing the unnecessary cost components of a distribution network. This study proposes a genetic algorithm to solve a distribution network design model. The structure of the chromosome in the proposed algorithm is defined in a novel way that in addition to producing feasible solutions, it also reduces the computational complexity of the algorithm. Computational results are presented to show the algorithm performance.
Genetic Algorithm Tuned Fuzzy Logic for Gliding Return Trajectories
NASA Technical Reports Server (NTRS)
Burchett, Bradley T.
2003-01-01
The problem of designing and flying a trajectory for successful recovery of a reusable launch vehicle is tackled using fuzzy logic control with genetic algorithm optimization. The plant is approximated by a simplified three degree of freedom non-linear model. A baseline trajectory design and guidance algorithm consisting of several Mamdani type fuzzy controllers is tuned using a simple genetic algorithm. Preliminary results show that the performance of the overall system is shown to improve with genetic algorithm tuning.
LDRD Report: Scheduling Irregular Algorithms
Boman, Erik G.
2014-10-01
This LDRD project was a campus exec fellowship to fund (in part) Donald Nguyen’s PhD research at UT-Austin. His work has focused on parallel programming models, and scheduling irregular algorithms on shared-memory systems using the Galois framework. Galois provides a simple but powerful way for users and applications to automatically obtain good parallel performance using certain supported data containers. The naïve user can write serial code, while advanced users can optimize performance by advanced features, such as specifying the scheduling policy. Galois was used to parallelize two sparse matrix reordering schemes: RCM and Sloan. Such reordering is important in high-performance computing to obtain better data locality and thus reduce run times.
Online Planning Algorithms for POMDPs
Ross, Stéphane; Pineau, Joelle; Paquet, Sébastien; Chaib-draa, Brahim
2009-01-01
Partially Observable Markov Decision Processes (POMDPs) provide a rich framework for sequential decision-making under uncertainty in stochastic domains. However, solving a POMDP is often intractable except for small problems due to their complexity. Here, we focus on online approaches that alleviate the computational complexity by computing good local policies at each decision step during the execution. Online algorithms generally consist of a lookahead search to find the best action to execute at each time step in an environment. Our objectives here are to survey the various existing online POMDP methods, analyze their properties and discuss their advantages and disadvantages; and to thoroughly evaluate these online approaches in different environments under various metrics (return, error bound reduction, lower bound improvement). Our experimental results indicate that state-of-the-art online heuristic search methods can handle large POMDP domains efficiently. PMID:19777080
Genome editing: Bioethics shows the way
Caplan, Arthur L.
2017-01-01
When some scientists hear the word “bioethics,” they break out in intellectual hives. They shouldn’t. Good bioethics is about enabling science to move forward. Bioethics pushes scientists to acknowledge that they operate not within a vacuum but within a society in which diverse perspectives and values must be engaged. Bioethicists give voice to those divergent perspectives and provide a framework to facilitate informed and inclusive discussions that spur progress, rather than stall it. The field is needed to advance cutting-edge biomedical research in domains in which the benefits to be had are enormous, such as genome editing, but ethical concerns persist. PMID:28301466
Genome editing: Bioethics shows the way.
Neuhaus, Carolyn P; Caplan, Arthur L
2017-03-01
When some scientists hear the word "bioethics," they break out in intellectual hives. They shouldn't. Good bioethics is about enabling science to move forward. Bioethics pushes scientists to acknowledge that they operate not within a vacuum but within a society in which diverse perspectives and values must be engaged. Bioethicists give voice to those divergent perspectives and provide a framework to facilitate informed and inclusive discussions that spur progress, rather than stall it. The field is needed to advance cutting-edge biomedical research in domains in which the benefits to be had are enormous, such as genome editing, but ethical concerns persist.
Quantum Algorithm for Linear Programming Problems
NASA Astrophysics Data System (ADS)
Joag, Pramod; Mehendale, Dhananjay
The quantum algorithm (PRL 103, 150502, 2009) solves a system of linear equations with exponential speedup over existing classical algorithms. We show that the above algorithm can be readily adopted in the iterative algorithms for solving linear programming (LP) problems. The first iterative algorithm that we suggest for LP problem follows from duality theory. It consists of finding nonnegative solution of the equation forduality condition; forconstraints imposed by the given primal problem and for constraints imposed by its corresponding dual problem. This problem is called the problem of nonnegative least squares, or simply the NNLS problem. We use a well known method for solving the problem of NNLS due to Lawson and Hanson. This algorithm essentially consists of solving in each iterative step a new system of linear equations . The other iterative algorithms that can be used are those based on interior point methods. The same technique can be adopted for solving network flow problems as these problems can be readily formulated as LP problems. The suggested quantum algorithm cansolveLP problems and Network Flow problems of very large size involving millions of variables.
A novel algorithm for Bluetooth ECG.
Pandya, Utpal T; Desai, Uday B
2012-11-01
In wireless transmission of ECG, data latency will be significant when battery power level and data transmission distance are not maintained. In applications like home monitoring or personalized care, to overcome the joint effect of previous issues of wireless transmission and other ECG measurement noises, a novel filtering strategy is required. Here, a novel algorithm, identified as peak rejection adaptive sampling modified moving average (PRASMMA) algorithm for wireless ECG is introduced. This algorithm first removes error in bit pattern of received data if occurred in wireless transmission and then removes baseline drift. Afterward, a modified moving average is implemented except in the region of each QRS complexes. The algorithm also sets its filtering parameters according to different sampling rate selected for acquisition of signals. To demonstrate the work, a prototyped Bluetooth-based ECG module is used to capture ECG with different sampling rate and in different position of patient. This module transmits ECG wirelessly to Bluetooth-enabled devices where the PRASMMA algorithm is applied on captured ECG. The performance of PRASMMA algorithm is compared with moving average and S-Golay algorithms visually as well as numerically. The results show that the PRASMMA algorithm can significantly improve the ECG reconstruction by efficiently removing the noise and its use can be extended to any parameters where peaks are importance for diagnostic purpose.
Solutions to the public goods dilemma in bacterial biofilms
Drescher, Knut; Nadell, Carey D.; Stone, Howard A.; Wingreen, Ned S.; Bassler, Bonnie L.
2014-01-01
Summary: Bacteria frequently live in densely populated surface-bound communities, termed biofilms [1-4]. Biofilm-dwelling cells rely on secretion of extracellular substances to construct their communities and to capture nutrients from the environment [5]. Some secreted factors behave as cooperative public goods: they can be exploited by non-producing cells [6-11]. The means by which public-good-producing bacteria avert exploitation in biofilm environments are largely unknown. Using experiments with Vibrio cholerae, which secretes extracellular enzymes to digest its primary food source, the solid polymer chitin, we show that the public goods dilemma may be solved by two very different mechanisms: cells can produce thick biofilms that confine the goods to producers, or fluid flow can remove soluble products of chitin digestion, denying access to non-producers. Both processes are unified by limiting the distance over which enzyme-secreting cells provide benefits to neighbors, resulting in preferential benefit to nearby clonemates and allowing kin selection to favor public good production. Our results demonstrate new mechanisms by which the physical conditions of natural habitats can interact with bacterial physiology to promote the evolution of cooperation. PMID:24332540
Solutions to the public goods dilemma in bacterial biofilms
NASA Astrophysics Data System (ADS)
Drescher, Knut; Nadell, Carey D.; Stone, Howard A.; Wingreen, Ned S.; Bassler, Bonnie L.
2014-03-01
Bacteria frequently live in densely populated surface-bound communities, termed biofilms. Biofilm-dwelling cells rely on secretion of extracellular substances to construct their communities and to capture nutrients from the environment. Some secreted factors behave as cooperative public goods: they can be exploited by non-producing cells. The means by which public-good-producing bacteria avert exploitation in biofilm environments are largely unknown. Using experiments with Vibrio cholerae, which secretes extracellular enzymes to digest its primary food source, the solid polymer chitin, we show that the public goods dilemma may be solved by two very different mechanisms: cells can produce thick biofilms that confine the goods to producers, or fluid flow can remove soluble products of chitin digestion, denying access to non-producers. Both processes are unified by limiting the distance over which enzyme-secreting cells provide benefits to neighbors, resulting in preferential benefit to nearby clonemates and allowing kin selection to favor public good production. Our results demonstrate new mechanisms by which the physical conditions of natural habitats can interact with bacterial physiology to promote the evolution of cooperation.
Conception of discrete systems decomposition algorithm using p-invariants and hypergraphs
NASA Astrophysics Data System (ADS)
Stefanowicz, Ł.
2016-09-01
In the article author presents an idea of decomposition algorithm of discrete systems described by Petri Nets using pinvariants. Decomposition process is significant from the point of view of discrete systems design, because it allows separation of the smaller sequential parts. Proposed algorithm uses modified Martinez-Silva method as well as author's selection algorithm. The developed method is a good complement of classical decomposition algorithms using graphs and hypergraphs.
NASA Astrophysics Data System (ADS)
Alpatov, Boris; Babayan, Pavel; Ershov, Maksim; Strotov, Valery
2016-10-01
This paper describes the implementation of the orientation estimation algorithm in FPGA-based vision system. An approach to estimate an orientation of objects lacking axial symmetry is proposed. Suggested algorithm is intended to estimate orientation of a specific known 3D object based on object 3D model. The proposed orientation estimation algorithm consists of two stages: learning and estimation. Learning stage is devoted to the exploring of studied object. Using 3D model we can gather set of training images by capturing 3D model from viewpoints evenly distributed on a sphere. Sphere points distribution is made by the geosphere principle. Gathered training image set is used for calculating descriptors, which will be used in the estimation stage of the algorithm. The estimation stage is focusing on matching process between an observed image descriptor and the training image descriptors. The experimental research was performed using a set of images of Airbus A380. The proposed orientation estimation algorithm showed good accuracy in all case studies. The real-time performance of the algorithm in FPGA-based vision system was demonstrated.
NASA Astrophysics Data System (ADS)
Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Ji, Jin-Chao
2016-04-01
In this paper, we propose a novel learning algorithm, named SABC-MKELM, based on a kernel extreme learning machine (KELM) method for single-hidden-layer feedforward networks. In SABC-MKELM, the combination of Gaussian kernels is used as the activate function of KELM instead of simple fixed kernel learning, where the related parameters of kernels and the weights of kernels can be optimised by a novel self-adaptive artificial bee colony (SABC) approach simultaneously. SABC-MKELM outperforms six other state-of-the-art approaches in general, as it could effectively determine solution updating strategies and suitable parameters to produce a flexible kernel function involved in SABC. Simulations have demonstrated that the proposed algorithm not only self-adaptively determines suitable parameters and solution updating strategies learning from the previous experiences, but also achieves better generalisation performances than several related methods, and the results show good stability of the proposed algorithm.
Good Laboratory Practice. Part 1. An Introduction
ERIC Educational Resources Information Center
Wedlich, Richard C.; Libera, Agata E.; Pires, Amanda; Therrien, Matthew T.
2013-01-01
The Good Laboratory Practice (GLP) regulations were put into place in 1978. They establish a standard of practice to ensure that results from the nonclinical laboratory study reported to the U.S. Food and Drug Administration (FDA) are valid and that the study report accurately reflects the conduct of the study. While the GLP regulations promulgate…
Omega-3 fats: Good for your heart
... arteries - omega-3s; Coronary artery disease - omega-3s; Heart disease - omega-3s ... Omega-3s are good for your heart and blood vessels in several ways. They reduce triglycerides , a type of fat in your blood. They reduce the risk of an ...
Very Good Medicine: Indigenous Humor and Laughter
ERIC Educational Resources Information Center
Mala, Cynthia Lindquist
2016-01-01
Humor is not only instinctive and a basic human need, but it also is very good medicine. Laughter boosts the immune system, lowers blood pressure, reduces stress hormones, and is linked to healthy functioning organs. [This article was written with Mylo Redwater Smith.
19 CFR 10.914 - Originating goods.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 1 2012-04-01 2012-04-01 false Originating goods. 10.914 Section 10.914 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Peru Trade Promotion...
19 CFR 10.914 - Originating goods.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 1 2013-04-01 2013-04-01 false Originating goods. 10.914 Section 10.914 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Peru Trade Promotion...
19 CFR 10.914 - Originating goods.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 1 2014-04-01 2014-04-01 false Originating goods. 10.914 Section 10.914 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Peru Trade Promotion...
19 CFR 10.1014 - Originating goods.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 1 2012-04-01 2012-04-01 false Originating goods. 10.1014 Section 10.1014 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Korea Free Trade Agreement...
19 CFR 10.1014 - Originating goods.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 1 2014-04-01 2014-04-01 false Originating goods. 10.1014 Section 10.1014 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Korea Free Trade Agreement...
19 CFR 10.1014 - Originating goods.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 1 2013-04-01 2013-04-01 false Originating goods. 10.1014 Section 10.1014 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Korea Free Trade Agreement...
Athletics: The Good It Should Do
ERIC Educational Resources Information Center
Stoll, Sharon Kay
2011-01-01
The purpose of this article is to discuss a coach's perspective of the good that sport should do. The author argues that the relationships developed through the coaching experience are powerful, formative, and exceptional. She discusses the important moral values of sharing and caring and how these are important in the moral development of the…
Planning Behaviour in Good and Poor Readers
ERIC Educational Resources Information Center
Mahapatra, Shamita
2016-01-01
A group of 50 good readers and a group of 50 poor readers of Grade 5 matched for age and intelligence and selected on the basis of their proficiency in reading comprehension were tested for their competence in word reading and the process of planning at three different levels, namely, perceptual, memory and conceptual in order to study the…
Globalizing Students Acting for the Common Good
ERIC Educational Resources Information Center
Bencze, Larry; Carter, Lyn
2011-01-01
It is apparent that many of us live in a hyper-economized world, in which personal identities and routine practices are significantly oriented towards production and consumption of for-profit goods and services. Extreme consumerism resulting from this orientation often is associated with many personal, social, and environmental problems.…
19 CFR 10.770 - Originating goods.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 1 2010-04-01 2010-04-01 false Originating goods. 10.770 Section 10.770 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Morocco Free Trade...
Suffering, compassion and 'doing good medical ethics'.
de Zulueta, Paquita C
2015-01-01
'Doing good medical ethics' involves attending to both the biomedical and existential aspects of illness. For this, we need to bring in a phenomenological perspective to the clinical encounter, adopt a virtue-based ethic and resolve to re-evaluate the goals of medicine, in particular the alleviation of suffering and the role of compassion in everyday ethics.
ERIC Educational Resources Information Center
Lester, David
2006-01-01
The issue of whether suicide can be a good death was separated into two different questions: (1) can suicide be an appropriate death, and (2) can suicide be a rational death? Several definitions of an "appropriate" death were proposed, and suicide was seen as potentially appropriate. Similarly, several criteria for rationality were proposed and…
19 CFR 10.810 - Originating goods.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., that has been grown, produced, or manufactured in the territory of one or both of the Parties, is... considered an originating good under the BFTA only if the sum of the value of materials produced in one or... having undergone simple combining or packaging operations, or mere dilution with water or...
Postwar Headteachers' Perspectives of "Good" Teachers
ERIC Educational Resources Information Center
Whitehead, Kay
2003-01-01
In this article, the author focuses in what counted as a "good" teacher on the cusp of a significant change in the profile of the teaching workforce from single to married women. As Jackie Blount notes, single women had dominated state school systems numerically for the century preceding the Second World War but by the late 1940s the…
Teacher Dispositions as Predictors of Good Teaching
ERIC Educational Resources Information Center
Helm, Carroll M.
2006-01-01
In this article, the author seeks to discern whether it is possible to identify people who are predisposed to become good teachers as if it was a kind of calling in the same manner in which ministers are called to the ministry. The author cites a 2005 study by Wayda and Lund, who developed rubrics to address the student's suitability for the…
Higher Education for the Public Good
ERIC Educational Resources Information Center
Perez, Angel B.
2012-01-01
In his 1988 article, Nathaniel Jackson inspired NACAC with four objectives to meet the challenge of leading the charge in minority students access. In this article, the author discusses three more that he believes will challenge colleges in the decades ahead: (1) Focus on the Public Good; (2) Honor Transparency; and (3) Measure Collective Success.…
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false Good faith. 93.210 Section 93.210 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES PUBLIC HEALTH SERVICE POLICIES ON...
Spiritual Development as a Social Good
ERIC Educational Resources Information Center
Hicks, Mona; Tran-Parsons, Uyen
2013-01-01
The skill development of equanimity and empathy gained through spiritual growth equips students to examine solutions to complex problems in a diverse, global society. This chapter explores intentional multicultural initiatives designed to foster spiritual development and interfaith engagement as means to navigate difference and social good.
Good governance of national veterinary services.
Schneider, H
2011-04-01
The beginning of the 21st Century has been characterised by changed political and economic realities affecting the prevention, control and eradication of animal diseases and zoonoses and presenting new challenges to the veterinary profession. Veterinary Services (VS) need to have the capacity and capabilities to face these challenges and be able to detect, prevent, control and eradicate disease threats. Animal health and VS, being a public good, require global initiatives and collective international action to be able to implement global animal disease eradication. The application of the 'One World, One Health' strategy at the animal-human interface will strengthen veterinary capacity to meet this challenge. Good governance of VS at the national, regional and global level is at the heart of such a strategy. In this paper, the author lists the key elements comprising good veterinary governance and discusses the World Organisation for Animal Health (OIE) standards for the quality of VS. The OIE Tool for the Evaluation of the Performance of Veterinary Services (OIE PVS Tool) is introduced and its relevance in assessing compliance with OIE standards to prevent the spread of pathogens through trade is highlighted. A firm political commitment at the national, regional and international level, with provision of the necessary funding at all levels, is an absolute necessity in establishing good governance of VS to meet the ever-increasing threats posed by animal and human pathogens.
What is Good University Financial Management?
ERIC Educational Resources Information Center
Taylor, Mark P.
2013-01-01
In the current and foreseeable harsh UK higher education environment, aspiring to best-practice financial management will be key to ensuring the prosperity--and indeed the survival--of any university. In this article I argue that good university financial management should provide stability to the institution, allow for investment as well as…
Good Teaching: What Matters to University Students
ERIC Educational Resources Information Center
Lee, Hwee Hoon; Kim, Grace May Lin; Chan, Ling Ling
2015-01-01
Institutions assess teaching effectiveness in various ways, such as classroom observation, peer evaluation and self-assessment. In higher education, student feedback continues to be the main teaching evaluation tool. However, most of such forms include characteristics of good teaching that the institutions deem important and may not adequately…
19 CFR 10.770 - Originating goods.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 1 2011-04-01 2011-04-01 false Originating goods. 10.770 Section 10.770 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Morocco Free Trade...
19 CFR 10.770 - Originating goods.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 1 2014-04-01 2014-04-01 false Originating goods. 10.770 Section 10.770 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Morocco Free Trade...
19 CFR 10.770 - Originating goods.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 1 2013-04-01 2013-04-01 false Originating goods. 10.770 Section 10.770 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Morocco Free Trade...
19 CFR 10.770 - Originating goods.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 1 2012-04-01 2012-04-01 false Originating goods. 10.770 Section 10.770 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY ARTICLES CONDITIONALLY FREE, SUBJECT TO A REDUCED RATE, ETC. United States-Morocco Free Trade...
Good Intentions, Bad Advice for Bilingual Families
ERIC Educational Resources Information Center
Harlin, Rebecca; Paneque, Oneyda M.
2006-01-01
Quite often, educators tell families of children who are learning English as a second language to speak only English, and not their native language, at home. Although these educators may have good intentions, the authors argue that the educators' advice to families is misguided and stems from misunderstandings about the nature of bilingualism and…
Learners' Epistemic Criteria for Good Scientific Models
ERIC Educational Resources Information Center
Pluta, William J.; Chinn, Clark A.; Duncan, Ravit Golan
2011-01-01
Epistemic criteria are the standards used to evaluate scientific products (e.g., models, evidence, arguments). In this study, we analyzed epistemic criteria for good models generated by 324 middle-school students. After evaluating a range of scientific models, but before extensive instruction or experience with model-based reasoning practices,…
Alternative Pathways to Apprenticeships. Good Practice Guide
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2015
2015-01-01
Apprenticeships are changing. The increasing proportions of people entering apprenticeships at various levels of ability and backgrounds are stimulating demand for alternative pathways to completions. This good practice guide assembles the key findings for education practitioners and workplace supervisors from three related research reports on…