2014-08-01
consensus algorithm called randomized gossip is more suitable [7, 8]. In asynchronous randomized gossip algorithms, pairs of neighboring nodes exchange...messages and perform updates in an asynchronous and unattended manner, and they also 1 The class of broadcast gossip algorithms [9, 10, 11, 12] are...dynamics [2] and asynchronous pairwise randomized gossip [7, 8], broadcast gossip algorithms do not require that nodes know the identities of their
Geographic Gossip: Efficient Averaging for Sensor Networks
NASA Astrophysics Data System (ADS)
Dimakis, Alexandros D. G.; Sarwate, Anand D.; Wainwright, Martin J.
Gossip algorithms for distributed computation are attractive due to their simplicity, distributed nature, and robustness in noisy and uncertain environments. However, using standard gossip algorithms can lead to a significant waste in energy by repeatedly recirculating redundant information. For realistic sensor network model topologies like grids and random geometric graphs, the inefficiency of gossip schemes is related to the slow mixing times of random walks on the communication graph. We propose and analyze an alternative gossiping scheme that exploits geographic information. By utilizing geographic routing combined with a simple resampling method, we demonstrate substantial gains over previously proposed gossip protocols. For regular graphs such as the ring or grid, our algorithm improves standard gossip by factors of $n$ and $\\sqrt{n}$ respectively. For the more challenging case of random geometric graphs, our algorithm computes the true average to accuracy $\\epsilon$ using $O(\\frac{n^{1.5}}{\\sqrt{\\log n}} \\log \\epsilon^{-1})$ radio transmissions, which yields a $\\sqrt{\\frac{n}{\\log n}}$ factor improvement over standard gossip algorithms. We illustrate these theoretical results with experimental comparisons between our algorithm and standard methods as applied to various classes of random fields.
Greedy Gossip With Eavesdropping
NASA Astrophysics Data System (ADS)
Ustebay, Deniz; Oreshkin, Boris N.; Coates, Mark J.; Rabbat, Michael G.
2010-07-01
This paper presents greedy gossip with eavesdropping (GGE), a novel randomized gossip algorithm for distributed computation of the average consensus problem. In gossip algorithms, nodes in the network randomly communicate with their neighbors and exchange information iteratively. The algorithms are simple and decentralized, making them attractive for wireless network applications. In general, gossip algorithms are robust to unreliable wireless conditions and time varying network topologies. In this paper we introduce GGE and demonstrate that greedy updates lead to rapid convergence. We do not require nodes to have any location information. Instead, greedy updates are made possible by exploiting the broadcast nature of wireless communications. During the operation of GGE, when a node decides to gossip, instead of choosing one of its neighbors at random, it makes a greedy selection, choosing the node which has the value most different from its own. In order to make this selection, nodes need to know their neighbors' values. Therefore, we assume that all transmissions are wireless broadcasts and nodes keep track of their neighbors' values by eavesdropping on their communications. We show that the convergence of GGE is guaranteed for connected network topologies. We also study the rates of convergence and illustrate, through theoretical bounds and numerical simulations, that GGE consistently outperforms randomized gossip and performs comparably to geographic gossip on moderate-sized random geometric graph topologies.
NASA Astrophysics Data System (ADS)
Jakovetic, Dusan; Xavier, João; Moura, José M. F.
2011-08-01
We study distributed optimization in networked systems, where nodes cooperate to find the optimal quantity of common interest, x=x^\\star. The objective function of the corresponding optimization problem is the sum of private (known only by a node,) convex, nodes' objectives and each node imposes a private convex constraint on the allowed values of x. We solve this problem for generic connected network topologies with asymmetric random link failures with a novel distributed, decentralized algorithm. We refer to this algorithm as AL-G (augmented Lagrangian gossiping,) and to its variants as AL-MG (augmented Lagrangian multi neighbor gossiping) and AL-BG (augmented Lagrangian broadcast gossiping.) The AL-G algorithm is based on the augmented Lagrangian dual function. Dual variables are updated by the standard method of multipliers, at a slow time scale. To update the primal variables, we propose a novel, Gauss-Seidel type, randomized algorithm, at a fast time scale. AL-G uses unidirectional gossip communication, only between immediate neighbors in the network and is resilient to random link failures. For networks with reliable communication (i.e., no failures,) the simplified, AL-BG (augmented Lagrangian broadcast gossiping) algorithm reduces communication, computation and data storage cost. We prove convergence for all proposed algorithms and demonstrate by simulations the effectiveness on two applications: l_1-regularized logistic regression for classification and cooperative spectrum sensing for cognitive radio networks.
Distributed Matrix Completion: Application to Cooperative Positioning in Noisy Environments
2013-12-11
positioning, and a gossip version of low-rank approximation were developed. A convex relaxation for positioning in the presence of noise was shown to...of a large data matrix through gossip algorithms. A new algorithm is proposed that amounts to iteratively multiplying a vector by independent random...sparsification of the original matrix and averaging the resulting normalized vectors. This can be viewed as a generalization of gossip algorithms for
Li, Bo; Li, Shuang; Wu, Junfeng; Qi, Hongsheng
2018-02-09
This paper establishes a framework of quantum clique gossiping by introducing local clique operations to networks of interconnected qubits. Cliques are local structures in complex networks being complete subgraphs, which can be used to accelerate classical gossip algorithms. Based on cyclic permutations, clique gossiping leads to collective multi-party qubit interactions. We show that at reduced states, these cliques have the same acceleration effects as their roles in accelerating classical gossip algorithms. For randomized selection of cliques, such improved rate of convergence is precisely characterized. On the other hand, the rate of convergence at the coherent states of the overall quantum network is proven to be decided by the spectrum of a mean-square error evolution matrix. Remarkably, the use of larger quantum cliques does not necessarily increase the speed of the network density aggregation, suggesting quantum network dynamics is not entirely decided by its classical topology.
Gossip-based solutions for discrete rendezvous in populations of communicating agents.
Hollander, Christopher D; Wu, Annie S
2014-01-01
The objective of the rendezvous problem is to construct a method that enables a population of agents to agree on a spatial (and possibly temporal) meeting location. We introduce the buffered gossip algorithm as a general solution to the rendezvous problem in a discrete domain with direct communication between decentralized agents. We compare the performance of the buffered gossip algorithm against the well known uniform gossip algorithm. We believe that a buffered solution is preferable to an unbuffered solution, such as the uniform gossip algorithm, because the use of a buffer allows an agent to use multiple information sources when determining its desired rendezvous point, and that access to multiple information sources may improve agent decision making by reinforcing or contradicting an initial choice. To show that the buffered gossip algorithm is an actual solution for the rendezvous problem, we construct a theoretical proof of convergence and derive the conditions under which the buffered gossip algorithm is guaranteed to produce a consensus on rendezvous location. We use these results to verify that the uniform gossip algorithm also solves the rendezvous problem. We then use a multi-agent simulation to conduct a series of simulation experiments to compare the performance between the buffered and uniform gossip algorithms. Our results suggest that the buffered gossip algorithm can solve the rendezvous problem faster than the uniform gossip algorithm; however, the relative performance between these two solutions depends on the specific constraints of the problem and the parameters of the buffered gossip algorithm.
Gossip-Based Solutions for Discrete Rendezvous in Populations of Communicating Agents
Hollander, Christopher D.; Wu, Annie S.
2014-01-01
The objective of the rendezvous problem is to construct a method that enables a population of agents to agree on a spatial (and possibly temporal) meeting location. We introduce the buffered gossip algorithm as a general solution to the rendezvous problem in a discrete domain with direct communication between decentralized agents. We compare the performance of the buffered gossip algorithm against the well known uniform gossip algorithm. We believe that a buffered solution is preferable to an unbuffered solution, such as the uniform gossip algorithm, because the use of a buffer allows an agent to use multiple information sources when determining its desired rendezvous point, and that access to multiple information sources may improve agent decision making by reinforcing or contradicting an initial choice. To show that the buffered gossip algorithm is an actual solution for the rendezvous problem, we construct a theoretical proof of convergence and derive the conditions under which the buffered gossip algorithm is guaranteed to produce a consensus on rendezvous location. We use these results to verify that the uniform gossip algorithm also solves the rendezvous problem. We then use a multi-agent simulation to conduct a series of simulation experiments to compare the performance between the buffered and uniform gossip algorithms. Our results suggest that the buffered gossip algorithm can solve the rendezvous problem faster than the uniform gossip algorithm; however, the relative performance between these two solutions depends on the specific constraints of the problem and the parameters of the buffered gossip algorithm. PMID:25397882
Relaxation of Distributed Data Aggregation for Underwater Acoustic Sensor Networks
2014-03-31
2 3.1 Gossip algorithms for distributed averaging . . . . . . . . . . . . . . . . . 3 3.2 Distributed particle filtering...algorithm that had direct access to all of the measurements. We use gossip algorithms (discussed in Section 3.1) to diffuse information across the...2 3.1 Gossip algorithms for distributed averaging We begin by discussing gossip algorithms, which we use to synchronize and spread infor- mation
Fast Decentralized Averaging via Multi-scale Gossip
NASA Astrophysics Data System (ADS)
Tsianos, Konstantinos I.; Rabbat, Michael G.
We are interested in the problem of computing the average consensus in a distributed fashion on random geometric graphs. We describe a new algorithm called Multi-scale Gossip which employs a hierarchical decomposition of the graph to partition the computation into tractable sub-problems. Using only pairwise messages of fixed size that travel at most O(n^{1/3}) hops, our algorithm is robust and has communication cost of O(n loglogn logɛ - 1) transmissions, which is order-optimal up to the logarithmic factor in n. Simulated experiments verify the good expected performance on graphs of many thousands of nodes.
Distributed Matrix Completion: Applications to Cooperative Positioning in Noisy Environments
2013-12-11
positioning, and a gossip version of low-rank approximation were developed. A convex relaxation for positioning in the presence of noise was shown...computing the leading eigenvectors of a large data matrix through gossip algorithms. A new algorithm is proposed that amounts to iteratively multiplying...generalization of gossip algorithms for consensus. The algorithms outperform state-of-the-art methods in a communication-limited scenario. Positioning via
Quantized Average Consensus on Gossip Digraphs with Reduced Computation
NASA Astrophysics Data System (ADS)
Cai, Kai; Ishii, Hideaki
The authors have recently proposed a class of randomized gossip algorithms which solve the distributed averaging problem on directed graphs, with the constraint that each node has an integer-valued state. The essence of this algorithm is to maintain local records, called “surplus”, of individual state updates, thereby achieving quantized average consensus even though the state sum of all nodes is not preserved. In this paper we study a modified version of this algorithm, whose feature is primarily in reducing both computation and communication effort. Concretely, each node needs to update fewer local variables, and can transmit surplus by requiring only one bit. Under this modified algorithm we prove that reaching the average is ensured for arbitrary strongly connected graphs. The condition of arbitrary strong connection is less restrictive than those known in the literature for either real-valued or quantized states; in particular, it does not require the special structure on the network called balanced. Finally, we provide numerical examples to illustrate the convergence result, with emphasis on convergence time analysis.
Gossip algorithms in quantum networks
NASA Astrophysics Data System (ADS)
Siomau, Michael
2017-01-01
Gossip algorithms is a common term to describe protocols for unreliable information dissemination in natural networks, which are not optimally designed for efficient communication between network entities. We consider application of gossip algorithms to quantum networks and show that any quantum network can be updated to optimal configuration with local operations and classical communication. This allows to speed-up - in the best case exponentially - the quantum information dissemination. Irrespective of the initial configuration of the quantum network, the update requiters at most polynomial number of local operations and classical communication.
Asynchronous Gossip for Averaging and Spectral Ranking
NASA Astrophysics Data System (ADS)
Borkar, Vivek S.; Makhijani, Rahul; Sundaresan, Rajesh
2014-08-01
We consider two variants of the classical gossip algorithm. The first variant is a version of asynchronous stochastic approximation. We highlight a fundamental difficulty associated with the classical asynchronous gossip scheme, viz., that it may not converge to a desired average, and suggest an alternative scheme based on reinforcement learning that has guaranteed convergence to the desired average. We then discuss a potential application to a wireless network setting with simultaneous link activation constraints. The second variant is a gossip algorithm for distributed computation of the Perron-Frobenius eigenvector of a nonnegative matrix. While the first variant draws upon a reinforcement learning algorithm for an average cost controlled Markov decision problem, the second variant draws upon a reinforcement learning algorithm for risk-sensitive control. We then discuss potential applications of the second variant to ranking schemes, reputation networks, and principal component analysis.
NASA Astrophysics Data System (ADS)
Kar, Soummya; Moura, José M. F.
2011-08-01
The paper considers gossip distributed estimation of a (static) distributed random field (a.k.a., large scale unknown parameter vector) observed by sparsely interconnected sensors, each of which only observes a small fraction of the field. We consider linear distributed estimators whose structure combines the information \\emph{flow} among sensors (the \\emph{consensus} term resulting from the local gossiping exchange among sensors when they are able to communicate) and the information \\emph{gathering} measured by the sensors (the \\emph{sensing} or \\emph{innovations} term.) This leads to mixed time scale algorithms--one time scale associated with the consensus and the other with the innovations. The paper establishes a distributed observability condition (global observability plus mean connectedness) under which the distributed estimates are consistent and asymptotically normal. We introduce the distributed notion equivalent to the (centralized) Fisher information rate, which is a bound on the mean square error reduction rate of any distributed estimator; we show that under the appropriate modeling and structural network communication conditions (gossip protocol) the distributed gossip estimator attains this distributed Fisher information rate, asymptotically achieving the performance of the optimal centralized estimator. Finally, we study the behavior of the distributed gossip estimator when the measurements fade (noise variance grows) with time; in particular, we consider the maximum rate at which the noise variance can grow and still the distributed estimator being consistent, by showing that, as long as the centralized estimator is consistent, the distributed estimator remains consistent.
Brain tissue segmentation in MR images based on a hybrid of MRF and social algorithms.
Yousefi, Sahar; Azmi, Reza; Zahedi, Morteza
2012-05-01
Effective abnormality detection and diagnosis in Magnetic Resonance Images (MRIs) requires a robust segmentation strategy. Since manual segmentation is a time-consuming task which engages valuable human resources, automatic MRI segmentations received an enormous amount of attention. For this goal, various techniques have been applied. However, Markov Random Field (MRF) based algorithms have produced reasonable results in noisy images compared to other methods. MRF seeks a label field which minimizes an energy function. The traditional minimization method, simulated annealing (SA), uses Monte Carlo simulation to access the minimum solution with heavy computation burden. For this reason, MRFs are rarely used in real time processing environments. This paper proposed a novel method based on MRF and a hybrid of social algorithms that contain an ant colony optimization (ACO) and a Gossiping algorithm which can be used for segmenting single and multispectral MRIs in real time environments. Combining ACO with the Gossiping algorithm helps find the better path using neighborhood information. Therefore, this interaction causes the algorithm to converge to an optimum solution faster. Several experiments on phantom and real images were performed. Results indicate that the proposed algorithm outperforms the traditional MRF and hybrid of MRF-ACO in speed and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
A gossip based information fusion protocol for distributed frequent itemset mining
NASA Astrophysics Data System (ADS)
Sohrabi, Mohammad Karim
2018-07-01
The computational complexity, huge memory space requirement, and time-consuming nature of frequent pattern mining process are the most important motivations for distribution and parallelization of this mining process. On the other hand, the emergence of distributed computational and operational environments, which causes the production and maintenance of data on different distributed data sources, makes the parallelization and distribution of the knowledge discovery process inevitable. In this paper, a gossip based distributed itemset mining (GDIM) algorithm is proposed to extract frequent itemsets, which are special types of frequent patterns, in a wireless sensor network environment. In this algorithm, local frequent itemsets of each sensor are extracted using a bit-wise horizontal approach (LHPM) from the nodes which are clustered using a leach-based protocol. Heads of clusters exploit a gossip based protocol in order to communicate each other to find the patterns which their global support is equal to or more than the specified support threshold. Experimental results show that the proposed algorithm outperforms the best existing gossip based algorithm in term of execution time.
Convergence and Applications of a Gossip-Based Gauss-Newton Algorithm
NASA Astrophysics Data System (ADS)
Li, Xiao; Scaglione, Anna
2013-11-01
The Gauss-Newton algorithm is a popular and efficient centralized method for solving non-linear least squares problems. In this paper, we propose a multi-agent distributed version of this algorithm, named Gossip-based Gauss-Newton (GGN) algorithm, which can be applied in general problems with non-convex objectives. Furthermore, we analyze and present sufficient conditions for its convergence and show numerically that the GGN algorithm achieves performance comparable to the centralized algorithm, with graceful degradation in case of network failures. More importantly, the GGN algorithm provides significant performance gains compared to other distributed first order methods.
NASA Astrophysics Data System (ADS)
Miller, Avery
Consider a set of prisoners that want to gossip with one another, and suppose that these prisoners are located at fixed locations (e.g., in jail cells) along a corridor. Each prisoner has a way to broadcast messages (e.g. by voice or contraband radio) with transmission radius R and interference radius R' ≥ R. We study synchronous algorithms for this problem (that is, prisoners are allowed to speak at regulated intervals) including two restricted subclasses. We prove exact upper and lower bounds on the gossiping completion time for all three classes. We demonstrate that each restriction placed on the algorithm results in decreasing performance.
Intelligent Distributed Systems
2015-10-23
periodic gossiping algorithms by using convex combination rules rather than standard averaging rules. On a ring graph, we have discovered how to sequence...the gossips within a period to achieve the best possible convergence rate and we have related this optimal value to the classic edge coloring problem...consensus. There are three different approaches to distributed averaging: linear iterations, gossiping , and dou- ble linear iterations which are also known as
Something to talk about: Gossip increases oxytocin levels in a near real-life situation.
Brondino, Natascia; Fusar-Poli, Laura; Politi, Pierluigi
2017-03-01
Gossip is a pervasive social behavior. Its evolutionary survival seems related to its social functions, such as establishing group rules, punishing trespassers, exercising social influence through reputational systems, and developing and strengthening social bonds. We aimed at evaluating the effect of gossip on hormones (oxytocin and cortisol) and at identifying potential mediators of hormonal response to gossip. Twenty-two female students were randomly assigned to a gossip conversation or to an emotional non-gossip conversation. Additionally, all participants underwent a neutral conversation on the second day of the study. Salivary oxytocin and cortisol levels were measured. Oxytocin increased significantly in the gossip compared to the emotional non-gossip conversation. A decrease in cortisol levels was observed in all three conditions (gossip, emotional non-gossip, neutral). Change in cortisol levels was similar across conditions. Psychological characteristics (e.g. empathy, autistic traits, perceived stress, envy) did not affect oxytocin rise in the gossip condition. Our findings suggest that oxytocin may represent a potential hormonal correlate of gossip behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.
Distributed Estimation using Bayesian Consensus Filtering
2014-06-06
Convergence rate analysis of distributed gossip (linear parameter) estimation: Fundamental limits and tradeoffs,” IEEE J. Sel. Topics Signal Process...Dimakis, S. Kar, J. Moura, M. Rabbat, and A. Scaglione, “ Gossip algorithms for distributed signal processing,” Proc. of the IEEE, vol. 98, no. 11, pp
NASA Astrophysics Data System (ADS)
Friedman, Roy; Kermarrec, Anne-Marie; Miranda, Hugo; Rodrigues, Luís
Gossip-based networking has emerged as a viable approach to disseminate information reliably and efficiently in large-scale systems. Initially introduced for database replication [222], the applicability of the approach extends much further now. For example, it has been applied for data aggregation [415], peer sampling [416] and publish/subscribe systems [845]. Gossip-based protocols rely on a periodic peer-wise exchange of information in wired systems. By changing the way each peer is selected for the gossip communication, and which data are exchanged and processed [451], gossip systems can be used to perform different distributed tasks, such as, among others: overlay maintenance, distributed computation, and information dissemination (a collection of papers on gossip can be found in [451]). In a wired setting, the peer sampling service, allowing for a random or specific peer selection, is often provided as an independent service, able to operate independently from other gossip-based services [416].
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katti, Amogh; Di Fatta, Giuseppe; Naughton III, Thomas J
Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implementedmore » and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.« less
Self-* properties through gossiping.
Babaoglu, Ozalp; Jelasity, Márk
2008-10-28
As computer systems have become more complex, numerous competing approaches have been proposed for these systems to self-configure, self-manage, self-repair, etc. such that human intervention in their operation can be minimized. In ubiquitous systems, this has always been a central issue as well. In this paper, we overview techniques to implement self-* properties in large-scale, decentralized networks through bio-inspired techniques in general, and gossip-based algorithms in particular. We believe that gossip-based algorithms could be an important inspiration for solving problems in ubiquitous computing as well. As an example, we outline a novel approach to arrange large numbers of mobile agents (e.g. vehicles, rescue teams carrying mobile devices) into different formations in a totally decentralized manner. The approach is inspired by the biological mechanism of cell sorting via differential adhesion, as well as by our earlier work in self-organizing peer-to-peer overlay networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas
Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less
Epidemic failure detection and consensus for extreme parallelism
Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas; ...
2017-02-01
Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less
NASA Astrophysics Data System (ADS)
Malarz, K.; Szvetelszky, Z.; Szekf, B.; Kulakowski, K.
2006-11-01
We consider the average probability X of being informed on a gossip in a given social network. The network is modeled within the random graph theory of Erd{õ}s and Rényi. In this theory, a network is characterized by two parameters: the size N and the link probability p. Our experimental data suggest three levels of social inclusion of friendship. The critical value pc, for which half of agents are informed, scales with the system size as N-gamma with gamma approx 0.68. Computer simulations show that the probability X varies with p as a sigmoidal curve. Influence of the correlations between neighbors is also evaluated: with increasing clustering coefficient C, X decreases.
Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip.
Zamora, Jane Louie Fresco; Kashihara, Shigeru; Yamaguchi, Suguru
2015-01-01
Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values.
Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip
Yamaguchi, Suguru
2015-01-01
Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values. PMID:26421312
Gossip and Distributed Kalman Filtering: Weak Consensus Under Weak Detectability
NASA Astrophysics Data System (ADS)
Kar, Soummya; Moura, José M. F.
2011-04-01
The paper presents the gossip interactive Kalman filter (GIKF) for distributed Kalman filtering for networked systems and sensor networks, where inter-sensor communication and observations occur at the same time-scale. The communication among sensors is random; each sensor occasionally exchanges its filtering state information with a neighbor depending on the availability of the appropriate network link. We show that under a weak distributed detectability condition: 1. the GIKF error process remains stochastically bounded, irrespective of the instability properties of the random process dynamics; and 2. the network achieves \\emph{weak consensus}, i.e., the conditional estimation error covariance at a (uniformly) randomly selected sensor converges in distribution to a unique invariant measure on the space of positive semi-definite matrices (independent of the initial state.) To prove these results, we interpret the filtered states (estimates and error covariances) at each node in the GIKF as stochastic particles with local interactions. We analyze the asymptotic properties of the error process by studying as a random dynamical system the associated switched (random) Riccati equation, the switching being dictated by a non-stationary Markov chain on the network graph.
Psychological adaptations for assessing gossip veracity.
Hess, Nicole H; Hagen, Edward H
2006-09-01
Evolutionary models of human cooperation are increasingly emphasizing the role of reputation and the requisite truthful "gossiping" about reputation-relevant behavior. If resources were allocated among individuals according to their reputations, competition for resources via competition for "good" reputations would have created incentives for exaggerated or deceptive gossip about oneself and one's competitors in ancestral societies. Correspondingly, humans should have psychological adaptations to assess gossip veracity. Using social psychological methods, we explored cues of gossip veracity in four experiments. We found that simple reiteration increased gossip veracity, but only for those who found the gossip relatively uninteresting. Multiple sources of gossip increased its veracity, as did the independence of those sources. Information that suggested alternative, benign interpretations of gossip decreased its veracity. Competition between a gossiper and her target decreased gossip veracity. These results provide preliminary evidence for psychological adaptations for assessing gossip veracity, mechanisms that might be used to assess veracity in other domains involving social exchange of information.
Tell me the gossip: the self-evaluative function of receiving gossip about others.
Martinescu, Elena; Janssen, Onne; Nijstad, Bernard A
2014-12-01
We investigate the self-evaluative function of competence-related gossip for individuals who receive it. Using the Self-Concept Enhancing Tactician (SCENT) model, we propose that individuals use evaluative information about others (i.e., gossip) to improve, promote, and protect themselves. Results of a critical incident study and an experimental study showed that positive gossip had higher self-improvement value than negative gossip, whereas negative gossip had higher self-promotion value and raised higher self-protection concerns than positive gossip. Self-promotion mediated the relationship between gossip valence and pride, while self-protection mediated the relationship between gossip valence and fear, although the latter mediated relationship emerged for receivers with mastery goals rather than performance goals. These results suggest that gossip serves self-evaluative functions for gossip receivers and triggers self-conscious emotions. © 2014 by the Society for Personality and Social Psychology, Inc.
Gossip as an alternative for direct observation in games of indirect reciprocity.
Sommerfeld, Ralf D; Krambeck, Hans-Jürgen; Semmann, Dirk; Milinski, Manfred
2007-10-30
Communication about social topics is abundant in human societies, and many functions have been attributed to such gossiping. One of these proposed functions is the management of reputations. Reputation by itself has been shown to have a strong influence on cooperation dynamics in games of indirect reciprocity, and this notion helps to explain the observed high level of cooperation in humans. Here we designed a game to test a widespread assumption that gossip functions as a vector for the transmission of social information. This empirical study (with 14 groups of nine students each) focuses on the composition of gossip, information transfer by gossip, and the behavior based on gossip information. We show that gossip has a strong influence on the resulting behavior even when participants have access to the original information (i.e., direct observation) as well as gossip about the same information. Thus, it is evident that gossip has a strong manipulative potential. Furthermore, gossip about cooperative individuals is more positive than gossip about uncooperative individuals, gossip comments transmit social information successfully, and cooperation levels are higher when people encounter positive compared with negative gossip.
Gossip as an alternative for direct observation in games of indirect reciprocity
Sommerfeld, Ralf D.; Krambeck, Hans-Jürgen; Semmann, Dirk; Milinski, Manfred
2007-01-01
Communication about social topics is abundant in human societies, and many functions have been attributed to such gossiping. One of these proposed functions is the management of reputations. Reputation by itself has been shown to have a strong influence on cooperation dynamics in games of indirect reciprocity, and this notion helps to explain the observed high level of cooperation in humans. Here we designed a game to test a widespread assumption that gossip functions as a vector for the transmission of social information. This empirical study (with 14 groups of nine students each) focuses on the composition of gossip, information transfer by gossip, and the behavior based on gossip information. We show that gossip has a strong influence on the resulting behavior even when participants have access to the original information (i.e., direct observation) as well as gossip about the same information. Thus, it is evident that gossip has a strong manipulative potential. Furthermore, gossip about cooperative individuals is more positive than gossip about uncooperative individuals, gossip comments transmit social information successfully, and cooperation levels are higher when people encounter positive compared with negative gossip. PMID:17947384
Countering Center Gossip--Guidelines for Implementing an Anti-Gossip Policy.
ERIC Educational Resources Information Center
Copeland, Margaret Leitch; Bruno, Holly Elissa
2001-01-01
Discusses gossip in early childhood settings as a threat to professionalism. Identifies reasons for staff gossip, provides guidance for developing an anti-gossip program policy, and presents an activity to distinguish gossip and shared information. Discusses how directors can influence parents' discussions with staff and get staff to confront each…
Moving beyond assumptions of deviance: The reconceptualization and measurement of workplace gossip.
Brady, Daniel L; Brown, Douglas J; Liang, Lindie Hanyu
2017-01-01
Despite decades of research from other academic fields arguing that gossip is an important and potentially functional behavior, organizational research has largely assumed that gossip is malicious talk. This has resulted in the proliferation of gossip items in deviance scales, effectively subsuming workplace gossip research into deviance research. In this paper, the authors argue that organizational research has traditionally considered only a very narrow subset of workplace gossip, focusing almost exclusively on extreme negative cases which are not reflective of typical workplace gossip behavior. Instead of being primarily malicious, typical workplace gossip can be either positive or negative in nature and may serve important functions. It is therefore recommended that workplace gossip be studied on its own, independent of deviance. To facilitate this, the authors reconceptualize the workplace gossip construct and then develop a series of general-purpose English- and Chinese-language workplace gossip scales. Using 8 samples (including qualitative, multisource, multiwave, and multicultural data), the authors demonstrate the construct validity, reliability, cross-cultural measurement invariance, and acceptable psychometric properties of the workplace gossip scales. Relationships are demonstrated between workplace gossip and a variety of other organizational variables and processes, including uncertainty, emotion validation, self-esteem, norm enforcement, networking, influence, organizational justice, performance, deviance, and turnover. Future directions in workplace gossip research are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Peng, Xiaozhe; Li, You; Wang, Pengfei; Mo, Lei; Chen, Qi
2015-01-01
In contrast to abstract trait words which describe people's general personality, gossip is about personal affairs of others. Although neural correlates underlying processing self-related trait words have been well documented, it remains poorly understood how the human brain processes gossip. In the present fMRI study, participants were instructed to rate their online emotional states upon hearing positive and negative gossip about celebrities, themselves, and their best friends. Explicit behavioral ratings suggested that participants were happier to hear positive gossip and more annoyed to hear negative gossip about themselves than about celebrities and best friends. At the neural level, dissociated neural networks were involved in processing the positive gossip about self and the negative gossip about celebrities. On the one hand, the superior medial prefrontal cortex responded not only to self-related gossip but also to moral transgressions, and neural activity in the orbital prefrontal cortex increased linearly with pleasure ratings on positive gossip about self. On the other hand, although participants' ratings did not show they were particularly happy on hearing negative gossip about celebrities, the significantly enhanced neural activity in the reward system suggested that they were indeed amused. Moreover, via enhanced functional connectivity, the prefrontal executive control network was involved in regulating the reward system by giving explicit pleasure ratings according to social norm compliance, rather than natural true feelings.
Gossiping to the Top: Observed Differences in Popular Adolescents' Gossip
ERIC Educational Resources Information Center
Wargo Aikins, Julie; Collibee, Charlene; Cunningham, Jessica
2017-01-01
Despite its omnipresence, quantitative research examining both the nature and the function of adolescent gossip has been limited. The present study aimed to address this limitation in the literature by examining the nature and function of adolescent gossip; in particular, it aimed to explore observed differences between the gossip of those popular…
Adaptive Peer Sampling with Newscast
NASA Astrophysics Data System (ADS)
Tölgyesi, Norbert; Jelasity, Márk
The peer sampling service is a middleware service that provides random samples from a large decentralized network to support gossip-based applications such as multicast, data aggregation and overlay topology management. Lightweight gossip-based implementations of the peer sampling service have been shown to provide good quality random sampling while also being extremely robust to many failure scenarios, including node churn and catastrophic failure. We identify two problems with these approaches. The first problem is related to message drop failures: if a node experiences a higher-than-average message drop rate then the probability of sampling this node in the network will decrease. The second problem is that the application layer at different nodes might request random samples at very different rates which can result in very poor random sampling especially at nodes with high request rates. We propose solutions for both problems. We focus on Newscast, a robust implementation of the peer sampling service. Our solution is based on simple extensions of the protocol and an adaptive self-control mechanism for its parameters, namely—without involving failure detectors—nodes passively monitor local protocol events using them as feedback for a local control loop for self-tuning the protocol parameters. The proposed solution is evaluated by simulation experiments.
On coffee talk and break-room chatter: perceptions of women who gossip in the workplace.
Farley, Sally D; Timme, Diane R; Hart, Jason W
2010-01-01
The present study examined perceptions of female gossipers in the workplace. Male and female participants (N=129) were asked to think of a woman who either frequently or rarely contributed negative information about other people during conversation. Participants then completed ratings on the target using the six dimensions of the FIRO-B. As predicted, high gossipers were perceived as having a greater need to exert control of others, but less need for others to control them, than low gossipers. Higher gossipers were also perceived as less emotionally warm than low gossipers. The implications of these findings for gossip research are presented.
Distributed parameter estimation in unreliable sensor networks via broadcast gossip algorithms.
Wang, Huiwei; Liao, Xiaofeng; Wang, Zidong; Huang, Tingwen; Chen, Guo
2016-01-01
In this paper, we present an asynchronous algorithm to estimate the unknown parameter under an unreliable network which allows new sensors to join and old sensors to leave, and can tolerate link failures. Each sensor has access to partially informative measurements when it is awakened. In addition, the proposed algorithm can avoid the interference among messages and effectively reduce the accumulated measurement and quantization errors. Based on the theory of stochastic approximation, we prove that our proposed algorithm almost surely converges to the unknown parameter. Finally, we present a numerical example to assess the performance and the communication cost of the algorithm. Copyright © 2015 Elsevier Ltd. All rights reserved.
Gossip and nurses: malady or remedy?
Thomas, Sarah A; Rozell, Elizabeth J
2007-01-01
Gossip is a natural part of every social setting and has a profound influence on organizational behaviors. As the primary care givers in the hospital setting, nurses are the front line in generating and controlling gossip. It is essential that management recognize this dynamic in the nursing workforce so they can be proactive in developing strategies to effectively control gossip. This article highlights the positive and negative aspects of gossip and provides strategies to help nursing professionals effectively manage this workplace issue. Unmanaged gossip can have a negative effect on the workplace by damaging relationships and reputations. Gossip that is managed effectively can have a positive effect on the workplace by building social bonds within the nursing unit.
ERIC Educational Resources Information Center
Turner, Monique Mitchell; Mazur, Michelle A.; Wendel, Nicole; Winslow, Robert
2003-01-01
Explores gossip's function as a social influence tool. Considers if gossip is untrustworthy, leading to relational demise, or whether gossip can lead to perceived liking, trust, and expertise. Indicates that for the undergraduate student subjects, both positive and negative gossip are perceived negatively for both friends and strangers. (SG)
Gossip spread in social network Models
NASA Astrophysics Data System (ADS)
Johansson, Tobias
2017-04-01
Gossip almost inevitably arises in real social networks. In this article we investigate the relationship between the number of friends of a person and limits on how far gossip about that person can spread in the network. How far gossip travels in a network depends on two sets of factors: (a) factors determining gossip transmission from one person to the next and (b) factors determining network topology. For a simple model where gossip is spread among people who know the victim it is known that a standard scale-free network model produces a non-monotonic relationship between number of friends and expected relative spread of gossip, a pattern that is also observed in real networks (Lind et al., 2007). Here, we study gossip spread in two social network models (Toivonen et al., 2006; Vázquez, 2003) by exploring the parameter space of both models and fitting them to a real Facebook data set. Both models can produce the non-monotonic relationship of real networks more accurately than a standard scale-free model while also exhibiting more realistic variability in gossip spread. Of the two models, the one given in Vázquez (2003) best captures both the expected values and variability of gossip spread.
Seki, Motohide; Nakamaru, Mayuko
2016-10-21
Indirect reciprocity is considered to be important for explaining altruism among humans. The evolution of altruism has been modeled using several types of reputational scores, most of which were assumed to be updated immediately after each game session. In this study, we introduce gossip sessions held between game sessions to capture the spread of reputation and examine the effects of false information intentionally introduced by some players. Analytical and individual-based simulation results indicated that the frequent exchange of gossip favored the evolution of altruism when no players started false information. In contrast, intermediate repetitions of gossip sessions were favored when the population included liars or biased gossipers. In addition, we found that a gossip listener's strategy of incorporating any gossip regardless of speakers usually worked better than an alternative strategy of not believing gossip from untrustworthy players. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Xin, Qin; Yao, Xiaolan; Engelstad, Paal E.
2010-09-01
Wireless Mesh Networking is an emerging communication paradigm to enable resilient, cost-efficient and reliable services for the future-generation wireless networks. We study here the minimum-latency communication primitive of gossiping (all-to-all communication) in multi-hop ad-hoc Wireless Mesh Networks (WMNs). Each mesh node in the WMN is initially given a message and the objective is to design a minimum-latency schedule such that each mesh node distributes its message to all other mesh nodes. Minimum-latency gossiping problem is well known to be NP-hard even for the scenario in which the topology of the WMN is known to all mesh nodes in advance. In this paper, we propose a new latency-efficient approximation scheme that can accomplish gossiping task in polynomial time units in any ad-hoc WMN under consideration of Large Interference Range (LIR), e.g., the interference range is much larger than the transmission range. To the best of our knowledge, it is first time to investigate such a scenario in ad-hoc WMNs under LIR, our algorithm allows the labels (e.g., identifiers) of the mesh nodes to be polynomially large in terms of the size of the WMN, which is the first time that the scenario of large labels has been considered in ad-hoc WMNs under LIR. Furthermore, our gossiping scheme can be considered as a framework which can be easily implied to the scenario under consideration of mobility-related issues since we assume that the mesh nodes have no knowledge on the network topology even for its neighboring mesh nodes.
Rethinking gossip and scandal in healthcare organizations.
Waddington, Kathryn
2016-09-19
Purpose The purpose of this paper is to argue that gossip is a neglected aspect of organizational communication and knowledge, and an under-used management resource. Design/methodology/approach The paper challenges mainstream managerial assumptions that gossip is trivial or tainted talk which should be discouraged in the workplace. Instead, gossip is re-framed at an organizational level of analysis, which provides the opportunity for relational knowledge about systemic failure and poor practice in healthcare to surface. Findings Rather than simply viewing gossip as an individual behaviour and interpersonal process, it is claimed that organizational gossip is also a valuable early warning indicator of risk and failure in healthcare systems. There is potentially significant value in re-framing gossip as an aspect of organizational communication and knowledge. If attended to (rather than neglected or silenced) gossip can provide fresh insights into professional practice, decision making and relational leadership. Originality/value This paper offers a provocative challenge to mainstream health organization and management thinking about gossip in the workplace. It offers new ways of thinking to promote patient safety, and prevent the scandals that have plagued healthcare organizations in recent years.
Multiple gossip statements and their effect on reputation and trustworthiness.
Sommerfeld, Ralf D; Krambeck, Hans-Jürgen; Milinski, Manfred
2008-11-07
Empirical and theoretical evidence from various disciplines indicates that reputation, reputation building and trust are important for human cooperation, social behaviour and economic progress. Recently, it has been shown that reputation gained in games of indirect reciprocity can be transmitted by gossip. But it has also been shown that gossiping has a strong manipulative potential. We propose that this manipulative potential is alleviated by the abundance of gossip. Multiple gossip statements give a better picture of the actual behaviour of a person, and thus inaccurate or fake gossip has little power as long as it is in the minority. In addition, we investigate the supposedly strong connection between reciprocity, reputation and trust. The results of this experimental study (with 11 groups of 12 students each) document that gossip quantity helps to direct cooperation towards cooperators. Moreover, reciprocity, trust and reputations transferred via gossip are positively correlated. This interrelation might have helped to reach the high levels of cooperation that can be observed in humans.
Multiple gossip statements and their effect on reputation and trustworthiness
Sommerfeld, Ralf D; Krambeck, Hans-Jürgen; Milinski, Manfred
2008-01-01
Empirical and theoretical evidence from various disciplines indicates that reputation, reputation building and trust are important for human cooperation, social behaviour and economic progress. Recently, it has been shown that reputation gained in games of indirect reciprocity can be transmitted by gossip. But it has also been shown that gossiping has a strong manipulative potential. We propose that this manipulative potential is alleviated by the abundance of gossip. Multiple gossip statements give a better picture of the actual behaviour of a person, and thus inaccurate or fake gossip has little power as long as it is in the minority. In addition, we investigate the supposedly strong connection between reciprocity, reputation and trust. The results of this experimental study (with 11 groups of 12 students each) document that gossip quantity helps to direct cooperation towards cooperators. Moreover, reciprocity, trust and reputations transferred via gossip are positively correlated. This interrelation might have helped to reach the high levels of cooperation that can be observed in humans. PMID:18664435
Familiarity with interest breeds gossip: contributions of emotion, expectation, and reputation.
Yao, Bo; Scott, Graham G; McAleer, Phil; O'Donnell, Patrick J; Sereno, Sara C
2014-01-01
Although gossip serves several important social functions, it has relatively infrequently been the topic of systematic investigation. In two experiments, we advance a cognitive-informational approach to gossip. Specifically, we sought to determine which informational components engender gossip. In Experiment 1, participants read brief passages about other people and indicated their likelihood to share this information. We manipulated target familiarity (celebrity, non-celebrity) and story interest (interesting, boring). While participants were more likely to gossip about celebrity than non-celebrity targets and interesting than boring stories, they were even more likely to gossip about celebrity targets embedded within interesting stories. In Experiment 2, we additionally probed participants' reactions to the stories concerning emotion, expectation, and reputation information conveyed. Analyses showed that while such information partially mediated target familiarity and story interest effects, only expectation and reputation accounted for the interactive pattern of gossip behavior. Our findings provide novel insights into the essential components and processing mechanisms of gossip.
Familiarity with Interest Breeds Gossip: Contributions of Emotion, Expectation, and Reputation
Yao, Bo; Scott, Graham G.; McAleer, Phil; O'Donnell, Patrick J.; Sereno, Sara C.
2014-01-01
Although gossip serves several important social functions, it has relatively infrequently been the topic of systematic investigation. In two experiments, we advance a cognitive-informational approach to gossip. Specifically, we sought to determine which informational components engender gossip. In Experiment 1, participants read brief passages about other people and indicated their likelihood to share this information. We manipulated target familiarity (celebrity, non-celebrity) and story interest (interesting, boring). While participants were more likely to gossip about celebrity than non-celebrity targets and interesting than boring stories, they were even more likely to gossip about celebrity targets embedded within interesting stories. In Experiment 2, we additionally probed participants' reactions to the stories concerning emotion, expectation, and reputation information conveyed. Analyses showed that while such information partially mediated target familiarity and story interest effects, only expectation and reputation accounted for the interactive pattern of gossip behavior. Our findings provide novel insights into the essential components and processing mechanisms of gossip. PMID:25119267
Gossip, humor, and the art of becoming an intimate of Jesus.
Capps, Donald
2012-03-01
In Living Stories (Capps 1997) I addressed the rather broad consensus among clergy and laity alike that gossip is destructive of congregational life, a consensus based on the view that gossip invariably involves negatively critical conversations about other individuals and groups. However, this view is not supported by social scientific research and literary studies on gossip, which present a more complex picture of this form of human communication. On the other hand, the claim that gossip is trivial is more difficult to challenge, so I made a case for the importance of the trivial through consideration of the formal similarities between gossip and the narratives that comprise the Gospels, including the fact that both employ an "esthetic of surfaces" that focuses on specific personal particulars and that the stories that are told derive their power from the freedom that the participants in the conversation gain from entering imaginatively into the life of other persons. The present article furthers the exploration of the affinities between gossip and Gospel narratives by noting the role of humor in fostering good gossip and the mutually supportive role of gossip and humor in the art of becoming an intimate of Jesus.
Gossiping About Deviance: Evidence That Deviance Spurs the Gossip That Builds Bonds.
Peters, Kim; Jetten, Jolanda; Radova, Dagmar; Austin, Kacie
2017-11-01
We propose that the gossip that is triggered when people witness behaviors that deviate from social norms builds social bonds. To test this possibility, we showed dyads of unacquainted students a short video of everyday campus life that either did or did not include an incident of negative or positive deviance (dropping or cleaning up litter). Study 1 showed that participants in the deviance conditions reported having a greater understanding of campus social norms than those in the control condition; they also expressed a greater desire to gossip about the video. Study 2 found that when given the opportunity, participants did gossip about the deviance, and this gossip was associated with increased norm clarification and (indirectly) social cohesion. These findings suggest that gossip may be a mechanism through which deviance can have positive downstream social consequences.
Collaborative emitter tracking using Rao-Blackwellized random exchange diffusion particle filtering
NASA Astrophysics Data System (ADS)
Bruno, Marcelo G. S.; Dias, Stiven S.
2014-12-01
We introduce in this paper the fully distributed, random exchange diffusion particle filter (ReDif-PF) to track a moving emitter using multiple received signal strength (RSS) sensors. We consider scenarios with both known and unknown sensor model parameters. In the unknown parameter case, a Rao-Blackwellized (RB) version of the random exchange diffusion particle filter, referred to as the RB ReDif-PF, is introduced. In a simulated scenario with a partially connected network, the proposed ReDif-PF outperformed a PF tracker that assimilates local neighboring measurements only and also outperformed a linearized random exchange distributed extended Kalman filter (ReDif-EKF). Furthermore, the novel ReDif-PF matched the tracking error performance of alternative suboptimal distributed PFs based respectively on iterative Markov chain move steps and selective average gossiping with an inter-node communication cost that is roughly two orders of magnitude lower than the corresponding cost for the Markov chain and selective gossip filters. Compared to a broadcast-based filter which exactly mimics the optimal centralized tracker or its equivalent (exact) consensus-based implementations, ReDif-PF showed a degradation in steady-state error performance. However, compared to the optimal consensus-based trackers, ReDif-PF is better suited for real-time applications since it does not require iterative inter-node communication between measurement arrivals.
Developmental and Gender Differences in Preadolescents' Judgments of the Veracity of Gossip.
ERIC Educational Resources Information Center
Kuttler, Ami Flam; Parker, Jeffrey G.; La Greca, Annette M.
2002-01-01
Used hypothetical vignettes to examine 384 preadolescents' understanding of gossip in varying circumstances. Found that children correctly labeled talk about nonpresent others as gossip and considered it inappropriate. Skepticism was higher for gossip than for firsthand information and was greatest with cues suggesting that speakers were…
Gossip: does it play a role in the socialization of nurses?
Laing, M
1993-01-01
Despite its generally negative reputation, gossip continues to be a significant genre of communication in every society. The initial purpose of this paper is to scrutinize gossip from historical, analytical and feminist perspectives. An extensive review of the literature suggests gossip serves three primary functions: information, influence or social control and entertainment. The second purpose is to explore how the functions of gossip may contribute to the socialization of nurses to their professional role and to their work culture.
Anderson, Eric; Siegel, Erika H; Bliss-Moreau, Eliza; Barrett, Lisa Feldman
2011-06-17
Gossip is a form of affective information about who is friend and who is foe. We show that gossip does not influence only how a face is evaluated--it affects whether a face is seen in the first place. In two experiments, neutral faces were paired with negative, positive, or neutral gossip and were then presented alone in a binocular rivalry paradigm (faces were presented to one eye, houses to the other). In both studies, faces previously paired with negative (but not positive or neutral) gossip dominated longer in visual consciousness. These findings demonstrate that gossip, as a potent form of social affective learning, can influence vision in a completely top-down manner, independent of the basic structural features of a face.
Gossip Versus Punishment: The Efficiency of Reputation to Promote and Maintain Cooperation.
Wu, Junhui; Balliet, Daniel; Van Lange, Paul A M
2016-04-04
Prior theory suggests that reputation spreading (e.g., gossip) and punishment are two key mechanisms to promote cooperation in groups, but no behavioral research has yet examined their relative effectiveness and efficiency in promoting and maintaining cooperation. To examine these issues, we observed participants interacting in a four-round public goods game (PGG) with or without gossip and punishment options, and a subsequent two-round trust game (TG). We manipulated gossip as the option to send notes about other group members to these members' future partners, and punishment as the option to assign deduction points to reduce other group members' outcomes with a fee-to-fine ratio of 1:3. Findings revealed that in the four-round PGG, the option to gossip increased both cooperation and individual earnings, whereas the option to punish had no overall effect on cooperation (but a positive effect on cooperation in the last two rounds of the PGG) and significantly decreased individual earnings. Importantly, the initial option to gossip made people more trusting and trustworthy in the subsequent TG when gossip was no longer possible, compared to the no-gossip condition. Thus, we provide some initial evidence that gossip may be more effective and efficient than punishment to promote and maintain cooperation.
Gossip Versus Punishment: The Efficiency of Reputation to Promote and Maintain Cooperation
Wu, Junhui; Balliet, Daniel; Van Lange, Paul A. M.
2016-01-01
Prior theory suggests that reputation spreading (e.g., gossip) and punishment are two key mechanisms to promote cooperation in groups, but no behavioral research has yet examined their relative effectiveness and efficiency in promoting and maintaining cooperation. To examine these issues, we observed participants interacting in a four-round public goods game (PGG) with or without gossip and punishment options, and a subsequent two-round trust game (TG). We manipulated gossip as the option to send notes about other group members to these members’ future partners, and punishment as the option to assign deduction points to reduce other group members’ outcomes with a fee-to-fine ratio of 1:3. Findings revealed that in the four-round PGG, the option to gossip increased both cooperation and individual earnings, whereas the option to punish had no overall effect on cooperation (but a positive effect on cooperation in the last two rounds of the PGG) and significantly decreased individual earnings. Importantly, the initial option to gossip made people more trusting and trustworthy in the subsequent TG when gossip was no longer possible, compared to the no-gossip condition. Thus, we provide some initial evidence that gossip may be more effective and efficient than punishment to promote and maintain cooperation. PMID:27039896
Anderson, Eric; Siegel, Erika H.; Bliss-Moreau, Eliza; Barrett, Lisa Feldman
2011-01-01
Gossip is a form of affective information about who is friend and who is foe. We show that gossip does not impact only how a face is evaluated—it affects whether a face is seen in the first place. In two experiments, neutral faces were paired with negative, positive, or neutral gossip and were then presented alone in a binocular rivalry paradigm (faces were presented to one eye, houses to the other). In both studies, faces previously paired with negative (but not positive or neutral) gossip dominated longer in visual consciousness. These findings demonstrate that gossip, as a potent form of social affective learning, can influence vision in a completely top-down manner, independent of the basic structural features of a face. PMID:21596956
Utilities of gossip across organizational levels : Multilevel selection, free-riders, and teams.
Kniffin, Kevin M; Wilson, David Sloan
2005-09-01
Gossip is a subject that has been studied by researchers from an array of disciplines with various foci and methods. We measured the content of language use by members of a competitive sports team across 18 months, integrating qualitative ethnographic methods with quantitative sampling and analysis. We hypothesized that the use of gossip will vary significantly depending on whether it is used for self-serving or group-serving purposes. Our results support a model of gossip derived from multilevel selection theory that expects gossip to serve group-beneficial rules when rewards are partitioned at the group level on a scale that permits mutual monitoring. We integrate our case study with earlier studies of gossip conducted by anthropologists, psychologists, and management researchers.
What's to be done when 'foul whisp rings are abroad'? Gossip and rumour in health organisations.
O'Connor, Nick; Kotze, Beth; Storm, Victor
2018-02-01
This article explores the relevance of gossip and rumour to health organisations and presents what limited empirical research is available specific to the management of gossip and rumour in health organisations. The concept of a sentinel function for gossip and rumour in health organisations is proposed as a topic worthy of further research.
Gossip and Occupational Ideology
ERIC Educational Resources Information Center
Rysman, Alexander R.
1976-01-01
Defines the transmission of gossip as an essential social process reflecting a shared group membership and discusses the ways in which gossip supports ideologies held by members of a specific occupation. (MH)
Gossip, Drama, and Technology: How South Asian American Young Women Negotiate Gender On and Offline
ERIC Educational Resources Information Center
Subramanian, Mathangi
2013-01-01
Gossip, defined as evaluative talk about a third party, is a powerful tool for establishing in- and out-group norms and determining belonging. Drama, a form of gossip that is evolving in online spaces, is the process of fighting back against gossip and rumors designed to isolate and ostracise. While literature commonly portrays women as victims or…
Gossip and emotion in nursing and health-care organizations.
Waddington, Kathryn; Fletcher, Clive
2005-01-01
The purpose of this paper is to examine the relationship between gossip and emotion in health-care organizations. It draws on findings from empirical research exploring the characteristics and function of gossip which, to date, has been a relatively under-researched organizational phenomenon. A multidisciplinary approach was adopted, drawing on an eclectic range of discipline-based theories, skills, ideas and data. Methods included repertory grid technique, in-depth interviews and structured diary records of work-related gossip. The sample comprised 96 qualified nurses working in a range of practice areas and organizational settings in the UK. Template analysis was used to integrate findings across three phases of data collection. The findings revealed that gossip is used to express a range of emotions including care and concern about others, anger, annoyance and anxiety, with emotional outcomes that include feeling reassured and supported. It is the individual who gossips, while the organization provides the content, emotional context, triggers and opportunities. Nurses were chosen as an information-rich source of data, but the findings may simply reflect the professional culture and practice of nursing. Future research should take into account a wider range of health-care organizational roles and perspectives in order to capture the dynamics and detail of the emotions and relationships that initiate and sustain gossip. Because gossip makes people feel better it may serve to reinforce the "stress mask of professionalism", hiding issues of conflict, vulnerability and intense emotion. Managers need to consider what the emotions expressed through gossip might represent in terms of underlying issues relating to organizational health, communication and change. This paper makes a valuable contribution to the under-researched phenomenon of gossip in organizations and adds to the growing field of research into the role of emotion in health-care organizations and emotion work in nursing.
Evolution of gossip-based indirect reciprocity on a bipartite network
Giardini, Francesca; Vilone, Daniele
2016-01-01
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission. PMID:27885256
Evolution of gossip-based indirect reciprocity on a bipartite network.
Giardini, Francesca; Vilone, Daniele
2016-11-25
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.
Evolution of gossip-based indirect reciprocity on a bipartite network
NASA Astrophysics Data System (ADS)
Giardini, Francesca; Vilone, Daniele
2016-11-01
Cooperation can be supported by indirect reciprocity via reputation. Thanks to gossip, reputations are built and circulated and humans can identify defectors and ostracise them. However, the evolutionary stability of gossip is allegedly undermined by the fact that it is more error-prone that direct observation, whereas ostracism could be ineffective if the partner selection mechanism is not robust. The aim of this work is to investigate the conditions under which the combination of gossip and ostracism might support cooperation in groups of different sizes. We are also interested in exploring the extent to which errors in transmission might undermine the reliability of gossip as a mechanism for identifying defectors. Our results show that a large quantity of gossip is necessary to support cooperation, and that group structure can mitigate the effects of errors in transmission.
GOSSIP, a New VO Compliant Tool for SED Fitting
NASA Astrophysics Data System (ADS)
Franzetti, P.; Scodeggio, M.; Garilli, B.; Fumana, M.; Paioro, L.
2008-08-01
We present GOSSIP (Galaxy Observed-Simulated SED Interactive Program), a new tool developed to perform SED fitting in a simple, user friendly and efficient way. GOSSIP automatically builds-up the observed SED of an object (or a large sample of objects) combining magnitudes in different bands and eventually a spectrum; then it performs a χ^2 minimization fitting procedure versus a set of synthetic models. The fitting results are used to estimate a number of physical parameters like the Star Formation History, absolute magnitudes, stellar mass and their Probability Distribution Functions. User defined models can be used, but GOSSIP is also able to load models produced by the most commonly used synthesis population codes. GOSSIP can be used interactively with other visualization tools using the PLASTIC protocol for communications. Moreover, since it has been developed with large data sets applications in mind, it will be extended to operate within the Virtual Observatory framework. GOSSIP is distributed to the astronomical community from the PANDORA group web site (http://cosmos.iasf-milano.inaf.it/pandora/gossip.html).
Staff gossip tells managers what's really going on.
2016-10-19
'Gossip helps keep patients safe and organisations healthy, study claims' was the headline in our most popular online story of the past week. Perhaps it's no surprise that an item on gossiping got our readers talking.
Rumors and gossip: a guide for the health care supervisor.
Dowd, S B; Davidhizar, R; Dowd, L P
1997-09-01
Rumor and gossip are long-standing means of communication among humans and are prevalent in health care settings in part due to the nature of the organization. Rumor and gossip may be negative or positive, and health care supervisors should monitor the grapevine and consider themselves personally responsible for transmitting accurate information whenever possible to ensure that rumor and gossip do not have a negative effect on the department or institution.
Deterrence and transmission as mechanisms ensuring reliability of gossip.
Giardini, Francesca
2012-10-01
Spreading information about the members of one's group is one of the most universal human behaviors. Thanks to gossip, individuals can acquire the information about their peers without sustaining the burden of costly interactions with cheaters, but they can also create and revise social bonds. Gossip has also several positive functions at the group level, promoting cohesion and norm compliance. However, gossip can be unreliable, and can be used to damage others' reputation or to circulate false information, thus becoming detrimental to people involved and useless for the group. In this work, we propose a theoretical model in which reliability of gossip depends on the joint functioning of two distinct mechanisms. Thanks to the first, i.e., deterrence, individuals tend to avoid informational cheating because they fear punishment and the disruption of social bonds. On the other hand, transmission provides humans with the opportunity of reducing the consequences of cheating through a manipulation of the source of gossip.
The proactive management of rumor and gossip.
Ribeiro, V E; Blakeley, J A
1995-06-01
Gossip and rumor are common forms of communication in the workplace. Consequently, nursing administrators are challenged to find ways to manage these grapevine activities. The authors present an analysis of gossip and rumor and discuss strategies for their prevention and control.
ERIC Educational Resources Information Center
Anderson, Bill
1995-01-01
Workplace gossip has consequences: stealing time, damaging morale, and hurting people. Gossip feeds on truth and falsehood, can become part of "corporate culture," and is inevitable. Administrators should plug information gaps, set a good example, promote a calm atmosphere, anticipate rumors, use informative staff newsletters, and teach…
The spread of gossip in American schools
NASA Astrophysics Data System (ADS)
Lind, P. G.; da Silva, L. R.; Andrade, J. S., Jr.; Herrmann, H. J.
2007-06-01
Gossip is defined as a rumor which specifically targets one individual and essentially only propagates within its friendship connections. How fast and how far a gossip can spread is for the first time assessed quantitatively in this study. For that purpose we introduce the "spread factor" and study it on empirical networks of school friendships as well as on various models for social connections. We discover that there exists an ideal number of friendship connections an individual should have to minimize the danger of gossip propagation.
McGuigan, Nicola; Cubillo, Marcus
2013-01-01
The authors' aim was to use a highly novel open diffusion paradigm to investigate the transmission of social information (i.e., gossip) and general knowledge within 2 groups of 10- and 11-year-old children. Four children, 2 from each group, acted as a primed information source, selected on the basis of sex and dominance ranking (high or low) within the group. Each source received 1 piece of gossip and 1 piece of general knowledge from the experimenter during natural class interaction, and the information was allowed to diffuse naturally within the group. Results revealed that gossip was transmitted more frequently than knowledge, and that male sources were more likely to transmit gossip than female sources. The relationship between characteristics of the source, and characteristics of the gossip recipient, also appeared influential with the dominant male source transmitting gossip to exclusively to friends, and the nondominant male source transmitting to individuals of higher peer regard than themselves.
Smith, Eliot R
2014-11-01
Although person perception is central to virtually all human social behavior, it is ordinarily studied in isolated individual perceivers. Conceptualizing it as a socially distributed process opens up a variety of novel issues, which have been addressed in scattered literatures mostly outside of social psychology. This article examines some of these issues using a series of multiagent models. Perceivers can use gossip (information from others about social targets) to improve their ability to detect targets who perform rare negative behaviors. The model suggests that they can simultaneously protect themselves against being influenced by malicious gossip intended to defame specific targets. They can balance these potentially conflicting goals by using specific strategies including disregarding gossip that differs from a personally obtained impression. Multiagent modeling demonstrates the outcomes produced by different combinations of assumptions about gossip, and suggests directions for further research and theoretical development. © 2014 by the Society for Personality and Social Psychology, Inc.
Gans, Jerome S
2014-01-01
Although what transpires in group therapy is not gossip per se-except perhaps when absent or former members are discussed-listening to group interaction through an understanding of the dynamics of gossip can contribute to a greater appreciation of group dynamics and group leadership as well as enlarge therapeutic space. After examining the interpersonal dynamics of gossip, this paper discusses six ways in which an understanding of these dynamics can inform group leadership and shed light on group psychotherapy. Central features of gossip that appear in group interactions are explored: These include projection, displacement, self-esteem regulation, clarification of motivation, unself-consciousness, social comparison and bonding, avoidance of psychic pain, and making the ego-syntonic dystonic. The lively use of imagination in the mature phase of group therapy is conceived of as the time when the darker side of human nature-imagined gossip harnessed for therapeutic purposes-can be welcomed in and processed in a kind, playful, and compassionate manner.
Load Balancing in Structured P2P Networks
NASA Astrophysics Data System (ADS)
Zhu, Yingwu
In this chapter we start by addressing the importance and necessity of load balancing in structured P2P networks, due to three main reasons. First, structured P2P networks assume uniform peer capacities while peer capacities are heterogeneous in deployed P2P networks. Second, resorting to pseudo-uniformity of the hash function used to generate node IDs and data item keys leads to imbalanced overlay address space and item distribution. Lastly, placement of data items cannot be randomized in some applications (e.g., range searching). We then present an overview of load aggregation and dissemination techniques that are required by many load balancing algorithms. Two techniques are discussed including tree structure-based approach and gossip-based approach. They make different tradeoffs between estimate/aggregate accuracy and failure resilience. To address the issue of load imbalance, three main solutions are described: virtual server-based approach, power of two choices, and address-space and item balancing. While different in their designs, they all aim to improve balance on the address space and data item distribution. As a case study, the chapter discusses a virtual server-based load balancing algorithm that strives to ensure fair load distribution among nodes and minimize load balancing cost in bandwidth. Finally, the chapter concludes with future research and a summary.
EZ and GOSSIP, two new VO compliant tools for spectral analysis
NASA Astrophysics Data System (ADS)
Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.
2008-10-01
We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.
Gossip-Free Zones: Problem Solving to Prevent Power Struggles
ERIC Educational Resources Information Center
Bruno, Holly
2007-01-01
Gossiping staff in early childhood programs models destructive behavior and harms families' trust in the professionalism of the program. Bruno identifies some of the causes and motives for gossip and examines its occurrence in early care and education settings. She offers step-by-step strategies to help supervisors set and enact policies that…
Entering a Crack: An Encounter with Gossip
ERIC Educational Resources Information Center
Henderson, Linda
2014-01-01
In this paper, I enter a crack to think otherwise about the concept "gossip". Drawing on previous scholarship engaging with Deleuzian concepts to inform research methodologies, this paper builds on this body of work. Following Deleuze and Guattari, the paper undertakes a mapping of gossip, subsequent to an encounter with a crack.…
Analyzing Rumors, Gossip, and Urban Legends through Their Conversational Properties
ERIC Educational Resources Information Center
Guerin, Bernard; Miyazaki, Yoshihiko
2006-01-01
A conversational approach is developed to explain the ubiquitous presence of rumors, urban legends, and gossip as arising from their conversational properties rather than from side effects of cognitive processing or "effort after meaning." It is suggested that the primary function of telling rumors, gossip, and urban legends is not to impart…
The effects of collaboration on recall of social information.
Reysen, Matthew B; Talbert, Natalie G; Dominko, Mura; Jones, Amie N; Kelley, Matthew R
2011-08-01
Three experiments examined the effects of passage type on both individual and collaborative memory performance. In Experiment 1, both individuals and collaborative groups recalled more information from passages containing social information than non-social information. Furthermore, collaborative inhibition (CI) was observed for both types of passages. In Experiment 2, which included a social passage that did not contain gossip, significant main effects of both gossip (gossip > non-gossip) and sociability (explicit > implicit) were observed. As in Experiment 1, CI was observed across all conditions. Experiment 3 separately manipulated gossip and the interest level of the passages and both of these factors enhanced memory performance. Moreover, robust CI was again observed across all conditions. Taken together, the present results demonstrate a mnemonic benefit for social information in individuals and collaborative groups. ©2011 The British Psychological Society.
Stories and Gossip in English: The Macro-Structure of Casual Talk.
ERIC Educational Resources Information Center
Slade, Diana
1997-01-01
A discussion of two text-types commonly occurring in casual conversation, stories and gossip, (1) details four kinds of stories told in casual talk, (2) demonstrates that gossip is a culturally-determined process with a distinctive structure, and (3) considers implications for teaching English-as-a- Second-Language. Analysis is based on over three…
NASA Astrophysics Data System (ADS)
Frey, Davide; Guerraoui, Rachid; Kermarrec, Anne-Marie; Koldehofe, Boris; Mogensen, Martin; Monod, Maxime; Quéma, Vivien
Gossip-based information dissemination protocols are considered easy to deploy, scalable and resilient to network dynamics. Load-balancing is inherent in these protocols as the dissemination work is evenly spread among all nodes. Yet, large-scale distributed systems are usually heterogeneous with respect to network capabilities such as bandwidth. In practice, a blind load-balancing strategy might significantly hamper the performance of the gossip dissemination.
Gossip as an effective and low-cost form of punishment.
Feinberg, Matthew; Cheng, Joey T; Willer, Robb
2012-02-01
The spreading of reputational information about group members through gossip represents a widespread, efficient, and low-cost form of punishment. Research shows that negative arousal states motivate individuals to gossip about the transgressions of group members. By sharing information in this way groups are better able to promote cooperation and maintain social control and order.
How to Combat a Campus-Gossip Web Site (And Why You Shouldn't)
ERIC Educational Resources Information Center
Young, Jeffrey R.
2008-01-01
The author discusses the gossip Web site Juicy Campus. Although many students express concern that potential employers who see the cite may decline to hire individuals after reading gossip-filled allegations, or that their social lives are in tatters over the mean-spirited, anonymous messages posted about them, because the site has no affiliation…
Rumors and gossip in radiology.
Dowd, S B; Davidhizar, R
1997-01-01
Rumors and gossip have long been popular topics in literature. Social scientists have even studied the topic and defined four main types of rumor: wish rumors; fear or bogey rumors; wedge-driving or aggressive rumors; and anticipatory rumors. In general, people believe rumor and gossip are synonymous. Rumormongering--the spreading of rumors--occurs among all cultures and types of people. Both men and women gossip and women's gossip is not more vindicative than men's, as is often thought. With such new means of communication as the Internet, transmitting rumor is possible beyond the traditional oral and written forms. Rumor is spread in both the higher and lower levels of an organization. Typically, disproving a rumor is more difficult than proving a rumor. The financial impact of a rumor must be considered also. If people believe, for example, that a radiology department does not have its act together or offers poor customer service, the department may lose revenue because people have lost confidence in it. Originally, the word gossip had positive implications. It referred to a family friend or the woman who delivered a child and announced the event to the community. Because well-intentioned gossip often turns into a damaging story, various approaches for stopping rumors have been identified. They include analyzing the grapevine, identifying the habitual spreaders of rumor and keeping employees informed. In most cases, a person of authority who provides facts can stop or at least slow down rumors spreading at the employee level.
Do online gossipers promote brands?
Okazaki, Shintaro; Rubio, Natalia; Campo, Sara
2013-02-01
Online gossip has been recognized as small talk on social networking sites (SNSs) that influences consumer behavior, but little attention has been paid to its role. This study makes three theoretical predictions: (a) propensity to gossip online leads to greater information value, entertainment value, and friendship value; (b) upon exposure to a high-involvement product, online gossipers are more willing to spread such information through electronic word-of-mouth (eWOM) in search of prestige or fame as a knowledge expert; and (c) this tendency will be more pronounced when they are connected with strong ties (rather than weak ties) and belong to a large network (rather than a small network). An experimental survey was conducted with a scenario method. In total, 818 general consumers participated in the survey. A multivariate analysis of variance (ANOVA) provides empirical support for prediction (1). With regard to predictions (2) and (3), a series of three-way and two-way between-subjective ANOVAs were performed. When a high-involvement product is promoted, gossipers, rather than nongossipers, are more willing to participate in eWOM on an SNS. Furthermore, a significant interaction effect indicates that online gossipers' willingness to particiapte in eWOM would be more pronounced if they belonged to a large network rather than a small network. However, when a low-involvement product is promoted, no interaction effect is found between online gossip and network size. In closing, theoretical and managerial implications are discussed, while important limitations are recognized.
Do Online Gossipers Promote Brands?
Rubio, Natalia; Campo, Sara
2013-01-01
Abstract Online gossip has been recognized as small talk on social networking sites (SNSs) that influences consumer behavior, but little attention has been paid to its role. This study makes three theoretical predictions: (a) propensity to gossip online leads to greater information value, entertainment value, and friendship value; (b) upon exposure to a high-involvement product, online gossipers are more willing to spread such information through electronic word-of-mouth (eWOM) in search of prestige or fame as a knowledge expert; and (c) this tendency will be more pronounced when they are connected with strong ties (rather than weak ties) and belong to a large network (rather than a small network). An experimental survey was conducted with a scenario method. In total, 818 general consumers participated in the survey. A multivariate analysis of variance (ANOVA) provides empirical support for prediction (1). With regard to predictions (2) and (3), a series of three-way and two-way between-subjective ANOVAs were performed. When a high-involvement product is promoted, gossipers, rather than nongossipers, are more willing to participate in eWOM on an SNS. Furthermore, a significant interaction effect indicates that online gossipers' willingness to particiapte in eWOM would be more pronounced if they belonged to a large network rather than a small network. However, when a low-involvement product is promoted, no interaction effect is found between online gossip and network size. In closing, theoretical and managerial implications are discussed, while important limitations are recognized. PMID:23276259
Data dissemination using gossiping in wireless sensor networks
NASA Astrophysics Data System (ADS)
Medidi, Muralidhar; Ding, Jin; Medidi, Sirisha
2005-06-01
Disseminating data among sensors is a fundamental operation in energy-constrained wireless sensor networks. We present a gossip-based adaptive protocol for data dissemination to improve energy efficiency of this operation. To overcome the data implosion problems associated with dissemination operation, our protocol uses meta-data to name the data using high-level data descriptors and negotiation to eliminate redundant transmissions of duplicate data in the network. Further, we adapt the gossiping with data aggregation possibilities in sensor networks. We simulated our data dissemination protocol, and compared it to the SPIN protocol. We find that our protocol improves on the energy consumption by about 20% over others, while improving significantly over the data dissemination rate of gossiping.
Memoryless cooperative graph search based on the simulated annealing algorithm
NASA Astrophysics Data System (ADS)
Hou, Jian; Yan, Gang-Feng; Fan, Zhen
2011-04-01
We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1. Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip-consensus method based scheme is presented to update the key parameter—radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment.
NASA Astrophysics Data System (ADS)
Serbu, Sabina; Rivière, Étienne; Felber, Pascal
The emergence of large-scale distributed applications based on many-to-many communication models, e.g., broadcast and decentralized group communication, has an important impact on the underlying layers, notably the Internet routing infrastructure. To make an effective use of network resources, protocols should both limit the stress (amount of messages) on each infrastructure entity like routers and links, and balance as much as possible the load in the network. Most protocols use application-level metrics such as delays to improve efficiency of content dissemination or routing, but the extend to which such application-centric optimizations help reduce and balance the load imposed to the infrastructure is unclear. In this paper, we elaborate on the design of such network-friendly protocols and associated metrics. More specifically, we investigate random-based gossip dissemination. We propose and evaluate different ways of making this representative protocol network-friendly while keeping its desirable properties (robustness and low delays). Simulations of the proposed methods using synthetic and real network topologies convey and compare their abilities to reduce and balance the load while keeping good performance.
The time course of indirect moral judgment in gossip processing modulated by different agents.
Peng, Xiaozhe; Jiao, Can; Cui, Fang; Chen, Qingfei; Li, Peng; Li, Hong
2017-10-01
Previous studies have investigated personal moral violations with different references (i.e., the protagonists in moral scenarios are the participants themselves or unknown other individuals). However, the roles of various agents in moral judgments have remained unclear. In the present study, ERPs were used to investigate moral judgments when the participants viewed gossip that described (im)moral behaviors committed by different agents (self, friend, celebrity). The results demonstrate that the P2 and late positive component (LPC) correspond to two successive processes of indirect moral judgment when individuals process gossip. Specifically, the P2 amplitude in the celebrity condition was more sensitive in distinguishing immoral behaviors from moral behaviors than that in the other two conditions, whereas the moral valence effect on the LPC was predominately driven by the self-reference. These findings expand our current understanding of moral judgments in a gossip evaluation task and demonstrate that the early processing of gossip depends on both the entertainment value of the agent and the salience of moral behaviors. Processing in the later stage reflects reactions to intensified affective stimuli, or reflects cognitive effort that was required to resolve the conflict between negative gossip about self and the self-positivity bias. © 2017 Society for Psychophysiological Research.
On Equivalence between Critical Probabilities of Dynamic Gossip Protocol and Static Site Percolation
NASA Astrophysics Data System (ADS)
Ishikawa, Tetsuya; Hayakawa, Tomohisa
The relationship between the critical probability of gossip protocol on the square lattice and the critical probability of site percolation on the square lattice is discussed. Specifically, these two critical probabilities are analytically shown to be equal to each other. Furthermore, we present a way of evaluating the critical probability of site percolation by approximating the saturation of gossip protocol. Finally, we provide numerical results which support the theoretical analysis.
FOG: Fighting the Achilles' Heel of Gossip Protocols with Fountain Codes
NASA Astrophysics Data System (ADS)
Champel, Mary-Luc; Kermarrec, Anne-Marie; Le Scouarnec, Nicolas
Gossip protocols are well known to provide reliable and robust dissemination protocols in highly dynamic systems. Yet, they suffer from high redundancy in the last phase of the dissemination. In this paper, we combine fountain codes (rateless erasure-correcting codes) together with gossip protocols for a robust and fast content dissemination in large-scale dynamic systems. The use of fountain enables to eliminate the unnecessary redundancy of gossip protocols. We propose the design of FOG, which fully exploits the first exponential growth phase (where the data is disseminated exponentially fast) of gossip protocols while avoiding the need for the shrinking phase by using fountain codes. FOG voluntarily increases the number of disseminations but limits those disseminations to the exponential growth phase. In addition, FOG creates a split-graph overlay that splits the peers between encoders and forwarders. Forwarder peers become encoders as soon as they have received the whole content. In order to benefit even further and quicker from encoders, FOG biases the dissemination towards the most advanced peers to make them complete earlier.
De Backer, Charlotte J S; Nelissen, Mark; Vyncke, Patrick; Braeckman, Johan; McAndrew, Francis T
2007-12-01
In this paper we present two compatible hypotheses to explain interest in celebrity gossip. The Learning Hypothesis explains interest in celebrity gossip as a by-product of an evolved mechanism useful for acquiring fitness-relevant survival information. The Parasocial Hypothesis sees celebrity gossip as a diversion of this mechanism, which leads individuals to misperceive celebrities as people who are part of their social network. Using two preliminary studies, we tested our predictions. In a survey with 838 respondents and in-depth interviews with 103 individuals, we investigated how interest in celebrity gossip was related to several dimensions of the participants' social lives. In support of the Learning Hypothesis, age proved to be a strong predictor of interest in celebrities. In partial support of the Parasocial Hypothesis, media exposure, but not social isolation, was a strong predictor of interest in celebrities. The preliminary results support both theories, indicate that across our life span celebrities move from being teachers to being friends, and open up a list of future research opportunities.
Distributed consensus for metamorphic systems using a gossip algorithm for CAT(0) metric spaces
NASA Astrophysics Data System (ADS)
Bellachehab, Anass; Jakubowicz, Jérémie
2015-01-01
We present an application of distributed consensus algorithms to metamorphic systems. A metamorphic system is a set of identical units that can self-assemble to form a rigid structure. For instance, one can think of a robotic arm composed of multiple links connected by joints. The system can change its shape in order to adapt to different environments via reconfiguration of its constituting units. We assume in this work that several metamorphic systems form a network: two systems are connected whenever they are able to communicate with each other. The aim of this paper is to propose a distributed algorithm that synchronizes all the systems in the network. Synchronizing means that all the systems should end up having the same configuration. This aim is achieved in two steps: (i) we cast the problem as a consensus problem on a metric space and (ii) we use a recent distributed consensus algorithm that only make use of metrical notions.
Percival, Jennifer
2002-12-11
Mike, a charge nurse on a surgical day unit, told me about Janice, his ward clerk who was always gossiping and causing trouble between people in his team. I asked what he had tried to do about it. He said that when he noticed her gossiping he had asked her for something to try to distract her. Although this put an end to the immediate problem, Mike wanted a long-term solution, especially as a new team member had asked for advice because she felt uncomfortable hearing 'stories' about her colleagues from Janice.
Social curiosity and gossip: related but different drives of social functioning.
Hartung, Freda-Marie; Renner, Britta
2013-01-01
The present online-questionnaire study examined two fundamental social behaviors, social curiosity and gossip, and their interrelations in an English (n = 218) and a German sample (n = 152). Analyses showed that both samples believed that they are less gossipy but more curious than their peers. Multidimensional SEM of self and trait conceptions indicated that social curiosity and gossip are related constructs but with different patterns of social functions. Gossip appears to serve predominantly entertainment purposes whereas social curiosity appears to be more driven by a general interest in gathering information about how other people feel, think, and behave and the need to belong. Relationships to other personality traits (N, E, O) provided additional evidence for divergent validity. The needs for gathering and disseminating social information might represent two interlinked but different drives of cultural learning.
Social Curiosity and Gossip: Related but Different Drives of Social Functioning
Hartung, Freda-Marie; Renner, Britta
2013-01-01
The present online-questionnaire study examined two fundamental social behaviors, social curiosity and gossip, and their interrelations in an English (n = 218) and a German sample (n = 152). Analyses showed that both samples believed that they are less gossipy but more curious than their peers. Multidimensional SEM of self and trait conceptions indicated that social curiosity and gossip are related constructs but with different patterns of social functions. Gossip appears to serve predominantly entertainment purposes whereas social curiosity appears to be more driven by a general interest in gathering information about how other people feel, think, and behave and the need to belong. Relationships to other personality traits (N, E, O) provided additional evidence for divergent validity. The needs for gathering and disseminating social information might represent two interlinked but different drives of cultural learning. PMID:23936130
Developing Appreciation for Sarcasm and Sarcastic Gossip: It Depends on Perspective.
Glenwright, Melanie; Tapley, Brent; Rano, Jacqueline K S; Pexman, Penny M
2017-11-09
Speakers use sarcasm to criticize others and to be funny; the indirectness of sarcasm protects the addressee's face (Brown & Levinson, 1987). Thus, appreciation of sarcasm depends on the ability to consider perspectives. We investigated development of this ability from late childhood into adulthood and examined effects of interpretive perspective and parties present. We presented 9- to 10-year-olds, 13- to 14-year-olds, and adults with sarcastic and literal remarks in three parties-present conditions: private evaluation, public evaluation, and gossip. Participants interpreted the speaker's attitude and humor from the addressee's perspective and, when appropriate, from the bystander's perspective. Children showed no influence of interpretive perspective or parties present on appreciation of the speaker's attitude or humor. Adolescents and adults, however, shifted their interpretations, judging that addressees have less favorable views of criticisms than bystanders. Further, adolescents and adults differed in their perceptions of the social functions of gossip, with adolescents showing more positive attitudes than adults toward sarcastic gossip. We suggest that adults' disapproval of sarcastic gossip shows a deeper understanding of the utility of sarcasm's face-saving function. Thus, the ability to modulate appreciation of sarcasm according to interpretive perspective and parties present continues to develop in adolescence and into adulthood.
The virtues of gossip: reputational information sharing as prosocial behavior.
Feinberg, Matthew; Willer, Robb; Stellar, Jennifer; Keltner, Dacher
2012-05-01
Reputation systems promote cooperation and deter antisocial behavior in groups. Little is known, however, about how and why people share reputational information. Here, we seek to establish the existence and dynamics of prosocial gossip, the sharing of negative evaluative information about a target in a way that protects others from antisocial or exploitative behavior. We present a model of prosocial gossip and the results of 4 studies testing the model's claims. Results of Studies 1 through 3 demonstrate that (a) individuals who observe an antisocial act experience negative affect and are compelled to share information about the antisocial actor with a potentially vulnerable person, (b) sharing such information reduces negative affect created by observing the antisocial behavior, and (c) individuals possessing more prosocial orientations are the most motivated to engage in such gossip, even at a personal cost, and exhibit the greatest reduction in negative affect as a result. Study 4 demonstrates that prosocial gossip can effectively deter selfishness and promote cooperation. Taken together these results highlight the roles of prosocial motivations and negative affective reactions to injustice in maintaining reputational information sharing in groups. We conclude by discussing implications for reputational theories of the maintenance of cooperation in human groups.
The hot body issue: Weight and caption tone in celebrity gossip magazines.
McDonnell, Andrea; Lin, Linda
2016-09-01
While representations of bodies and weight have been studied in regards to fashion and fitness magazines, little research exists that examines such representations in celebrity gossip magazines. Using data collected through content analysis of 262 photo-caption units published in June 2015 issues of American celebrity gossip magazines, this study examines representations of bodies within the genre and the relationship between the gender, race, and body size of pictured celebrities and the tone of accompanying captions. Results indicate that celebrity gossip magazines critique the bodies of both female and male subjects, but that women are more likely to be the subject of negative comments than men. Underweight women and overweight men are especially targeted for criticism. Latinos are praised more often than other racial groups. The implications of these representations are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Fauji, Shantanu
We consider the problem of energy efficient and fault tolerant in--network aggregation for wireless sensor networks (WSNs). In-network aggregation is the process of aggregation while collecting data from sensors to the base station. This process should be energy efficient due to the limited energy at the sensors and tolerant to the high failure rates common in sensor networks. Tree based in--network aggregation protocols, although energy efficient, are not robust to network failures. Multipath routing protocols are robust to failures to a certain degree but are not energy efficient due to the overhead in the maintenance of multiple paths. We propose a new protocol for in-network aggregation in WSNs, which is energy efficient, achieves high lifetime, and is robust to the changes in the network topology. Our protocol, gossip--based protocol for in-network aggregation (GPIA) is based on the spreading of information via gossip. GPIA is not only adaptive to failures and changes in the network topology, but is also energy efficient. Energy efficiency of GPIA comes from all the nodes being capable of selective message reception and detecting convergence of the aggregation early. We experimentally show that GPIA provides significant improvement over some other competitors like the Ridesharing, Synopsis Diffusion and the pure version of gossip. GPIA shows ten fold, five fold and two fold improvement over the pure gossip, the synopsis diffusion and Ridesharing protocols in terms of network lifetime, respectively. Further, GPIA retains gossip's robustness to failures and improves upon the accuracy of synopsis diffusion and Ridesharing.
GOSSIP: a method for fast and accurate global alignment of protein structures.
Kifer, I; Nussinov, R; Wolfson, H J
2011-04-01
The database of known protein structures (PDB) is increasing rapidly. This results in a growing need for methods that can cope with the vast amount of structural data. To analyze the accumulating data, it is important to have a fast tool for identifying similar structures and clustering them by structural resemblance. Several excellent tools have been developed for the comparison of protein structures. These usually address the task of local structure alignment, an important yet computationally intensive problem due to its complexity. It is difficult to use such tools for comparing a large number of structures to each other at a reasonable time. Here we present GOSSIP, a novel method for a global all-against-all alignment of any set of protein structures. The method detects similarities between structures down to a certain cutoff (a parameter of the program), hence allowing it to detect similar structures at a much higher speed than local structure alignment methods. GOSSIP compares many structures in times which are several orders of magnitude faster than well-known available structure alignment servers, and it is also faster than a database scanning method. We evaluate GOSSIP both on a dataset of short structural fragments and on two large sequence-diverse structural benchmarks. Our conclusions are that for a threshold of 0.6 and above, the speed of GOSSIP is obtained with no compromise of the accuracy of the alignments or of the number of detected global similarities. A server, as well as an executable for download, are available at http://bioinfo3d.cs.tau.ac.il/gossip/.
Marivaux and La Commere (The Gossip)
ERIC Educational Resources Information Center
Trapnall, William H., Jr.
1970-01-01
Compares two Marivaux works: the novel of 1735, "Le Paysan parvenu" ("The Upstart Peasant") and the one-act comedy, "La Commere" ("The Gossip") not previously published until 1966. Questions, by analyzing the character portrayals, whether the discovered manuscript was penned by Marivaux. (DS)
Three Degrees of Inclusion: the Gossip-Effect in Human Networks
NASA Astrophysics Data System (ADS)
Szekfu˝, Balázs; Szvetelszky, Zsuzsanna
2005-06-01
Using the scientific definition of gossip, an ancient and ubiquitous phenomenon of the social networks, we present our preliminary study and its results on how to measure the networks based on dissemination of connections and information. We try to accurately calculate the gossip-effects in networks with our hypothesis of "three degrees of inclusion". Our preliminary study on the subject of "three degrees of inclusion" gives latency to a very important property of social networks. Observing the human communication of closely knit social groups we came to the conclusion that the human networks are based on not more than three degrees of links. Taking the strong human ties into account our research indicates that whoever is on the fourth degree rarely counts as an in-group member. Our close friend's close friend's close friend — that is about the farthest — three steps — our network can reach out when it comes to telling a story or asking for a favor. Up to now no investigations have been performed to see whether the effects of gossip lead to the phase transition of the content of network's self-organizing communication. Our conclusion is that the gossip-effect must be considered as the prefactor of the news and opinions diffusion and dynamics at the social level.
Gossip as a Communication Construct.
ERIC Educational Resources Information Center
Ting-Toomey, Stella
Although an important communication process, gossip rarely has been seriously studied. Distinct from rumor and self-disclosure, it can be defined as the communication process whereby information about another person's affairs or activities is disclosed and circulated in an exclusive manner in dyads. Some "functionalists" assert that…
NASA Astrophysics Data System (ADS)
Leitão, João; Pereira, José; Rodrigues, Luís
Gossip, or epidemic, protocols have emerged as a powerful strategy to implement highly scalable and resilient reliable broadcast primitives on large scale peer-to-peer networks. Epidemic protocols are scalable because they distribute the load among all nodes in the system and resilient because they have an intrinsic level of redundancy that masks node and network failures. This chapter provides an introduction to gossip-based broadcast on large-scale unstructured peer-to-peer overlay networks: it surveys the main results in the field, discusses techniques to build and maintain the overlays that support efficient dissemination strategies, and provides an in-depth discussion and experimental evaluation of two concrete protocols, named HyParView and Plumtree.
Cheating and Anti-Cheating in Gossip-Based Protocol: An Experimental Investigation
NASA Astrophysics Data System (ADS)
Xiao, Xin; Shi, Yuanchun; Tang, Yun; Zhang, Nan
During recent years, there has been a rapid growth in deployment of gossip-based protocol in many multicast applications. In a typical gossip-based protocol, each node acts as dual roles of receiver and sender, independently exchanging data with its neighbors to facilitate scalability and resilience. However, most of previous work in this literature seldom considered cheating issue of end users, which is also very important in face of the fact that the mutual cooperation inherently determines overall system performance. In this paper, we investigate the dishonest behaviors in decentralized gossip-based protocol through extensive experimental study. Our original contributions come in two-fold: In the first part of cheating study, we analytically discuss two typical cheating strategies, that is, intentionally increasing subscription requests and untruthfully calculating forwarding probability, and further evaluate their negative impacts. The results indicate that more attention should be paid to defending cheating behaviors in gossip-based protocol. In the second part of anti-cheating study, we propose a receiver-driven measurement mechanism, which evaluates individual forwarding traffic from the perspective of receivers and thus identifies cheating nodes with high incoming/outgoing ratio. Furthermore, we extend our mechanism by introducing reliable factor to further improve its accuracy. The experiments under various conditions show that it performs quite well in case of serious cheating and achieves considerable performance in other cases.
THE NURSES' FORM OF ORGANIZATIONAL COMMUNICATION: WHAT IS THE ROLE OF GOSSIP?
Altuntaş, Serap; Altun, Ozlem Şahin; Akyil, Rahşan Çevik
2014-07-19
Abstract Background: Gossip is important for managers to control it and to use it to create positive effects that help organizations to attain their goals. Objectives/Aim: The study utilised a descriptive model to determine how nurses use gossip as an informal communication channel in organizational communication. Method: Nurses working in 4 hospitals within a city in the eastern part of Turkey form the population of the study whereas nurses who agreed to participate in the study form the sample. Among these hospitals, two of them serve under the Ministry of Health while two serve under a university; diagnosis, treatment and rehabilitation services in any field are provided in each of these hospitals. The researchers developed a questionnaire for data collection after examining the literature. The approval of the ethical committees and written official permissions were obtained for the study. Data were acquired from 264 out of 420 nurses in total. Data were collected between June and September 2011. The response rate to the data collection tool was 62.8%. Subsequently, data were analyzed by frequency and percentage distribution tests with SPSS for Windows 17.0. Results: This study determined that nurses uses gossip most frequently about working conditions to share information face-to-face when they feel angry. Conclusion: The study concluded that nurses use gossip as an informal communication style in their institutions.
The nurses' form of organizational communication: What is the role of gossip?
Altuntaş, Serap; Altun, Ozlem Şahin; Akyil, Rahşan Çevik
2014-01-01
Abstract Background: Gossip is important for managers to control it and to use it to create positive effects that help organizations to attain their goals. The study utilized a descriptive model to determine how nurses use gossip as an informal communication channel in organizational communication. Nurses working in four hospitals within a city in the eastern part of Turkey form the population of the study whereas nurses who agreed to participate in the study form the sample. Among these hospitals, two of them serve under the Ministry of Health while two serve under a university; diagnosis, treatment and rehabilitation services in any field are provided in each of these hospitals. The researchers developed a questionnaire for data collection after examining the literature. The approval of the ethical committees and written official permissions were obtained for the study. Data were acquired from 264 out of 420 nurses in total. Data were collected between June and September 2011. The response rate to the data collection tool was 62.8%. Subsequently, data were analyzed by frequency and percentage distribution tests with SPSS for Windows 17.0. This study determined that nurses uses gossip most frequently about working conditions to share information face-to-face when they feel angry. The study concluded that nurses use gossip as an informal communication style in their institutions.
Gossip Revisited: A Game for Concept Review.
ERIC Educational Resources Information Center
Edwards, Barbara
1989-01-01
Describes a class activity based on the game of "Gossip" in which a group of students paraphrases a major concept in an instructional unit, then passes only the paraphrase to the next group. Notes that this activity encourages critical thinking and helps review and summarize key lesson concepts. (RS)
Information Withholding as a Manipulative and Collusive Strategy in Nukulaelae Gossip.
ERIC Educational Resources Information Center
Besnier, Niko
1989-01-01
Examines the organization and function of information-withholding sequences, a conversational strategy used by participants in gossip interaction on Nukulaelae, a Polynesian Central Pacific atoll. Withholding sequences illustrate how ambiguity and repairs can be exploited to meet the communicative demands of particular interactional contexts. (62…
Transforming the Whole Class into Gossiping Groups.
ERIC Educational Resources Information Center
Baw, San Shwe
2002-01-01
Explores a way to use gossip in the language classroom to provide language fluency practice. Shows how certain interpersonal exchanges can be encouraged by exploiting the natural proclivity for talking about people. Activities stress social aspects of learning and are intended to provide learners with opportunities to talk and listen. (Author/VWL)
Organizational Implicatons of Gossip and Rumor
ERIC Educational Resources Information Center
Houmanfar, Ramona; Johnson, Rebecca
2004-01-01
The challenge in designing organizational interventions lies in making explicit and available what is usually implicit. Accordingly, a contribution to the understanding of complex and implicit practices such as gossip and rumor, the conditions responsible for their origin, as well as the relation they sustain to the outcome of group survival,…
GoDisco: Selective Gossip Based Dissemination of Information in Social Community Based Overlays
NASA Astrophysics Data System (ADS)
Datta, Anwitaman; Sharma, Rajesh
We propose and investigate a gossip based, social principles and behavior inspired decentralized mechanism (GoDisco) to disseminate information in online social community networks, using exclusively social links and exploiting semantic context to keep the dissemination process selective to relevant nodes. Such a designed dissemination scheme using gossiping over a egocentric social network is unique and is arguably a concept whose time has arrived, emulating word of mouth behavior and can have interesting applications like probabilistic publish/subscribe, decentralized recommendation and contextual advertisement systems, to name a few. Simulation based experiments show that despite using only local knowledge and contacts, the system has good global coverage and behavior.
Mathematical Gossip: Relevance and Context in the Mathematics Classroom
ERIC Educational Resources Information Center
Callingham, Rosemary
2004-01-01
Using mathematical gossip in the classroom allows teachers to expand their students' horizons, and provide pathways to improvement of understanding. The expansion of a simple idea into another mathematical context can enrich a student's learning. In particular it may help to bridge the gap between purely procedural approaches and a conceptual…
ERIC Educational Resources Information Center
Menzer, Melissa M.; McDonald, Kristina L.; Rubin, Kenneth H.; Rose-Krasnor, Linda; Booth-LaForce, Cathryn; Schulz, Annie
2012-01-01
We evaluated whether gossip between best friends moderated the relation between anxious withdrawal and friendship quality in early adolescence, using an Actor-Partner Interdependence Model ("APIM," Kenny, Kashy, & Cook, 2006) approach. Participants (n = 256) were 5th and 6th grade young adolescents (actors) and their best friends…
Ranking the Top 10 Education Institutions Is Nothing More Than Idle Gossip.
ERIC Educational Resources Information Center
Witt, Peter A.
1982-01-01
Ranking the "top 10 universities" which train recreation professionals does little to enhance the goals and standards of the field of parks and recreation. While evaluation of faculty and educational institutions is considered important, it should be conducted in a manner which tends to be more professional than mere gossip. (JN)
Managing rumor and gossip in operating room settings.
Blakeley, J A; Ribeiro, V; Hughes, A
1996-07-01
The unique features of the operating room (OR) make it an ideal setting for the proliferation of gossip and rumor. Although not always negative, these "grapevine" communications can reduce productivity and work satisfaction. Hence, OR managers need to understand these forms of communication and prevent or control their negative consequences. The authors offer suggestions for undertaking this challenge.
Ex-King of Campus Gossip Turns to Saving Web Reputations
ERIC Educational Resources Information Center
Rice, Alexandra
2012-01-01
Matt Ivester became notorious on campuses across the country in 2007 for publishing gossip--not about celebrities but about students--on Juicy-Campus, the Web site he created. The site was blocked by some colleges, banned by several student governments, and threatened with legal action by several students who claimed that defaming comments on the…
CLON: Overlay Networks and Gossip Protocols for Cloud Environments
NASA Astrophysics Data System (ADS)
Matos, Miguel; Sousa, António; Pereira, José; Oliveira, Rui; Deliot, Eric; Murray, Paul
Although epidemic or gossip-based multicast is a robust and scalable approach to reliable data dissemination, its inherent redundancy results in high resource consumption on both links and nodes. This problem is aggravated in settings that have costlier or resource constrained links as happens in Cloud Computing infrastructures composed by several interconnected data centers across the globe.
Crossing boundaries: women's gossip, insults and violence in sixteenth-century France.
Lipscomb, Suzannah
2011-01-01
Using evidence from cases recorded in the registers of the consistories of southern France, the author investigates the way in which Languedocian women policed each other's behaviour, enforcing a collective morality through gossip, sexual insult and physical confrontation. In contrast to case studies by other historians, it is argued here that gossip does appear to have been a peculiarly female activity, but far more than simply being an outlet for malice or prurience, it gave women a distinctive social role in the town. No less evident is the involvement of women in physical violence both against each other and against men, violence which, though less extreme than its male counterpart, nonetheless occupies a significant role in the proceedings of the consistories.
Energy Efficient Probabilistic Broadcasting for Mobile Ad-Hoc Network
NASA Astrophysics Data System (ADS)
Kumar, Sumit; Mehfuz, Shabana
2017-06-01
In mobile ad-hoc network (MANETs) flooding method is used for broadcasting route request (RREQ) packet from one node to another node for route discovery. This is the simplest method of broadcasting of RREQ packets but it often results in broadcast storm problem, originating collisions and congestion of packets in the network. A probabilistic broadcasting is one of the widely used broadcasting scheme for route discovery in MANETs and provides solution for broadcasting storm problem. But it does not consider limited energy of the battery of the nodes. In this paper, a new energy efficient probabilistic broadcasting (EEPB) is proposed in which probability of broadcasting RREQs is calculated with respect to remaining energy of nodes. The analysis of simulation results clearly indicate that an EEPB route discovery scheme in ad-hoc on demand distance vector (AODV) can increase the network lifetime with a decrease in the average power consumption and RREQ packet overhead. It also decreases the number of dropped packets in the network, in comparison to other EEPB schemes like energy constraint gossip (ECG), energy aware gossip (EAG), energy based gossip (EBG) and network lifetime through energy efficient broadcast gossip (NEBG).
Spreading gossip in social networks.
Lind, Pedro G; da Silva, Luciano R; Andrade, José S; Herrmann, Hans J
2007-09-01
We study a simple model of information propagation in social networks, where two quantities are introduced: the spread factor, which measures the average maximal reachability of the neighbors of a given node that interchange information among each other, and the spreading time needed for the information to reach such a fraction of nodes. When the information refers to a particular node at which both quantities are measured, the model can be taken as a model for gossip propagation. In this context, we apply the model to real empirical networks of social acquaintances and compare the underlying spreading dynamics with different types of scale-free and small-world networks. We find that the number of friendship connections strongly influences the probability of being gossiped. Finally, we discuss how the spread factor is able to be applied to other situations.
Preschoolers affect others' reputations through prosocial gossip.
Engelmann, Jan M; Herrmann, Esther; Tomasello, Michael
2016-09-01
Providing evaluative information to others about absent third parties helps them to identify cooperators and avoid cheaters. Here, we show that 5-year-olds, but not 3-year-olds, reliably engage in such prosocial gossip. In an experimental setting, 5-year-old children spontaneously offered relevant reputational information to guide a peer towards a cooperative partner. Three-year-old children offered such evaluative information only rarely, although they still showed a willingness to inform in a non-evaluative manner. A follow-up study revealed that one component involved in this age difference is children's developing ability to provide justifications. The current results extend previous work on young children's tendency to manage their own reputation by showing that preschoolers also influence others' reputations via gossip. © 2016 The British Psychological Society.
Angular resolution of the gaseous micro-pixel detector Gossip
NASA Astrophysics Data System (ADS)
Bilevych, Y.; Blanco Carballo, V.; van Dijk, M.; Fransen, M.; van der Graaf, H.; Hartjes, F.; Hessey, N.; Koppert, W.; Nauta, S.; Rogers, M.; Romaniouk, A.; Veenhof, R.
2011-06-01
Gossip is a gaseous micro-pixel detector with a very thin drift gap intended for a high rate environment like at the pixel layers of ATLAS at the sLHC. The detector outputs not only the crossing point of a traversing MIP, but also the angle of the track, thus greatly simplifying track reconstruction. In this paper we describe a testbeam experiment to examine the angular resolution of the reconstructed track segments in Gossip. We used here the low diffusion gas mixture DME/CO 2 50/50. An angular resolution of 20 mrad for perpendicular tracks could be obtained from a 1.5 mm thin drift volume. However, for the prototype detector used at the testbeam experiment, the resolution of slanting tracks was worsened by poor time resolution of the pixel chip used.
Spreading gossip in social networks
NASA Astrophysics Data System (ADS)
Lind, Pedro G.; da Silva, Luciano R.; Andrade, José S., Jr.; Herrmann, Hans J.
2007-09-01
We study a simple model of information propagation in social networks, where two quantities are introduced: the spread factor, which measures the average maximal reachability of the neighbors of a given node that interchange information among each other, and the spreading time needed for the information to reach such a fraction of nodes. When the information refers to a particular node at which both quantities are measured, the model can be taken as a model for gossip propagation. In this context, we apply the model to real empirical networks of social acquaintances and compare the underlying spreading dynamics with different types of scale-free and small-world networks. We find that the number of friendship connections strongly influences the probability of being gossiped. Finally, we discuss how the spread factor is able to be applied to other situations.
Gossip and ostracism promote cooperation in groups.
Feinberg, Matthew; Willer, Robb; Schultz, Michael
2014-03-01
The widespread existence of cooperation is difficult to explain because individuals face strong incentives to exploit the cooperative tendencies of others. In the research reported here, we examined how the spread of reputational information through gossip promotes cooperation in mixed-motive settings. Results showed that individuals readily communicated reputational information about others, and recipients used this information to selectively interact with cooperative individuals and ostracize those who had behaved selfishly, which enabled group members to contribute to the public good with reduced threat of exploitation. Additionally, ostracized individuals responded to exclusion by subsequently cooperating at levels comparable to those who were not ostracized. These results suggest that the spread of reputational information through gossip can mitigate egoistic behavior by facilitating partner selection, thereby helping to solve the problem of cooperation even in noniterated interactions.
Three degrees of inclusion: the emergence of self-organizing social beliefs
NASA Astrophysics Data System (ADS)
Szvetelszky, Zsuzsanna; Szekfu˝, Balázs
2005-07-01
Using the scientific definition of gossip, an ancient and ubiquitous phenomenon of the social networks, we present our preliminary study and its results on how to measure the networks based on dissemination of connections and information. We try to accurately calculate the gossip-effects in networks with our hypothesis of "three degrees of inclusion". Our preliminary study on the subject of "three degrees of inclusion" gives latency to a very important property of social networks. Observing the human communication of closely knit social groups we came to the conclusion that the human networks are based on not more than three degrees of links. Taking the strong human ties into account our research indicates that whoever is on the fourth degree rarely counts as an in-group member. Our close friend's close friend's close friend — that is about the farthest — three steps — our network can reach out when it comes to telling a story or asking for a favor. Up to now no investigations have been performed to see whether the effects of gossip lead to the phase transition of the content of network's self-organizing communication. Our conclusion is that the gossip-effect must be considered as the prefactor of the news and opinions diffusion and dynamics at the social level.
Data-Driven Packet Loss Estimation for Node Healthy Sensing in Decentralized Cluster.
Fan, Hangyu; Wang, Huandong; Li, Yong
2018-01-23
Decentralized clustering of modern information technology is widely adopted in various fields these years. One of the main reason is the features of high availability and the failure-tolerance which can prevent the entire system form broking down by a failure of a single point. Recently, toolkits such as Akka are used by the public commonly to easily build such kind of cluster. However, clusters of such kind that use Gossip as their membership managing protocol and use link failure detecting mechanism to detect link failures cannot deal with the scenario that a node stochastically drops packets and corrupts the member status of the cluster. In this paper, we formulate the problem to be evaluating the link quality and finding a max clique (NP-Complete) in the connectivity graph. We then proposed an algorithm that consists of two models driven by data from application layer to respectively solving these two problems. Through simulations with statistical data and a real-world product, we demonstrate that our algorithm has a good performance.
Office gossip: a surprising source of liability.
Gregg, Robert E
2003-01-01
Rumors and gossip are inevitable ingredients of work life. Within limits, they may have some beneficial functions. Still, practitioners and managers must be aware of the dangers inherent in defamation of character and harassment. This article defines workplace comments and activities that should be avoided and the employer's legal liability when situations get out of hand. It also outlines the manager's responsibilities and lists privacy rights that are codified by state and federal laws.
Path planning in GPS-denied environments via collective intelligence of distributed sensor networks
NASA Astrophysics Data System (ADS)
Jha, Devesh K.; Chattopadhyay, Pritthi; Sarkar, Soumik; Ray, Asok
2016-05-01
This paper proposes a framework for reactive goal-directed navigation without global positioning facilities in unknown dynamic environments. A mobile sensor network is used for localising regions of interest for path planning of an autonomous mobile robot. The underlying theory is an extension of a generalised gossip algorithm that has been recently developed in a language-measure-theoretic setting. The algorithm has been used to propagate local decisions of target detection over a mobile sensor network and thus, it generates a belief map for the detected target over the network. In this setting, an autonomous mobile robot may communicate only with a few mobile sensing nodes in its own neighbourhood and localise itself relative to the communicating nodes with bounded uncertainties. The robot makes use of the knowledge based on the belief of the mobile sensors to generate a sequence of way-points, leading to a possible goal. The estimated way-points are used by a sampling-based motion planning algorithm to generate feasible trajectories for the robot. The proposed concept has been validated by numerical simulation on a mobile sensor network test-bed and a Dubin's car-like robot.
On Adding Structure to Unstructured Overlay Networks
NASA Astrophysics Data System (ADS)
Leitão, João; Carvalho, Nuno A.; Pereira, José; Oliveira, Rui; Rodrigues, Luís
Unstructured peer-to-peer overlay networks are very resilient to churn and topology changes, while requiring little maintenance cost. Therefore, they are an infrastructure to build highly scalable large-scale services in dynamic networks. Typically, the overlay topology is defined by a peer sampling service that aims at maintaining, in each process, a random partial view of peers in the system. The resulting random unstructured topology is suboptimal when a specific performance metric is considered. On the other hand, structured approaches (for instance, a spanning tree) may optimize a given target performance metric but are highly fragile. In fact, the cost for maintaining structures with strong constraints may easily become prohibitive in highly dynamic networks. This chapter discusses different techniques that aim at combining the advantages of unstructured and structured networks. Namely we focus on two distinct approaches, one based on optimizing the overlay and another based on optimizing the gossip mechanism itself.
Exchanging Peers to Establish P2P Networks
NASA Astrophysics Data System (ADS)
Akon, Mursalin; Islam, Mohammad Towhidul; Shen, Xuemin(Sherman); Singh, Ajit
Structure-wise, P2P networks can be divided into two major categories: (1) structured and (2) unstructured. In this chapter, we survey a group of unstructured P2P networks. This group of networks employs a gossip or epidemic protocol to maintain the members of the network and during a gossip, peers exchange a subset of their neighbors with each other. It is reported that this kind of networks are scalable, robust and resilient to severe network failure, at the same time very inexpensive to operate.
A Method for Decentralised Optimisation in Networks
NASA Astrophysics Data System (ADS)
Saramäki, Jari
2005-06-01
We outline a method for distributed Monte Carlo optimisation of computational problems in networks of agents, such as peer-to-peer networks of computers. The optimisation and messaging procedures are inspired by gossip protocols and epidemic data dissemination, and are decentralised, i.e. no central overseer is required. In the outlined method, each agent follows simple local rules and seeks for better solutions to the optimisation problem by Monte Carlo trials, as well as by querying other agents in its local neighbourhood. With proper network topology, good solutions spread rapidly through the network for further improvement. Furthermore, the system retains its functionality even in realistic settings where agents are randomly switched on and off.
Community detection using preference networks
NASA Astrophysics Data System (ADS)
Tasgin, Mursel; Bingol, Haluk O.
2018-04-01
Community detection is the task of identifying clusters or groups of nodes in a network where nodes within the same group are more connected with each other than with nodes in different groups. It has practical uses in identifying similar functions or roles of nodes in many biological, social and computer networks. With the availability of very large networks in recent years, performance and scalability of community detection algorithms become crucial, i.e. if time complexity of an algorithm is high, it cannot run on large networks. In this paper, we propose a new community detection algorithm, which has a local approach and is able to run on large networks. It has a simple and effective method; given a network, algorithm constructs a preference network of nodes where each node has a single outgoing edge showing its preferred node to be in the same community with. In such a preference network, each connected component is a community. Selection of the preferred node is performed using similarity based metrics of nodes. We use two alternatives for this purpose which can be calculated in 1-neighborhood of nodes, i.e. number of common neighbors of selector node and its neighbors and, the spread capability of neighbors around the selector node which is calculated by the gossip algorithm of Lind et.al. Our algorithm is tested on both computer generated LFR networks and real-life networks with ground-truth community structure. It can identify communities accurately in a fast way. It is local, scalable and suitable for distributed execution on large networks.
Data-Driven Packet Loss Estimation for Node Healthy Sensing in Decentralized Cluster
Fan, Hangyu; Wang, Huandong; Li, Yong
2018-01-01
Decentralized clustering of modern information technology is widely adopted in various fields these years. One of the main reason is the features of high availability and the failure-tolerance which can prevent the entire system form broking down by a failure of a single point. Recently, toolkits such as Akka are used by the public commonly to easily build such kind of cluster. However, clusters of such kind that use Gossip as their membership managing protocol and use link failure detecting mechanism to detect link failures cannot deal with the scenario that a node stochastically drops packets and corrupts the member status of the cluster. In this paper, we formulate the problem to be evaluating the link quality and finding a max clique (NP-Complete) in the connectivity graph. We then proposed an algorithm that consists of two models driven by data from application layer to respectively solving these two problems. Through simulations with statistical data and a real-world product, we demonstrate that our algorithm has a good performance. PMID:29360792
NASA Astrophysics Data System (ADS)
Franzetti, Paolo; Scodeggio, Marco
2012-10-01
GOSSIP fits the electro-magnetic emission of an object (the SED, Spectral Energy Distribution) against synthetic models to find the simulated one that best reproduces the observed data. It builds-up the observed SED of an object (or a large sample of objects) combining magnitudes in different bands and eventually a spectrum; then it performs a chi-square minimization fitting procedure versus a set of synthetic models. The fitting results are used to estimate a number of physical parameters like the Star Formation History, absolute magnitudes, stellar mass and their Probability Distribution Functions.
Charlton, Bruce G
2008-01-01
Crick and Watson gave complementary advice to the aspiring scientist based on the insight that to do your best work you need to make your greatest possible effort. Crick made the positive suggestion to work on the subject which most deeply interests you, the thing about which you spontaneously gossip - Crick termed this 'the gossip test'. Watson made the negative suggestion of avoiding topics and activities that bore you - which I have termed 'the boredom principle'. This is good advice because science is tough and the easy things have already been done. Solving the harder problems that remain requires a lot of effort. But in modern biomedical science individual effort does not necessarily correlate with career success as measured by salary, status, job security, etc. This is because Crick and Watson are talking about revolutionary science - using Thomas Kuhn's distinction between paradigm-shifting 'revolutionary' science and incremental 'normal' science. There are two main problems with pursuing a career in revolutionary science. The first is that revolutionary science is intrinsically riskier than normal science, the second that even revolutionary success in a scientific backwater may be less career-enhancing than mundane work in a trendy field. So, if you pick your scientific problem using the gossip test and the boredom principle, you might also be committing career suicide. This may explain why so few people follow Crick and Watson's advice. The best hope for future biomedical science is that it will evolve towards a greater convergence between individual effort and career success.
Scalable Multicast Protocols for Overlapped Groups in Broker-Based Sensor Networks
NASA Astrophysics Data System (ADS)
Kim, Chayoung; Ahn, Jinho
In sensor networks, there are lots of overlapped multicast groups because of many subscribers, associated with their potentially varying specific interests, querying every event to sensors/publishers. And gossip based communication protocols are promising as one of potential solutions providing scalability in P(Publish)/ S(Subscribe) paradigm in sensor networks. Moreover, despite the importance of both guaranteeing message delivery order and supporting overlapped multicast groups in sensor or P2P networks, there exist little research works on development of gossip-based protocols to satisfy all these requirements. In this paper, we present two versions of causally ordered delivery guaranteeing protocols for overlapped multicast groups. The one is based on sensor-broker as delegates and the other is based on local views and delegates representing subscriber subgroups. In the sensor-broker based protocol, sensor-broker might lead to make overlapped multicast networks organized by subscriber's interests. The message delivery order has been guaranteed consistently and all multicast messages are delivered to overlapped subscribers using gossip based protocols by sensor-broker. Therefore, these features of the sensor-broker based protocol might be significantly scalable rather than those of the protocols by hierarchical membership list of dedicated groups like traditional committee protocols. And the subscriber-delegate based protocol is much stronger rather than fully decentralized protocols guaranteeing causally ordered delivery based on only local views because the message delivery order has been guaranteed consistently by all corresponding members of the groups including delegates. Therefore, this feature of the subscriber-delegate protocol is a hybrid approach improving the inherent scalability of multicast nature by gossip-based technique in all communications.
Kiss, Philippe; De Meester, Marc; Kristensen, Tage S; Braeckman, Lutgart
2014-11-01
This study aimed to explore the associations of organizational social capital (OSC) with the presence of "gossip and slander," the presence of "conflicts and quarrels," sick leave prevalence, and prevalence of poor work ability in frontline working personnel of nursing homes. A total of 239 subjects (81 % participation), working in 11 different nursing homes, took part in a cross-sectional questionnaire study. Following end points were considered, they are as follows: prevalence of "gossip and slander," "conflicts and quarrels," sick leave, and poor work ability. Associations with OSC were explored at individual level (binomial log-linear regression analysis) and on group level (Kendall's tau correlation coefficients). Significant associations were found between OSC and "gossip and slander," sick leave, and poor work ability, both in the individual- and group-level analyses. The associations showed a higher significance level in the group-level analyses, with the strongest association found between mean OSC of the workplace and the prevalence of poor work ability at the workplace (τ = -0.722; p = 0.002). This study demonstrated significant associations of OSC with three end points that are relevant within the framework of well-being at work in nursing homes. The results are suggestive that OSC should be treated as a characteristic of the entire workplace, rather than as an individually experienced characteristic. The strikingly strong association between OSC and prevalence of poor work ability is suggestive for an important role of OSC within the context of maintaining work ability.
How Much Do Adolescents Cybergossip? Scale Development and Validation in Spain and Colombia.
Romera, Eva M; Herrera-López, Mauricio; Casas, José A; Ortega Ruiz, Rosario; Del Rey, Rosario
2018-01-01
Cybergossip is the act of two or more people making evaluative comments via digital devices about somebody who is not present. This cyberbehavior affects the social group in which it occurs and can either promote or hinder peer relationships. Scientific studies that assess the nature of this emerging and interactive behavior in the virtual world are limited. Some research on traditional gossip has identified it as an inherent and defining element of indirect relational aggression. This paper adopts and argues for a wider definition of gossip that includes positive comments and motivations. This work also suggests that cybergossip has to be measured independently from traditional gossip due to key differences when it occurs through ICT. This paper presents the Colombian and Spanish validation of the Cybergossip Questionnaire for Adolescents (CGQ-A), involving 3,747 high school students ( M = 13.98 years old, SD = 1.69; 48.5% male), of which 1,931 were Colombian and 1,816 were Spanish. Test models derived from item response theory, confirmatory factor analysis, content validation, and multi-group analysis were run on the full sample and subsamples for each country and both genders. The obtained optimal fit and psychometric properties confirm the robustness and suitability of a one-dimensional structure for the cybergossip instrument. The multi-group analysis shows that the cybergossip construct is understood similarly in both countries and between girls and boys. The composite reliability ratifies convergent and divergent validity of the scale. Descriptive results show that Colombian adolescents gossip less than their Spanish counterparts and that boys and girls use cybergossip to the same extent. As a conclusion, this study confirmes the relationship between cybergossip and cyberbullying, but it also supports a focus on positive cybergossip in psychoeducational interventions to build positive virtual relationships and prevent risky cyberbehaviors.
NASA Astrophysics Data System (ADS)
Mavromoustakis, Constandinos X.; Karatza, Helen D.
2010-06-01
While sharing resources the efficiency is substantially degraded as a result of the scarceness of availability of the requested resources in a multiclient support manner. These resources are often aggravated by many factors like the temporal constraints for availability or node flooding by the requested replicated file chunks. Thus replicated file chunks should be efficiently disseminated in order to enable resource availability on-demand by the mobile users. This work considers a cross layered middleware support system for efficient delay-sensitive streaming by using each device's connectivity and social interactions in a cross layered manner. The collaborative streaming is achieved through the epidemically replicated file chunk policy which uses a transition-based approach of a chained model of an infectious disease with susceptible, infected, recovered and death states. The Gossip-based stateful model enforces the mobile nodes whether to host a file chunk or not or, when no longer a chunk is needed, to purge it. The proposed model is thoroughly evaluated through experimental simulation taking measures for the effective throughput Eff as a function of the packet loss parameter in contrast with the effectiveness of the replication Gossip-based policy.
Study on Dissemination Patterns in Location-Aware Gossiping Networks
NASA Astrophysics Data System (ADS)
Kami, Nobuharu; Baba, Teruyuki; Yoshikawa, Takashi; Morikawa, Hiroyuki
We study the properties of information dissemination over location-aware gossiping networks leveraging location-based real-time communication applications. Gossiping is a promising method for quickly disseminating messages in a large-scale system, but in its application to information dissemination for location-aware applications, it is important to consider the network topology and patterns of spatial dissemination over the network in order to achieve effective delivery of messages to potentially interested users. To this end, we propose a continuous-space network model extended from Kleinberg's small-world model applicable to actual location-based applications. Analytical and simulation-based study shows that the proposed network achieves high dissemination efficiency resulting from geographically neutral dissemination patterns as well as selective dissemination to proximate users. We have designed a highly scalable location management method capable of promptly updating the network topology in response to node movement and have implemented a distributed simulator to perform dynamic target pursuit experiments as one example of applications that are the most sensitive to message forwarding delay. The experimental results show that the proposed network surpasses other types of networks in pursuit efficiency and achieves the desirable dissemination patterns.
Negative Rumor: Contagion of a Psychiatric Department
McEwan, Stephanie; Bota, Robert G.
2014-01-01
Over the past few decades, a sizable body of literature on the effects of rumors and gossip has emerged. Addressing rumors in the workplace is an important subject, as rumors have a direct impact on the quality of the work environment and also on the productivity and creativity of the employees. To date, little has been written on the effect of rumors and gossip in psychiatric hospitals. This article presents case vignettes of rumors spread in psychiatric hospitals and the impact on team cohesion and morale among the staff implicated in these, too often, neglected occurrences. Dynamic aspects with particular focus on rumors in psychiatric units and suggestions for remedy and treatment are presented. PMID:25133051
How Much Do Adolescents Cybergossip? Scale Development and Validation in Spain and Colombia
Romera, Eva M.; Herrera-López, Mauricio; Casas, José A.; Ortega Ruiz, Rosario; Del Rey, Rosario
2018-01-01
Cybergossip is the act of two or more people making evaluative comments via digital devices about somebody who is not present. This cyberbehavior affects the social group in which it occurs and can either promote or hinder peer relationships. Scientific studies that assess the nature of this emerging and interactive behavior in the virtual world are limited. Some research on traditional gossip has identified it as an inherent and defining element of indirect relational aggression. This paper adopts and argues for a wider definition of gossip that includes positive comments and motivations. This work also suggests that cybergossip has to be measured independently from traditional gossip due to key differences when it occurs through ICT. This paper presents the Colombian and Spanish validation of the Cybergossip Questionnaire for Adolescents (CGQ-A), involving 3,747 high school students (M = 13.98 years old, SD = 1.69; 48.5% male), of which 1,931 were Colombian and 1,816 were Spanish. Test models derived from item response theory, confirmatory factor analysis, content validation, and multi-group analysis were run on the full sample and subsamples for each country and both genders. The obtained optimal fit and psychometric properties confirm the robustness and suitability of a one-dimensional structure for the cybergossip instrument. The multi-group analysis shows that the cybergossip construct is understood similarly in both countries and between girls and boys. The composite reliability ratifies convergent and divergent validity of the scale. Descriptive results show that Colombian adolescents gossip less than their Spanish counterparts and that boys and girls use cybergossip to the same extent. As a conclusion, this study confirmes the relationship between cybergossip and cyberbullying, but it also supports a focus on positive cybergossip in psychoeducational interventions to build positive virtual relationships and prevent risky cyberbehaviors. PMID:29483887
Ma, Li; Fan, Suohai
2017-03-14
The random forests algorithm is a type of classifier with prominent universality, a wide application range, and robustness for avoiding overfitting. But there are still some drawbacks to random forests. Therefore, to improve the performance of random forests, this paper seeks to improve imbalanced data processing, feature selection and parameter optimization. We propose the CURE-SMOTE algorithm for the imbalanced data classification problem. Experiments on imbalanced UCI data reveal that the combination of Clustering Using Representatives (CURE) enhances the original synthetic minority oversampling technique (SMOTE) algorithms effectively compared with the classification results on the original data using random sampling, Borderline-SMOTE1, safe-level SMOTE, C-SMOTE, and k-means-SMOTE. Additionally, the hybrid RF (random forests) algorithm has been proposed for feature selection and parameter optimization, which uses the minimum out of bag (OOB) data error as its objective function. Simulation results on binary and higher-dimensional data indicate that the proposed hybrid RF algorithms, hybrid genetic-random forests algorithm, hybrid particle swarm-random forests algorithm and hybrid fish swarm-random forests algorithm can achieve the minimum OOB error and show the best generalization ability. The training set produced from the proposed CURE-SMOTE algorithm is closer to the original data distribution because it contains minimal noise. Thus, better classification results are produced from this feasible and effective algorithm. Moreover, the hybrid algorithm's F-value, G-mean, AUC and OOB scores demonstrate that they surpass the performance of the original RF algorithm. Hence, this hybrid algorithm provides a new way to perform feature selection and parameter optimization.
Modeling Human Dynamics of Face-to-Face Interaction Networks
NASA Astrophysics Data System (ADS)
Starnini, Michele; Baronchelli, Andrea; Pastor-Satorras, Romualdo
2013-04-01
Face-to-face interaction networks describe social interactions in human gatherings, and are the substrate for processes such as epidemic spreading and gossip propagation. The bursty nature of human behavior characterizes many aspects of empirical data, such as the distribution of conversation lengths, of conversations per person, or of interconversation times. Despite several recent attempts, a general theoretical understanding of the global picture emerging from data is still lacking. Here we present a simple model that reproduces quantitatively most of the relevant features of empirical face-to-face interaction networks. The model describes agents that perform a random walk in a two-dimensional space and are characterized by an attractiveness whose effect is to slow down the motion of people around them. The proposed framework sheds light on the dynamics of human interactions and can improve the modeling of dynamical processes taking place on the ensuing dynamical social networks.
Rumor, gossip and blame: implications for HIV/AIDS prevention in the South African lowveld.
Stadler, Jonathan
2003-08-01
The HIV/AIDS epidemic provides fertile breeding ground for theories of the origin of HIV/AIDS, its mode of transmission, and the allocation of blame. Drawing on ethnographic research in the Bushbuckridge region of the South African lowveld, this article examines the articulation of AIDS through gossip and rumor. These oral forms create moral readings of behavior and shape folk discourses of AIDS that resist dominant epidemiological explanations. Significantly, constructions of AIDS are not uniform. Although elders claim AIDS as traditional and curable, younger men and women support theories of AIDS as a modern, foreign disease. Witchcraft beliefs are popular in explaining why certain people die and not others. At times, rumor may escalate into a moral panic. The implications of these findings for social responses to the AIDS epidemic and HIV/AIDS prevention are explored.
New approaches to model and study social networks
NASA Astrophysics Data System (ADS)
Lind, P. G.; Herrmann, H. J.
2007-07-01
We describe and develop three recent novelties in network research which are particularly useful for studying social systems. The first one concerns the discovery of some basic dynamical laws that enable the emergence of the fundamental features observed in social networks, namely the nontrivial clustering properties, the existence of positive degree correlations and the subdivision into communities. To reproduce all these features, we describe a simple model of mobile colliding agents, whose collisions define the connections between the agents which are the nodes in the underlying network, and develop some analytical considerations. The second point addresses the particular feature of clustering and its relationship with global network measures, namely with the distribution of the size of cycles in the network. Since in social bipartite networks it is not possible to measure the clustering from standard procedures, we propose an alternative clustering coefficient that can be used to extract an improved normalized cycle distribution in any network. Finally, the third point addresses dynamical processes occurring on networks, namely when studying the propagation of information in them. In particular, we focus on the particular features of gossip propagation which impose some restrictions in the propagation rules. To this end we introduce a quantity, the spread factor, which measures the average maximal fraction of nearest neighbours which get in contact with the gossip, and find the striking result that there is an optimal non-trivial number of friends for which the spread factor is minimized, decreasing the danger of being gossiped about.
Making sense of information in noisy networks: human communication, gossip, and distortion.
Laidre, Mark E; Lamb, Alex; Shultz, Susanne; Olsen, Megan
2013-01-21
Information from others can be unreliable. Humans nevertheless act on such information, including gossip, to make various social calculations, thus raising the question of whether individuals can sort through social information to identify what is, in fact, true. Inspired by empirical literature on people's decision-making when considering gossip, we built an agent-based simulation model to examine how well simple decision rules could make sense of information as it propagated through a network. Our simulations revealed that a minimalistic decision-rule 'Bit-wise mode' - which compared information from multiple sources and then sought a consensus majority for each component bit within the message - was consistently the most successful at converging upon the truth. This decision rule attained high relative fitness even in maximally noisy networks, composed entirely of nodes that distorted the message. The rule was also superior to other decision rules regardless of its frequency in the population. Simulations carried out with variable agent memory constraints, different numbers of observers who initiated information propagation, and a variety of network types suggested that the single most important factor in making sense of information was the number of independent sources that agents could consult. Broadly, our model suggests that despite the distortion information is subject to in the real world, it is nevertheless possible to make sense of it based on simple Darwinian computations that integrate multiple sources. Copyright © 2012 Elsevier Ltd. All rights reserved.
How hardwired is human behavior?
Nicholson, N
1998-01-01
Time and time again managers have tried to eliminate hierarchies, politics, and interorganizational rivalry--but to no avail. Why? Evolutionary psychologists would say that they are working against nature--emotional and behavioral "hardwiring" that is the legacy of our Stone Age ancestors. In this evolutionary psychology primer for executives, Nigel Nicholson explores many of the Science's central tenets. Of course, evolutionary psychology is still an emerging discipline, and its strong connection with the theory of natural selection has sparked significant controversy. But, as Nicholson suggests, evolutionary psychology is now well established enough that its insights into human instinct will prove illuminating to anyone seeking to understand why people act the way they do in organizational settings. Take gossip. According to evolutionary psychology, our Stone Age ancestors needed this skill to survive the socially unpredictable conditions of the Savannah Plain. Thus, over time, the propensity to gossip became part of our mental programming. Executives trying to eradicate gossip at work might as well try to change their employees' musical tastes. Better to put one's energy into making sure the "rumor mill" avoids dishonesty or unkindness as much as possible. Evolutionary psychology also explores the dynamics of the human group. Clans on the Savannah Plain, for example, appear to have had no more than 150 members. The message for managers? People will likely be most effective in small organizational units. As every executive knows, it pays to be an insightful student of human nature. Evolutionary psychology adds another important chapter to consider.
Randomized Dynamic Mode Decomposition
NASA Astrophysics Data System (ADS)
Erichson, N. Benjamin; Brunton, Steven L.; Kutz, J. Nathan
2017-11-01
The dynamic mode decomposition (DMD) is an equation-free, data-driven matrix decomposition that is capable of providing accurate reconstructions of spatio-temporal coherent structures arising in dynamical systems. We present randomized algorithms to compute the near-optimal low-rank dynamic mode decomposition for massive datasets. Randomized algorithms are simple, accurate and able to ease the computational challenges arising with `big data'. Moreover, randomized algorithms are amenable to modern parallel and distributed computing. The idea is to derive a smaller matrix from the high-dimensional input data matrix using randomness as a computational strategy. Then, the dynamic modes and eigenvalues are accurately learned from this smaller representation of the data, whereby the approximation quality can be controlled via oversampling and power iterations. Here, we present randomized DMD algorithms that are categorized by how many passes the algorithm takes through the data. Specifically, the single-pass randomized DMD does not require data to be stored for subsequent passes. Thus, it is possible to approximately decompose massive fluid flows (stored out of core memory, or not stored at all) using single-pass algorithms, which is infeasible with traditional DMD algorithms.
Talking to Kids and Teens About Social Media and Sexting
... in the case of “sexting” (see below) or bullying. Remember to make a point of discouraging kids from gossiping, spreading rumors, bullying or damaging someone’s reputation using texting or other ...
NASA Astrophysics Data System (ADS)
Koffeman, E. N.
2007-12-01
Several years ago a revolutionary miniature TPC was developed using a pixel chip with a Micromegas foil spanned over it. To overcome the mechanical stability problems and improve the positioning accuracy while spanning a foil on top of a small readout chip a process has been developed in which a Micromegas-like grid is applied on a CMOS wafer in a post-processing step. This aluminum grid is supported on insulating pillars that are created by etching after the grid has been made. The energy resolution (measured on the absorption of the X-rays from a 55Fe source) was remarkably good. Several geometries have since been tested and we now believe that a Gas On Slimmed Silicon Pixel chip' (Gossip) may be realized. The drift region of such a gaseous pixel detector would be reduced to a millimeter. Such a detector is potentially very radiation hard (SLHC vertexing) but aging and sparking must be eliminated.
Rusu, Mihai S
2017-01-01
Cross-culturally, dead are protected from posthumous negative evaluations by the universal "nil nisi bonum" precept that governs the ethics within the community of mourners. In this study, we set out to test the observance of this injunction against posthumous gossiping in the Romanian public deathscape. Obituaries and other posthumous articles ( N = 1,148) were collected that covered the deaths of 63 celebrities who passed away between 2013 and 2016. Materials were gathered from the digital archives of three Romanian news sources (a news agency, a "quality" newspaper, and a tabloid), published one week after the moment of death. The findings show that 22% of the articles do contain negative evaluations of the deceased. The percentage rises to 36.4% if we restrict the sample to only those celebrities with a controversial anthumous reputation (19 of 63). These results indicate that celebrities are not spared from critical assessments after they pass away.
How social stigma sustains the HIV treatment gap for MSM in Mpumalanga, South Africa.
Maleke, Kabelo; Daniels, Joseph; Lane, Tim; Struthers, Helen; McIntyre, James; Coates, Thomas
2017-11-01
There are gaps in HIV care for men who have sex with men (MSM) in African settings, and HIV social stigma plays a significant role in sustaining these gaps. We conducted a three-year research project with 49 HIV-positive MSM in two districts in Mpumalanga Province, South Africa, to understand the factors that inform HIV care seeking behaviors. Semi-structured focus group discussions and interviews were conducted in IsiZulu, SiSwati, and some code-switching into English, and these were audio-recorded, transcribed, and translated into English. We used a constant comparison approach to analyze these data. HIV social stigma centered around gossip that sustained self-diagnosis and delayed clinical care with decisions to use traditional healers to mitigate the impact of gossip on their lives. More collaboration models are needed between traditional healers and health professionals to support the global goals for HIV testing and treatment.
Emergence of an optimal search strategy from a simple random walk
Sakiyama, Tomoko; Gunji, Yukio-Pegio
2013-01-01
In reports addressing animal foraging strategies, it has been stated that Lévy-like algorithms represent an optimal search strategy in an unknown environment, because of their super-diffusion properties and power-law-distributed step lengths. Here, starting with a simple random walk algorithm, which offers the agent a randomly determined direction at each time step with a fixed move length, we investigated how flexible exploration is achieved if an agent alters its randomly determined next step forward and the rule that controls its random movement based on its own directional moving experiences. We showed that our algorithm led to an effective food-searching performance compared with a simple random walk algorithm and exhibited super-diffusion properties, despite the uniform step lengths. Moreover, our algorithm exhibited a power-law distribution independent of uniform step lengths. PMID:23804445
Emergence of an optimal search strategy from a simple random walk.
Sakiyama, Tomoko; Gunji, Yukio-Pegio
2013-09-06
In reports addressing animal foraging strategies, it has been stated that Lévy-like algorithms represent an optimal search strategy in an unknown environment, because of their super-diffusion properties and power-law-distributed step lengths. Here, starting with a simple random walk algorithm, which offers the agent a randomly determined direction at each time step with a fixed move length, we investigated how flexible exploration is achieved if an agent alters its randomly determined next step forward and the rule that controls its random movement based on its own directional moving experiences. We showed that our algorithm led to an effective food-searching performance compared with a simple random walk algorithm and exhibited super-diffusion properties, despite the uniform step lengths. Moreover, our algorithm exhibited a power-law distribution independent of uniform step lengths.
A fast ergodic algorithm for generating ensembles of equilateral random polygons
NASA Astrophysics Data System (ADS)
Varela, R.; Hinson, K.; Arsuaga, J.; Diao, Y.
2009-03-01
Knotted structures are commonly found in circular DNA and along the backbone of certain proteins. In order to properly estimate properties of these three-dimensional structures it is often necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths (such polygons are called equilateral random polygons). However finding efficient algorithms that properly sample the space of equilateral random polygons is a difficult problem. Currently there are no proven algorithms that generate equilateral random polygons with its theoretical distribution. In this paper we propose a method that generates equilateral random polygons in a 'step-wise uniform' way. We prove that this method is ergodic in the sense that any given equilateral random polygon can be generated by this method and we show that the time needed to generate an equilateral random polygon of length n is linear in terms of n. These two properties make this algorithm a big improvement over the existing generating methods. Detailed numerical comparisons of our algorithm with other widely used algorithms are provided.
Charge amplitude distribution of the Gossip gaseous pixel detector
NASA Astrophysics Data System (ADS)
Blanco Carballo, V. M.; Chefdeville, M.; Colas, P.; Giomataris, Y.; van der Graaf, H.; Gromov, V.; Hartjes, F.; Kluit, R.; Koffeman, E.; Salm, C.; Schmitz, J.; Smits, S. M.; Timmermans, J.; Visschers, J. L.
2007-12-01
The Gossip gaseous pixel detector is being developed for the detection of charged particles in extreme high radiation environments as foreseen close to the interaction point of the proposed super LHC. The detecting medium is a thin layer of gas. Because of the low density of this medium, only a few primary electron/ion pairs are created by the traversing particle. To get a detectable signal, the electrons drift towards a perforated metal foil (Micromegas) whereafter they are multiplied in a gas avalanche to provide a detectable signal. The gas avalanche occurs in the high field between the Micromegas and the pixel readout chip (ROC). Compared to a silicon pixel detector, Gossip features a low material budget and a low cooling power. An experiment using X-rays has indicated a possible high radiation tolerance exceeding 10 16 hadrons/cm 2. The amplified charge signal has a broad amplitude distribution due to the limited statistics of the primary ionization and the statistical variation of the gas amplification. Therefore, some degree of inefficiency is inevitable. This study presents experimental results on the charge amplitude distribution for CO 2/DME (dimethyl-ether) and Ar/iC 4H 10 mixtures. The measured curves were fitted with the outcome of a theoretical model. In the model, the physical Landau distribution is approximated by a Poisson distribution that is convoluted with the variation of the gas gain and the electronic noise. The value for the fraction of pedestal events is used for a direct calculation of the cluster density. For some gases, the measured cluster density is considerably lower than given in literature.
Modes of Discourse in Educational Administration: A Taxonomy.
ERIC Educational Resources Information Center
Vice, James W.
1983-01-01
Uses examples from higher education to present a taxonomy of discourse, classifying verbal interchanges into three minor modes (rote pronouncement, passing time, and gossip), a transitional mode (true conversation), and three major modes (rhetoric, dialectic, and deliberation). (JAC)
ERIC Educational Resources Information Center
Vail, Kathleen
1995-01-01
All new superintendents are in jeopardy, but their greatest vulnerability is ignorance of district history. Superintendents should trust no one for several months and be especially wary of disgruntled board members, sore losers, gossips, and sieves. New superintendents can build trust by being trustworthy, immediately firing or reassigning…
NASA Astrophysics Data System (ADS)
2002-09-01
CD-ROM REVIEWS (449) It's Physics Furry Elephant: Electricity Explained BOOK REVIEWS (450) What Are the Chances? Voodoo Deaths, Office Gossip and Other Adventures in Probability Dictionary of Mechanics: A handbook for teachers and students Intermediate 2 Physics PLACES TO VISIT (452) Spaceguard Centre WEB WATCH (455) Risk
ERIC Educational Resources Information Center
Foucher, Bernard; And Others
1981-01-01
Discusses the pedagogical implications, for a French class, of the following: (1) a puzzle-like game for text reconstruction; (2) use of gossip-column letters in advanced classes; (3) use of radio news and newspaper titles; and (4) classroom space utilization favoring spontaneous communication. (AMH)
Breach of Faith: The Abuse of Confidentiality in Faculty Rooms
ERIC Educational Resources Information Center
Lopiparo, Jerry
1977-01-01
Problems can arise from comments made in faculty rooms about students and their families. The author recommends bringing the issue of gossip into the open in each school district in order for individual teachers to realize its ramifications and destructiveness. (GC)
Of gossips, eavesdroppers, and peeping toms
Francis, Huw W S
1982-01-01
British accounts of medical ethics concentrate on confidentiality to the exclusion of wider questions of privacy. This paper argues for consideration of privacy within medical ethics, and illustrates through the television series `Hospital', what may go awry when this wider concept is forgotten. PMID:7131499
On factoring RSA modulus using random-restart hill-climbing algorithm and Pollard’s rho algorithm
NASA Astrophysics Data System (ADS)
Budiman, M. A.; Rachmawati, D.
2017-12-01
The security of the widely-used RSA public key cryptography algorithm depends on the difficulty of factoring a big integer into two large prime numbers. For many years, the integer factorization problem has been intensively and extensively studied in the field of number theory. As a result, a lot of deterministic algorithms such as Euler’s algorithm, Kraitchik’s, and variants of Pollard’s algorithms have been researched comprehensively. Our study takes a rather uncommon approach: rather than making use of intensive number theories, we attempt to factorize RSA modulus n by using random-restart hill-climbing algorithm, which belongs the class of metaheuristic algorithms. The factorization time of RSA moduli with different lengths is recorded and compared with the factorization time of Pollard’s rho algorithm, which is a deterministic algorithm. Our experimental results indicates that while random-restart hill-climbing algorithm is an acceptable candidate to factorize smaller RSA moduli, the factorization speed is much slower than that of Pollard’s rho algorithm.
The evolutionary advantage of limited network knowledge.
Larson, Jennifer M
2016-06-07
Groups of individuals have social networks that structure interactions within the groups; evolutionary theory increasingly uses this fact to explain the emergence of cooperation (Eshel and Cavalli-Sforza, 1982; Boyd and Richerson, 1988, 1989; Ohtsuki et al., 2006; Nowak et al., 2010; Van Veelen et al., 2012). This approach has resulted in a number of important insights for the evolution of cooperation in the biological and social sciences, but omits a key function of social networks that has persisted throughout recent evolutionary history (Apicella et al., 2012): their role in transmitting gossip about behavior within a group. Accounting for this well-established role of social networks among rational agents in a setting of indirect reciprocity not only shows a new mechanism by which the structure of networks is fitness-relevant, but also reveals that knowledge of social networks can be fitness-relevant as well. When groups enforce cooperation by sanctioning peers whom gossip reveals to have deviated, individuals in certain peripheral network positions are tempting targets of uncooperative behavior because gossip they share about misbehavior spreads slowly through the network. The ability to identify these individuals creates incentives to behave uncooperatively. Consequently, groups comprised of individuals who knew precise information about their social networks would be at a fitness disadvantage relative to groups of individuals with a coarser knowledge of their networks. Empirical work has consistently shown that modern humans know little about the structure of their own social networks and perform poorly when tasked with learning new ones. This robust empirical regularity may be the product of natural selection in an environment of strong selective pressure at the group level. Imprecise views of networks make enforcing cooperation easier. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Jiafu; Xiang, Shuiying; Wang, Haoning; Gong, Junkai; Wen, Aijun
2018-03-01
In this paper, a novel image encryption algorithm based on synchronization of physical random bit generated in a cascade-coupled semiconductor ring lasers (CCSRL) system is proposed, and the security analysis is performed. In both transmitter and receiver parts, the CCSRL system is a master-slave configuration consisting of a master semiconductor ring laser (M-SRL) with cross-feedback and a solitary SRL (S-SRL). The proposed image encryption algorithm includes image preprocessing based on conventional chaotic maps, pixel confusion based on control matrix extracted from physical random bit, and pixel diffusion based on random bit stream extracted from physical random bit. Firstly, the preprocessing method is used to eliminate the correlation between adjacent pixels. Secondly, physical random bit with verified randomness is generated based on chaos in the CCSRL system, and is used to simultaneously generate the control matrix and random bit stream. Finally, the control matrix and random bit stream are used for the encryption algorithm in order to change the position and the values of pixels, respectively. Simulation results and security analysis demonstrate that the proposed algorithm is effective and able to resist various typical attacks, and thus is an excellent candidate for secure image communication application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papantoni-Kazakos, P.; Paterakis, M.
1988-07-01
For many communication applications with time constraints (e.g., transmission of packetized voice messages), a critical performance measure is the percentage of messages transmitted within a given amount of time after their generation at the transmitting station. This report presents a random-access algorithm (RAA) suitable for time-constrained applications. Performance analysis demonstrates that significant message-delay improvement is attained at the expense of minimal traffic loss. Also considered is the case of noisy channels. The noise effect appears at erroneously observed channel feedback. Error sensitivity analysis shows that the proposed random-access algorithm is insensitive to feedback channel errors. Window Random-Access Algorithms (RAAs) aremore » considered next. These algorithms constitute an important subclass of Multiple-Access Algorithms (MAAs); they are distributive, and they attain high throughput and low delays by controlling the number of simultaneously transmitting users.« less
Genetic algorithms with memory- and elitism-based immigrants in dynamic environments.
Yang, Shengxiang
2008-01-01
In recent years the genetic algorithm community has shown a growing interest in studying dynamic optimization problems. Several approaches have been devised. The random immigrants and memory schemes are two major ones. The random immigrants scheme addresses dynamic environments by maintaining the population diversity while the memory scheme aims to adapt genetic algorithms quickly to new environments by reusing historical information. This paper investigates a hybrid memory and random immigrants scheme, called memory-based immigrants, and a hybrid elitism and random immigrants scheme, called elitism-based immigrants, for genetic algorithms in dynamic environments. In these schemes, the best individual from memory or the elite from the previous generation is retrieved as the base to create immigrants into the population by mutation. This way, not only can diversity be maintained but it is done more efficiently to adapt genetic algorithms to the current environment. Based on a series of systematically constructed dynamic problems, experiments are carried out to compare genetic algorithms with the memory-based and elitism-based immigrants schemes against genetic algorithms with traditional memory and random immigrants schemes and a hybrid memory and multi-population scheme. The sensitivity analysis regarding some key parameters is also carried out. Experimental results show that the memory-based and elitism-based immigrants schemes efficiently improve the performance of genetic algorithms in dynamic environments.
What Does the Academic Publisher Actually Do?
ERIC Educational Resources Information Center
Mendel, David
1991-01-01
A frustrated author recounts his own experiences and those of others in dealing with publishers. He concludes that academic publishers prefer exchanging ideas and academic gossip with authors to the basics of business, letting the books sell themselves to a captive audience of academic libraries. (MSE)
Goethe Gossips with Grass: Using Computer Chatting Software in an Introductory Literature Course.
ERIC Educational Resources Information Center
Fraser, Catherine C.
1999-01-01
Students in a third-year introduction to German literature course chatted over networked computers, using "FirstClass" software. A brief description of the course design is provided with detailed information on how the three chat sessions were organized. (Author/VWL)
The GOSSIP on the MCV V347 Pavonis
NASA Astrophysics Data System (ADS)
Potter, S. B.; Cropper, Mark; Hakala, P. J.
Modelling of the polarized cyclotron emission from magnetic cataclysmic variables (MCVs) has been a powerful technique for determining the structure of the accretion zones on the white dwarf. Until now, this has been achieved by constructing emission regions (for example arcs and spots) put in by hand, in order to recover the polarized emission. These models were all inferred indirectly from arguments based on polarization and X-ray light curves. Potter, Hakala & Cropper (1998) presented a technique (Stokes imaging) which objectively and analytically models the polarized emission to recover the structure of the cyclotron emission region(s) in MCVs. We demonstrate this technique with the aid of a test case, then we apply the technique to polarimetric observations of the AM Her system V347 Pav. As the system parameters of V347 Pav (for example its inclination) have not been well determined, we describe an extension to the Stokes imaging technique which also searches the system parameter space (GOSSIP).
Graphic matching based on shape contexts and reweighted random walks
NASA Astrophysics Data System (ADS)
Zhang, Mingxuan; Niu, Dongmei; Zhao, Xiuyang; Liu, Mingjun
2018-04-01
Graphic matching is a very critical issue in all aspects of computer vision. In this paper, a new graphics matching algorithm combining shape contexts and reweighted random walks was proposed. On the basis of the local descriptor, shape contexts, the reweighted random walks algorithm was modified to possess stronger robustness and correctness in the final result. Our main process is to use the descriptor of the shape contexts for the random walk on the iteration, of which purpose is to control the random walk probability matrix. We calculate bias matrix by using descriptors and then in the iteration we use it to enhance random walks' and random jumps' accuracy, finally we get the one-to-one registration result by discretization of the matrix. The algorithm not only preserves the noise robustness of reweighted random walks but also possesses the rotation, translation, scale invariance of shape contexts. Through extensive experiments, based on real images and random synthetic point sets, and comparisons with other algorithms, it is confirmed that this new method can produce excellent results in graphic matching.
Consciences, Compasses, Codes, and Common Principles. Teaching Strategy.
ERIC Educational Resources Information Center
Blum, Ann
1996-01-01
Describes a lesson plan that examines the needs and benefits of moral behavior standards that extend beyond the strictly "legal." Procedures include a guided discussion concerning legal but reprehensible actions (participating in destructive gossip). Informative handouts precede moving the discussion to a more global perspective. (MJP)
Sexual Orientation and Violations of Civil Liberties
ERIC Educational Resources Information Center
Adelman, Marcy R.
1977-01-01
This study determined that sexual orientation is frequently assumed rather than known. Bases for assumption include gossip and rumor, appearance and behavior, and association with others. Sexual orientation was most frequently assumed on the basis of appearance and behavior. Presented at the American Psychological Association Convention,…
Using Pop Culture to Teach Introductory Biology
ERIC Educational Resources Information Center
Pryor, Gregory S.
2008-01-01
Students are captivated by the characters, storylines, and gossip provided by pop culture (television, movies, magazines, books, sports, music, advertisements, and the Internet). They always seem more engaged when teachers incorporate examples and analogies from popular culture into their lectures. This seems especially true regarding non-majors…
Parameter identification using a creeping-random-search algorithm
NASA Technical Reports Server (NTRS)
Parrish, R. V.
1971-01-01
A creeping-random-search algorithm is applied to different types of problems in the field of parameter identification. The studies are intended to demonstrate that a random-search algorithm can be applied successfully to these various problems, which often cannot be handled by conventional deterministic methods, and, also, to introduce methods that speed convergence to an extremal of the problem under investigation. Six two-parameter identification problems with analytic solutions are solved, and two application problems are discussed in some detail. Results of the study show that a modified version of the basic creeping-random-search algorithm chosen does speed convergence in comparison with the unmodified version. The results also show that the algorithm can successfully solve problems that contain limits on state or control variables, inequality constraints (both independent and dependent, and linear and nonlinear), or stochastic models.
A New Algorithm with Plane Waves and Wavelets for Random Velocity Fields with Many Spatial Scales
NASA Astrophysics Data System (ADS)
Elliott, Frank W.; Majda, Andrew J.
1995-03-01
A new Monte Carlo algorithm for constructing and sampling stationary isotropic Gaussian random fields with power-law energy spectrum, infrared divergence, and fractal self-similar scaling is developed here. The theoretical basis for this algorithm involves the fact that such a random field is well approximated by a superposition of random one-dimensional plane waves involving a fixed finite number of directions. In general each one-dimensional plane wave is the sum of a random shear layer and a random acoustical wave. These one-dimensional random plane waves are then simulated by a wavelet Monte Carlo method for a single space variable developed recently by the authors. The computational results reported in this paper demonstrate remarkable low variance and economical representation of such Gaussian random fields through this new algorithm. In particular, the velocity structure function for an imcorepressible isotropic Gaussian random field in two space dimensions with the Kolmogoroff spectrum can be simulated accurately over 12 decades with only 100 realizations of the algorithm with the scaling exponent accurate to 1.1% and the constant prefactor accurate to 6%; in fact, the exponent of the velocity structure function can be computed over 12 decades within 3.3% with only 10 realizations. Furthermore, only 46,592 active computational elements are utilized in each realization to achieve these results for 12 decades of scaling behavior.
A high speed implementation of the random decrement algorithm
NASA Technical Reports Server (NTRS)
Kiraly, L. J.
1982-01-01
The algorithm is useful for measuring net system damping levels in stochastic processes and for the development of equivalent linearized system response models. The algorithm works by summing together all subrecords which occur after predefined threshold level is crossed. The random decrement signature is normally developed by scanning stored data and adding subrecords together. The high speed implementation of the random decrement algorithm exploits the digital character of sampled data and uses fixed record lengths of 2(n) samples to greatly speed up the process. The contributions to the random decrement signature of each data point was calculated only once and in the same sequence as the data were taken. A hardware implementation of the algorithm using random logic is diagrammed and the process is shown to be limited only by the record size and the threshold crossing frequency of the sampled data. With a hardware cycle time of 200 ns and 1024 point signature, a threshold crossing frequency of 5000 Hertz can be processed and a stably averaged signature presented in real time.
Hallway gossip between Ras and PI3K pathways.
Emanuel, Peter D
2014-05-01
In this issue of Blood, Goodwin et al investigate the pathogenesis of juvenile myelomonocytic leukemia (JMML), demonstrating that mutant Shp2 induces granulocyte macrophage-colony-stimulating factor (GM-CSF) hypersensitivity and that the p110δ subunit of phosphatidylinositol 3-kinase (PI3K) further promotes this dysregulation
Workplace Relations: Friendship Patterns and Consequences (According to Managers).
ERIC Educational Resources Information Center
Berman, Evan M.; West, Jonathan P.; Richter, Maurice N., Jr.
2002-01-01
A survey of 222 large-city administrators indicated that 76.7% approved of co-worker friendships but fewer supported friendships across the vertical hierarchy. Despite risks (gossip, romances, distraction), there was strong agreement on rewards: mutual support, improved work environment and communication, support for diversity. Ways to encourage…
Relational Aggression among Students
ERIC Educational Resources Information Center
Young, Ellie L.; Nelson, David A.; Hottle, America B.; Warburton, Brittney; Young, Bryan K.
2011-01-01
"Relational aggression" refers to harm within relationships caused by covert bullying or manipulative behavior. Examples include isolating a youth from his or her group of friends (social exclusion), threatening to stop talking to a friend (the silent treatment), or spreading gossip and rumors by email. This type of bullying tends to be…
Predictors of Satisfaction in Geographically Close and Long-Distance Relationships
ERIC Educational Resources Information Center
Lee, Ji-yeon; Pistole, M. Carole
2012-01-01
In this study, the authors examined geographically close (GCRs) and long-distance (LDRs) romantic relationship satisfaction as explained by insecure attachment, self-disclosure, gossip, and idealization. After college student participants (N = 536) completed a Web survey, structural equation modeling (SEM) multigroup analysis revealed that the GCR…
Relational Aggression: A Different Kind of Bullying.
ERIC Educational Resources Information Center
Mullin-Rindler, Nancy
2003-01-01
Relational aggression is a form of bullying that includes both overt name-calling and verbal attacks as well as such indirect strategies as spreading gossip and rumors, manipulating friendships, or intentionally excluding or isolating someone. Educators must become more attuned and vigilant in their responses to it. Offers successful strategies.…
Avoiding School Management Conflicts and Crisis through Formal Communication
ERIC Educational Resources Information Center
Nwogbaga, David M. E.; Nwankwo, Oliver U.; Onwa, Doris O.
2015-01-01
This paper examined how conflicts and crisis can be avoided through formal communication. It was necessitated by the observation that most of the conflicts and crisis which tend to mar school management today are functions of the inconsistencies arising from "grapevines, rumours, and gossips" generated through informal communication…
Cliché, Gossip, and Anecdote as Supervision Training
ERIC Educational Resources Information Center
Grealy, Liam
2016-01-01
This article expands on a co-authored project with Timothy Laurie on the practices and ethics of higher degree research (HDR) supervision (or advising): "What does good HDR supervision look like?" in contemporary universities. It connects that project with scholarship on the relevance of "common sense" to questions of…
A Random Forest-based ensemble method for activity recognition.
Feng, Zengtao; Mo, Lingfei; Li, Meng
2015-01-01
This paper presents a multi-sensor ensemble approach to human physical activity (PA) recognition, using random forest. We designed an ensemble learning algorithm, which integrates several independent Random Forest classifiers based on different sensor feature sets to build a more stable, more accurate and faster classifier for human activity recognition. To evaluate the algorithm, PA data collected from the PAMAP (Physical Activity Monitoring for Aging People), which is a standard, publicly available database, was utilized to train and test. The experimental results show that the algorithm is able to correctly recognize 19 PA types with an accuracy of 93.44%, while the training is faster than others. The ensemble classifier system based on the RF (Random Forest) algorithm can achieve high recognition accuracy and fast calculation.
A random walk approach to quantum algorithms.
Kendon, Vivien M
2006-12-15
The development of quantum algorithms based on quantum versions of random walks is placed in the context of the emerging field of quantum computing. Constructing a suitable quantum version of a random walk is not trivial; pure quantum dynamics is deterministic, so randomness only enters during the measurement phase, i.e. when converting the quantum information into classical information. The outcome of a quantum random walk is very different from the corresponding classical random walk owing to the interference between the different possible paths. The upshot is that quantum walkers find themselves further from their starting point than a classical walker on average, and this forms the basis of a quantum speed up, which can be exploited to solve problems faster. Surprisingly, the effect of making the walk slightly less than perfectly quantum can optimize the properties of the quantum walk for algorithmic applications. Looking to the future, even with a small quantum computer available, the development of quantum walk algorithms might proceed more rapidly than it has, especially for solving real problems.
NASA Astrophysics Data System (ADS)
Ma, Tianren; Xia, Zhengyou
2017-05-01
Currently, with the rapid development of information technology, the electronic media for social communication is becoming more and more popular. Discovery of communities is a very effective way to understand the properties of complex networks. However, traditional community detection algorithms consider the structural characteristics of a social organization only, with more information about nodes and edges wasted. In the meanwhile, these algorithms do not consider each node on its merits. Label propagation algorithm (LPA) is a near linear time algorithm which aims to find the community in the network. It attracts many scholars owing to its high efficiency. In recent years, there are more improved algorithms that were put forward based on LPA. In this paper, an improved LPA based on random walk and node importance (NILPA) is proposed. Firstly, a list of node importance is obtained through calculation. The nodes in the network are sorted in descending order of importance. On the basis of random walk, a matrix is constructed to measure the similarity of nodes and it avoids the random choice in the LPA. Secondly, a new metric IAS (importance and similarity) is calculated by node importance and similarity matrix, which we can use to avoid the random selection in the original LPA and improve the algorithm stability. Finally, a test in real-world and synthetic networks is given. The result shows that this algorithm has better performance than existing methods in finding community structure.
NASA Astrophysics Data System (ADS)
Gong, Lihua; Deng, Chengzhi; Pan, Shumin; Zhou, Nanrun
2018-07-01
Based on hyper-chaotic system and discrete fractional random transform, an image compression-encryption algorithm is designed. The original image is first transformed into a spectrum by the discrete cosine transform and the resulting spectrum is compressed according to the method of spectrum cutting. The random matrix of the discrete fractional random transform is controlled by a chaotic sequence originated from the high dimensional hyper-chaotic system. Then the compressed spectrum is encrypted by the discrete fractional random transform. The order of DFrRT and the parameters of the hyper-chaotic system are the main keys of this image compression and encryption algorithm. The proposed algorithm can compress and encrypt image signal, especially can encrypt multiple images once. To achieve the compression of multiple images, the images are transformed into spectra by the discrete cosine transform, and then the spectra are incised and spliced into a composite spectrum by Zigzag scanning. Simulation results demonstrate that the proposed image compression and encryption algorithm is of high security and good compression performance.
Fast self contained exponential random deviate algorithm
NASA Astrophysics Data System (ADS)
Fernández, Julio F.
1997-03-01
An algorithm that generates random numbers with an exponential distribution and is about ten times faster than other well known algorithms has been reported before (J. F. Fernández and J. Rivero, Comput. Phys. 10), 83 (1996). That algorithm requires input of uniform random deviates. We now report a new version of it that needs no input and is nearly as fast. The only limitation we predict thus far for the quality of the output is the amount of computer memory available. Performance results under various tests will be reported. The algorithm works in close analogy to the set up that is often used in statistical physics in order to obtain the Gibb's distribution. N numbers, that are are stored in N registers, change with time according to the rules of the algorithm, keeping their sum constant. Further details will be given.
NASA Astrophysics Data System (ADS)
Voytishek, Anton V.; Shipilov, Nikolay M.
2017-11-01
In this paper, the systematization of numerical (implemented on a computer) randomized functional algorithms for approximation of a solution of Fredholm integral equation of the second kind is carried out. Wherein, three types of such algorithms are distinguished: the projection, the mesh and the projection-mesh methods. The possibilities for usage of these algorithms for solution of practically important problems is investigated in detail. The disadvantages of the mesh algorithms, related to the necessity of calculation values of the kernels of integral equations in fixed points, are identified. On practice, these kernels have integrated singularities, and calculation of their values is impossible. Thus, for applied problems, related to solving Fredholm integral equation of the second kind, it is expedient to use not mesh, but the projection and the projection-mesh randomized algorithms.
Fast Constrained Spectral Clustering and Cluster Ensemble with Random Projection
Liu, Wenfen
2017-01-01
Constrained spectral clustering (CSC) method can greatly improve the clustering accuracy with the incorporation of constraint information into spectral clustering and thus has been paid academic attention widely. In this paper, we propose a fast CSC algorithm via encoding landmark-based graph construction into a new CSC model and applying random sampling to decrease the data size after spectral embedding. Compared with the original model, the new algorithm has the similar results with the increase of its model size asymptotically; compared with the most efficient CSC algorithm known, the new algorithm runs faster and has a wider range of suitable data sets. Meanwhile, a scalable semisupervised cluster ensemble algorithm is also proposed via the combination of our fast CSC algorithm and dimensionality reduction with random projection in the process of spectral ensemble clustering. We demonstrate by presenting theoretical analysis and empirical results that the new cluster ensemble algorithm has advantages in terms of efficiency and effectiveness. Furthermore, the approximate preservation of random projection in clustering accuracy proved in the stage of consensus clustering is also suitable for the weighted k-means clustering and thus gives the theoretical guarantee to this special kind of k-means clustering where each point has its corresponding weight. PMID:29312447
Seismic waveform tomography with shot-encoding using a restarted L-BFGS algorithm.
Rao, Ying; Wang, Yanghua
2017-08-17
In seismic waveform tomography, or full-waveform inversion (FWI), one effective strategy used to reduce the computational cost is shot-encoding, which encodes all shots randomly and sums them into one super shot to significantly reduce the number of wavefield simulations in the inversion. However, this process will induce instability in the iterative inversion regardless of whether it uses a robust limited-memory BFGS (L-BFGS) algorithm. The restarted L-BFGS algorithm proposed here is both stable and efficient. This breakthrough ensures, for the first time, the applicability of advanced FWI methods to three-dimensional seismic field data. In a standard L-BFGS algorithm, if the shot-encoding remains unchanged, it will generate a crosstalk effect between different shots. This crosstalk effect can only be suppressed by employing sufficient randomness in the shot-encoding. Therefore, the implementation of the L-BFGS algorithm is restarted at every segment. Each segment consists of a number of iterations; the first few iterations use an invariant encoding, while the remainder use random re-coding. This restarted L-BFGS algorithm balances the computational efficiency of shot-encoding, the convergence stability of the L-BFGS algorithm, and the inversion quality characteristic of random encoding in FWI.
Sharing Teaching Ideas: Active Participation in the Classroom through Creative Problem Generation.
ERIC Educational Resources Information Center
Gonzales, Nancy A.; And Others
1996-01-01
Presents an activity to involve students in mathematical communication and creative thinking. The activity is similar to the "pass it along" gossip game in which each person in a chain adds a piece of information. The class analyzes the resulting mathematics problem using George Polya's problem-solving techniques. (MKR)
Developing Appreciation for Sarcasm and Sarcastic Gossip: It Depends on Perspective
ERIC Educational Resources Information Center
Glenwright, Melanie; Tapley, Brent; Rano, Jacqueline K. S.; Pexman, Penny M.
2017-01-01
Background: Speakers use sarcasm to criticize others and to be funny; the indirectness of sarcasm protects the addressee's face (Brown & Levinson, 1987). Thus, appreciation of sarcasm depends on the ability to consider perspectives. Purpose: We investigated development of this ability from late childhood into adulthood and examined effects of…
Catch It Low to Prevent It High: Countering Low-Level Verbal Abuse.
ERIC Educational Resources Information Center
Goldstein, Arnold P.
2000-01-01
Focuses on the low-level aggression of verbal abuse demonstrated by children and adolescents. Describes the teasing, cursing, gossip, and ostracism associated with verbal abuse. Provides strategies for working with youth who are verbally aggressive including, how to reduce verbal maltreatment, how to engage in constructive communication, and ways…
Puritan Day: A Social Science Simulation
ERIC Educational Resources Information Center
Schur, Joan Brodsky
2007-01-01
Most students assume that a thriving society runs smoothly because people abide by the laws. But there are various informal, as well as formal, means of social control such as gossip, ridicule, and shame that function even in complex societies to achieve social control, or conformity to group norms. Good teaching ideas have the potential to lead…
Gossypiboma versus Gossip-Boma.
Singh, Charanjeet; Gupta, Mamta
2011-01-01
Gossypiboma, or a retained surgical sponge, is a rare condition, and it can occur after any surgical intervention that requires use of internal swabs. A case of an eight-year-old girl is presented, who had right minithoracotomy for ASD closure. She was finally diagnosed to have a retained surgical sponge in the right pleural cavity.
ERIC Educational Resources Information Center
Young, Jeffrey R.
2009-01-01
Twitter is quickly becoming a global faculty lounge. Sure, it's easy to waste a lot of time on the Internet-based microblogging service reading mundane details about people's days. But one can also pick up some great higher-education gossip, track down colleagues to collaborate with, or get advice on how to improve one's teaching or research. In…
Media Literacy: 21st Century Learning
ERIC Educational Resources Information Center
Baker, Frank W.
2011-01-01
The media, for better or worse, deliver the news and the gossip; they entertain, educate and inform. The media have not always been in American classrooms. Yes, teachers teach with media, but rarely do they teach "about" the media. It's called media literacy. Most students are not receiving adequate media literacy instruction, mostly because their…
ERIC Educational Resources Information Center
Blake-Beard, Stacy D.
2001-01-01
Comparison of women in formal and informal mentoring relationships showed that formal mentoring often led to unrealistic expectations; unbalanced focus on proteges; difficulty managing relationships among supervisors, proteges, and mentors; and damage from gossip. Informal mentoring may provide psychosocial and career support without these…
A Cellular Automata Approach to Computer Vision and Image Processing.
1980-09-01
the ACM, vol. 15, no. 9, pp. 827-837. [ Duda and Hart] R. 0. Duda and P. E. Hart, Pattern Classification and Scene Analysis, Wiley, New York, 1973...Center TR-738, 1979. [Farley] Arthur M. Farley and Andrzej Proskurowski, "Gossiping in Grid Graphs", University of Oregon Computer Science Department CS-TR
Spellings Seeks to Cast Her Glow over NCLB
ERIC Educational Resources Information Center
Hoff, David J.
2008-01-01
In her three years as U.S. secretary of education, Margaret Spellings has been a celebrity contestant on "Jeopardy!," a guest on "The Daily Show With Jon Stewart," and an occasional subject of Washington's best-read gossip column. Above all, though, she's been the nation's leading spokeswoman for the No Child Left Behind Act…
Breaking Bad in Business Education: Impacts on Student Incivility and Academic Dishonesty
ERIC Educational Resources Information Center
Offstein, Evan H.; Chory, Rebecca M.
2017-01-01
The present study examines instructors' attempts to increase student satisfaction through what we predict to be destructive communication tactics. Results indicate that business majors reported being more likely to engage in incivility and academic dishonesty in courses taught by professors who attempted to gain student favor through gossiping,…
ERIC Educational Resources Information Center
Boulton, Michael J.; Chau, Cam; Whitehand, Caroline; Amataya, Kishori; Murray, Lindsay
2009-01-01
Background: Prior studies outside of the UK have shown that peer victimization is negatively associated with school adjustment. Aims: To examine concurrent and short-term longitudinal associations between peer victimization (physical, malicious teasing, deliberate social exclusion, and malicious gossiping) and two measures of school adjustment…
Strategising as a Complex Responsive Leadership Process
ERIC Educational Resources Information Center
Groot, Nol; Homan, Thijs H.
2012-01-01
This paper, based on a narrative of one of the authors, explores management reality where a chosen strategy developed into a different direction than expected. The authors offer an insight in a manager's daily struggle, where power, gossip and conflict can influence the strategising process. The plans and strategic ambitions chosen at the outset…
ERIC Educational Resources Information Center
O'Bannion, Colette Marie
2010-01-01
A reader might assume contemporary society has progressed beyond literary censorship. However, as recently as 2008, the "Gossip Girl" and "Twilight" young adult literature series both faced challenges in distinct sectors of United States society (American Library Association (ALA), 2009: Martindale, 2008). A number of concerned…
Value homophily benefits cooperation but motivates employing incorrect social information.
Rauwolf, Paul; Mitchell, Dominic; Bryson, Joanna J
2015-02-21
Individuals often judge others based on third-party gossip, rather than their own experience, despite the fact that gossip is error-prone. Rather than judging others on their merits, even when such knowledge is free, we judge based on the opinions of third parties. Here we seek to understand this observation in the context of the evolution of cooperation. If individuals are being judged on noisy social reputations rather than on merit, then agents might exploit this, eroding the sustainability of cooperation. We employ a version of the Prisoner׳s Dilemma, the Donation game, which has been used to simulate the evolution of cooperation through indirect reciprocity. First, we validate the proposition that adding homophily (the propensity to interact with others of similar beliefs) into a society increases the sustainability of cooperation. However, this creates an evolutionary conflict between the accurate signalling of ingroup status versus the veridical report of the behaviour of other agents. We find that conditions exist where signalling ingroup status outweighs honesty as the best method to ultimately spread cooperation. Copyright © 2014 Elsevier Ltd. All rights reserved.
An Extended Deterministic Dendritic Cell Algorithm for Dynamic Job Shop Scheduling
NASA Astrophysics Data System (ADS)
Qiu, X. N.; Lau, H. Y. K.
The problem of job shop scheduling in a dynamic environment where random perturbation exists in the system is studied. In this paper, an extended deterministic Dendritic Cell Algorithm (dDCA) is proposed to solve such a dynamic Job Shop Scheduling Problem (JSSP) where unexpected events occurred randomly. This algorithm is designed based on dDCA and makes improvements by considering all types of signals and the magnitude of the output values. To evaluate this algorithm, ten benchmark problems are chosen and different kinds of disturbances are injected randomly. The results show that the algorithm performs competitively as it is capable of triggering the rescheduling process optimally with much less run time for deciding the rescheduling action. As such, the proposed algorithm is able to minimize the rescheduling times under the defined objective and to keep the scheduling process stable and efficient.
Typical performance of approximation algorithms for NP-hard problems
NASA Astrophysics Data System (ADS)
Takabe, Satoshi; Hukushima, Koji
2016-11-01
Typical performance of approximation algorithms is studied for randomized minimum vertex cover problems. A wide class of random graph ensembles characterized by an arbitrary degree distribution is discussed with the presentation of a theoretical framework. Herein, three approximation algorithms are examined: linear-programming relaxation, loopy-belief propagation, and the leaf-removal algorithm. The former two algorithms are analyzed using a statistical-mechanical technique, whereas the average-case analysis of the last one is conducted using the generating function method. These algorithms have a threshold in the typical performance with increasing average degree of the random graph, below which they find true optimal solutions with high probability. Our study reveals that there exist only three cases, determined by the order of the typical performance thresholds. In addition, we provide some conditions for classification of the graph ensembles and demonstrate explicitly some examples for the difference in thresholds.
An Efficient Randomized Algorithm for Real-Time Process Scheduling in PicOS Operating System
NASA Astrophysics Data System (ADS)
Helmy*, Tarek; Fatai, Anifowose; Sallam, El-Sayed
PicOS is an event-driven operating environment designed for use with embedded networked sensors. More specifically, it is designed to support the concurrency in intensive operations required by networked sensors with minimal hardware requirements. Existing process scheduling algorithms of PicOS; a commercial tiny, low-footprint, real-time operating system; have their associated drawbacks. An efficient, alternative algorithm, based on a randomized selection policy, has been proposed, demonstrated, confirmed for efficiency and fairness, on the average, and has been recommended for implementation in PicOS. Simulations were carried out and performance measures such as Average Waiting Time (AWT) and Average Turn-around Time (ATT) were used to assess the efficiency of the proposed randomized version over the existing ones. The results prove that Randomized algorithm is the best and most attractive for implementation in PicOS, since it is most fair and has the least AWT and ATT on average over the other non-preemptive scheduling algorithms implemented in this paper.
A weighted belief-propagation algorithm for estimating volume-related properties of random polytopes
NASA Astrophysics Data System (ADS)
Font-Clos, Francesc; Massucci, Francesco Alessandro; Pérez Castillo, Isaac
2012-11-01
In this work we introduce a novel weighted message-passing algorithm based on the cavity method for estimating volume-related properties of random polytopes, properties which are relevant in various research fields ranging from metabolic networks, to neural networks, to compressed sensing. We propose, as opposed to adopting the usual approach consisting in approximating the real-valued cavity marginal distributions by a few parameters, using an algorithm to faithfully represent the entire marginal distribution. We explain various alternatives for implementing the algorithm and benchmarking the theoretical findings by showing concrete applications to random polytopes. The results obtained with our approach are found to be in very good agreement with the estimates produced by the Hit-and-Run algorithm, known to produce uniform sampling.
Magnetic localization and orientation of the capsule endoscope based on a random complex algorithm.
He, Xiaoqi; Zheng, Zizhao; Hu, Chao
2015-01-01
The development of the capsule endoscope has made possible the examination of the whole gastrointestinal tract without much pain. However, there are still some important problems to be solved, among which, one important problem is the localization of the capsule. Currently, magnetic positioning technology is a suitable method for capsule localization, and this depends on a reliable system and algorithm. In this paper, based on the magnetic dipole model as well as magnetic sensor array, we propose nonlinear optimization algorithms using a random complex algorithm, applied to the optimization calculation for the nonlinear function of the dipole, to determine the three-dimensional position parameters and two-dimensional direction parameters. The stability and the antinoise ability of the algorithm is compared with the Levenberg-Marquart algorithm. The simulation and experiment results show that in terms of the error level of the initial guess of magnet location, the random complex algorithm is more accurate, more stable, and has a higher "denoise" capacity, with a larger range for initial guess values.
NASA Astrophysics Data System (ADS)
Jia, Zhongxiao; Yang, Yanfei
2018-05-01
In this paper, we propose new randomization based algorithms for large scale linear discrete ill-posed problems with general-form regularization: subject to , where L is a regularization matrix. Our algorithms are inspired by the modified truncated singular value decomposition (MTSVD) method, which suits only for small to medium scale problems, and randomized SVD (RSVD) algorithms that generate good low rank approximations to A. We use rank-k truncated randomized SVD (TRSVD) approximations to A by truncating the rank- RSVD approximations to A, where q is an oversampling parameter. The resulting algorithms are called modified TRSVD (MTRSVD) methods. At every step, we use the LSQR algorithm to solve the resulting inner least squares problem, which is proved to become better conditioned as k increases so that LSQR converges faster. We present sharp bounds for the approximation accuracy of the RSVDs and TRSVDs for severely, moderately and mildly ill-posed problems, and substantially improve a known basic bound for TRSVD approximations. We prove how to choose the stopping tolerance for LSQR in order to guarantee that the computed and exact best regularized solutions have the same accuracy. Numerical experiments illustrate that the best regularized solutions by MTRSVD are as accurate as the ones by the truncated generalized singular value decomposition (TGSVD) algorithm, and at least as accurate as those by some existing truncated randomized generalized singular value decomposition (TRGSVD) algorithms. This work was supported in part by the National Science Foundation of China (Nos. 11771249 and 11371219).
An Image Encryption Algorithm Based on Information Hiding
NASA Astrophysics Data System (ADS)
Ge, Xin; Lu, Bin; Liu, Fenlin; Gong, Daofu
Aiming at resolving the conflict between security and efficiency in the design of chaotic image encryption algorithms, an image encryption algorithm based on information hiding is proposed based on the “one-time pad” idea. A random parameter is introduced to ensure a different keystream for each encryption, which has the characteristics of “one-time pad”, improving the security of the algorithm rapidly without significant increase in algorithm complexity. The random parameter is embedded into the ciphered image with information hiding technology, which avoids negotiation for its transport and makes the application of the algorithm easier. Algorithm analysis and experiments show that the algorithm is secure against chosen plaintext attack, differential attack and divide-and-conquer attack, and has good statistical properties in ciphered images.
NASA Astrophysics Data System (ADS)
Bassa, Zaakirah; Bob, Urmilla; Szantoi, Zoltan; Ismail, Riyad
2016-01-01
In recent years, the popularity of tree-based ensemble methods for land cover classification has increased significantly. Using WorldView-2 image data, we evaluate the potential of the oblique random forest algorithm (oRF) to classify a highly heterogeneous protected area. In contrast to the random forest (RF) algorithm, the oRF algorithm builds multivariate trees by learning the optimal split using a supervised model. The oRF binary algorithm is adapted to a multiclass land cover and land use application using both the "one-against-one" and "one-against-all" combination approaches. Results show that the oRF algorithms are capable of achieving high classification accuracies (>80%). However, there was no statistical difference in classification accuracies obtained by the oRF algorithms and the more popular RF algorithm. For all the algorithms, user accuracies (UAs) and producer accuracies (PAs) >80% were recorded for most of the classes. Both the RF and oRF algorithms poorly classified the indigenous forest class as indicated by the low UAs and PAs. Finally, the results from this study advocate and support the utility of the oRF algorithm for land cover and land use mapping of protected areas using WorldView-2 image data.
Comparison of genetic algorithm methods for fuel management optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeChaine, M.D.; Feltus, M.A.
1995-12-31
The CIGARO system was developed for genetic algorithm fuel management optimization. Tests are performed to find the best fuel location swap mutation operator probability and to compare genetic algorithm to a truly random search method. Tests showed the fuel swap probability should be between 0% and 10%, and a 50% definitely hampered the optimization. The genetic algorithm performed significantly better than the random search method, which did not even satisfy the peak normalized power constraint.
Random Bits Forest: a Strong Classifier/Regressor for Big Data
NASA Astrophysics Data System (ADS)
Wang, Yi; Li, Yi; Pu, Weilin; Wen, Kathryn; Shugart, Yin Yao; Xiong, Momiao; Jin, Li
2016-07-01
Efficiency, memory consumption, and robustness are common problems with many popular methods for data analysis. As a solution, we present Random Bits Forest (RBF), a classification and regression algorithm that integrates neural networks (for depth), boosting (for width), and random forests (for prediction accuracy). Through a gradient boosting scheme, it first generates and selects ~10,000 small, 3-layer random neural networks. These networks are then fed into a modified random forest algorithm to obtain predictions. Testing with datasets from the UCI (University of California, Irvine) Machine Learning Repository shows that RBF outperforms other popular methods in both accuracy and robustness, especially with large datasets (N > 1000). The algorithm also performed highly in testing with an independent data set, a real psoriasis genome-wide association study (GWAS).
Fast generation of sparse random kernel graphs
Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo
2015-09-10
The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less
Fault Detection of Aircraft System with Random Forest Algorithm and Similarity Measure
Park, Wookje; Jung, Sikhang
2014-01-01
Research on fault detection algorithm was developed with the similarity measure and random forest algorithm. The organized algorithm was applied to unmanned aircraft vehicle (UAV) that was readied by us. Similarity measure was designed by the help of distance information, and its usefulness was also verified by proof. Fault decision was carried out by calculation of weighted similarity measure. Twelve available coefficients among healthy and faulty status data group were used to determine the decision. Similarity measure weighting was done and obtained through random forest algorithm (RFA); RF provides data priority. In order to get a fast response of decision, a limited number of coefficients was also considered. Relation of detection rate and amount of feature data were analyzed and illustrated. By repeated trial of similarity calculation, useful data amount was obtained. PMID:25057508
ERIC Educational Resources Information Center
Kelly, Courtney R.
2012-01-01
This article describes an after-school program in which immigrant and urban low-income middle school students collaborated to create social maps of their school and to produce a multilingual video against gossip. These literacy-based projects combined critical pedagogy and culturally relevant pedagogy to promote meaningful interactions between…
ERIC Educational Resources Information Center
Willer, Erin K.
2009-01-01
Social aggression, including behaviors such as gossip and friendship manipulation, can be damaging to girls' individual and relational well-being. As a result, the purpose of the present dissertation study was to test a narrative sense-making metaphor intervention with middle schools girls experiencing social aggression in order to facilitate…
Doing What Your Big Sister Does: Sex, Postfeminism and the YA Chick Lit Series
ERIC Educational Resources Information Center
Bullen, Elizabeth; Toffoletti, Kim; Parsons, Liz
2011-01-01
Mass-marketed teen chick lit has become a publishing phenomenon and has begun to attract critical interest among children's literature scholars. Much of this critical work, however, has shied away from robust critical assessment of the postfeminist conditions informing the production and reception of young adult series like Private, Gossip Girl…
ERIC Educational Resources Information Center
Low, Sabina; Frey, Karin S.; Brockman, Callie J.
2010-01-01
Relational forms of aggression are known to increase during the middle school years. To date, the majority of efficacy studies of elementary school-based programs have focused on the reduction of physical and direct verbal aggression, to the exclusion of effects on relational aggression. "Steps to Respect: A Bullying Prevention Program" is one…
Using Social Media to Engage Youth: Education, Social Justice, & Humanitarianism
ERIC Educational Resources Information Center
Liang, Belle; Commins, Meghan; Duffy, Nicole
2010-01-01
While youth typically turn to social media for gossip, photo sharing, and friendship building, can it also be used to inspire them toward greater goals? The creators of GenerationPulse.com explore how two theories salient to adolescent social development (positive youth development and relational health) were used to shape a social media website…
The Bystander's Dilemma: How Can We Turn Our Students into Upstanders?
ERIC Educational Resources Information Center
Woglom, Lauren; Pennington, Kim
2010-01-01
While bullying is often accepted as an integral aspect of "growing up," it can have detrimental and lasting effects on its victims. Bullying can occur in a variety of forms, including direct teasing and threatening, the use of physical violence, and in the spreading of malicious gossip and rumors. With the proliferation of new technology, bullying…
NASA Astrophysics Data System (ADS)
van der Graaf, Harry
2009-07-01
The Gossip detector, being a GridPix TPC equipped with a thin layer of gas, is a promising alternative for Si tracking detectors. In addition, GridPix would be an interesting way to read out the gaseous phase volume of bi-phase Liquid Xe cryostats of v-less double beta decay and rare event (i.e. WIMP) search experiments.
1994-06-01
The big news this week is that prefabricated buildings are no longer viewed as a poor substitute for traditional buildings. That's what it says here, anyway. In a press release headed Future looking bright for prefabricated building industry', Rovacabin - 'leading suppliers of modular and portable buildings' - says average turnover in the prefab, building industry rose by 20 per cent in the second half of 1993.
Lessons in the Conversation That We Are: Robert Frost's "Death of the Hired Man."
ERIC Educational Resources Information Center
Jost, Walter
1996-01-01
Looks at Robert Frost's "The Death of the Hired Man" as a "representative anecdote" for Frost's work, which, taken as a whole, shows readers how to lose themselves among the overlooked places and turnings, the topics and tropes, that make up Frost's rhetorical home, the place of everyday human talk and gossip. (TB)
ERIC Educational Resources Information Center
Ryan, Pat
Epideictic rhetoric, expression of praise or blame, animates much communication, from gossip to sermons, from commercial ads to love letters. Even when writing for purposes other than to judge, writers often frame their talk with implicit or explicit expressions of praise for individuals or groups or ideas considered "good." Epideictic…
The Original Handhelds: Magazines that Teens Can't Resist.
ERIC Educational Resources Information Center
Webber, Carlie
2009-01-01
In a world of instant messages, Twitter, and Facebook, what do magazines have to offer teens? Well, as it turns out, plenty. For starters, they feature celebrity gossip, humor, beauty tips, sports, and even manga. Some magazines offer online content that can only be accessed by using a special code that's available in the print edition. Recently,…
JuicyCampus: Gone, and Best Forgotten
ERIC Educational Resources Information Center
Hornbeck, J. Patrick, II
2009-01-01
Two years ago, a former student of the author was raped. That should have been awful enough. But a few months later, his student discovered that her personal horror was being openly discussed--or, more accurately, mocked--on the gossip Web site JuicyCampus, where some of her classmates told her, and anyone else who happened to read the site, that…
Tengo una Bomba: The Paralinguistic and Linguistic Conventions of the Oral Practice Chismeando.
ERIC Educational Resources Information Center
Hall, Joan Kelly
1993-01-01
This article offers a linguistic and paralinguistic explication of the oral practice of chismeando (gossiping) as engaged in by a group of women from the Dominican Republic. A culture-specific study of the structuring resources by which the participants construct, maintain, and/or modify their in-group identities in everyday oral practice is…
Teh, Seng Khoon; Zheng, Wei; Lau, David P; Huang, Zhiwei
2009-06-01
In this work, we evaluated the diagnostic ability of near-infrared (NIR) Raman spectroscopy associated with the ensemble recursive partitioning algorithm based on random forests for identifying cancer from normal tissue in the larynx. A rapid-acquisition NIR Raman system was utilized for tissue Raman measurements at 785 nm excitation, and 50 human laryngeal tissue specimens (20 normal; 30 malignant tumors) were used for NIR Raman studies. The random forests method was introduced to develop effective diagnostic algorithms for classification of Raman spectra of different laryngeal tissues. High-quality Raman spectra in the range of 800-1800 cm(-1) can be acquired from laryngeal tissue within 5 seconds. Raman spectra differed significantly between normal and malignant laryngeal tissues. Classification results obtained from the random forests algorithm on tissue Raman spectra yielded a diagnostic sensitivity of 88.0% and specificity of 91.4% for laryngeal malignancy identification. The random forests technique also provided variables importance that facilitates correlation of significant Raman spectral features with cancer transformation. This study shows that NIR Raman spectroscopy in conjunction with random forests algorithm has a great potential for the rapid diagnosis and detection of malignant tumors in the larynx.
Computer methods for sampling from the gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, M.E.; Tadikamalla, P.R.
1978-01-01
Considerable attention has recently been directed at developing ever faster algorithms for generating gamma random variates on digital computers. This paper surveys the current state of the art including the leading algorithms of Ahrens and Dieter, Atkinson, Cheng, Fishman, Marsaglia, Tadikamalla, and Wallace. General random variate generation techniques are explained with reference to these gamma algorithms. Computer simulation experiments on IBM and CDC computers are reported.
Simple-random-sampling-based multiclass text classification algorithm.
Liu, Wuying; Wang, Lin; Yi, Mianzhu
2014-01-01
Multiclass text classification (MTC) is a challenging issue and the corresponding MTC algorithms can be used in many applications. The space-time overhead of the algorithms must be concerned about the era of big data. Through the investigation of the token frequency distribution in a Chinese web document collection, this paper reexamines the power law and proposes a simple-random-sampling-based MTC (SRSMTC) algorithm. Supported by a token level memory to store labeled documents, the SRSMTC algorithm uses a text retrieval approach to solve text classification problems. The experimental results on the TanCorp data set show that SRSMTC algorithm can achieve the state-of-the-art performance at greatly reduced space-time requirements.
Eroglu, Duygu Yilmaz; Ozmutlu, H Cenk
2014-01-01
We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms.
Predicting the random drift of MEMS gyroscope based on K-means clustering and OLS RBF Neural Network
NASA Astrophysics Data System (ADS)
Wang, Zhen-yu; Zhang, Li-jie
2017-10-01
Measure error of the sensor can be effectively compensated with prediction. Aiming at large random drift error of MEMS(Micro Electro Mechanical System))gyroscope, an improved learning algorithm of Radial Basis Function(RBF) Neural Network(NN) based on K-means clustering and Orthogonal Least-Squares (OLS) is proposed in this paper. The algorithm selects the typical samples as the initial cluster centers of RBF NN firstly, candidates centers with K-means algorithm secondly, and optimizes the candidate centers with OLS algorithm thirdly, which makes the network structure simpler and makes the prediction performance better. Experimental results show that the proposed K-means clustering OLS learning algorithm can predict the random drift of MEMS gyroscope effectively, the prediction error of which is 9.8019e-007°/s and the prediction time of which is 2.4169e-006s
NASA Astrophysics Data System (ADS)
Polan, Daniel F.; Brady, Samuel L.; Kaufman, Robert A.
2016-09-01
There is a need for robust, fully automated whole body organ segmentation for diagnostic CT. This study investigates and optimizes a Random Forest algorithm for automated organ segmentation; explores the limitations of a Random Forest algorithm applied to the CT environment; and demonstrates segmentation accuracy in a feasibility study of pediatric and adult patients. To the best of our knowledge, this is the first study to investigate a trainable Weka segmentation (TWS) implementation using Random Forest machine-learning as a means to develop a fully automated tissue segmentation tool developed specifically for pediatric and adult examinations in a diagnostic CT environment. Current innovation in computed tomography (CT) is focused on radiomics, patient-specific radiation dose calculation, and image quality improvement using iterative reconstruction, all of which require specific knowledge of tissue and organ systems within a CT image. The purpose of this study was to develop a fully automated Random Forest classifier algorithm for segmentation of neck-chest-abdomen-pelvis CT examinations based on pediatric and adult CT protocols. Seven materials were classified: background, lung/internal air or gas, fat, muscle, solid organ parenchyma, blood/contrast enhanced fluid, and bone tissue using Matlab and the TWS plugin of FIJI. The following classifier feature filters of TWS were investigated: minimum, maximum, mean, and variance evaluated over a voxel radius of 2 n , (n from 0 to 4), along with noise reduction and edge preserving filters: Gaussian, bilateral, Kuwahara, and anisotropic diffusion. The Random Forest algorithm used 200 trees with 2 features randomly selected per node. The optimized auto-segmentation algorithm resulted in 16 image features including features derived from maximum, mean, variance Gaussian and Kuwahara filters. Dice similarity coefficient (DSC) calculations between manually segmented and Random Forest algorithm segmented images from 21 patient image sections, were analyzed. The automated algorithm produced segmentation of seven material classes with a median DSC of 0.86 ± 0.03 for pediatric patient protocols, and 0.85 ± 0.04 for adult patient protocols. Additionally, 100 randomly selected patient examinations were segmented and analyzed, and a mean sensitivity of 0.91 (range: 0.82-0.98), specificity of 0.89 (range: 0.70-0.98), and accuracy of 0.90 (range: 0.76-0.98) were demonstrated. In this study, we demonstrate that this fully automated segmentation tool was able to produce fast and accurate segmentation of the neck and trunk of the body over a wide range of patient habitus and scan parameters.
Novel image encryption algorithm based on multiple-parameter discrete fractional random transform
NASA Astrophysics Data System (ADS)
Zhou, Nanrun; Dong, Taiji; Wu, Jianhua
2010-08-01
A new method of digital image encryption is presented by utilizing a new multiple-parameter discrete fractional random transform. Image encryption and decryption are performed based on the index additivity and multiple parameters of the multiple-parameter fractional random transform. The plaintext and ciphertext are respectively in the spatial domain and in the fractional domain determined by the encryption keys. The proposed algorithm can resist statistic analyses effectively. The computer simulation results show that the proposed encryption algorithm is sensitive to the multiple keys, and that it has considerable robustness, noise immunity and security.
Hopi Indian Witchcraft and Healing: On Good, Evil, and Gossip
ERIC Educational Resources Information Center
Geertz, Armin W.
2011-01-01
One of the abiding problems in the study of American Indians is that it is plagued by stereotyping and romanticism. In the history of ideas in Europe and the United States, negative as well as positive stereotyping has been called "primitivism." Much of the author's work has been an attempt to get beyond primitivism in order to get to…
ERIC Educational Resources Information Center
Glenn, Wendy
2008-01-01
This article employs critical discourse analysis methods to (a) apply Marxist and critical literacy theories to recently published young adult novels that feature wealthy New York teens whose privilege grants them lives of leisure and (b) discuss the implications of using these texts in the classroom to encourage students to read (and consume)…
The Relation between Bullying, Victimization, and Adolescents' Level of Hopelessness
ERIC Educational Resources Information Center
Siyahhan, Sinem; Aricak, O. Tolga; Cayirdag-Acar, Nur
2012-01-01
In this study, 419 Turkish middle school students (203 girls, 216 boys) were surveyed on their exposure to and engagement in bullying, and their level of hopelessness. Our findings suggest that girls were victims of indirect (e.g. gossiping) bullying more than boys. Boys reported being victims of physical (e.g. damaging property) and verbal (e.g.…
ERIC Educational Resources Information Center
Steinberg, Matthew P.; Allensworth, Elaine; Johnson, David W.
2011-01-01
In schools across the country, students routinely encounter a range of safety issues--from overt acts of violence and bullying to subtle intimidation and disrespect. Though extreme incidents such as school shootings tend to attract the most attention, day-to-day incidents such as gossip, hallway fights, and yelling matches between teachers and…
ERIC Educational Resources Information Center
Crookston, Shara L.
2009-01-01
Women in higher education and the consumer pressures they feel have implications for the quality of an education a woman receives. With college tuition rising, more and more college students are going into deeper financial debt than ever before (National Center for Educational Statistics, 2008). Popular culture influences on the college-going…
ERIC Educational Resources Information Center
Grote, Ellen
2005-01-01
Gossip has mainly been investigated as an oral discourse practice, one that serves as a mechanism to reaffirm relationships and to construct, monitor and maintain social norms and values within communities. This study investigates how a group of Aboriginal English speaking teenage girls constructed norms, values and identities in their email…
The Impact of Third-Party Information on Trust: Valence, Source, and Reliability
2016-01-01
Economic exchange between strangers happens extremely frequently due to the growing number of internet transactions. In trust situations like online transactions, a trustor usually does not know whether she encounters a trustworthy trustee. However, the trustor might form beliefs about the trustee's trustworthiness by relying on third-party information. Different kinds of third-party information can vary dramatically in their importance to the trustor. We ran a factorial design to study how the different characteristics of third-party information affect the trustor’s decision to trust. We systematically varied unregulated third-party information regarding the source (friend or a stranger), the reliability (gossip or experiences), and the valence (positive or negative) of the information. The results show that negative information is more salient for withholding trust than positive information is for placing trust. If third-party information is positive, experience of a friend has the strongest effect on trusting followed by friend’s gossip. Positive information from a stranger does not matter to the trustor. With respect to negative information, the data show that even the slightest hint of an untrustworthy trustee leads to significantly less placed trust irrespective of the source or the reliability of the information. PMID:26882013
Investigation of mechanisms linking media exposure to smoking in high school students.
Carson, Nicholas J; Rodriguez, Daniel; Audrain-McGovern, Janet
2005-08-01
Media exposure has been found to impact adolescent smoking, although the mechanisms of this relationship have not been thoroughly investigated. Drive for thinness and tobacco advertising receptivity, both shown to be associated with smoking, are two potential mediators. 967 twelfth grade students completed a self-report survey as part of a longitudinal study of biobehavioral predictors of smoking. Exposure to magazines and television, drive for thinness, tobacco advertisement receptivity, and twelfth grade smoking level were the primary variables of interest. Effects of gender, race, BMI, smoking exposure, and perceived physical appearance were controlled for in the model. Exposure to fashion, entertainment, and gossip magazines had indirect effects on smoking via drive for thinness and tobacco advertisement receptivity. There was a direct effect of health, fitness, and sports magazine reading on smoking. Television watching had no significant effects on smoking. Adolescents who read fashion, entertainment, and gossip magazines may be more likely to smoke, in part, because of a higher drive for thinness and greater receptivity to cigarette advertisements. Conversely, adolescents reading Health and Fitness magazines may be less likely to smoke. Drive for thinness and tobacco advertising receptivity are thus potential targets for adolescent smoking intervention.
Enhancing the Selection of Backoff Interval Using Fuzzy Logic over Wireless Ad Hoc Networks
Ranganathan, Radha; Kannan, Kathiravan
2015-01-01
IEEE 802.11 is the de facto standard for medium access over wireless ad hoc network. The collision avoidance mechanism (i.e., random binary exponential backoff—BEB) of IEEE 802.11 DCF (distributed coordination function) is inefficient and unfair especially under heavy load. In the literature, many algorithms have been proposed to tune the contention window (CW) size. However, these algorithms make every node select its backoff interval between [0, CW] in a random and uniform manner. This randomness is incorporated to avoid collisions among the nodes. But this random backoff interval can change the optimal order and frequency of channel access among competing nodes which results in unfairness and increased delay. In this paper, we propose an algorithm that schedules the medium access in a fair and effective manner. This algorithm enhances IEEE 802.11 DCF with additional level of contention resolution that prioritizes the contending nodes according to its queue length and waiting time. Each node computes its unique backoff interval using fuzzy logic based on the input parameters collected from contending nodes through overhearing. We evaluate our algorithm against IEEE 802.11, GDCF (gentle distributed coordination function) protocols using ns-2.35 simulator and show that our algorithm achieves good performance. PMID:25879066
NASA Astrophysics Data System (ADS)
Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua
2016-07-01
On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.
Focusing light through random scattering media by four-element division algorithm
NASA Astrophysics Data System (ADS)
Fang, Longjie; Zhang, Xicheng; Zuo, Haoyi; Pang, Lin
2018-01-01
The focusing of light through random scattering materials using wavefront shaping is studied in detail. We propose a newfangled approach namely four-element division algorithm to improve the average convergence rate and signal-to-noise ratio of focusing. Using 4096 independently controlled segments of light, the intensity at the target is 72 times enhanced over the original intensity at the same position. The four-element division algorithm and existing phase control algorithms of focusing through scattering media are compared by both of the numerical simulation and the experiment. It is found that four-element division algorithm is particularly advantageous to improve the average convergence rate of focusing.
Random bits, true and unbiased, from atmospheric turbulence
Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo
2014-01-01
Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499
Cai, Tianxi; Karlson, Elizabeth W.
2013-01-01
Objectives To test whether data extracted from full text patient visit notes from an electronic medical record (EMR) would improve the classification of PsA compared to an algorithm based on codified data. Methods From the > 1,350,000 adults in a large academic EMR, all 2318 patients with a billing code for PsA were extracted and 550 were randomly selected for chart review and algorithm training. Using codified data and phrases extracted from narrative data using natural language processing, 31 predictors were extracted and three random forest algorithms trained using coded, narrative, and combined predictors. The receiver operator curve (ROC) was used to identify the optimal algorithm and a cut point was chosen to achieve the maximum sensitivity possible at a 90% positive predictive value (PPV). The algorithm was then used to classify the remaining 1768 charts and finally validated in a random sample of 300 cases predicted to have PsA. Results The PPV of a single PsA code was 57% (95%CI 55%–58%). Using a combination of coded data and NLP the random forest algorithm reached a PPV of 90% (95%CI 86%–93%) at sensitivity of 87% (95% CI 83% – 91%) in the training data. The PPV was 93% (95%CI 89%–96%) in the validation set. Adding NLP predictors to codified data increased the area under the ROC (p < 0.001). Conclusions Using NLP with text notes from electronic medical records improved the performance of the prediction algorithm significantly. Random forests were a useful tool to accurately classify psoriatic arthritis cases to enable epidemiological research. PMID:20701955
System Design under Uncertainty: Evolutionary Optimization of the Gravity Probe-B Spacecraft
NASA Technical Reports Server (NTRS)
Pullen, Samuel P.; Parkinson, Bradford W.
1994-01-01
This paper discusses the application of evolutionary random-search algorithms (Simulated Annealing and Genetic Algorithms) to the problem of spacecraft design under performance uncertainty. Traditionally, spacecraft performance uncertainty has been measured by reliability. Published algorithms for reliability optimization are seldom used in practice because they oversimplify reality. The algorithm developed here uses random-search optimization to allow us to model the problem more realistically. Monte Carlo simulations are used to evaluate the objective function for each trial design solution. These methods have been applied to the Gravity Probe-B (GP-B) spacecraft being developed at Stanford University for launch in 1999, Results of the algorithm developed here for GP-13 are shown, and their implications for design optimization by evolutionary algorithms are discussed.
Recourse-based facility-location problems in hybrid uncertain environment.
Wang, Shuming; Watada, Junzo; Pedrycz, Witold
2010-08-01
The objective of this paper is to study facility-location problems in the presence of a hybrid uncertain environment involving both randomness and fuzziness. A two-stage fuzzy-random facility-location model with recourse (FR-FLMR) is developed in which both the demands and costs are assumed to be fuzzy-random variables. The bounds of the optimal objective value of the two-stage FR-FLMR are derived. As, in general, the fuzzy-random parameters of the FR-FLMR can be regarded as continuous fuzzy-random variables with an infinite number of realizations, the computation of the recourse requires solving infinite second-stage programming problems. Owing to this requirement, the recourse function cannot be determined analytically, and, hence, the model cannot benefit from the use of techniques of classical mathematical programming. In order to solve the location problems of this nature, we first develop a technique of fuzzy-random simulation to compute the recourse function. The convergence of such simulation scenarios is discussed. In the sequel, we propose a hybrid mutation-based binary ant-colony optimization (MBACO) approach to the two-stage FR-FLMR, which comprises the fuzzy-random simulation and the simplex algorithm. A numerical experiment illustrates the application of the hybrid MBACO algorithm. The comparison shows that the hybrid MBACO finds better solutions than the one using other discrete metaheuristic algorithms, such as binary particle-swarm optimization, genetic algorithm, and tabu search.
Random sequential adsorption of cubes
NASA Astrophysics Data System (ADS)
Cieśla, Michał; Kubala, Piotr
2018-01-01
Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.
An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution
NASA Technical Reports Server (NTRS)
Campbell, C. W.
1983-01-01
An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.
The PX-EM algorithm for fast stable fitting of Henderson's mixed model
Foulley, Jean-Louis; Van Dyk, David A
2000-01-01
This paper presents procedures for implementing the PX-EM algorithm of Liu, Rubin and Wu to compute REML estimates of variance covariance components in Henderson's linear mixed models. The class of models considered encompasses several correlated random factors having the same vector length e.g., as in random regression models for longitudinal data analysis and in sire-maternal grandsire models for genetic evaluation. Numerical examples are presented to illustrate the procedures. Much better results in terms of convergence characteristics (number of iterations and time required for convergence) are obtained for PX-EM relative to the basic EM algorithm in the random regression. PMID:14736399
AUTOCLASSIFICATION OF THE VARIABLE 3XMM SOURCES USING THE RANDOM FOREST MACHINE LEARNING ALGORITHM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, Sean A.; Murphy, Tara; Lo, Kitty K., E-mail: s.farrell@physics.usyd.edu.au
In the current era of large surveys and massive data sets, autoclassification of astrophysical sources using intelligent algorithms is becoming increasingly important. In this paper we present the catalog of variable sources in the Third XMM-Newton Serendipitous Source catalog (3XMM) autoclassified using the Random Forest machine learning algorithm. We used a sample of manually classified variable sources from the second data release of the XMM-Newton catalogs (2XMMi-DR2) to train the classifier, obtaining an accuracy of ∼92%. We also evaluated the effectiveness of identifying spurious detections using a sample of spurious sources, achieving an accuracy of ∼95%. Manual investigation of amore » random sample of classified sources confirmed these accuracy levels and showed that the Random Forest machine learning algorithm is highly effective at automatically classifying 3XMM sources. Here we present the catalog of classified 3XMM variable sources. We also present three previously unidentified unusual sources that were flagged as outlier sources by the algorithm: a new candidate supergiant fast X-ray transient, a 400 s X-ray pulsar, and an eclipsing 5 hr binary system coincident with a known Cepheid.« less
NASA Astrophysics Data System (ADS)
Bera, Debajyoti
2015-06-01
One of the early achievements of quantum computing was demonstrated by Deutsch and Jozsa (Proc R Soc Lond A Math Phys Sci 439(1907):553, 1992) regarding classification of a particular type of Boolean functions. Their solution demonstrated an exponential speedup compared to classical approaches to the same problem; however, their solution was the only known quantum algorithm for that specific problem so far. This paper demonstrates another quantum algorithm for the same problem, with the same exponential advantage compared to classical algorithms. The novelty of this algorithm is the use of quantum amplitude amplification, a technique that is the key component of another celebrated quantum algorithm developed by Grover (Proceedings of the twenty-eighth annual ACM symposium on theory of computing, ACM Press, New York, 1996). A lower bound for randomized (classical) algorithms is also presented which establishes a sound gap between the effectiveness of our quantum algorithm and that of any randomized algorithm with similar efficiency.
Ozmutlu, H. Cenk
2014-01-01
We developed mixed integer programming (MIP) models and hybrid genetic-local search algorithms for the scheduling problem of unrelated parallel machines with job sequence and machine-dependent setup times and with job splitting property. The first contribution of this paper is to introduce novel algorithms which make splitting and scheduling simultaneously with variable number of subjobs. We proposed simple chromosome structure which is constituted by random key numbers in hybrid genetic-local search algorithm (GAspLA). Random key numbers are used frequently in genetic algorithms, but it creates additional difficulty when hybrid factors in local search are implemented. We developed algorithms that satisfy the adaptation of results of local search into the genetic algorithms with minimum relocation operation of genes' random key numbers. This is the second contribution of the paper. The third contribution of this paper is three developed new MIP models which are making splitting and scheduling simultaneously. The fourth contribution of this paper is implementation of the GAspLAMIP. This implementation let us verify the optimality of GAspLA for the studied combinations. The proposed methods are tested on a set of problems taken from the literature and the results validate the effectiveness of the proposed algorithms. PMID:24977204
NASA Astrophysics Data System (ADS)
Adya Zizwan, Putra; Zarlis, Muhammad; Budhiarti Nababan, Erna
2017-12-01
The determination of Centroid on K-Means Algorithm directly affects the quality of the clustering results. Determination of centroid by using random numbers has many weaknesses. The GenClust algorithm that combines the use of Genetic Algorithms and K-Means uses a genetic algorithm to determine the centroid of each cluster. The use of the GenClust algorithm uses 50% chromosomes obtained through deterministic calculations and 50% is obtained from the generation of random numbers. This study will modify the use of the GenClust algorithm in which the chromosomes used are 100% obtained through deterministic calculations. The results of this study resulted in performance comparisons expressed in Mean Square Error influenced by centroid determination on K-Means method by using GenClust method, modified GenClust method and also classic K-Means.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soufi, M; Asl, A Kamali; Geramifar, P
2015-06-15
Purpose: The objective of this study was to find the best seed localization parameters in random walk algorithm application to lung tumor delineation in Positron Emission Tomography (PET) images. Methods: PET images suffer from statistical noise and therefore tumor delineation in these images is a challenging task. Random walk algorithm, a graph based image segmentation technique, has reliable image noise robustness. Also its fast computation and fast editing characteristics make it powerful for clinical purposes. We implemented the random walk algorithm using MATLAB codes. The validation and verification of the algorithm have been done by 4D-NCAT phantom with spherical lungmore » lesions in different diameters from 20 to 90 mm (with incremental steps of 10 mm) and different tumor to background ratios of 4:1 and 8:1. STIR (Software for Tomographic Image Reconstruction) has been applied to reconstruct the phantom PET images with different pixel sizes of 2×2×2 and 4×4×4 mm{sup 3}. For seed localization, we selected pixels with different maximum Standardized Uptake Value (SUVmax) percentages, at least (70%, 80%, 90% and 100%) SUVmax for foreground seeds and up to (20% to 55%, 5% increment) SUVmax for background seeds. Also, for investigation of algorithm performance on clinical data, 19 patients with lung tumor were studied. The resulted contours from algorithm have been compared with nuclear medicine expert manual contouring as ground truth. Results: Phantom and clinical lesion segmentation have shown that the best segmentation results obtained by selecting the pixels with at least 70% SUVmax as foreground seeds and pixels up to 30% SUVmax as background seeds respectively. The mean Dice Similarity Coefficient of 94% ± 5% (83% ± 6%) and mean Hausdorff Distance of 1 (2) pixels have been obtained for phantom (clinical) study. Conclusion: The accurate results of random walk algorithm in PET image segmentation assure its application for radiation treatment planning and diagnosis.« less
NASA Astrophysics Data System (ADS)
Zhuang, Yufei; Huang, Haibin
2014-02-01
A hybrid algorithm combining particle swarm optimization (PSO) algorithm with the Legendre pseudospectral method (LPM) is proposed for solving time-optimal trajectory planning problem of underactuated spacecrafts. At the beginning phase of the searching process, an initialization generator is constructed by the PSO algorithm due to its strong global searching ability and robustness to random initial values, however, PSO algorithm has a disadvantage that its convergence rate around the global optimum is slow. Then, when the change in fitness function is smaller than a predefined value, the searching algorithm is switched to the LPM to accelerate the searching process. Thus, with the obtained solutions by the PSO algorithm as a set of proper initial guesses, the hybrid algorithm can find a global optimum more quickly and accurately. 200 Monte Carlo simulations results demonstrate that the proposed hybrid PSO-LPM algorithm has greater advantages in terms of global searching capability and convergence rate than both single PSO algorithm and LPM algorithm. Moreover, the PSO-LPM algorithm is also robust to random initial values.
NASA Astrophysics Data System (ADS)
Ayyad, Yassid; Mittig, Wolfgang; Bazin, Daniel; Beceiro-Novo, Saul; Cortesi, Marco
2018-02-01
The three-dimensional reconstruction of particle tracks in a time projection chamber is a challenging task that requires advanced classification and fitting algorithms. In this work, we have developed and implemented a novel algorithm based on the Random Sample Consensus Model (RANSAC). The RANSAC is used to classify tracks including pile-up, to remove uncorrelated noise hits, as well as to reconstruct the vertex of the reaction. The algorithm, developed within the Active Target Time Projection Chamber (AT-TPC) framework, was tested and validated by analyzing the 4He+4He reaction. Results, performance and quality of the proposed algorithm are presented and discussed in detail.
Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou
2015-01-01
Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.
Spin-the-bottle Sort and Annealing Sort: Oblivious Sorting via Round-robin Random Comparisons
Goodrich, Michael T.
2013-01-01
We study sorting algorithms based on randomized round-robin comparisons. Specifically, we study Spin-the-bottle sort, where comparisons are unrestricted, and Annealing sort, where comparisons are restricted to a distance bounded by a temperature parameter. Both algorithms are simple, randomized, data-oblivious sorting algorithms, which are useful in privacy-preserving computations, but, as we show, Annealing sort is much more efficient. We show that there is an input permutation that causes Spin-the-bottle sort to require Ω(n2 log n) expected time in order to succeed, and that in O(n2 log n) time this algorithm succeeds with high probability for any input. We also show there is a specification of Annealing sort that runs in O(n log n) time and succeeds with very high probability. PMID:24550575
USDA-ARS?s Scientific Manuscript database
Palmer amaranth (Amaranthus palmeri S. Wats.) invasion negatively impacts cotton (Gossypium hirsutum L.) production systems throughout the United States. The objective of this study was to evaluate canopy hyperspectral narrowband data as input into the random forest machine learning algorithm to dis...
"She Seems Nice": Teaching Evaluations and Gender Trouble
ERIC Educational Resources Information Center
Bartlett, Alison
2005-01-01
The aim of this paper is to work out what the author needs to do to pass with more success, and how students evaluations of her teaching may or may not indicate this. In doing so, she draws on the literature of feminist pedagogy and on anecdote, or gossip, as a counter-discourse or a mode of talk that destabilizes the official versions of…
ERIC Educational Resources Information Center
Duncan, Judith; Bowden, Chris; Smith, Anne B.
2006-01-01
Parental support has been an increasingly essential part of New Zealand early childhood (EC) education services over the last 20 years. This support has taken many shapes and forms over this time period, and has depended on the differing philosophies of the EC education services. What this support "looks like" and how it is delivered is…
Life in an Unjust Community: A Hollywood View of High School Moral Life
ERIC Educational Resources Information Center
Resnick, David
2008-01-01
This article analyses the film "Mean girls" (2004) as a window on popular notions of the moral life of American high schools, which straddles Kohlberg's Stage 2 and 3. The film presents loyalty to peer group cliques as a key value, even as it offers an individualist, relativist critique of that loyalty. Gossip is the main transgression in this…
Quantum speedup of Monte Carlo methods.
Montanaro, Ashley
2015-09-08
Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.
Ultrafast adiabatic quantum algorithm for the NP-complete exact cover problem
Wang, Hefeng; Wu, Lian-Ao
2016-01-01
An adiabatic quantum algorithm may lose quantumness such as quantum coherence entirely in its long runtime, and consequently the expected quantum speedup of the algorithm does not show up. Here we present a general ultrafast adiabatic quantum algorithm. We show that by applying a sequence of fast random or regular signals during evolution, the runtime can be reduced substantially, whereas advantages of the adiabatic algorithm remain intact. We also propose a randomized Trotter formula and show that the driving Hamiltonian and the proposed sequence of fast signals can be implemented simultaneously. We illustrate the algorithm by solving the NP-complete 3-bit exact cover problem (EC3), where NP stands for nondeterministic polynomial time, and put forward an approach to implementing the problem with trapped ions. PMID:26923834
Quantum speedup of Monte Carlo methods
Montanaro, Ashley
2015-01-01
Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently. PMID:26528079
NASA Astrophysics Data System (ADS)
Job, Joshua; Wang, Zhihui; Rønnow, Troels; Troyer, Matthias; Lidar, Daniel
2014-03-01
We report on experimental work benchmarking the performance of the D-Wave Two programmable annealer on its native Ising problem, and a comparison to available classical algorithms. In this talk we will focus on the comparison with an algorithm originally proposed and implemented by Alex Selby. This algorithm uses dynamic programming to repeatedly optimize over randomly selected maximal induced trees of the problem graph starting from a random initial state. If one is looking for a quantum advantage over classical algorithms, one should compare to classical algorithms which are designed and optimized to maximally take advantage of the structure of the type of problem one is using for the comparison. In that light, this classical algorithm should serve as a good gauge for any potential quantum speedup for the D-Wave Two.
Optimized random phase only holograms.
Zea, Alejandro Velez; Barrera Ramirez, John Fredy; Torroba, Roberto
2018-02-15
We propose a simple and efficient technique capable of generating Fourier phase only holograms with a reconstruction quality similar to the results obtained with the Gerchberg-Saxton (G-S) algorithm. Our proposal is to use the traditional G-S algorithm to optimize a random phase pattern for the resolution, pixel size, and target size of the general optical system without any specific amplitude data. This produces an optimized random phase (ORAP), which is used for fast generation of phase only holograms of arbitrary amplitude targets. This ORAP needs to be generated only once for a given optical system, avoiding the need for costly iterative algorithms for each new target. We show numerical and experimental results confirming the validity of the proposal.
Bi-dimensional null model analysis of presence-absence binary matrices.
Strona, Giovanni; Ulrich, Werner; Gotelli, Nicholas J
2018-01-01
Comparing the structure of presence/absence (i.e., binary) matrices with those of randomized counterparts is a common practice in ecology. However, differences in the randomization procedures (null models) can affect the results of the comparisons, leading matrix structural patterns to appear either "random" or not. Subjectivity in the choice of one particular null model over another makes it often advisable to compare the results obtained using several different approaches. Yet, available algorithms to randomize binary matrices differ substantially in respect to the constraints they impose on the discrepancy between observed and randomized row and column marginal totals, which complicates the interpretation of contrasting patterns. This calls for new strategies both to explore intermediate scenarios of restrictiveness in-between extreme constraint assumptions, and to properly synthesize the resulting information. Here we introduce a new modeling framework based on a flexible matrix randomization algorithm (named the "Tuning Peg" algorithm) that addresses both issues. The algorithm consists of a modified swap procedure in which the discrepancy between the row and column marginal totals of the target matrix and those of its randomized counterpart can be "tuned" in a continuous way by two parameters (controlling, respectively, row and column discrepancy). We show how combining the Tuning Peg with a wise random walk procedure makes it possible to explore the complete null space embraced by existing algorithms. This exploration allows researchers to visualize matrix structural patterns in an innovative bi-dimensional landscape of significance/effect size. We demonstrate the rational and potential of our approach with a set of simulated and real matrices, showing how the simultaneous investigation of a comprehensive and continuous portion of the null space can be extremely informative, and possibly key to resolving longstanding debates in the analysis of ecological matrices. © 2017 The Authors. Ecology, published by Wiley Periodicals, Inc., on behalf of the Ecological Society of America.
Stochastic Leader Gravitational Search Algorithm for Enhanced Adaptive Beamforming Technique
Darzi, Soodabeh; Islam, Mohammad Tariqul; Tiong, Sieh Kiong; Kibria, Salehin; Singh, Mandeep
2015-01-01
In this paper, stochastic leader gravitational search algorithm (SL-GSA) based on randomized k is proposed. Standard GSA (SGSA) utilizes the best agents without any randomization, thus it is more prone to converge at suboptimal results. Initially, the new approach randomly choses k agents from the set of all agents to improve the global search ability. Gradually, the set of agents is reduced by eliminating the agents with the poorest performances to allow rapid convergence. The performance of the SL-GSA was analyzed for six well-known benchmark functions, and the results are compared with SGSA and some of its variants. Furthermore, the SL-GSA is applied to minimum variance distortionless response (MVDR) beamforming technique to ensure compatibility with real world optimization problems. The proposed algorithm demonstrates superior convergence rate and quality of solution for both real world problems and benchmark functions compared to original algorithm and other recent variants of SGSA. PMID:26552032
Accelerating IMRT optimization by voxel sampling
NASA Astrophysics Data System (ADS)
Martin, Benjamin C.; Bortfeld, Thomas R.; Castañon, David A.
2007-12-01
This paper presents a new method for accelerating intensity-modulated radiation therapy (IMRT) optimization using voxel sampling. Rather than calculating the dose to the entire patient at each step in the optimization, the dose is only calculated for some randomly selected voxels. Those voxels are then used to calculate estimates of the objective and gradient which are used in a randomized version of a steepest descent algorithm. By selecting different voxels on each step, we are able to find an optimal solution to the full problem. We also present an algorithm to automatically choose the best sampling rate for each structure within the patient during the optimization. Seeking further improvements, we experimented with several other gradient-based optimization algorithms and found that the delta-bar-delta algorithm performs well despite the randomness. Overall, we were able to achieve approximately an order of magnitude speedup on our test case as compared to steepest descent.
Parallel Algorithms for Switching Edges in Heterogeneous Graphs.
Bhuiyan, Hasanuzzaman; Khan, Maleq; Chen, Jiangzhuo; Marathe, Madhav
2017-06-01
An edge switch is an operation on a graph (or network) where two edges are selected randomly and one of their end vertices are swapped with each other. Edge switch operations have important applications in graph theory and network analysis, such as in generating random networks with a given degree sequence, modeling and analyzing dynamic networks, and in studying various dynamic phenomena over a network. The recent growth of real-world networks motivates the need for efficient parallel algorithms. The dependencies among successive edge switch operations and the requirement to keep the graph simple (i.e., no self-loops or parallel edges) as the edges are switched lead to significant challenges in designing a parallel algorithm. Addressing these challenges requires complex synchronization and communication among the processors leading to difficulties in achieving a good speedup by parallelization. In this paper, we present distributed memory parallel algorithms for switching edges in massive networks. These algorithms provide good speedup and scale well to a large number of processors. A harmonic mean speedup of 73.25 is achieved on eight different networks with 1024 processors. One of the steps in our edge switch algorithms requires the computation of multinomial random variables in parallel. This paper presents the first non-trivial parallel algorithm for the problem, achieving a speedup of 925 using 1024 processors.
Parallel Algorithms for Switching Edges in Heterogeneous Graphs☆
Khan, Maleq; Chen, Jiangzhuo; Marathe, Madhav
2017-01-01
An edge switch is an operation on a graph (or network) where two edges are selected randomly and one of their end vertices are swapped with each other. Edge switch operations have important applications in graph theory and network analysis, such as in generating random networks with a given degree sequence, modeling and analyzing dynamic networks, and in studying various dynamic phenomena over a network. The recent growth of real-world networks motivates the need for efficient parallel algorithms. The dependencies among successive edge switch operations and the requirement to keep the graph simple (i.e., no self-loops or parallel edges) as the edges are switched lead to significant challenges in designing a parallel algorithm. Addressing these challenges requires complex synchronization and communication among the processors leading to difficulties in achieving a good speedup by parallelization. In this paper, we present distributed memory parallel algorithms for switching edges in massive networks. These algorithms provide good speedup and scale well to a large number of processors. A harmonic mean speedup of 73.25 is achieved on eight different networks with 1024 processors. One of the steps in our edge switch algorithms requires the computation of multinomial random variables in parallel. This paper presents the first non-trivial parallel algorithm for the problem, achieving a speedup of 925 using 1024 processors. PMID:28757680
Jeyasingh, Suganthi; Veluchamy, Malathi
2017-05-01
Early diagnosis of breast cancer is essential to save lives of patients. Usually, medical datasets include a large variety of data that can lead to confusion during diagnosis. The Knowledge Discovery on Database (KDD) process helps to improve efficiency. It requires elimination of inappropriate and repeated data from the dataset before final diagnosis. This can be done using any of the feature selection algorithms available in data mining. Feature selection is considered as a vital step to increase the classification accuracy. This paper proposes a Modified Bat Algorithm (MBA) for feature selection to eliminate irrelevant features from an original dataset. The Bat algorithm was modified using simple random sampling to select the random instances from the dataset. Ranking was with the global best features to recognize the predominant features available in the dataset. The selected features are used to train a Random Forest (RF) classification algorithm. The MBA feature selection algorithm enhanced the classification accuracy of RF in identifying the occurrence of breast cancer. The Wisconsin Diagnosis Breast Cancer Dataset (WDBC) was used for estimating the performance analysis of the proposed MBA feature selection algorithm. The proposed algorithm achieved better performance in terms of Kappa statistic, Mathew’s Correlation Coefficient, Precision, F-measure, Recall, Mean Absolute Error (MAE), Root Mean Square Error (RMSE), Relative Absolute Error (RAE) and Root Relative Squared Error (RRSE). Creative Commons Attribution License
Seismic noise attenuation using an online subspace tracking algorithm
NASA Astrophysics Data System (ADS)
Zhou, Yatong; Li, Shuhua; Zhang, Dong; Chen, Yangkang
2018-02-01
We propose a new low-rank based noise attenuation method using an efficient algorithm for tracking subspaces from highly corrupted seismic observations. The subspace tracking algorithm requires only basic linear algebraic manipulations. The algorithm is derived by analysing incremental gradient descent on the Grassmannian manifold of subspaces. When the multidimensional seismic data are mapped to a low-rank space, the subspace tracking algorithm can be directly applied to the input low-rank matrix to estimate the useful signals. Since the subspace tracking algorithm is an online algorithm, it is more robust to random noise than traditional truncated singular value decomposition (TSVD) based subspace tracking algorithm. Compared with the state-of-the-art algorithms, the proposed denoising method can obtain better performance. More specifically, the proposed method outperforms the TSVD-based singular spectrum analysis method in causing less residual noise and also in saving half of the computational cost. Several synthetic and field data examples with different levels of complexities demonstrate the effectiveness and robustness of the presented algorithm in rejecting different types of noise including random noise, spiky noise, blending noise, and coherent noise.
Testing an earthquake prediction algorithm
Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.
1997-01-01
A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.
A distributed scheduling algorithm for heterogeneous real-time systems
NASA Technical Reports Server (NTRS)
Zeineldine, Osman; El-Toweissy, Mohamed; Mukkamala, Ravi
1991-01-01
Much of the previous work on load balancing and scheduling in distributed environments was concerned with homogeneous systems and homogeneous loads. Several of the results indicated that random policies are as effective as other more complex load allocation policies. The effects of heterogeneity on scheduling algorithms for hard real time systems is examined. A distributed scheduler specifically to handle heterogeneities in both nodes and node traffic is proposed. The performance of the algorithm is measured in terms of the percentage of jobs discarded. While a random task allocation is very sensitive to heterogeneities, the algorithm is shown to be robust to such non-uniformities in system components and load.
Studies of the DIII-D disruption database using Machine Learning algorithms
NASA Astrophysics Data System (ADS)
Rea, Cristina; Granetz, Robert; Meneghini, Orso
2017-10-01
A Random Forests Machine Learning algorithm, trained on a large database of both disruptive and non-disruptive DIII-D discharges, predicts disruptive behavior in DIII-D with about 90% of accuracy. Several algorithms have been tested and Random Forests was found superior in performances for this particular task. Over 40 plasma parameters are included in the database, with data for each of the parameters taken from 500k time slices. We focused on a subset of non-dimensional plasma parameters, deemed to be good predictors based on physics considerations. Both binary (disruptive/non-disruptive) and multi-label (label based on the elapsed time before disruption) classification problems are investigated. The Random Forests algorithm provides insight on the available dataset by ranking the relative importance of the input features. It is found that q95 and Greenwald density fraction (n/nG) are the most relevant parameters for discriminating between DIII-D disruptive and non-disruptive discharges. A comparison with the Gradient Boosted Trees algorithm is shown and the first results coming from the application of regression algorithms are presented. Work supported by the US Department of Energy under DE-FC02-04ER54698, DE-SC0014264 and DE-FG02-95ER54309.
PCA-LBG-based algorithms for VQ codebook generation
NASA Astrophysics Data System (ADS)
Tsai, Jinn-Tsong; Yang, Po-Yuan
2015-04-01
Vector quantisation (VQ) codebooks are generated by combining principal component analysis (PCA) algorithms with Linde-Buzo-Gray (LBG) algorithms. All training vectors are grouped according to the projected values of the principal components. The PCA-LBG-based algorithms include (1) PCA-LBG-Median, which selects the median vector of each group, (2) PCA-LBG-Centroid, which adopts the centroid vector of each group, and (3) PCA-LBG-Random, which randomly selects a vector of each group. The LBG algorithm finds a codebook based on the better vectors sent to an initial codebook by the PCA. The PCA performs an orthogonal transformation to convert a set of potentially correlated variables into a set of variables that are not linearly correlated. Because the orthogonal transformation efficiently distinguishes test image vectors, the proposed PCA-LBG-based algorithm is expected to outperform conventional algorithms in designing VQ codebooks. The experimental results confirm that the proposed PCA-LBG-based algorithms indeed obtain better results compared to existing methods reported in the literature.
Grebenkov, Denis S
2011-02-01
A new method for computing the signal attenuation due to restricted diffusion in a linear magnetic field gradient is proposed. A fast random walk (FRW) algorithm for simulating random trajectories of diffusing spin-bearing particles is combined with gradient encoding. As random moves of a FRW are continuously adapted to local geometrical length scales, the method is efficient for simulating pulsed-gradient spin-echo experiments in hierarchical or multiscale porous media such as concrete, sandstones, sedimentary rocks and, potentially, brain or lungs. Copyright © 2010 Elsevier Inc. All rights reserved.
Recursive Branching Simulated Annealing Algorithm
NASA Technical Reports Server (NTRS)
Bolcar, Matthew; Smith, J. Scott; Aronstein, David
2012-01-01
This innovation is a variation of a simulated-annealing optimization algorithm that uses a recursive-branching structure to parallelize the search of a parameter space for the globally optimal solution to an objective. The algorithm has been demonstrated to be more effective at searching a parameter space than traditional simulated-annealing methods for a particular problem of interest, and it can readily be applied to a wide variety of optimization problems, including those with a parameter space having both discrete-value parameters (combinatorial) and continuous-variable parameters. It can take the place of a conventional simulated- annealing, Monte-Carlo, or random- walk algorithm. In a conventional simulated-annealing (SA) algorithm, a starting configuration is randomly selected within the parameter space. The algorithm randomly selects another configuration from the parameter space and evaluates the objective function for that configuration. If the objective function value is better than the previous value, the new configuration is adopted as the new point of interest in the parameter space. If the objective function value is worse than the previous value, the new configuration may be adopted, with a probability determined by a temperature parameter, used in analogy to annealing in metals. As the optimization continues, the region of the parameter space from which new configurations can be selected shrinks, and in conjunction with lowering the annealing temperature (and thus lowering the probability for adopting configurations in parameter space with worse objective functions), the algorithm can converge on the globally optimal configuration. The Recursive Branching Simulated Annealing (RBSA) algorithm shares some features with the SA algorithm, notably including the basic principles that a starting configuration is randomly selected from within the parameter space, the algorithm tests other configurations with the goal of finding the globally optimal solution, and the region from which new configurations can be selected shrinks as the search continues. The key difference between these algorithms is that in the SA algorithm, a single path, or trajectory, is taken in parameter space, from the starting point to the globally optimal solution, while in the RBSA algorithm, many trajectories are taken; by exploring multiple regions of the parameter space simultaneously, the algorithm has been shown to converge on the globally optimal solution about an order of magnitude faster than when using conventional algorithms. Novel features of the RBSA algorithm include: 1. More efficient searching of the parameter space due to the branching structure, in which multiple random configurations are generated and multiple promising regions of the parameter space are explored; 2. The implementation of a trust region for each parameter in the parameter space, which provides a natural way of enforcing upper- and lower-bound constraints on the parameters; and 3. The optional use of a constrained gradient- search optimization, performed on the continuous variables around each branch s configuration in parameter space to improve search efficiency by allowing for fast fine-tuning of the continuous variables within the trust region at that configuration point.
Tom Cruise is dangerous and irresponsible
Neill, Ushma S.
2005-01-01
Yes, even the JCI can weigh in on celebrity gossip, but hopefully without becoming a tabloid. Rather, we want to shine a light on the reckless comments actor Tom Cruise has recently made that psychiatry is a “quack” field and his belief that postpartum depression cannot be treated pharmacologically. We can only hope that his influence as a celebrity does not hold back those in need of psychiatric treatment. PMID:16075033
The Sleep of the Saved and Thankful
2009-03-16
the media working with the BSC, including Henry Luce, publisher of Time, Life and Fortune magazines; Walter Lippmann, columnist for the Herald Tribune...Walter Winchell, whose syndicated gossip column was read by over 50 million people in more than 2,000 papers worldwide; Arthur H. Sulzberger...York Herald Tribune, the New York Post, and the Baltimore Sun, – the places where the BSC had the most success in finding sympathetic columnists and
Ingram, Gordon P D
2014-04-29
Adult humans are characterized by low rates of intra-group physical aggression. Since children tend to be more physically aggressive, an evolutionary developmental account shows promise for explaining how physical aggression is suppressed in adults. I argue that this is achieved partly through extended dominance hierarchies, based on indirect reciprocity and linguistic transmission of reputational information, mediated by indirectly aggressive competition. Reviewing the literature on indirect and related forms of aggression provides three pieces of evidence for the claim that evolutionarily old impulses towards physical aggression are socialized into indirect aggression in humans: (i) physical aggression falls in early childhood over the same age range at which indirect aggression increases; (ii) the same individuals engage in both direct and indirect aggression; and (iii) socially dominant individuals practice indirect aggression more frequently. Consideration of the developmental course of indirect aggression is complemented by analysis of similar developments in verbal behaviors that are not always thought of as aggressive, namely tattling and gossip. An important puzzle concerns why indirect aggression becomes more covert, and tattling more derogated, in preadolescence and adolescence. This may be due to the development of new strategies aimed at renegotiating social identity and friendship alliances in the peer group.
NASA Astrophysics Data System (ADS)
Zhou, Nanrun; Zhang, Aidi; Zheng, Fen; Gong, Lihua
2014-10-01
The existing ways to encrypt images based on compressive sensing usually treat the whole measurement matrix as the key, which renders the key too large to distribute and memorize or store. To solve this problem, a new image compression-encryption hybrid algorithm is proposed to realize compression and encryption simultaneously, where the key is easily distributed, stored or memorized. The input image is divided into 4 blocks to compress and encrypt, then the pixels of the two adjacent blocks are exchanged randomly by random matrices. The measurement matrices in compressive sensing are constructed by utilizing the circulant matrices and controlling the original row vectors of the circulant matrices with logistic map. And the random matrices used in random pixel exchanging are bound with the measurement matrices. Simulation results verify the effectiveness, security of the proposed algorithm and the acceptable compression performance.
Alam, M S; Bognar, J G; Cain, S; Yasuda, B J
1998-03-10
During the process of microscanning a controlled vibrating mirror typically is used to produce subpixel shifts in a sequence of forward-looking infrared (FLIR) images. If the FLIR is mounted on a moving platform, such as an aircraft, uncontrolled random vibrations associated with the platform can be used to generate the shifts. Iterative techniques such as the expectation-maximization (EM) approach by means of the maximum-likelihood algorithm can be used to generate high-resolution images from multiple randomly shifted aliased frames. In the maximum-likelihood approach the data are considered to be Poisson random variables and an EM algorithm is developed that iteratively estimates an unaliased image that is compensated for known imager-system blur while it simultaneously estimates the translational shifts. Although this algorithm yields high-resolution images from a sequence of randomly shifted frames, it requires significant computation time and cannot be implemented for real-time applications that use the currently available high-performance processors. The new image shifts are iteratively calculated by evaluation of a cost function that compares the shifted and interlaced data frames with the corresponding values in the algorithm's latest estimate of the high-resolution image. We present a registration algorithm that estimates the shifts in one step. The shift parameters provided by the new algorithm are accurate enough to eliminate the need for iterative recalculation of translational shifts. Using this shift information, we apply a simplified version of the EM algorithm to estimate a high-resolution image from a given sequence of video frames. The proposed modified EM algorithm has been found to reduce significantly the computational burden when compared with the original EM algorithm, thus making it more attractive for practical implementation. Both simulation and experimental results are presented to verify the effectiveness of the proposed technique.
Nagahama, Yuki; Shimobaba, Tomoyoshi; Kakue, Takashi; Masuda, Nobuyuki; Ito, Tomoyoshi
2017-05-01
A holographic projector utilizes holography techniques. However, there are several barriers to realizing holographic projections. One is deterioration of hologram image quality caused by speckle noise and ringing artifacts. The combination of the random phase-free method and the Gerchberg-Saxton (GS) algorithm has improved the image quality of holograms. However, the GS algorithm requires significant computation time. We propose faster methods for image quality improvement of random phase-free holograms using the characteristics of ringing artifacts.
Multi-label spacecraft electrical signal classification method based on DBN and random forest
Li, Ke; Yu, Nan; Li, Pengfei; Song, Shimin; Wu, Yalei; Li, Yang; Liu, Meng
2017-01-01
In spacecraft electrical signal characteristic data, there exists a large amount of data with high-dimensional features, a high computational complexity degree, and a low rate of identification problems, which causes great difficulty in fault diagnosis of spacecraft electronic load systems. This paper proposes a feature extraction method that is based on deep belief networks (DBN) and a classification method that is based on the random forest (RF) algorithm; The proposed algorithm mainly employs a multi-layer neural network to reduce the dimension of the original data, and then, classification is applied. Firstly, we use the method of wavelet denoising, which was used to pre-process the data. Secondly, the deep belief network is used to reduce the feature dimension and improve the rate of classification for the electrical characteristics data. Finally, we used the random forest algorithm to classify the data and comparing it with other algorithms. The experimental results show that compared with other algorithms, the proposed method shows excellent performance in terms of accuracy, computational efficiency, and stability in addressing spacecraft electrical signal data. PMID:28486479
Multi-label spacecraft electrical signal classification method based on DBN and random forest.
Li, Ke; Yu, Nan; Li, Pengfei; Song, Shimin; Wu, Yalei; Li, Yang; Liu, Meng
2017-01-01
In spacecraft electrical signal characteristic data, there exists a large amount of data with high-dimensional features, a high computational complexity degree, and a low rate of identification problems, which causes great difficulty in fault diagnosis of spacecraft electronic load systems. This paper proposes a feature extraction method that is based on deep belief networks (DBN) and a classification method that is based on the random forest (RF) algorithm; The proposed algorithm mainly employs a multi-layer neural network to reduce the dimension of the original data, and then, classification is applied. Firstly, we use the method of wavelet denoising, which was used to pre-process the data. Secondly, the deep belief network is used to reduce the feature dimension and improve the rate of classification for the electrical characteristics data. Finally, we used the random forest algorithm to classify the data and comparing it with other algorithms. The experimental results show that compared with other algorithms, the proposed method shows excellent performance in terms of accuracy, computational efficiency, and stability in addressing spacecraft electrical signal data.
Scalable and fault tolerant orthogonalization based on randomized distributed data aggregation
Gansterer, Wilfried N.; Niederbrucker, Gerhard; Straková, Hana; Schulze Grotthoff, Stefan
2013-01-01
The construction of distributed algorithms for matrix computations built on top of distributed data aggregation algorithms with randomized communication schedules is investigated. For this purpose, a new aggregation algorithm for summing or averaging distributed values, the push-flow algorithm, is developed, which achieves superior resilience properties with respect to failures compared to existing aggregation methods. It is illustrated that on a hypercube topology it asymptotically requires the same number of iterations as the optimal all-to-all reduction operation and that it scales well with the number of nodes. Orthogonalization is studied as a prototypical matrix computation task. A new fault tolerant distributed orthogonalization method rdmGS, which can produce accurate results even in the presence of node failures, is built on top of distributed data aggregation algorithms. PMID:24748902
Random Walk Quantum Clustering Algorithm Based on Space
NASA Astrophysics Data System (ADS)
Xiao, Shufen; Dong, Yumin; Ma, Hongyang
2018-01-01
In the random quantum walk, which is a quantum simulation of the classical walk, data points interacted when selecting the appropriate walk strategy by taking advantage of quantum-entanglement features; thus, the results obtained when the quantum walk is used are different from those when the classical walk is adopted. A new quantum walk clustering algorithm based on space is proposed by applying the quantum walk to clustering analysis. In this algorithm, data points are viewed as walking participants, and similar data points are clustered using the walk function in the pay-off matrix according to a certain rule. The walk process is simplified by implementing a space-combining rule. The proposed algorithm is validated by a simulation test and is proved superior to existing clustering algorithms, namely, Kmeans, PCA + Kmeans, and LDA-Km. The effects of some of the parameters in the proposed algorithm on its performance are also analyzed and discussed. Specific suggestions are provided.
An improved genetic algorithm and its application in the TSP problem
NASA Astrophysics Data System (ADS)
Li, Zheng; Qin, Jinlei
2011-12-01
Concept and research actuality of genetic algorithm are introduced in detail in the paper. Under this condition, the simple genetic algorithm and an improved algorithm are described and applied in an example of TSP problem, where the advantage of genetic algorithm is adequately shown in solving the NP-hard problem. In addition, based on partial matching crossover operator, the crossover operator method is improved into extended crossover operator in order to advance the efficiency when solving the TSP. In the extended crossover method, crossover operator can be performed between random positions of two random individuals, which will not be restricted by the position of chromosome. Finally, the nine-city TSP is solved using the improved genetic algorithm with extended crossover method, the efficiency of whose solution process is much higher, besides, the solving speed of the optimal solution is much faster.
Benchmarking protein classification algorithms via supervised cross-validation.
Kertész-Farkas, Attila; Dhir, Somdutta; Sonego, Paolo; Pacurar, Mircea; Netoteia, Sergiu; Nijveen, Harm; Kuzniar, Arnold; Leunissen, Jack A M; Kocsor, András; Pongor, Sándor
2008-04-24
Development and testing of protein classification algorithms are hampered by the fact that the protein universe is characterized by groups vastly different in the number of members, in average protein size, similarity within group, etc. Datasets based on traditional cross-validation (k-fold, leave-one-out, etc.) may not give reliable estimates on how an algorithm will generalize to novel, distantly related subtypes of the known protein classes. Supervised cross-validation, i.e., selection of test and train sets according to the known subtypes within a database has been successfully used earlier in conjunction with the SCOP database. Our goal was to extend this principle to other databases and to design standardized benchmark datasets for protein classification. Hierarchical classification trees of protein categories provide a simple and general framework for designing supervised cross-validation strategies for protein classification. Benchmark datasets can be designed at various levels of the concept hierarchy using a simple graph-theoretic distance. A combination of supervised and random sampling was selected to construct reduced size model datasets, suitable for algorithm comparison. Over 3000 new classification tasks were added to our recently established protein classification benchmark collection that currently includes protein sequence (including protein domains and entire proteins), protein structure and reading frame DNA sequence data. We carried out an extensive evaluation based on various machine-learning algorithms such as nearest neighbor, support vector machines, artificial neural networks, random forests and logistic regression, used in conjunction with comparison algorithms, BLAST, Smith-Waterman, Needleman-Wunsch, as well as 3D comparison methods DALI and PRIDE. The resulting datasets provide lower, and in our opinion more realistic estimates of the classifier performance than do random cross-validation schemes. A combination of supervised and random sampling was used to construct model datasets, suitable for algorithm comparison.
NASA Technical Reports Server (NTRS)
Olson, William S.; Kummerow, Christian D.; Yang, Song; Petty, Grant W.; Tao, Wei-Kuo; Bell, Thomas L.; Braun, Scott A.; Wang, Yansen; Lang, Stephen E.; Johnson, Daniel E.
2004-01-01
A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating/drying profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and non-convective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud resolving model simulations, and from the Bayesian formulation itself. Synthetic rain rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in instantaneous rain rate estimates at 0.5 deg resolution range from approximately 50% at 1 mm/h to 20% at 14 mm/h. These errors represent about 70-90% of the mean random deviation between collocated passive microwave and spaceborne radar rain rate estimates. The cumulative algorithm error in TMI estimates at monthly, 2.5 deg resolution is relatively small (less than 6% at 5 mm/day) compared to the random error due to infrequent satellite temporal sampling (8-35% at the same rain rate).
Solving large test-day models by iteration on data and preconditioned conjugate gradient.
Lidauer, M; Strandén, I; Mäntysaari, E A; Pösö, J; Kettunen, A
1999-12-01
A preconditioned conjugate gradient method was implemented into an iteration on a program for data estimation of breeding values, and its convergence characteristics were studied. An algorithm was used as a reference in which one fixed effect was solved by Gauss-Seidel method, and other effects were solved by a second-order Jacobi method. Implementation of the preconditioned conjugate gradient required storing four vectors (size equal to number of unknowns in the mixed model equations) in random access memory and reading the data at each round of iteration. The preconditioner comprised diagonal blocks of the coefficient matrix. Comparison of algorithms was based on solutions of mixed model equations obtained by a single-trait animal model and a single-trait, random regression test-day model. Data sets for both models used milk yield records of primiparous Finnish dairy cows. Animal model data comprised 665,629 lactation milk yields and random regression test-day model data of 6,732,765 test-day milk yields. Both models included pedigree information of 1,099,622 animals. The animal model ¿random regression test-day model¿ required 122 ¿305¿ rounds of iteration to converge with the reference algorithm, but only 88 ¿149¿ were required with the preconditioned conjugate gradient. To solve the random regression test-day model with the preconditioned conjugate gradient required 237 megabytes of random access memory and took 14% of the computation time needed by the reference algorithm.
NASA Astrophysics Data System (ADS)
Manzanares-Filho, N.; Albuquerque, R. B. F.; Sousa, B. S.; Santos, L. G. C.
2018-06-01
This article presents a comparative study of some versions of the controlled random search algorithm (CRSA) in global optimization problems. The basic CRSA, originally proposed by Price in 1977 and improved by Ali et al. in 1997, is taken as a starting point. Then, some new modifications are proposed to improve the efficiency and reliability of this global optimization technique. The performance of the algorithms is assessed using traditional benchmark test problems commonly invoked in the literature. This comparative study points out the key features of the modified algorithm. Finally, a comparison is also made in a practical engineering application, namely the inverse aerofoil shape design.
Quantum-inspired algorithm for estimating the permanent of positive semidefinite matrices
NASA Astrophysics Data System (ADS)
Chakhmakhchyan, L.; Cerf, N. J.; Garcia-Patron, R.
2017-08-01
We construct a quantum-inspired classical algorithm for computing the permanent of Hermitian positive semidefinite matrices by exploiting a connection between these mathematical structures and the boson sampling model. Specifically, the permanent of a Hermitian positive semidefinite matrix can be expressed in terms of the expected value of a random variable, which stands for a specific photon-counting probability when measuring a linear-optically evolved random multimode coherent state. Our algorithm then approximates the matrix permanent from the corresponding sample mean and is shown to run in polynomial time for various sets of Hermitian positive semidefinite matrices, achieving a precision that improves over known techniques. This work illustrates how quantum optics may benefit algorithm development.
Prime Numbers Comparison using Sieve of Eratosthenes and Sieve of Sundaram Algorithm
NASA Astrophysics Data System (ADS)
Abdullah, D.; Rahim, R.; Apdilah, D.; Efendi, S.; Tulus, T.; Suwilo, S.
2018-03-01
Prime numbers are numbers that have their appeal to researchers due to the complexity of these numbers, many algorithms that can be used to generate prime numbers ranging from simple to complex computations, Sieve of Eratosthenes and Sieve of Sundaram are two algorithm that can be used to generate Prime numbers of randomly generated or sequential numbered random numbers, testing in this study to find out which algorithm is better used for large primes in terms of time complexity, the test also assisted with applications designed using Java language with code optimization and Maximum memory usage so that the testing process can be simultaneously and the results obtained can be objective
Selective epidemic vaccination under the performant routing algorithms
NASA Astrophysics Data System (ADS)
Bamaarouf, O.; Alweimine, A. Ould Baba; Rachadi, A.; EZ-Zahraouy, H.
2018-04-01
Despite the extensive research on traffic dynamics and epidemic spreading, the effect of the routing algorithms strategies on the traffic-driven epidemic spreading has not received an adequate attention. It is well known that more performant routing algorithm strategies are used to overcome the congestion problem. However, our main result shows unexpectedly that these algorithms favor the virus spreading more than the case where the shortest path based algorithm is used. In this work, we studied the virus spreading in a complex network using the efficient path and the global dynamic routing algorithms as compared to shortest path strategy. Some previous studies have tried to modify the routing rules to limit the virus spreading, but at the expense of reducing the traffic transport efficiency. This work proposed a solution to overcome this drawback by using a selective vaccination procedure instead of a random vaccination used often in the literature. We found that the selective vaccination succeeded in eradicating the virus better than a pure random intervention for the performant routing algorithm strategies.
Seismic random noise attenuation method based on empirical mode decomposition of Hausdorff dimension
NASA Astrophysics Data System (ADS)
Yan, Z.; Luan, X.
2017-12-01
Introduction Empirical mode decomposition (EMD) is a noise suppression algorithm by using wave field separation, which is based on the scale differences between effective signal and noise. However, since the complexity of the real seismic wave field results in serious aliasing modes, it is not ideal and effective to denoise with this method alone. Based on the multi-scale decomposition characteristics of the signal EMD algorithm, combining with Hausdorff dimension constraints, we propose a new method for seismic random noise attenuation. First of all, We apply EMD algorithm adaptive decomposition of seismic data and obtain a series of intrinsic mode function (IMF)with different scales. Based on the difference of Hausdorff dimension between effectively signals and random noise, we identify IMF component mixed with random noise. Then we use threshold correlation filtering process to separate the valid signal and random noise effectively. Compared with traditional EMD method, the results show that the new method of seismic random noise attenuation has a better suppression effect. The implementation process The EMD algorithm is used to decompose seismic signals into IMF sets and analyze its spectrum. Since most of the random noise is high frequency noise, the IMF sets can be divided into three categories: the first category is the effective wave composition of the larger scale; the second category is the noise part of the smaller scale; the third category is the IMF component containing random noise. Then, the third kind of IMF component is processed by the Hausdorff dimension algorithm, and the appropriate time window size, initial step and increment amount are selected to calculate the Hausdorff instantaneous dimension of each component. The dimension of the random noise is between 1.0 and 1.05, while the dimension of the effective wave is between 1.05 and 2.0. On the basis of the previous steps, according to the dimension difference between the random noise and effective signal, we extracted the sample points, whose fractal dimension value is less than or equal to 1.05 for the each IMF components, to separate the residual noise. Using the IMF components after dimension filtering processing and the effective wave IMF components after the first selection for reconstruction, we can obtained the results of de-noising.
ERIC Educational Resources Information Center
Pawl, Andrew; Teodorescu, Raluca E.; Peterson, Joseph D.
2013-01-01
We have developed simple data-mining algorithms to assess the consistency and the randomness of student responses to problems consisting of multiple true or false statements. In this paper we describe the algorithms and use them to analyze data from introductory physics courses. We investigate statements that emerge as outliers because the class…
NASA Astrophysics Data System (ADS)
Perugini, G.; Ricci-Tersenghi, F.
2018-01-01
We first present an empirical study of the Belief Propagation (BP) algorithm, when run on the random field Ising model defined on random regular graphs in the zero temperature limit. We introduce the notion of extremal solutions for the BP equations, and we use them to fix a fraction of spins in their ground state configuration. At the phase transition point the fraction of unconstrained spins percolates and their number diverges with the system size. This in turn makes the associated optimization problem highly non trivial in the critical region. Using the bounds on the BP messages provided by the extremal solutions we design a new and very easy to implement BP scheme which is able to output a large number of stable fixed points. On one hand this new algorithm is able to provide the minimum energy configuration with high probability in a competitive time. On the other hand we found that the number of fixed points of the BP algorithm grows with the system size in the critical region. This unexpected feature poses new relevant questions about the physics of this class of models.
Selecting materialized views using random algorithm
NASA Astrophysics Data System (ADS)
Zhou, Lijuan; Hao, Zhongxiao; Liu, Chi
2007-04-01
The data warehouse is a repository of information collected from multiple possibly heterogeneous autonomous distributed databases. The information stored at the data warehouse is in form of views referred to as materialized views. The selection of the materialized views is one of the most important decisions in designing a data warehouse. Materialized views are stored in the data warehouse for the purpose of efficiently implementing on-line analytical processing queries. The first issue for the user to consider is query response time. So in this paper, we develop algorithms to select a set of views to materialize in data warehouse in order to minimize the total view maintenance cost under the constraint of a given query response time. We call it query_cost view_ selection problem. First, cost graph and cost model of query_cost view_ selection problem are presented. Second, the methods for selecting materialized views by using random algorithms are presented. The genetic algorithm is applied to the materialized views selection problem. But with the development of genetic process, the legal solution produced become more and more difficult, so a lot of solutions are eliminated and producing time of the solutions is lengthened in genetic algorithm. Therefore, improved algorithm has been presented in this paper, which is the combination of simulated annealing algorithm and genetic algorithm for the purpose of solving the query cost view selection problem. Finally, in order to test the function and efficiency of our algorithms experiment simulation is adopted. The experiments show that the given methods can provide near-optimal solutions in limited time and works better in practical cases. Randomized algorithms will become invaluable tools for data warehouse evolution.
Nonconvergence of the Wang-Landau algorithms with multiple random walkers.
Belardinelli, R E; Pereyra, V D
2016-05-01
This paper discusses some convergence properties in the entropic sampling Monte Carlo methods with multiple random walkers, particularly in the Wang-Landau (WL) and 1/t algorithms. The classical algorithms are modified by the use of m-independent random walkers in the energy landscape to calculate the density of states (DOS). The Ising model is used to show the convergence properties in the calculation of the DOS, as well as the critical temperature, while the calculation of the number π by multiple dimensional integration is used in the continuum approximation. In each case, the error is obtained separately for each walker at a fixed time, t; then, the average over m walkers is performed. It is observed that the error goes as 1/sqrt[m]. However, if the number of walkers increases above a certain critical value m>m_{x}, the error reaches a constant value (i.e., it saturates). This occurs for both algorithms; however, it is shown that for a given system, the 1/t algorithm is more efficient and accurate than the similar version of the WL algorithm. It follows that it makes no sense to increase the number of walkers above a critical value m_{x}, since it does not reduce the error in the calculation. Therefore, the number of walkers does not guarantee convergence.
Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler.
Jin, Ick Hoon; Yuan, Ying; Liang, Faming
2013-10-01
Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.
NASA Astrophysics Data System (ADS)
Xu, Jiuping; Zeng, Ziqiang; Han, Bernard; Lei, Xiao
2013-07-01
This article presents a dynamic programming-based particle swarm optimization (DP-based PSO) algorithm for solving an inventory management problem for large-scale construction projects under a fuzzy random environment. By taking into account the purchasing behaviour and strategy under rules of international bidding, a multi-objective fuzzy random dynamic programming model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform fuzzy random parameters into fuzzy variables that are subsequently defuzzified by using an expected value operator with optimistic-pessimistic index. The iterative nature of the authors' model motivates them to develop a DP-based PSO algorithm. More specifically, their approach treats the state variables as hidden parameters. This in turn eliminates many redundant feasibility checks during initialization and particle updates at each iteration. Results and sensitivity analysis are presented to highlight the performance of the authors' optimization method, which is very effective as compared to the standard PSO algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shenvi, Neil; Yang, Yang; Yang, Weitao
In recent years, interest in the random-phase approximation (RPA) has grown rapidly. At the same time, tensor hypercontraction has emerged as an intriguing method to reduce the computational cost of electronic structure algorithms. In this paper, we combine the particle-particle random phase approximation with tensor hypercontraction to produce the tensor-hypercontracted particle-particle RPA (THC-ppRPA) algorithm. Unlike previous implementations of ppRPA which scale as O(r{sup 6}), the THC-ppRPA algorithm scales asymptotically as only O(r{sup 4}), albeit with a much larger prefactor than the traditional algorithm. We apply THC-ppRPA to several model systems and show that it yields the same results as traditionalmore » ppRPA to within mH accuracy. Our method opens the door to the development of post-Kohn Sham functionals based on ppRPA without the excessive asymptotic cost of traditional ppRPA implementations.« less
NASA Astrophysics Data System (ADS)
Shenvi, Neil; van Aggelen, Helen; Yang, Yang; Yang, Weitao
2014-07-01
In recent years, interest in the random-phase approximation (RPA) has grown rapidly. At the same time, tensor hypercontraction has emerged as an intriguing method to reduce the computational cost of electronic structure algorithms. In this paper, we combine the particle-particle random phase approximation with tensor hypercontraction to produce the tensor-hypercontracted particle-particle RPA (THC-ppRPA) algorithm. Unlike previous implementations of ppRPA which scale as O(r6), the THC-ppRPA algorithm scales asymptotically as only O(r4), albeit with a much larger prefactor than the traditional algorithm. We apply THC-ppRPA to several model systems and show that it yields the same results as traditional ppRPA to within mH accuracy. Our method opens the door to the development of post-Kohn Sham functionals based on ppRPA without the excessive asymptotic cost of traditional ppRPA implementations.
Comparative analysis of used car price evaluation models
NASA Astrophysics Data System (ADS)
Chen, Chuancan; Hao, Lulu; Xu, Cong
2017-05-01
An accurate used car price evaluation is a catalyst for the healthy development of used car market. Data mining has been applied to predict used car price in several articles. However, little is studied on the comparison of using different algorithms in used car price estimation. This paper collects more than 100,000 used car dealing records throughout China to do empirical analysis on a thorough comparison of two algorithms: linear regression and random forest. These two algorithms are used to predict used car price in three different models: model for a certain car make, model for a certain car series and universal model. Results show that random forest has a stable but not ideal effect in price evaluation model for a certain car make, but it shows great advantage in the universal model compared with linear regression. This indicates that random forest is an optimal algorithm when handling complex models with a large number of variables and samples, yet it shows no obvious advantage when coping with simple models with less variables.
Precise algorithm to generate random sequential adsorption of hard polygons at saturation
NASA Astrophysics Data System (ADS)
Zhang, G.
2018-04-01
Random sequential adsorption (RSA) is a time-dependent packing process, in which particles of certain shapes are randomly and sequentially placed into an empty space without overlap. In the infinite-time limit, the density approaches a "saturation" limit. Although this limit has attracted particular research interest, the majority of past studies could only probe this limit by extrapolation. We have previously found an algorithm to reach this limit using finite computational time for spherical particles and could thus determine the saturation density of spheres with high accuracy. In this paper, we generalize this algorithm to generate saturated RSA packings of two-dimensional polygons. We also calculate the saturation density for regular polygons of three to ten sides and obtain results that are consistent with previous, extrapolation-based studies.
Precise algorithm to generate random sequential adsorption of hard polygons at saturation.
Zhang, G
2018-04-01
Random sequential adsorption (RSA) is a time-dependent packing process, in which particles of certain shapes are randomly and sequentially placed into an empty space without overlap. In the infinite-time limit, the density approaches a "saturation" limit. Although this limit has attracted particular research interest, the majority of past studies could only probe this limit by extrapolation. We have previously found an algorithm to reach this limit using finite computational time for spherical particles and could thus determine the saturation density of spheres with high accuracy. In this paper, we generalize this algorithm to generate saturated RSA packings of two-dimensional polygons. We also calculate the saturation density for regular polygons of three to ten sides and obtain results that are consistent with previous, extrapolation-based studies.
Algorithm 971: An Implementation of a Randomized Algorithm for Principal Component Analysis
LI, HUAMIN; LINDERMAN, GEORGE C.; SZLAM, ARTHUR; STANTON, KELLY P.; KLUGER, YUVAL; TYGERT, MARK
2017-01-01
Recent years have witnessed intense development of randomized methods for low-rank approximation. These methods target principal component analysis and the calculation of truncated singular value decompositions. The present article presents an essentially black-box, foolproof implementation for Mathworks’ MATLAB, a popular software platform for numerical computation. As illustrated via several tests, the randomized algorithms for low-rank approximation outperform or at least match the classical deterministic techniques (such as Lanczos iterations run to convergence) in basically all respects: accuracy, computational efficiency (both speed and memory usage), ease-of-use, parallelizability, and reliability. However, the classical procedures remain the methods of choice for estimating spectral norms and are far superior for calculating the least singular values and corresponding singular vectors (or singular subspaces). PMID:28983138
Xiao, Li-Hong; Chen, Pei-Ran; Gou, Zhong-Ping; Li, Yong-Zhong; Li, Mei; Xiang, Liang-Cheng; Feng, Ping
2017-01-01
The aim of this study is to evaluate the ability of the random forest algorithm that combines data on transrectal ultrasound findings, age, and serum levels of prostate-specific antigen to predict prostate carcinoma. Clinico-demographic data were analyzed for 941 patients with prostate diseases treated at our hospital, including age, serum prostate-specific antigen levels, transrectal ultrasound findings, and pathology diagnosis based on ultrasound-guided needle biopsy of the prostate. These data were compared between patients with and without prostate cancer using the Chi-square test, and then entered into the random forest model to predict diagnosis. Patients with and without prostate cancer differed significantly in age and serum prostate-specific antigen levels (P < 0.001), as well as in all transrectal ultrasound characteristics (P < 0.05) except uneven echo (P = 0.609). The random forest model based on age, prostate-specific antigen and ultrasound predicted prostate cancer with an accuracy of 83.10%, sensitivity of 65.64%, and specificity of 93.83%. Positive predictive value was 86.72%, and negative predictive value was 81.64%. By integrating age, prostate-specific antigen levels and transrectal ultrasound findings, the random forest algorithm shows better diagnostic performance for prostate cancer than either diagnostic indicator on its own. This algorithm may help improve diagnosis of the disease by identifying patients at high risk for biopsy.
A novel strategy for load balancing of distributed medical applications.
Logeswaran, Rajasvaran; Chen, Li-Choo
2012-04-01
Current trends in medicine, specifically in the electronic handling of medical applications, ranging from digital imaging, paperless hospital administration and electronic medical records, telemedicine, to computer-aided diagnosis, creates a burden on the network. Distributed Service Architectures, such as Intelligent Network (IN), Telecommunication Information Networking Architecture (TINA) and Open Service Access (OSA), are able to meet this new challenge. Distribution enables computational tasks to be spread among multiple processors; hence, performance is an important issue. This paper proposes a novel approach in load balancing, the Random Sender Initiated Algorithm, for distribution of tasks among several nodes sharing the same computational object (CO) instances in Distributed Service Architectures. Simulations illustrate that the proposed algorithm produces better network performance than the benchmark load balancing algorithms-the Random Node Selection Algorithm and the Shortest Queue Algorithm, especially under medium and heavily loaded conditions.
Distributed multimodal data fusion for large scale wireless sensor networks
NASA Astrophysics Data System (ADS)
Ertin, Emre
2006-05-01
Sensor network technology has enabled new surveillance systems where sensor nodes equipped with processing and communication capabilities can collaboratively detect, classify and track targets of interest over a large surveillance area. In this paper we study distributed fusion of multimodal sensor data for extracting target information from a large scale sensor network. Optimal tracking, classification, and reporting of threat events require joint consideration of multiple sensor modalities. Multiple sensor modalities improve tracking by reducing the uncertainty in the track estimates as well as resolving track-sensor data association problems. Our approach to solving the fusion problem with large number of multimodal sensors is construction of likelihood maps. The likelihood maps provide a summary data for the solution of the detection, tracking and classification problem. The likelihood map presents the sensory information in an easy format for the decision makers to interpret and is suitable with fusion of spatial prior information such as maps, imaging data from stand-off imaging sensors. We follow a statistical approach to combine sensor data at different levels of uncertainty and resolution. The likelihood map transforms each sensor data stream to a spatio-temporal likelihood map ideally suitable for fusion with imaging sensor outputs and prior geographic information about the scene. We also discuss distributed computation of the likelihood map using a gossip based algorithm and present simulation results.
The Evolution of Random Number Generation in MUVES
2017-01-01
mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number
Optimizing Constrained Single Period Problem under Random Fuzzy Demand
NASA Astrophysics Data System (ADS)
Taleizadeh, Ata Allah; Shavandi, Hassan; Riazi, Afshin
2008-09-01
In this paper, we consider the multi-product multi-constraint newsboy problem with random fuzzy demands and total discount. The demand of the products is often stochastic in the real word but the estimation of the parameters of distribution function may be done by fuzzy manner. So an appropriate option to modeling the demand of products is using the random fuzzy variable. The objective function of proposed model is to maximize the expected profit of newsboy. We consider the constraints such as warehouse space and restriction on quantity order for products, and restriction on budget. We also consider the batch size for products order. Finally we introduce a random fuzzy multi-product multi-constraint newsboy problem (RFM-PM-CNP) and it is changed to a multi-objective mixed integer nonlinear programming model. Furthermore, a hybrid intelligent algorithm based on genetic algorithm, Pareto and TOPSIS is presented for the developed model. Finally an illustrative example is presented to show the performance of the developed model and algorithm.
Approximate ground states of the random-field Potts model from graph cuts
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Kumar, Ravinder; Weigel, Martin; Banerjee, Varsha; Janke, Wolfhard; Puri, Sanjay
2018-05-01
While the ground-state problem for the random-field Ising model is polynomial, and can be solved using a number of well-known algorithms for maximum flow or graph cut, the analog random-field Potts model corresponds to a multiterminal flow problem that is known to be NP-hard. Hence an efficient exact algorithm is very unlikely to exist. As we show here, it is nevertheless possible to use an embedding of binary degrees of freedom into the Potts spins in combination with graph-cut methods to solve the corresponding ground-state problem approximately in polynomial time. We benchmark this heuristic algorithm using a set of quasiexact ground states found for small systems from long parallel tempering runs. For a not-too-large number q of Potts states, the method based on graph cuts finds the same solutions in a fraction of the time. We employ the new technique to analyze the breakup length of the random-field Potts model in two dimensions.
Coverage-maximization in networks under resource constraints.
Nandi, Subrata; Brusch, Lutz; Deutsch, Andreas; Ganguly, Niloy
2010-06-01
Efficient coverage algorithms are essential for information search or dispersal in all kinds of networks. We define an extended coverage problem which accounts for constrained resources of consumed bandwidth B and time T . Our solution to the network challenge is here studied for regular grids only. Using methods from statistical mechanics, we develop a coverage algorithm with proliferating message packets and temporally modulated proliferation rate. The algorithm performs as efficiently as a single random walker but O(B(d-2)/d) times faster, resulting in significant service speed-up on a regular grid of dimension d . The algorithm is numerically compared to a class of generalized proliferating random walk strategies and on regular grids shown to perform best in terms of the product metric of speed and efficiency.
Prediction of Baseflow Index of Catchments using Machine Learning Algorithms
NASA Astrophysics Data System (ADS)
Yadav, B.; Hatfield, K.
2017-12-01
We present the results of eight machine learning techniques for predicting the baseflow index (BFI) of ungauged basins using a surrogate of catchment scale climate and physiographic data. The tested algorithms include ordinary least squares, ridge regression, least absolute shrinkage and selection operator (lasso), elasticnet, support vector machine, gradient boosted regression trees, random forests, and extremely randomized trees. Our work seeks to identify the dominant controls of BFI that can be readily obtained from ancillary geospatial databases and remote sensing measurements, such that the developed techniques can be extended to ungauged catchments. More than 800 gauged catchments spanning the continental United States were selected to develop the general methodology. The BFI calculation was based on the baseflow separated from daily streamflow hydrograph using HYSEP filter. The surrogate catchment attributes were compiled from multiple sources including digital elevation model, soil, landuse, climate data, other publicly available ancillary and geospatial data. 80% catchments were used to train the ML algorithms, and the remaining 20% of the catchments were used as an independent test set to measure the generalization performance of fitted models. A k-fold cross-validation using exhaustive grid search was used to fit the hyperparameters of each model. Initial model development was based on 19 independent variables, but after variable selection and feature ranking, we generated revised sparse models of BFI prediction that are based on only six catchment attributes. These key predictive variables selected after the careful evaluation of bias-variance tradeoff include average catchment elevation, slope, fraction of sand, permeability, temperature, and precipitation. The most promising algorithms exceeding an accuracy score (r-square) of 0.7 on test data include support vector machine, gradient boosted regression trees, random forests, and extremely randomized trees. Considering both the accuracy and the computational complexity of these algorithms, we identify the extremely randomized trees as the best performing algorithm for BFI prediction in ungauged basins.
Yale, Jean-François; Berard, Lori; Groleau, Mélanie; Javadi, Pasha; Stewart, John; Harris, Stewart B
2017-10-01
It was uncertain whether an algorithm that involves increasing insulin dosages by 1 unit/day may cause more hypoglycemia with the longer-acting insulin glargine 300 units/mL (GLA-300). The objective of this study was to compare safety and efficacy of 2 titration algorithms, INSIGHT and EDITION, for GLA-300 in people with uncontrolled type 2 diabetes mellitus, mainly in a primary care setting. This was a 12-week, open-label, randomized, multicentre pilot study. Participants were randomly assigned to 1 of 2 algorithms: they either increased their dosage by 1 unit/day (INSIGHT, n=108) or the dose was adjusted by the investigator at least once weekly, but no more often than every 3 days (EDITION, n=104). The target fasting self-monitored blood glucose was in the range of 4.4 to 5.6 mmol/L. The percentages of participants reaching the primary endpoint of fasting self-monitored blood glucose ≤5.6 mmol/L without nocturnal hypoglycemia were 19.4% (INSIGHT) and 18.3% (EDITION). At week 12, 26.9% (INSIGHT) and 28.8% (EDITION) of participants achieved a glycated hemoglobin value of ≤7%. No differences in the incidence of hypoglycemia of any category were noted between algorithms. Participants in both arms of the study were much more satisfied with their new treatment as assessed by the Diabetes Treatment Satisfaction Questionnaire. Most health-care professionals (86%) preferred the INSIGHT over the EDITION algorithm. The frequency of adverse events was similar between algorithms. A patient-driven titration algorithm of 1 unit/day with GLA-300 is effective and comparable to the previously tested EDITION algorithm and is preferred by health-care professionals. Copyright © 2017 Diabetes Canada. Published by Elsevier Inc. All rights reserved.
Data transmission system and method
NASA Technical Reports Server (NTRS)
Bruck, Jehoshua (Inventor); Langberg, Michael (Inventor); Sprintson, Alexander (Inventor)
2010-01-01
A method of transmitting data packets, where randomness is added to the schedule. Universal broadcast schedules using encoding and randomization techniques are also discussed, together with optimal randomized schedules and an approximation algorithm for finding near-optimal schedules.
Greedy algorithms in disordered systems
NASA Astrophysics Data System (ADS)
Duxbury, P. M.; Dobrin, R.
1999-08-01
We discuss search, minimal path and minimal spanning tree algorithms and their applications to disordered systems. Greedy algorithms solve these problems exactly, and are related to extremal dynamics in physics. Minimal cost path (Dijkstra) and minimal cost spanning tree (Prim) algorithms provide extremal dynamics for a polymer in a random medium (the KPZ universality class) and invasion percolation (without trapping) respectively.
Thermodynamic cost of computation, algorithmic complexity and the information metric
NASA Technical Reports Server (NTRS)
Zurek, W. H.
1989-01-01
Algorithmic complexity is discussed as a computational counterpart to the second law of thermodynamics. It is shown that algorithmic complexity, which is a measure of randomness, sets limits on the thermodynamic cost of computations and casts a new light on the limitations of Maxwell's demon. Algorithmic complexity can also be used to define distance between binary strings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitley, L. Darrell; Howe, Adele E.; Watson, Jean-Paul
2004-09-01
Tabu search is one of the most effective heuristics for locating high-quality solutions to a diverse array of NP-hard combinatorial optimization problems. Despite the widespread success of tabu search, researchers have a poor understanding of many key theoretical aspects of this algorithm, including models of the high-level run-time dynamics and identification of those search space features that influence problem difficulty. We consider these questions in the context of the job-shop scheduling problem (JSP), a domain where tabu search algorithms have been shown to be remarkably effective. Previously, we demonstrated that the mean distance between random local optima and the nearestmore » optimal solution is highly correlated with problem difficulty for a well-known tabu search algorithm for the JSP introduced by Taillard. In this paper, we discuss various shortcomings of this measure and develop a new model of problem difficulty that corrects these deficiencies. We show that Taillard's algorithm can be modeled with high fidelity as a simple variant of a straightforward random walk. The random walk model accounts for nearly all of the variability in the cost required to locate both optimal and sub-optimal solutions to random JSPs, and provides an explanation for differences in the difficulty of random versus structured JSPs. Finally, we discuss and empirically substantiate two novel predictions regarding tabu search algorithm behavior. First, the method for constructing the initial solution is highly unlikely to impact the performance of tabu search. Second, tabu tenure should be selected to be as small as possible while simultaneously avoiding search stagnation; values larger than necessary lead to significant degradations in performance.« less
González-Recio, O; Jiménez-Montero, J A; Alenda, R
2013-01-01
In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy and bias. This modification may be used to speed the calculus of genome-assisted evaluation in large data sets such us those obtained from consortiums. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Precise algorithm to generate random sequential adsorption of hard polygons at saturation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, G.
Random sequential adsorption (RSA) is a time-dependent packing process, in which particles of certain shapes are randomly and sequentially placed into an empty space without overlap. In the infinite-time limit, the density approaches a "saturation'' limit. Although this limit has attracted particular research interest, the majority of past studies could only probe this limit by extrapolation. We have previously found an algorithm to reach this limit using finite computational time for spherical particles, and could thus determine the saturation density of spheres with high accuracy. Here in this paper, we generalize this algorithm to generate saturated RSA packings of two-dimensionalmore » polygons. We also calculate the saturation density for regular polygons of three to ten sides, and obtain results that are consistent with previous, extrapolation-based studies.« less
Precise algorithm to generate random sequential adsorption of hard polygons at saturation
Zhang, G.
2018-04-30
Random sequential adsorption (RSA) is a time-dependent packing process, in which particles of certain shapes are randomly and sequentially placed into an empty space without overlap. In the infinite-time limit, the density approaches a "saturation'' limit. Although this limit has attracted particular research interest, the majority of past studies could only probe this limit by extrapolation. We have previously found an algorithm to reach this limit using finite computational time for spherical particles, and could thus determine the saturation density of spheres with high accuracy. Here in this paper, we generalize this algorithm to generate saturated RSA packings of two-dimensionalmore » polygons. We also calculate the saturation density for regular polygons of three to ten sides, and obtain results that are consistent with previous, extrapolation-based studies.« less
Lin, Nan; Jiang, Junhai; Guo, Shicheng; Xiong, Momiao
2015-01-01
Due to the advancement in sensor technology, the growing large medical image data have the ability to visualize the anatomical changes in biological tissues. As a consequence, the medical images have the potential to enhance the diagnosis of disease, the prediction of clinical outcomes and the characterization of disease progression. But in the meantime, the growing data dimensions pose great methodological and computational challenges for the representation and selection of features in image cluster analysis. To address these challenges, we first extend the functional principal component analysis (FPCA) from one dimension to two dimensions to fully capture the space variation of image the signals. The image signals contain a large number of redundant features which provide no additional information for clustering analysis. The widely used methods for removing the irrelevant features are sparse clustering algorithms using a lasso-type penalty to select the features. However, the accuracy of clustering using a lasso-type penalty depends on the selection of the penalty parameters and the threshold value. In practice, they are difficult to determine. Recently, randomized algorithms have received a great deal of attentions in big data analysis. This paper presents a randomized algorithm for accurate feature selection in image clustering analysis. The proposed method is applied to both the liver and kidney cancer histology image data from the TCGA database. The results demonstrate that the randomized feature selection method coupled with functional principal component analysis substantially outperforms the current sparse clustering algorithms in image cluster analysis. PMID:26196383
1994-07-13
• Oh, where will it all end? Cash-strapped NHS hospitals and trusts are apparently coming up with ever more ingenious ways of making money. A Danish nurse, over here for the recent Euroquan conference, was charged £100 for a study visit around Stanmore Hospital, the national bone place. Good to know that in these days of European unity we still manage to rip off the foreigners. One can only hope the money will go to a good cause, like re-building the crumbling ruin.
JPRS Report, Near East & South Asia.
1988-02-11
the shop of Hasan al Houri, the butcher , and pamper themselves with gossip, between the hanging blocks of meat . JPRS-NEA-88-005 11 February 1988...efforts being made to reopen the PLO office in Beirut. [Answer] Things have not come to this yet. What we need to do first is to reshape relations...thought, and he made his choice. At a Nasirite celebration at the Lawyers Associ- ation in January 1984 he declared for the first time that "destiny
Predictors of satisfaction in geographically close and long-distance relationships.
Lee, Ji-yeon; Pistole, M Carole
2012-04-01
In this study, the authors examined geographically close (GCRs) and long-distance (LDRs) romantic relationship satisfaction as explained by insecure attachment, self-disclosure, gossip, and idealization. After college student participants (N = 536) completed a Web survey, structural equation modeling (SEM) multigroup analysis revealed that the GCR and LDR models were nonequivalent, as expected. Self-disclosure mediated the insecure attachment-idealization path differently in GCRs and in LDRs. Self-disclosure was positively associated with idealization in GCRs and negatively associated with idealization in LDRs, with the insecure attachment-idealization and the insecure attachment-satisfaction paths negative for both GCRs and LDRs. Furthermore, the insecure attachment-idealization path was stronger than the mediated path, especially for LDRs; the insecure attachment-satisfaction path was stronger than the mediation model for GCRs and LDRs. In other words, the GCR and LDR models differed despite some similarities. For both, with higher insecure (i.e., anxious and avoidant) attachment, the person discloses less to the partner, idealizes the partner less, and is less satisfied with the relationship. Also, people who idealize are more satisfied. In contrast, in LDRs only, with higher insecure attachment, the people tend to gossip more. With higher insecure attachment and with higher self-disclosure, people idealize more in GCRs but idealize less in LDRs. Overall, attachment insecurity explained more idealization and satisfaction in LDRs than in GCRs. Implications are discussed.
Evaluating progressive-rendering algorithms in appearance design tasks.
Jiawei Ou; Karlik, Ondrej; Křivánek, Jaroslav; Pellacini, Fabio
2013-01-01
Progressive rendering is becoming a popular alternative to precomputational approaches to appearance design. However, progressive algorithms create images exhibiting visual artifacts at early stages. A user study investigated these artifacts' effects on user performance in appearance design tasks. Novice and expert subjects performed lighting and material editing tasks with four algorithms: random path tracing, quasirandom path tracing, progressive photon mapping, and virtual-point-light rendering. Both the novices and experts strongly preferred path tracing to progressive photon mapping and virtual-point-light rendering. None of the participants preferred random path tracing to quasirandom path tracing or vice versa; the same situation held between progressive photon mapping and virtual-point-light rendering. The user workflow didn’t differ significantly with the four algorithms. The Web Extras include a video showing how four progressive-rendering algorithms converged (at http://youtu.be/ck-Gevl1e9s), the source code used, and other supplementary materials.
Escalated convergent artificial bee colony
NASA Astrophysics Data System (ADS)
Jadon, Shimpi Singh; Bansal, Jagdish Chand; Tiwari, Ritu
2016-03-01
Artificial bee colony (ABC) optimisation algorithm is a recent, fast and easy-to-implement population-based meta heuristic for optimisation. ABC has been proved a rival algorithm with some popular swarm intelligence-based algorithms such as particle swarm optimisation, firefly algorithm and ant colony optimisation. The solution search equation of ABC is influenced by a random quantity which helps its search process in exploration at the cost of exploitation. In order to find a fast convergent behaviour of ABC while exploitation capability is maintained, in this paper basic ABC is modified in two ways. First, to improve exploitation capability, two local search strategies, namely classical unidimensional local search and levy flight random walk-based local search are incorporated with ABC. Furthermore, a new solution search strategy, namely stochastic diffusion scout search is proposed and incorporated into the scout bee phase to provide more chance to abandon solution to improve itself. Efficiency of the proposed algorithm is tested on 20 benchmark test functions of different complexities and characteristics. Results are very promising and they prove it to be a competitive algorithm in the field of swarm intelligence-based algorithms.
Query construction, entropy, and generalization in neural-network models
NASA Astrophysics Data System (ADS)
Sollich, Peter
1994-05-01
We study query construction algorithms, which aim at improving the generalization ability of systems that learn from examples by choosing optimal, nonredundant training sets. We set up a general probabilistic framework for deriving such algorithms from the requirement of optimizing a suitable objective function; specifically, we consider the objective functions entropy (or information gain) and generalization error. For two learning scenarios, the high-low game and the linear perceptron, we evaluate the generalization performance obtained by applying the corresponding query construction algorithms and compare it to training on random examples. We find qualitative differences between the two scenarios due to the different structure of the underlying rules (nonlinear and ``noninvertible'' versus linear); in particular, for the linear perceptron, random examples lead to the same generalization ability as a sequence of queries in the limit of an infinite number of examples. We also investigate learning algorithms which are ill matched to the learning environment and find that, in this case, minimum entropy queries can in fact yield a lower generalization ability than random examples. Finally, we study the efficiency of single queries and its dependence on the learning history, i.e., on whether the previous training examples were generated randomly or by querying, and the difference between globally and locally optimal query construction.
Spectral turning bands for efficient Gaussian random fields generation on GPUs and accelerators
NASA Astrophysics Data System (ADS)
Hunger, L.; Cosenza, B.; Kimeswenger, S.; Fahringer, T.
2015-11-01
A random field (RF) is a set of correlated random variables associated with different spatial locations. RF generation algorithms are of crucial importance for many scientific areas, such as astrophysics, geostatistics, computer graphics, and many others. Current approaches commonly make use of 3D fast Fourier transform (FFT), which does not scale well for RF bigger than the available memory; they are also limited to regular rectilinear meshes. We introduce random field generation with the turning band method (RAFT), an RF generation algorithm based on the turning band method that is optimized for massively parallel hardware such as GPUs and accelerators. Our algorithm replaces the 3D FFT with a lower-order, one-dimensional FFT followed by a projection step and is further optimized with loop unrolling and blocking. RAFT can easily generate RF on non-regular (non-uniform) meshes and efficiently produce fields with mesh sizes bigger than the available device memory by using a streaming, out-of-core approach. Our algorithm generates RF with the correct statistical behavior and is tested on a variety of modern hardware, such as NVIDIA Tesla, AMD FirePro and Intel Phi. RAFT is faster than the traditional methods on regular meshes and has been successfully applied to two real case scenarios: planetary nebulae and cosmological simulations.
Deception Undermines the Stability of Cooperation in Games of Indirect Reciprocity.
Számadó, Szabolcs; Szalai, Ferenc; Scheuring, István
2016-01-01
Indirect reciprocity is often claimed as one of the key mechanisms of human cooperation. It works only if there is a reputational score keeping and each individual can inform with high probability which other individuals were good or bad in the previous round. Gossip is often proposed as a mechanism that can maintain such coherence of reputations in the face of errors of transmission. Random errors, however, are not the only source of uncertainty in such situations. The possibility of deceptive communication, where the signallers aim to misinform the receiver cannot be excluded. While there is plenty of evidence for deceptive communication in humans the possibility of deception is not yet incorporated into models of indirect reciprocity. Here we show that when deceptive strategies are allowed in the population it will cause the collapse of the coherence of reputations and thus in turn it results the collapse of cooperation. This collapse is independent of the norms and the cost and benefit values. It is due to the fact that there is no selection for honest communication in the framework of indirect reciprocity. It follows that indirect reciprocity can be only proposed plausibly as a mechanism of human cooperation if additional mechanisms are specified in the model that maintains honesty.
Subjective randomness as statistical inference.
Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B
2018-06-01
Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.
Ensemble-type numerical uncertainty information from single model integrations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rauser, Florian, E-mail: florian.rauser@mpimet.mpg.de; Marotzke, Jochem; Korn, Peter
2015-07-01
We suggest an algorithm that quantifies the discretization error of time-dependent physical quantities of interest (goals) for numerical models of geophysical fluid dynamics. The goal discretization error is estimated using a sum of weighted local discretization errors. The key feature of our algorithm is that these local discretization errors are interpreted as realizations of a random process. The random process is determined by the model and the flow state. From a class of local error random processes we select a suitable specific random process by integrating the model over a short time interval at different resolutions. The weights of themore » influences of the local discretization errors on the goal are modeled as goal sensitivities, which are calculated via automatic differentiation. The integration of the weighted realizations of local error random processes yields a posterior ensemble of goal approximations from a single run of the numerical model. From the posterior ensemble we derive the uncertainty information of the goal discretization error. This algorithm bypasses the requirement of detailed knowledge about the models discretization to generate numerical error estimates. The algorithm is evaluated for the spherical shallow-water equations. For two standard test cases we successfully estimate the error of regional potential energy, track its evolution, and compare it to standard ensemble techniques. The posterior ensemble shares linear-error-growth properties with ensembles of multiple model integrations when comparably perturbed. The posterior ensemble numerical error estimates are of comparable size as those of a stochastic physics ensemble.« less
Toward a Principled Sampling Theory for Quasi-Orders
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601
Toward a Principled Sampling Theory for Quasi-Orders.
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.
Spiral bacterial foraging optimization method: Algorithm, evaluation and convergence analysis
NASA Astrophysics Data System (ADS)
Kasaiezadeh, Alireza; Khajepour, Amir; Waslander, Steven L.
2014-04-01
A biologically-inspired algorithm called Spiral Bacterial Foraging Optimization (SBFO) is investigated in this article. SBFO, previously proposed by the same authors, is a multi-agent, gradient-based algorithm that minimizes both the main objective function (local cost) and the distance between each agent and a temporary central point (global cost). A random jump is included normal to the connecting line of each agent to the central point, which produces a vortex around the temporary central point. This random jump is also suitable to cope with premature convergence, which is a feature of swarm-based optimization methods. The most important advantages of this algorithm are as follows: First, this algorithm involves a stochastic type of search with a deterministic convergence. Second, as gradient-based methods are employed, faster convergence is demonstrated over GA, DE, BFO, etc. Third, the algorithm can be implemented in a parallel fashion in order to decentralize large-scale computation. Fourth, the algorithm has a limited number of tunable parameters, and finally SBFO has a strong certainty of convergence which is rare in existing global optimization algorithms. A detailed convergence analysis of SBFO for continuously differentiable objective functions has also been investigated in this article.
NASA Astrophysics Data System (ADS)
Moslemipour, Ghorbanali
2018-07-01
This paper aims at proposing a quadratic assignment-based mathematical model to deal with the stochastic dynamic facility layout problem. In this problem, product demands are assumed to be dependent normally distributed random variables with known probability density function and covariance that change from period to period at random. To solve the proposed model, a novel hybrid intelligent algorithm is proposed by combining the simulated annealing and clonal selection algorithms. The proposed model and the hybrid algorithm are verified and validated using design of experiment and benchmark methods. The results show that the hybrid algorithm has an outstanding performance from both solution quality and computational time points of view. Besides, the proposed model can be used in both of the stochastic and deterministic situations.
NASA Astrophysics Data System (ADS)
Nurdiyanto, Heri; Rahim, Robbi; Wulan, Nur
2017-12-01
Symmetric type cryptography algorithm is known many weaknesses in encryption process compared with asymmetric type algorithm, symmetric stream cipher are algorithm that works on XOR process between plaintext and key, to improve the security of symmetric stream cipher algorithm done improvisation by using Triple Transposition Key which developed from Transposition Cipher and also use Base64 algorithm for encryption ending process, and from experiment the ciphertext that produced good enough and very random.
NASA Astrophysics Data System (ADS)
Vatankhah, Saeed; Renaut, Rosemary A.; Ardestani, Vahid E.
2018-04-01
We present a fast algorithm for the total variation regularization of the 3-D gravity inverse problem. Through imposition of the total variation regularization, subsurface structures presenting with sharp discontinuities are preserved better than when using a conventional minimum-structure inversion. The associated problem formulation for the regularization is nonlinear but can be solved using an iteratively reweighted least-squares algorithm. For small-scale problems the regularized least-squares problem at each iteration can be solved using the generalized singular value decomposition. This is not feasible for large-scale, or even moderate-scale, problems. Instead we introduce the use of a randomized generalized singular value decomposition in order to reduce the dimensions of the problem and provide an effective and efficient solution technique. For further efficiency an alternating direction algorithm is used to implement the total variation weighting operator within the iteratively reweighted least-squares algorithm. Presented results for synthetic examples demonstrate that the novel randomized decomposition provides good accuracy for reduced computational and memory demands as compared to use of classical approaches.
Combinatorial approximation algorithms for MAXCUT using random walks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seshadhri, Comandur; Kale, Satyen
We give the first combinatorial approximation algorithm for MaxCut that beats the trivial 0.5 factor by a constant. The main partitioning procedure is very intuitive, natural, and easily described. It essentially performs a number of random walks and aggregates the information to provide the partition. We can control the running time to get an approximation factor-running time tradeoff. We show that for any constant b > 1.5, there is an {tilde O}(n{sup b}) algorithm that outputs a (0.5 + {delta})-approximation for MaxCut, where {delta} = {delta}(b) is some positive constant. One of the components of our algorithm is a weakmore » local graph partitioning procedure that may be of independent interest. Given a starting vertex i and a conductance parameter {phi}, unless a random walk of length {ell} = O(log n) starting from i mixes rapidly (in terms of {phi} and {ell}), we can find a cut of conductance at most {phi} close to the vertex. The work done per vertex found in the cut is sublinear in n.« less
Conditional Random Field-Based Offline Map Matching for Indoor Environments
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-01-01
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892
Conditional Random Field-Based Offline Map Matching for Indoor Environments.
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-08-16
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.
Robust local search for spacecraft operations using adaptive noise
NASA Technical Reports Server (NTRS)
Fukunaga, Alex S.; Rabideau, Gregg; Chien, Steve
2004-01-01
Randomization is a standard technique for improving the performance of local search algorithms for constraint satisfaction. However, it is well-known that local search algorithms are constraints satisfaction. However, it is well-known that local search algorithms are to the noise values selected. We investigate the use of an adaptive noise mechanism in an iterative repair-based planner/scheduler for spacecraft operations. Preliminary results indicate that adaptive noise makes the use of randomized repair moves safe and robust; that is, using adaptive noise makes it possible to consistently achieve, performance comparable with the best tuned noise setting without the need for manually tuning the noise parameter.
Adaptive Electronic Camouflage Using Texture Synthesis
2012-04-01
algorithm begins by computing the GLCMs, GIN and GOUT , of the input image (e.g., image of local environment) and output image (randomly generated...respectively. The algorithm randomly selects a pixel from the output image and cycles its gray-level through all values. For each value, GOUT is updated...The value of the selected pixel is permanently changed to the gray-level value that minimizes the error between GIN and GOUT . Without selecting a
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1990-01-01
While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.
A partially reflecting random walk on spheres algorithm for electrical impedance tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maire, Sylvain, E-mail: maire@univ-tln.fr; Simon, Martin, E-mail: simon@math.uni-mainz.de
2015-12-15
In this work, we develop a probabilistic estimator for the voltage-to-current map arising in electrical impedance tomography. This novel so-called partially reflecting random walk on spheres estimator enables Monte Carlo methods to compute the voltage-to-current map in an embarrassingly parallel manner, which is an important issue with regard to the corresponding inverse problem. Our method uses the well-known random walk on spheres algorithm inside subdomains where the diffusion coefficient is constant and employs replacement techniques motivated by finite difference discretization to deal with both mixed boundary conditions and interface transmission conditions. We analyze the global bias and the variance ofmore » the new estimator both theoretically and experimentally. Subsequently, the variance of the new estimator is considerably reduced via a novel control variate conditional sampling technique which yields a highly efficient hybrid forward solver coupling probabilistic and deterministic algorithms.« less
NASA Astrophysics Data System (ADS)
Shi, Jing; Shi, Yunli; Tan, Jian; Zhu, Lei; Li, Hu
2018-02-01
Traditional power forecasting models cannot efficiently take various factors into account, neither to identify the relation factors. In this paper, the mutual information in information theory and the artificial intelligence random forests algorithm are introduced into the medium and long-term electricity demand prediction. Mutual information can identify the high relation factors based on the value of average mutual information between a variety of variables and electricity demand, different industries may be highly associated with different variables. The random forests algorithm was used for building the different industries forecasting models according to the different correlation factors. The data of electricity consumption in Jiangsu Province is taken as a practical example, and the above methods are compared with the methods without regard to mutual information and the industries. The simulation results show that the above method is scientific, effective, and can provide higher prediction accuracy.
Complex Langevin simulation of a random matrix model at nonzero chemical potential
NASA Astrophysics Data System (ADS)
Bloch, J.; Glesaaen, J.; Verbaarschot, J. J. M.; Zafeiropoulos, S.
2018-03-01
In this paper we test the complex Langevin algorithm for numerical simulations of a random matrix model of QCD with a first order phase transition to a phase of finite baryon density. We observe that a naive implementation of the algorithm leads to phase quenched results, which were also derived analytically in this article. We test several fixes for the convergence issues of the algorithm, in particular the method of gauge cooling, the shifted representation, the deformation technique and reweighted complex Langevin, but only the latter method reproduces the correct analytical results in the region where the quark mass is inside the domain of the eigenvalues. In order to shed more light on the issues of the methods we also apply them to a similar random matrix model with a milder sign problem and no phase transition, and in that case gauge cooling solves the convergence problems as was shown before in the literature.
NASA Astrophysics Data System (ADS)
Berahmand, Kamal; Bouyer, Asgarali
2018-03-01
Community detection is an essential approach for analyzing the structural and functional properties of complex networks. Although many community detection algorithms have been recently presented, most of them are weak and limited in different ways. Label Propagation Algorithm (LPA) is a well-known and efficient community detection technique which is characterized by the merits of nearly-linear running time and easy implementation. However, LPA has some significant problems such as instability, randomness, and monster community detection. In this paper, an algorithm, namely node’s label influence policy for label propagation algorithm (LP-LPA) was proposed for detecting efficient community structures. LP-LPA measures link strength value for edges and nodes’ label influence value for nodes in a new label propagation strategy with preference on link strength and for initial nodes selection, avoid of random behavior in tiebreak states, and efficient updating order and rule update. These procedures can sort out the randomness issue in an original LPA and stabilize the discovered communities in all runs of the same network. Experiments on synthetic networks and a wide range of real-world social networks indicated that the proposed method achieves significant accuracy and high stability. Indeed, it can obviously solve monster community problem with regard to detecting communities in networks.
Randomized algorithms for high quality treatment planning in volumetric modulated arc therapy
NASA Astrophysics Data System (ADS)
Yang, Yu; Dong, Bin; Wen, Zaiwen
2017-02-01
In recent years, volumetric modulated arc therapy (VMAT) has been becoming a more and more important radiation technique widely used in clinical application for cancer treatment. One of the key problems in VMAT is treatment plan optimization, which is complicated due to the constraints imposed by the involved equipments. In this paper, we consider a model with four major constraints: the bound on the beam intensity, an upper bound on the rate of the change of the beam intensity, the moving speed of leaves of the multi-leaf collimator (MLC) and its directional-convexity. We solve the model by a two-stage algorithm: performing minimization with respect to the shapes of the aperture and the beam intensities alternatively. Specifically, the shapes of the aperture are obtained by a greedy algorithm whose performance is enhanced by random sampling in the leaf pairs with a decremental rate. The beam intensity is optimized using a gradient projection method with non-monotonic line search. We further improve the proposed algorithm by an incremental random importance sampling of the voxels to reduce the computational cost of the energy functional. Numerical simulations on two clinical cancer date sets demonstrate that our method is highly competitive to the state-of-the-art algorithms in terms of both computational time and quality of treatment planning.
A Hybrid Search Algorithm for Swarm Robots Searching in an Unknown Environment
Li, Shoutao; Li, Lina; Lee, Gordon; Zhang, Hao
2014-01-01
This paper proposes a novel method to improve the efficiency of a swarm of robots searching in an unknown environment. The approach focuses on the process of feeding and individual coordination characteristics inspired by the foraging behavior in nature. A predatory strategy was used for searching; hence, this hybrid approach integrated a random search technique with a dynamic particle swarm optimization (DPSO) search algorithm. If a search robot could not find any target information, it used a random search algorithm for a global search. If the robot found any target information in a region, the DPSO search algorithm was used for a local search. This particle swarm optimization search algorithm is dynamic as all the parameters in the algorithm are refreshed synchronously through a communication mechanism until the robots find the target position, after which, the robots fall back to a random searching mode. Thus, in this searching strategy, the robots alternated between two searching algorithms until the whole area was covered. During the searching process, the robots used a local communication mechanism to share map information and DPSO parameters to reduce the communication burden and overcome hardware limitations. If the search area is very large, search efficiency may be greatly reduced if only one robot searches an entire region given the limited resources available and time constraints. In this research we divided the entire search area into several subregions, selected a target utility function to determine which subregion should be initially searched and thereby reduced the residence time of the target to improve search efficiency. PMID:25386855
A hybrid search algorithm for swarm robots searching in an unknown environment.
Li, Shoutao; Li, Lina; Lee, Gordon; Zhang, Hao
2014-01-01
This paper proposes a novel method to improve the efficiency of a swarm of robots searching in an unknown environment. The approach focuses on the process of feeding and individual coordination characteristics inspired by the foraging behavior in nature. A predatory strategy was used for searching; hence, this hybrid approach integrated a random search technique with a dynamic particle swarm optimization (DPSO) search algorithm. If a search robot could not find any target information, it used a random search algorithm for a global search. If the robot found any target information in a region, the DPSO search algorithm was used for a local search. This particle swarm optimization search algorithm is dynamic as all the parameters in the algorithm are refreshed synchronously through a communication mechanism until the robots find the target position, after which, the robots fall back to a random searching mode. Thus, in this searching strategy, the robots alternated between two searching algorithms until the whole area was covered. During the searching process, the robots used a local communication mechanism to share map information and DPSO parameters to reduce the communication burden and overcome hardware limitations. If the search area is very large, search efficiency may be greatly reduced if only one robot searches an entire region given the limited resources available and time constraints. In this research we divided the entire search area into several subregions, selected a target utility function to determine which subregion should be initially searched and thereby reduced the residence time of the target to improve search efficiency.
Astronomers gossip about the (cosmic) neighborhood.
Jayawardhana, R
1994-09-09
The Hague, Netherlands, last month welcomed 2000 astronomers from around the world for the 22nd General Assembly of the International Astronomical Union (IAU). From 15 to 27 August, they participated in symposia and discussions on topics ranging from the down-to-Earth issue of light and radio-frequency pollution to the creation of elements at the farthest reaches of time and space, in the big bang. Some of the most striking news, however, came in new findings from our galaxy and its immediate surroundings.
Confidential student information in nursing education.
Morgan, J E
2001-01-01
Nurse educators frequently know intriguing personal information about students and must decide whether to share such information with colleagues. While sharing with colleagues is sometimes necessary, often it is not. Discussing stories about students may be an effective stress-relieving strategy for faculty, but stress reduction must not be achieved at the expense of ethical behavior. The author explores the fine line between gossip and collegial discourse that focuses on educational goals, considers whether a separate code of ethics for nurse educators is needed, and offers recommendations for action.
Nidheesh, N; Abdul Nazeer, K A; Ameer, P M
2017-12-01
Clustering algorithms with steps involving randomness usually give different results on different executions for the same dataset. This non-deterministic nature of algorithms such as the K-Means clustering algorithm limits their applicability in areas such as cancer subtype prediction using gene expression data. It is hard to sensibly compare the results of such algorithms with those of other algorithms. The non-deterministic nature of K-Means is due to its random selection of data points as initial centroids. We propose an improved, density based version of K-Means, which involves a novel and systematic method for selecting initial centroids. The key idea of the algorithm is to select data points which belong to dense regions and which are adequately separated in feature space as the initial centroids. We compared the proposed algorithm to a set of eleven widely used single clustering algorithms and a prominent ensemble clustering algorithm which is being used for cancer data classification, based on the performances on a set of datasets comprising ten cancer gene expression datasets. The proposed algorithm has shown better overall performance than the others. There is a pressing need in the Biomedical domain for simple, easy-to-use and more accurate Machine Learning tools for cancer subtype prediction. The proposed algorithm is simple, easy-to-use and gives stable results. Moreover, it provides comparatively better predictions of cancer subtypes from gene expression data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of PET projection data correction algorithm
NASA Astrophysics Data System (ADS)
Bazhanov, P. V.; Kotina, E. D.
2017-12-01
Positron emission tomography is modern nuclear medicine method used in metabolism and internals functions examinations. This method allows to diagnosticate treatments on their early stages. Mathematical algorithms are widely used not only for images reconstruction but also for PET data correction. In this paper random coincidences and scatter correction algorithms implementation are considered, as well as algorithm of PET projection data acquisition modeling for corrections verification.
Hu, Chen; Steingrimsson, Jon Arni
2018-01-01
A crucial component of making individualized treatment decisions is to accurately predict each patient's disease risk. In clinical oncology, disease risks are often measured through time-to-event data, such as overall survival and progression/recurrence-free survival, and are often subject to censoring. Risk prediction models based on recursive partitioning methods are becoming increasingly popular largely due to their ability to handle nonlinear relationships, higher-order interactions, and/or high-dimensional covariates. The most popular recursive partitioning methods are versions of the Classification and Regression Tree (CART) algorithm, which builds a simple interpretable tree structured model. With the aim of increasing prediction accuracy, the random forest algorithm averages multiple CART trees, creating a flexible risk prediction model. Risk prediction models used in clinical oncology commonly use both traditional demographic and tumor pathological factors as well as high-dimensional genetic markers and treatment parameters from multimodality treatments. In this article, we describe the most commonly used extensions of the CART and random forest algorithms to right-censored outcomes. We focus on how they differ from the methods for noncensored outcomes, and how the different splitting rules and methods for cost-complexity pruning impact these algorithms. We demonstrate these algorithms by analyzing a randomized Phase III clinical trial of breast cancer. We also conduct Monte Carlo simulations to compare the prediction accuracy of survival forests with more commonly used regression models under various scenarios. These simulation studies aim to evaluate how sensitive the prediction accuracy is to the underlying model specifications, the choice of tuning parameters, and the degrees of missing covariates.
A Sustainable City Planning Algorithm Based on TLBO and Local Search
NASA Astrophysics Data System (ADS)
Zhang, Ke; Lin, Li; Huang, Xuanxuan; Liu, Yiming; Zhang, Yonggang
2017-09-01
Nowadays, how to design a city with more sustainable features has become a center problem in the field of social development, meanwhile it has provided a broad stage for the application of artificial intelligence theories and methods. Because the design of sustainable city is essentially a constraint optimization problem, the swarm intelligence algorithm of extensive research has become a natural candidate for solving the problem. TLBO (Teaching-Learning-Based Optimization) algorithm is a new swarm intelligence algorithm. Its inspiration comes from the “teaching” and “learning” behavior of teaching class in the life. The evolution of the population is realized by simulating the “teaching” of the teacher and the student “learning” from each other, with features of less parameters, efficient, simple thinking, easy to achieve and so on. It has been successfully applied to scheduling, planning, configuration and other fields, which achieved a good effect and has been paid more and more attention by artificial intelligence researchers. Based on the classical TLBO algorithm, we propose a TLBO_LS algorithm combined with local search. We design and implement the random generation algorithm and evaluation model of urban planning problem. The experiments on the small and medium-sized random generation problem showed that our proposed algorithm has obvious advantages over DE algorithm and classical TLBO algorithm in terms of convergence speed and solution quality.
NASA Technical Reports Server (NTRS)
Molusis, J. A.
1982-01-01
An on line technique is presented for the identification of rotor blade modal damping and frequency from rotorcraft random response test data. The identification technique is based upon a recursive maximum likelihood (RML) algorithm, which is demonstrated to have excellent convergence characteristics in the presence of random measurement noise and random excitation. The RML technique requires virtually no user interaction, provides accurate confidence bands on the parameter estimates, and can be used for continuous monitoring of modal damping during wind tunnel or flight testing. Results are presented from simulation random response data which quantify the identified parameter convergence behavior for various levels of random excitation. The data length required for acceptable parameter accuracy is shown to depend upon the amplitude of random response and the modal damping level. Random response amplitudes of 1.25 degrees to .05 degrees are investigated. The RML technique is applied to hingeless rotor test data. The inplane lag regressing mode is identified at different rotor speeds. The identification from the test data is compared with the simulation results and with other available estimates of frequency and damping.
An algorithm of adaptive scale object tracking in occlusion
NASA Astrophysics Data System (ADS)
Zhao, Congmei
2017-05-01
Although the correlation filter-based trackers achieve the competitive results both on accuracy and robustness, there are still some problems in handling scale variations, object occlusion, fast motions and so on. In this paper, a multi-scale kernel correlation filter algorithm based on random fern detector was proposed. The tracking task was decomposed into the target scale estimation and the translation estimation. At the same time, the Color Names features and HOG features were fused in response level to further improve the overall tracking performance of the algorithm. In addition, an online random fern classifier was trained to re-obtain the target after the target was lost. By comparing with some algorithms such as KCF, DSST, TLD, MIL, CT and CSK, experimental results show that the proposed approach could estimate the object state accurately and handle the object occlusion effectively.
Near-optimal matrix recovery from random linear measurements.
Romanov, Elad; Gavish, Matan
2018-06-25
In matrix recovery from random linear measurements, one is interested in recovering an unknown M-by-N matrix [Formula: see text] from [Formula: see text] measurements [Formula: see text], where each [Formula: see text] is an M-by-N measurement matrix with i.i.d. random entries, [Formula: see text] We present a matrix recovery algorithm, based on approximate message passing, which iteratively applies an optimal singular-value shrinker-a nonconvex nonlinearity tailored specifically for matrix estimation. Our algorithm typically converges exponentially fast, offering a significant speedup over previously suggested matrix recovery algorithms, such as iterative solvers for nuclear norm minimization (NNM). It is well known that there is a recovery tradeoff between the information content of the object [Formula: see text] to be recovered (specifically, its matrix rank r) and the number of linear measurements n from which recovery is to be attempted. The precise tradeoff between r and n, beyond which recovery by a given algorithm becomes possible, traces the so-called phase transition curve of that algorithm in the [Formula: see text] plane. The phase transition curve of our algorithm is noticeably better than that of NNM. Interestingly, it is close to the information-theoretic lower bound for the minimal number of measurements needed for matrix recovery, making it not only state of the art in terms of convergence rate, but also near optimal in terms of the matrices it successfully recovers. Copyright © 2018 the Author(s). Published by PNAS.
Probability machines: consistent probability estimation using nonparametric learning machines.
Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A
2012-01-01
Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.
Optimal Quantum Spatial Search on Random Temporal Networks
NASA Astrophysics Data System (ADS)
Chakraborty, Shantanav; Novo, Leonardo; Di Giorgio, Serena; Omar, Yasser
2017-12-01
To investigate the performance of quantum information tasks on networks whose topology changes in time, we study the spatial search algorithm by continuous time quantum walk to find a marked node on a random temporal network. We consider a network of n nodes constituted by a time-ordered sequence of Erdös-Rényi random graphs G (n ,p ), where p is the probability that any two given nodes are connected: After every time interval τ , a new graph G (n ,p ) replaces the previous one. We prove analytically that, for any given p , there is always a range of values of τ for which the running time of the algorithm is optimal, i.e., O (√{n }), even when search on the individual static graphs constituting the temporal network is suboptimal. On the other hand, there are regimes of τ where the algorithm is suboptimal even when each of the underlying static graphs are sufficiently connected to perform optimal search on them. From this first study of quantum spatial search on a time-dependent network, it emerges that the nontrivial interplay between temporality and connectivity is key to the algorithmic performance. Moreover, our work can be extended to establish high-fidelity qubit transfer between any two nodes of the network. Overall, our findings show that one can exploit temporality to achieve optimal quantum information tasks on dynamical random networks.
A random forest algorithm for nowcasting of intense precipitation events
NASA Astrophysics Data System (ADS)
Das, Saurabh; Chakraborty, Rohit; Maitra, Animesh
2017-09-01
Automatic nowcasting of convective initiation and thunderstorms has potential applications in several sectors including aviation planning and disaster management. In this paper, random forest based machine learning algorithm is tested for nowcasting of convective rain with a ground based radiometer. Brightness temperatures measured at 14 frequencies (7 frequencies in 22-31 GHz band and 7 frequencies in 51-58 GHz bands) are utilized as the inputs of the model. The lower frequency band is associated to the water vapor absorption whereas the upper frequency band relates to the oxygen absorption and hence, provide information on the temperature and humidity of the atmosphere. Synthetic minority over-sampling technique is used to balance the data set and 10-fold cross validation is used to assess the performance of the model. Results indicate that random forest algorithm with fixed alarm generation time of 30 min and 60 min performs quite well (probability of detection of all types of weather condition ∼90%) with low false alarms. It is, however, also observed that reducing the alarm generation time improves the threat score significantly and also decreases false alarms. The proposed model is found to be very sensitive to the boundary layer instability as indicated by the variable importance measure. The study shows the suitability of a random forest algorithm for nowcasting application utilizing a large number of input parameters from diverse sources and can be utilized in other forecasting problems.
Cai, Jia; Tang, Yi
2018-02-01
Canonical correlation analysis (CCA) is a powerful statistical tool for detecting the linear relationship between two sets of multivariate variables. Kernel generalization of it, namely, kernel CCA is proposed to describe nonlinear relationship between two variables. Although kernel CCA can achieve dimensionality reduction results for high-dimensional data feature selection problem, it also yields the so called over-fitting phenomenon. In this paper, we consider a new kernel CCA algorithm via randomized Kaczmarz method. The main contributions of the paper are: (1) A new kernel CCA algorithm is developed, (2) theoretical convergence of the proposed algorithm is addressed by means of scaled condition number, (3) a lower bound which addresses the minimum number of iterations is presented. We test on both synthetic dataset and several real-world datasets in cross-language document retrieval and content-based image retrieval to demonstrate the effectiveness of the proposed algorithm. Numerical results imply the performance and efficiency of the new algorithm, which is competitive with several state-of-the-art kernel CCA methods. Copyright © 2017 Elsevier Ltd. All rights reserved.
Stochastic characterization of phase detection algorithms in phase-shifting interferometry
Munteanu, Florin
2016-11-01
Phase-shifting interferometry (PSI) is the preferred non-contact method for profiling sub-nanometer surfaces. Based on monochromatic light interference, the method computes the surface profile from a set of interferograms collected at separate stepping positions. Errors in the estimated profile are introduced when these positions are not located correctly. In order to cope with this problem, various algorithms that minimize the effects of certain types of stepping errors (linear, sinusoidal, etc.) have been developed. Despite the relatively large number of algorithms suggested in the literature, there is no unified way of characterizing their performance when additional unaccounted random errors are present. Here,more » we suggest a procedure for quantifying the expected behavior of each algorithm in the presence of independent and identically distributed (i.i.d.) random stepping errors, which can occur in addition to the systematic errors for which the algorithm has been designed. As a result, the usefulness of this method derives from the fact that it can guide the selection of the best algorithm for specific measurement situations.« less
Modal identification of structures from the responses and random decrement signatures
NASA Technical Reports Server (NTRS)
Brahim, S. R.; Goglia, G. L.
1977-01-01
The theory and application of a method which utilizes the free response of a structure to determine its vibration parameters is described. The time-domain free response is digitized and used in a digital computer program to determine the number of modes excited, the natural frequencies, the damping factors, and the modal vectors. The technique is applied to a complex generalized payload model previously tested using sine sweep method and analyzed by NASTRAN. Ten modes of the payload model are identified. In case free decay response is not readily available, an algorithm is developed to obtain the free responses of a structure from its random responses, due to some unknown or known random input or inputs, using the random decrement technique without changing time correlation between signals. The algorithm is tested using random responses from a generalized payload model and from the space shuttle model.
Dynamic defense and network randomization for computer systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavez, Adrian R.; Stout, William M. S.; Hamlet, Jason R.
The various technologies presented herein relate to determining a network attack is taking place, and further to adjust one or more network parameters such that the network becomes dynamically configured. A plurality of machine learning algorithms are configured to recognize an active attack pattern. Notification of the attack can be generated, and knowledge gained from the detected attack pattern can be utilized to improve the knowledge of the algorithms to detect a subsequent attack vector(s). Further, network settings and application communications can be dynamically randomized, wherein artificial diversity converts control systems into moving targets that help mitigate the early reconnaissancemore » stages of an attack. An attack(s) based upon a known static address(es) of a critical infrastructure network device(s) can be mitigated by the dynamic randomization. Network parameters that can be randomized include IP addresses, application port numbers, paths data packets navigate through the network, application randomization, etc.« less
A weighted ℓ{sub 1}-minimization approach for sparse polynomial chaos expansions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, Ji; Hampton, Jerrad; Doostan, Alireza, E-mail: alireza.doostan@colorado.edu
2014-06-15
This work proposes a method for sparse polynomial chaos (PC) approximation of high-dimensional stochastic functions based on non-adapted random sampling. We modify the standard ℓ{sub 1}-minimization algorithm, originally proposed in the context of compressive sampling, using a priori information about the decay of the PC coefficients, when available, and refer to the resulting algorithm as weightedℓ{sub 1}-minimization. We provide conditions under which we may guarantee recovery using this weighted scheme. Numerical tests are used to compare the weighted and non-weighted methods for the recovery of solutions to two differential equations with high-dimensional random inputs: a boundary value problem with amore » random elliptic operator and a 2-D thermally driven cavity flow with random boundary condition.« less
On the synchronizability and detectability of random PPM sequences
NASA Technical Reports Server (NTRS)
Georghiades, Costas N.; Lin, Shu
1987-01-01
The problem of synchronization and detection of random pulse-position-modulation (PPM) sequences is investigated under the assumption of perfect slot synchronization. Maximum-likelihood PPM symbol synchronization and receiver algorithms are derived that make decisions based both on soft as well as hard data; these algorithms are seen to be easily implementable. Bounds derived on the symbol error probability as well as the probability of false synchronization indicate the existence of a rather severe performance floor, which can easily be the limiting factor in the overall system performance. The performance floor is inherent in the PPM format and random data and becomes more serious as the PPM alphabet size Q is increased. A way to eliminate the performance floor is suggested by inserting special PPM symbols in the random data stream.
On the synchronizability and detectability of random PPM sequences
NASA Technical Reports Server (NTRS)
Georghiades, Costas N.
1987-01-01
The problem of synchronization and detection of random pulse-position-modulation (PPM) sequences is investigated under the assumption of perfect slot synchronization. Maximum likelihood PPM symbol synchronization and receiver algorithms are derived that make decisions based both on soft as well as hard data; these algorithms are seen to be easily implementable. Bounds were derived on the symbol error probability as well as the probability of false synchronization that indicate the existence of a rather severe performance floor, which can easily be the limiting factor in the overall system performance. The performance floor is inherent in the PPM format and random data and becomes more serious as the PPM alphabet size Q is increased. A way to eliminate the performance floor is suggested by inserting special PPM symbols in the random data stream.
Cluster-Based Multipolling Sequencing Algorithm for Collecting RFID Data in Wireless LANs
NASA Astrophysics Data System (ADS)
Choi, Woo-Yong; Chatterjee, Mainak
2015-03-01
With the growing use of RFID (Radio Frequency Identification), it is becoming important to devise ways to read RFID tags in real time. Access points (APs) of IEEE 802.11-based wireless Local Area Networks (LANs) are being integrated with RFID networks that can efficiently collect real-time RFID data. Several schemes, such as multipolling methods based on the dynamic search algorithm and random sequencing, have been proposed. However, as the number of RFID readers associated with an AP increases, it becomes difficult for the dynamic search algorithm to derive the multipolling sequence in real time. Though multipolling methods can eliminate the polling overhead, we still need to enhance the performance of the multipolling methods based on random sequencing. To that extent, we propose a real-time cluster-based multipolling sequencing algorithm that drastically eliminates more than 90% of the polling overhead, particularly so when the dynamic search algorithm fails to derive the multipolling sequence in real time.
Panda, Rashmi; Puhan, N B; Panda, Ganapati
2018-02-01
Accurate optic disc (OD) segmentation is an important step in obtaining cup-to-disc ratio-based glaucoma screening using fundus imaging. It is a challenging task because of the subtle OD boundary, blood vessel occlusion and intensity inhomogeneity. In this Letter, the authors propose an improved version of the random walk algorithm for OD segmentation to tackle such challenges. The algorithm incorporates the mean curvature and Gabor texture energy features to define the new composite weight function to compute the edge weights. Unlike the deformable model-based OD segmentation techniques, the proposed algorithm remains unaffected by curve initialisation and local energy minima problem. The effectiveness of the proposed method is verified with DRIVE, DIARETDB1, DRISHTI-GS and MESSIDOR database images using the performance measures such as mean absolute distance, overlapping ratio, dice coefficient, sensitivity, specificity and precision. The obtained OD segmentation results and quantitative performance measures show robustness and superiority of the proposed algorithm in handling the complex challenges in OD segmentation.
A novel global Harmony Search method based on Ant Colony Optimisation algorithm
NASA Astrophysics Data System (ADS)
Fouad, Allouani; Boukhetala, Djamel; Boudjema, Fares; Zenger, Kai; Gao, Xiao-Zhi
2016-03-01
The Global-best Harmony Search (GHS) is a stochastic optimisation algorithm recently developed, which hybridises the Harmony Search (HS) method with the concept of swarm intelligence in the particle swarm optimisation (PSO) to enhance its performance. In this article, a new optimisation algorithm called GHSACO is developed by incorporating the GHS with the Ant Colony Optimisation algorithm (ACO). Our method introduces a novel improvisation process, which is different from that of the GHS in the following aspects. (i) A modified harmony memory (HM) representation and conception. (ii) The use of a global random switching mechanism to monitor the choice between the ACO and GHS. (iii) An additional memory consideration selection rule using the ACO random proportional transition rule with a pheromone trail update mechanism. The proposed GHSACO algorithm has been applied to various benchmark functions and constrained optimisation problems. Simulation results demonstrate that it can find significantly better solutions when compared with the original HS and some of its variants.
Design and implementation of encrypted and decrypted file system based on USBKey and hardware code
NASA Astrophysics Data System (ADS)
Wu, Kehe; Zhang, Yakun; Cui, Wenchao; Jiang, Ting
2017-05-01
To protect the privacy of sensitive data, an encrypted and decrypted file system based on USBKey and hardware code is designed and implemented in this paper. This system uses USBKey and hardware code to authenticate a user. We use random key to encrypt file with symmetric encryption algorithm and USBKey to encrypt random key with asymmetric encryption algorithm. At the same time, we use the MD5 algorithm to calculate the hash of file to verify its integrity. Experiment results show that large files can be encrypted and decrypted in a very short time. The system has high efficiency and ensures the security of documents.
Adolescents' Responses to an Unintended Pregnancy in Ghana: A Qualitative Study.
Aziato, Lydia; Hindin, Michelle J; Maya, Ernest Tei; Manu, Abubakar; Amuasi, Susan Ama; Lawerh, Rachel Mahoe; Ankomah, Augustine
2016-12-01
To investigate the experiences and perceptions of adolescents who have experienced a recent pregnancy and undergone a termination of pregnancy. A vignette-based focus group approach was used to have adolescents reflect on scenarios that happen to others during an unwanted pregnancy. The study was conducted in public health facilities in the 3 major urban areas of Ghana-Accra, Kumasi, and Tamale. Adolescents, aged 10-19 years, who had a recent termination of pregnancy were recruited from public health facilities in the 3 sites. Fifteen focus groups were conducted and digitally recorded in English, Twi, Ga, and Dagbani. Transcripts were transcribed and translated, and thematic analysis was used for the analysis. Adolescents reported that the characters in the vignettes would feel sadness, depression, and regret from an unintended pregnancy and some male partners would "deny" the pregnancy or suggest an abortion. They suggested some parents would "be angry" and "sack" their children for becoming pregnant while others would "support" them. Parents might send the pregnant girl to a distant friend or grandparents until she delivers to avoid shame and gossip. Health professionals might encourage the pregnant girl or insult/gossip about the girl. Adolescent unintended pregnancies in Ghana are met with a range of reactions and these reactions influence the pregnancy choices young women make for continuation or termination of pregnancy. Copyright © 2016 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.
Pseudo-random dynamic address configuration (PRDAC) algorithm for mobile ad hoc networks
NASA Astrophysics Data System (ADS)
Wu, Shaochuan; Tan, Xuezhi
2007-11-01
By analyzing all kinds of address configuration algorithms, this paper provides a new pseudo-random dynamic address configuration (PRDAC) algorithm for mobile ad hoc networks. Based on PRDAC, the first node that initials this network randomly chooses a nonlinear shift register that can generates an m-sequence. When another node joins this network, the initial node will act as an IP address configuration sever to compute an IP address according to this nonlinear shift register, and then allocates this address and tell the generator polynomial of this shift register to this new node. By this means, when other node joins this network, any node that has obtained an IP address can act as a server to allocate address to this new node. PRDAC can also efficiently avoid IP conflicts and deal with network partition and merge as same as prophet address (PA) allocation and dynamic configuration and distribution protocol (DCDP). Furthermore, PRDAC has less algorithm complexity, less computational complexity and more sufficient assumption than PA. In addition, PRDAC radically avoids address conflicts and maximizes the utilization rate of IP addresses. Analysis and simulation results show that PRDAC has rapid convergence, low overhead and immune from topological structures.
Randomness Testing of the Advanced Encryption Standard Finalist Candidates
2000-03-28
Excursions Variant 18 168-185 Rank 1 7 Serial 2 186-187 Spectral DFT 1 8 Lempel - Ziv Compression 1 188 Aperiodic Templates 148 9-156 Linear Complexity...256 bits) for each of the algorithms , for a total of 80 different data sets10. These data sets were selected based on the belief that they would be...useful in evaluating the randomness of cryptographic algorithms . Table 2 lists the eight data types. For a description of the data types, see Appendix
An efficient genetic algorithm for maximum coverage deployment in wireless sensor networks.
Yoon, Yourim; Kim, Yong-Hyuk
2013-10-01
Sensor networks have a lot of applications such as battlefield surveillance, environmental monitoring, and industrial diagnostics. Coverage is one of the most important performance metrics for sensor networks since it reflects how well a sensor field is monitored. In this paper, we introduce the maximum coverage deployment problem in wireless sensor networks and analyze the properties of the problem and its solution space. Random deployment is the simplest way to deploy sensor nodes but may cause unbalanced deployment and therefore, we need a more intelligent way for sensor deployment. We found that the phenotype space of the problem is a quotient space of the genotype space in a mathematical view. Based on this property, we propose an efficient genetic algorithm using a novel normalization method. A Monte Carlo method is adopted to design an efficient evaluation function, and its computation time is decreased without loss of solution quality using a method that starts from a small number of random samples and gradually increases the number for subsequent generations. The proposed genetic algorithms could be further improved by combining with a well-designed local search. The performance of the proposed genetic algorithm is shown by a comparative experimental study. When compared with random deployment and existing methods, our genetic algorithm was not only about twice faster, but also showed significant performance improvement in quality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayne F. Boyer; Gurdeep S. Hura
2005-09-01
The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized taskmore » orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,« less
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; AliShaykhian, Gholam
2010-01-01
We present a simple multi-dimensional exhaustive search method to obtain, in a reasonable time, the optimal solution of a nonlinear programming problem. It is more relevant in the present day non-mainframe computing scenario where an estimated 95% computing resources remains unutilized and computing speed touches petaflops. While the processor speed is doubling every 18 months, the band width is doubling every 12 months, and the hard disk space is doubling every 9 months. A randomized search algorithm or, equivalently, an evolutionary search method is often used instead of an exhaustive search algorithm. The reason is that a randomized approach is usually polynomial-time, i.e., fast while an exhaustive search method is exponential-time i.e., slow. We discuss the increasing importance of exhaustive search in optimization with the steady increase of computing power for solving many real-world problems of reasonable size. We also discuss the computational error and complexity of the search algorithm focusing on the fact that no measuring device can usually measure a quantity with an accuracy greater than 0.005%. We stress the fact that the quality of solution of the exhaustive search - a deterministic method - is better than that of randomized search. In 21 st century computing environment, exhaustive search cannot be left aside as an untouchable and it is not always exponential. We also describe a possible application of these algorithms in improving the efficiency of solar cells - a real hot topic - in the current energy crisis. These algorithms could be excellent tools in the hands of experimentalists and could save not only large amount of time needed for experiments but also could validate the theory against experimental results fast.
Representation of high frequency Space Shuttle data by ARMA algorithms and random response spectra
NASA Technical Reports Server (NTRS)
Spanos, P. D.; Mushung, L. J.
1990-01-01
High frequency Space Shuttle lift-off data are treated by autoregressive (AR) and autoregressive-moving-average (ARMA) digital algorithms. These algorithms provide useful information on the spectral densities of the data. Further, they yield spectral models which lend themselves to incorporation to the concept of the random response spectrum. This concept yields a reasonably smooth power spectrum for the design of structural and mechanical systems when the available data bank is limited. Due to the non-stationarity of the lift-off event, the pertinent data are split into three slices. Each of the slices is associated with a rather distinguishable phase of the lift-off event, where stationarity can be expected. The presented results are rather preliminary in nature; it is aimed to call attention to the availability of the discussed digital algorithms and to the need to augment the Space Shuttle data bank as more flights are completed.
A simplified analytical random walk model for proton dose calculation
NASA Astrophysics Data System (ADS)
Yao, Weiguang; Merchant, Thomas E.; Farr, Jonathan B.
2016-10-01
We propose an analytical random walk model for proton dose calculation in a laterally homogeneous medium. A formula for the spatial fluence distribution of primary protons is derived. The variance of the spatial distribution is in the form of a distance-squared law of the angular distribution. To improve the accuracy of dose calculation in the Bragg peak region, the energy spectrum of the protons is used. The accuracy is validated against Monte Carlo simulation in water phantoms with either air gaps or a slab of bone inserted. The algorithm accurately reflects the dose dependence on the depth of the bone and can deal with small-field dosimetry. We further applied the algorithm to patients’ cases in the highly heterogeneous head and pelvis sites and used a gamma test to show the reasonable accuracy of the algorithm in these sites. Our algorithm is fast for clinical use.
Tian, Xin; Xin, Mingyuan; Luo, Jian; Liu, Mingyao; Jiang, Zhenran
2017-02-01
The selection of relevant genes for breast cancer metastasis is critical for the treatment and prognosis of cancer patients. Although much effort has been devoted to the gene selection procedures by use of different statistical analysis methods or computational techniques, the interpretation of the variables in the resulting survival models has been limited so far. This article proposes a new Random Forest (RF)-based algorithm to identify important variables highly related with breast cancer metastasis, which is based on the important scores of two variable selection algorithms, including the mean decrease Gini (MDG) criteria of Random Forest and the GeneRank algorithm with protein-protein interaction (PPI) information. The new gene selection algorithm can be called PPIRF. The improved prediction accuracy fully illustrated the reliability and high interpretability of gene list selected by the PPIRF approach.
Comparison of genetic algorithms with conjugate gradient methods
NASA Technical Reports Server (NTRS)
Bosworth, J. L.; Foo, N. Y.; Zeigler, B. P.
1972-01-01
Genetic algorithms for mathematical function optimization are modeled on search strategies employed in natural adaptation. Comparisons of genetic algorithms with conjugate gradient methods, which were made on an IBM 1800 digital computer, show that genetic algorithms display superior performance over gradient methods for functions which are poorly behaved mathematically, for multimodal functions, and for functions obscured by additive random noise. Genetic methods offer performance comparable to gradient methods for many of the standard functions.
Approximating the 0-1 Multiple Knapsack Problem with Agent Decomposition and Market Negotiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smolinski, B.
The 0-1 multiple knapsack problem appears in many domains from financial portfolio management to cargo ship stowing. Methods for solving it range from approximate algorithms, such as greedy algorithms, to exact algorithms, such as branch and bound. Approximate algorithms have no bounds on how poorly they perform and exact algorithms can suffer from exponential time and space complexities with large data sets. This paper introduces a market model based on agent decomposition and market auctions for approximating the 0-1 multiple knapsack problem, and an algorithm that implements the model (M(x)). M(x) traverses the solution space rather than getting caught inmore » a local maximum, overcoming an inherent problem of many greedy algorithms. The use of agents ensures that infeasible solutions are not considered while traversing the solution space and that traversal of the solution space is not just random, but is also directed. M(x) is compared to a bound and bound algorithm (BB) and a simple greedy algorithm with a random shuffle (G(x)). The results suggest that M(x) is a good algorithm for approximating the 0-1 Multiple Knapsack problem. M(x) almost always found solutions that were close to optimal in a fraction of the time it took BB to run and with much less memory on large test data sets. M(x) usually performed better than G(x) on hard problems with correlated data.« less
Random Numbers and Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
NASA Technical Reports Server (NTRS)
Olson, William S.; Kummerow, Christian D.; Yang, Song; Petty, Grant W.; Tao, Wei-Kuo; Bell, Thomas L.; Braun, Scott A.; Wang, Yansen; Lang, Stephen E.; Johnson, Daniel E.;
2006-01-01
A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and nonconvective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud-resolving model simulations, and from the Bayesian formulation itself. Synthetic rain-rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in TMI instantaneous rain-rate estimates at 0.5 -resolution range from approximately 50% at 1 mm/h to 20% at 14 mm/h. Errors in collocated spaceborne radar rain-rate estimates are roughly 50%-80% of the TMI errors at this resolution. The estimated algorithm random error in TMI rain rates at monthly, 2.5deg resolution is relatively small (less than 6% at 5 mm day.1) in comparison with the random error resulting from infrequent satellite temporal sampling (8%-35% at the same rain rate). Percentage errors resulting from sampling decrease with increasing rain rate, and sampling errors in latent heating rates follow the same trend. Averaging over 3 months reduces sampling errors in rain rates to 6%-15% at 5 mm day.1, with proportionate reductions in latent heating sampling errors.
Complex Langevin simulation of a random matrix model at nonzero chemical potential
Bloch, Jacques; Glesaaen, Jonas; Verbaarschot, Jacobus J. M.; ...
2018-03-06
In this study we test the complex Langevin algorithm for numerical simulations of a random matrix model of QCD with a first order phase transition to a phase of finite baryon density. We observe that a naive implementation of the algorithm leads to phase quenched results, which were also derived analytically in this article. We test several fixes for the convergence issues of the algorithm, in particular the method of gauge cooling, the shifted representation, the deformation technique and reweighted complex Langevin, but only the latter method reproduces the correct analytical results in the region where the quark mass ismore » inside the domain of the eigenvalues. In order to shed more light on the issues of the methods we also apply them to a similar random matrix model with a milder sign problem and no phase transition, and in that case gauge cooling solves the convergence problems as was shown before in the literature.« less
Do bioclimate variables improve performance of climate envelope models?
Watling, James I.; Romañach, Stephanie S.; Bucklin, David N.; Speroterra, Carolina; Brandt, Laura A.; Pearlstine, Leonard G.; Mazzotti, Frank J.
2012-01-01
Climate envelope models are widely used to forecast potential effects of climate change on species distributions. A key issue in climate envelope modeling is the selection of predictor variables that most directly influence species. To determine whether model performance and spatial predictions were related to the selection of predictor variables, we compared models using bioclimate variables with models constructed from monthly climate data for twelve terrestrial vertebrate species in the southeastern USA using two different algorithms (random forests or generalized linear models), and two model selection techniques (using uncorrelated predictors or a subset of user-defined biologically relevant predictor variables). There were no differences in performance between models created with bioclimate or monthly variables, but one metric of model performance was significantly greater using the random forest algorithm compared with generalized linear models. Spatial predictions between maps using bioclimate and monthly variables were very consistent using the random forest algorithm with uncorrelated predictors, whereas we observed greater variability in predictions using generalized linear models.
NASA Technical Reports Server (NTRS)
Chadwick, C.
1984-01-01
This paper describes the development and use of an algorithm to compute approximate statistics of the magnitude of a single random trajectory correction maneuver (TCM) Delta v vector. The TCM Delta v vector is modeled as a three component Cartesian vector each of whose components is a random variable having a normal (Gaussian) distribution with zero mean and possibly unequal standard deviations. The algorithm uses these standard deviations as input to produce approximations to (1) the mean and standard deviation of the magnitude of Delta v, (2) points of the probability density function of the magnitude of Delta v, and (3) points of the cumulative and inverse cumulative distribution functions of Delta v. The approximates are based on Monte Carlo techniques developed in a previous paper by the author and extended here. The algorithm described is expected to be useful in both pre-flight planning and in-flight analysis of maneuver propellant requirements for space missions.
Complex Langevin simulation of a random matrix model at nonzero chemical potential
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bloch, Jacques; Glesaaen, Jonas; Verbaarschot, Jacobus J. M.
In this study we test the complex Langevin algorithm for numerical simulations of a random matrix model of QCD with a first order phase transition to a phase of finite baryon density. We observe that a naive implementation of the algorithm leads to phase quenched results, which were also derived analytically in this article. We test several fixes for the convergence issues of the algorithm, in particular the method of gauge cooling, the shifted representation, the deformation technique and reweighted complex Langevin, but only the latter method reproduces the correct analytical results in the region where the quark mass ismore » inside the domain of the eigenvalues. In order to shed more light on the issues of the methods we also apply them to a similar random matrix model with a milder sign problem and no phase transition, and in that case gauge cooling solves the convergence problems as was shown before in the literature.« less
Moisan, Frédéric; Gonzalez, Cleotilde
2017-01-01
Game Theory is a common approach used to understand attacker and defender motives, strategies, and allocation of limited security resources. For example, many defense algorithms are based on game-theoretic solutions that conclude that randomization of defense actions assures unpredictability, creating difficulties for a human attacker. However, many game-theoretic solutions often rely on idealized assumptions of decision making that underplay the role of human cognition and information uncertainty. The consequence is that we know little about how effective these algorithms are against human players. Using a simplified security game, we study the type of attack strategy and the uncertainty about an attacker's strategy in a laboratory experiment where participants play the role of defenders against a simulated attacker. Our goal is to compare a human defender's behavior in three levels of uncertainty (Information Level: Certain, Risky, Uncertain) and three types of attacker's strategy (Attacker's strategy: Minimax, Random, Adaptive) in a between-subjects experimental design. Best defense performance is achieved when defenders play against a minimax and a random attack strategy compared to an adaptive strategy. Furthermore, when payoffs are certain, defenders are as efficient against random attack strategy as they are against an adaptive strategy, but when payoffs are uncertain, defenders have most difficulties defending against an adaptive attacker compared to a random attacker. We conclude that given conditions of uncertainty in many security problems, defense algorithms would be more efficient if they are adaptive to the attacker actions, taking advantage of the attacker's human inefficiencies. PMID:28690557
On efficient randomized algorithms for finding the PageRank vector
NASA Astrophysics Data System (ADS)
Gasnikov, A. V.; Dmitriev, D. Yu.
2015-03-01
Two randomized methods are considered for finding the PageRank vector; in other words, the solution of the system p T = p T P with a stochastic n × n matrix P, where n ˜ 107-109, is sought (in the class of probability distributions) with accuracy ɛ: ɛ ≫ n -1. Thus, the possibility of brute-force multiplication of P by the column is ruled out in the case of dense objects. The first method is based on the idea of Markov chain Monte Carlo algorithms. This approach is efficient when the iterative process p {/t+1 T} = p {/t T} P quickly reaches a steady state. Additionally, it takes into account another specific feature of P, namely, the nonzero off-diagonal elements of P are equal in rows (this property is used to organize a random walk over the graph with the matrix P). Based on modern concentration-of-measure inequalities, new bounds for the running time of this method are presented that take into account the specific features of P. In the second method, the search for a ranking vector is reduced to finding the equilibrium in the antagonistic matrix game where S n (1) is a unit simplex in ℝ n and I is the identity matrix. The arising problem is solved by applying a slightly modified Grigoriadis-Khachiyan algorithm (1995). This technique, like the Nazin-Polyak method (2009), is a randomized version of Nemirovski's mirror descent method. The difference is that randomization in the Grigoriadis-Khachiyan algorithm is used when the gradient is projected onto the simplex rather than when the stochastic gradient is computed. For sparse matrices P, the method proposed yields noticeably better results.
Ozçift, Akin
2011-05-01
Supervised classification algorithms are commonly used in the designing of computer-aided diagnosis systems. In this study, we present a resampling strategy based Random Forests (RF) ensemble classifier to improve diagnosis of cardiac arrhythmia. Random forests is an ensemble classifier that consists of many decision trees and outputs the class that is the mode of the class's output by individual trees. In this way, an RF ensemble classifier performs better than a single tree from classification performance point of view. In general, multiclass datasets having unbalanced distribution of sample sizes are difficult to analyze in terms of class discrimination. Cardiac arrhythmia is such a dataset that has multiple classes with small sample sizes and it is therefore adequate to test our resampling based training strategy. The dataset contains 452 samples in fourteen types of arrhythmias and eleven of these classes have sample sizes less than 15. Our diagnosis strategy consists of two parts: (i) a correlation based feature selection algorithm is used to select relevant features from cardiac arrhythmia dataset. (ii) RF machine learning algorithm is used to evaluate the performance of selected features with and without simple random sampling to evaluate the efficiency of proposed training strategy. The resultant accuracy of the classifier is found to be 90.0% and this is a quite high diagnosis performance for cardiac arrhythmia. Furthermore, three case studies, i.e., thyroid, cardiotocography and audiology, are used to benchmark the effectiveness of the proposed method. The results of experiments demonstrated the efficiency of random sampling strategy in training RF ensemble classification algorithm. Copyright © 2011 Elsevier Ltd. All rights reserved.
Underwater image enhancement through depth estimation based on random forest
NASA Astrophysics Data System (ADS)
Tai, Shen-Chuan; Tsai, Ting-Chou; Huang, Jyun-Han
2017-11-01
Light absorption and scattering in underwater environments can result in low-contrast images with a distinct color cast. This paper proposes a systematic framework for the enhancement of underwater images. Light transmission is estimated using the random forest algorithm. RGB values, luminance, color difference, blurriness, and the dark channel are treated as features in training and estimation. Transmission is calculated using an ensemble machine learning algorithm to deal with a variety of conditions encountered in underwater environments. A color compensation and contrast enhancement algorithm based on depth information was also developed with the aim of improving the visual quality of underwater images. Experimental results demonstrate that the proposed scheme outperforms existing methods with regard to subjective visual quality as well as objective measurements.
Optimizing event selection with the random grid search
NASA Astrophysics Data System (ADS)
Bhat, Pushpalatha C.; Prosper, Harrison B.; Sekmen, Sezen; Stewart, Chip
2018-07-01
The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.
Mortality risk score prediction in an elderly population using machine learning.
Rose, Sherri
2013-03-01
Standard practice for prediction often relies on parametric regression methods. Interesting new methods from the machine learning literature have been introduced in epidemiologic studies, such as random forest and neural networks. However, a priori, an investigator will not know which algorithm to select and may wish to try several. Here I apply the super learner, an ensembling machine learning approach that combines multiple algorithms into a single algorithm and returns a prediction function with the best cross-validated mean squared error. Super learning is a generalization of stacking methods. I used super learning in the Study of Physical Performance and Age-Related Changes in Sonomans (SPPARCS) to predict death among 2,066 residents of Sonoma, California, aged 54 years or more during the period 1993-1999. The super learner for predicting death (risk score) improved upon all single algorithms in the collection of algorithms, although its performance was similar to that of several algorithms. Super learner outperformed the worst algorithm (neural networks) by 44% with respect to estimated cross-validated mean squared error and had an R2 value of 0.201. The improvement of super learner over random forest with respect to R2 was approximately 2-fold. Alternatives for risk score prediction include the super learner, which can provide improved performance.
NASA Astrophysics Data System (ADS)
Domino, Krzysztof
2017-02-01
The cumulant analysis plays an important role in non Gaussian distributed data analysis. The shares' prices returns are good example of such data. The purpose of this research is to develop the cumulant based algorithm and use it to determine eigenvectors that represent investment portfolios with low variability. Such algorithm is based on the Alternating Least Square method and involves the simultaneous minimisation 2'nd- 6'th cumulants of the multidimensional random variable (percentage shares' returns of many companies). Then the algorithm was tested during the recent crash on the Warsaw Stock Exchange. To determine incoming crash and provide enter and exit signal for the investment strategy the Hurst exponent was calculated using the local DFA. It was shown that introduced algorithm is on average better that benchmark and other portfolio determination methods, but only within examination window determined by low values of the Hurst exponent. Remark that the algorithm is based on cumulant tensors up to the 6'th order calculated for a multidimensional random variable, what is the novel idea. It can be expected that the algorithm would be useful in the financial data analysis on the world wide scale as well as in the analysis of other types of non Gaussian distributed data.
Algorithmic Approaches for Place Recognition in Featureless, Walled Environments
2015-01-01
inertial measurement unit LIDAR light detection and ranging RANSAC random sample consensus SLAM simultaneous localization and mapping SUSAN smallest...algorithm 38 21 Typical input image for general junction based algorithm 39 22 Short exposure image of hallway junction taken by LIDAR 40 23...discipline of simultaneous localization and mapping ( SLAM ) has been studied intensively over the past several years. Many technical approaches
Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning
NASA Technical Reports Server (NTRS)
Smelyanskiy, V. N.; Toussaint, U. V.; Timucin, D. A.
2002-01-01
We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum excitation gap. g min, = O(n 2(exp -n/2), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to 'the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.
Dynamics of Quantum Adiabatic Evolution Algorithm for Number Partitioning
NASA Technical Reports Server (NTRS)
Smelyanskiy, Vadius; vonToussaint, Udo V.; Timucin, Dogan A.; Clancy, Daniel (Technical Monitor)
2002-01-01
We have developed a general technique to study the dynamics of the quantum adiabatic evolution algorithm applied to random combinatorial optimization problems in the asymptotic limit of large problem size n. We use as an example the NP-complete Number Partitioning problem and map the algorithm dynamics to that of an auxiliary quantum spin glass system with the slowly varying Hamiltonian. We use a Green function method to obtain the adiabatic eigenstates and the minimum exitation gap, gmin = O(n2(sup -n/2)), corresponding to the exponential complexity of the algorithm for Number Partitioning. The key element of the analysis is the conditional energy distribution computed for the set of all spin configurations generated from a given (ancestor) configuration by simultaneous flipping of a fixed number of spins. For the problem in question this distribution is shown to depend on the ancestor spin configuration only via a certain parameter related to the energy of the configuration. As the result, the algorithm dynamics can be described in terms of one-dimensional quantum diffusion in the energy space. This effect provides a general limitation of a quantum adiabatic computation in random optimization problems. Analytical results are in agreement with the numerical simulation of the algorithm.
Molecular Monte Carlo Simulations Using Graphics Processing Units: To Waste Recycle or Not?
Kim, Jihan; Rodgers, Jocelyn M; Athènes, Manuel; Smit, Berend
2011-10-11
In the waste recycling Monte Carlo (WRMC) algorithm, (1) multiple trial states may be simultaneously generated and utilized during Monte Carlo moves to improve the statistical accuracy of the simulations, suggesting that such an algorithm may be well posed for implementation in parallel on graphics processing units (GPUs). In this paper, we implement two waste recycling Monte Carlo algorithms in CUDA (Compute Unified Device Architecture) using uniformly distributed random trial states and trial states based on displacement random-walk steps, and we test the methods on a methane-zeolite MFI framework system to evaluate their utility. We discuss the specific implementation details of the waste recycling GPU algorithm and compare the methods to other parallel algorithms optimized for the framework system. We analyze the relationship between the statistical accuracy of our simulations and the CUDA block size to determine the efficient allocation of the GPU hardware resources. We make comparisons between the GPU and the serial CPU Monte Carlo implementations to assess speedup over conventional microprocessors. Finally, we apply our optimized GPU algorithms to the important problem of determining free energy landscapes, in this case for molecular motion through the zeolite LTA.
2010-11-01
TR 2010-188; R & D pour la défense Canada – Toronto; Novembre 2010. Introduction ou contexte : En règle générale, l’analyste du renseignement ou...model humans Series3 DRDC Toronto TR 2010-188 13 Figure 4. continued. Profiles for famous names generated by subjects and the model...document is classified) 13 . ABSTRACT (A brief and factual summary of the document. It may also appear elsewhere in the body of the document itself. It is
Gossip, stories and friendship: confidentiality in midwifery practice.
James, S
1995-12-01
Women often seek midwifery care as an alternative to the maternity services that are readily available within the insured health care system in Alberta. Some aspects of community-based, primary care midwifery in Alberta that characterize this alternative are the use of story-telling as a form of knowledge, the development of social connections among women seeking midwifery care, and nonauthoritarian relationships between midwives and women. In this paper, the concept of confidentiality, as it relates to these aspects of midwifery practice, is explored, using traditional, caring and feminist models of ethics.
Maintain workplace civility by sharing the vow of personal responsibility.
Chism, Marlene
2012-01-01
Office gossip, power struggles, employee burnout, and short fuses are becoming more the rule than the exception in running a medical practice. The difficult conversation avoided today can turn into the lawsuit 15 years later. Managers often find it hard to confront high performers and authority figures in the workplace. In order to deal with disruptive behavior and incivility before it ruins the medical practice, practice managers should institute the four steps outlined in this article plus the Vow of Personal Responsibility to improve clarity, teamwork, and personal performance.
JPRS Report, Soviet Union, Kommunist, No. 6, April 1989.
1989-07-13
Rolland’s "Diary of the War Years. 1914 -1919." As the note to the diaries mentions, in 1934 Rolland gave for safekeeping to the library of the university in... 1914 -1919. Many of the entries were based on Rolland’s talk with Henri Guilbeau and the Russian emigres who lived at that time in Switzerland—A.V...who, in his view, is simply a gossip who knows how to talk without saying anything." JPRS-UKO-89-012 13 JULY 1989 24 September 1918 . "...A very
Trans-algorithmic nature of learning in biological systems.
Shimansky, Yury P
2018-05-02
Learning ability is a vitally important, distinctive property of biological systems, which provides dynamic stability in non-stationary environments. Although several different types of learning have been successfully modeled using a universal computer, in general, learning cannot be described by an algorithm. In other words, algorithmic approach to describing the functioning of biological systems is not sufficient for adequate grasping of what is life. Since biosystems are parts of the physical world, one might hope that adding some physical mechanisms and principles to the concept of algorithm could provide extra possibilities for describing learning in its full generality. However, a straightforward approach to that through the so-called physical hypercomputation so far has not been successful. Here an alternative approach is proposed. Biosystems are described as achieving enumeration of possible physical compositions though random incremental modifications inflicted on them by active operating resources (AORs) in the environment. Biosystems learn through algorithmic regulation of the intensity of the above modifications according to a specific optimality criterion. From the perspective of external observers, biosystems move in the space of different algorithms driven by random modifications imposed by the environmental AORs. A particular algorithm is only a snapshot of that motion, while the motion itself is essentially trans-algorithmic. In this conceptual framework, death of unfit members of a population, for example, is viewed as a trans-algorithmic modification made in the population as a biosystem by environmental AORs. Numerous examples of AOR utilization in biosystems of different complexity, from viruses to multicellular organisms, are provided.
Serang, Oliver
2012-01-01
Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics. PMID:22952741
Removal of Stationary Sinusoidal Noise from Random Vibration Signals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian; Cap, Jerome S.
In random vibration environments, sinusoidal line noise may appear in the vibration signal and can affect analysis of the resulting data. We studied two methods which remove stationary sine tones from random noise: a matrix inversion algorithm and a chirp-z transform algorithm. In addition, we developed new methods to determine the frequency of the tonal noise. The results show that both of the removal methods can eliminate sine tones in prefabricated random vibration data when the sine-to-random ratio is at least 0.25. For smaller ratios down to 0.02 only the matrix inversion technique can remove the tones, but the metricsmore » to evaluate its effectiveness also degrade. We also found that using fast Fourier transforms best identified the tonal noise, and determined that band-pass-filtering the signals prior to the process improved sine removal. When applied to actual vibration test data, the methods were not as effective at removing harmonic tones, which we believe to be a result of mixed-phase sinusoidal noise.« less
Dimension from covariance matrices.
Carroll, T L; Byers, J M
2017-02-01
We describe a method to estimate embedding dimension from a time series. This method includes an estimate of the probability that the dimension estimate is valid. Such validity estimates are not common in algorithms for calculating the properties of dynamical systems. The algorithm described here compares the eigenvalues of covariance matrices created from an embedded signal to the eigenvalues for a covariance matrix of a Gaussian random process with the same dimension and number of points. A statistical test gives the probability that the eigenvalues for the embedded signal did not come from the Gaussian random process.
NASA Astrophysics Data System (ADS)
Apdilah, D.; Harahap, M. K.; Khairina, N.; Husein, A. M.; Harahap, M.
2018-04-01
One Time Pad algorithm always requires a pairing of the key for plaintext. If the length of keys less than a length of the plaintext, the key will be repeated until the length of the plaintext same with the length of the key. In this research, we use Linear Congruential Generator and Quadratic Congruential Generator for generating a random number. One Time Pad use a random number as a key for encryption and decryption process. Key will generate the first letter from the plaintext, we compare these two algorithms in terms of time speed encryption, and the result is a combination of OTP with LCG faster than the combination of OTP with QCG.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradonjic, Milan; Elsasser, Robert; Friedrich, Tobias
A Randon Geometric Graph (RGG) is constructed by distributing n nodes uniformly at random in the unit square and connecting two nodes if their Euclidean distance is at most r, for some prescribed r. They analyze the following randomized broadcast algorithm on RGGs. At the beginning, there is only one informed node. Then in each round, each informed node chooses a neighbor uniformly at random and informs it. They prove that this algorithm informs every node in the largest component of a RGG in {Omicron}({radical}n/r) rounds with high probability. This holds for any value of r larger than the criticalmore » value for the emergence of a giant component. In particular, the result implies that the diameter of the giant component is {Theta}({radical}n/r).« less
NASA Astrophysics Data System (ADS)
Ha, Jeongmok; Jeong, Hong
2016-07-01
This study investigates the directed acyclic subgraph (DAS) algorithm, which is used to solve discrete labeling problems much more rapidly than other Markov-random-field-based inference methods but at a competitive accuracy. However, the mechanism by which the DAS algorithm simultaneously achieves competitive accuracy and fast execution speed, has not been elucidated by a theoretical derivation. We analyze the DAS algorithm by comparing it with a message passing algorithm. Graphical models, inference methods, and energy-minimization frameworks are compared between DAS and message passing algorithms. Moreover, the performances of DAS and other message passing methods [sum-product belief propagation (BP), max-product BP, and tree-reweighted message passing] are experimentally compared.
Ye, Yalan; He, Wenwen; Cheng, Yunfei; Huang, Wenxia; Zhang, Zhilin
2017-02-16
The estimation of heart rate (HR) based on wearable devices is of interest in fitness. Photoplethysmography (PPG) is a promising approach to estimate HR due to low cost; however, it is easily corrupted by motion artifacts (MA). In this work, a robust approach based on random forest is proposed for accurately estimating HR from the photoplethysmography signal contaminated by intense motion artifacts, consisting of two stages. Stage 1 proposes a hybrid method to effectively remove MA with a low computation complexity, where two MA removal algorithms are combined by an accurate binary decision algorithm whose aim is to decide whether or not to adopt the second MA removal algorithm. Stage 2 proposes a random forest-based spectral peak-tracking algorithm, whose aim is to locate the spectral peak corresponding to HR, formulating the problem of spectral peak tracking into a pattern classification problem. Experiments on the PPG datasets including 22 subjects used in the 2015 IEEE Signal Processing Cup showed that the proposed approach achieved the average absolute error of 1.65 beats per minute (BPM) on the 22 PPG datasets. Compared to state-of-the-art approaches, the proposed approach has better accuracy and robustness to intense motion artifacts, indicating its potential use in wearable sensors for health monitoring and fitness tracking.
VNIR hyperspectral background characterization methods in adverse weather conditions
NASA Astrophysics Data System (ADS)
Romano, João M.; Rosario, Dalton; Roth, Luz
2009-05-01
Hyperspectral technology is currently being used by the military to detect regions of interest where potential targets may be located. Weather variability, however, may affect the ability for an algorithm to discriminate possible targets from background clutter. Nonetheless, different background characterization approaches may facilitate the ability for an algorithm to discriminate potential targets over a variety of weather conditions. In a previous paper, we introduced a new autonomous target size invariant background characterization process, the Autonomous Background Characterization (ABC) or also known as the Parallel Random Sampling (PRS) method, features a random sampling stage, a parallel process to mitigate the inclusion by chance of target samples into clutter background classes during random sampling; and a fusion of results at the end. In this paper, we will demonstrate how different background characterization approaches are able to improve performance of algorithms over a variety of challenging weather conditions. By using the Mahalanobis distance as the standard algorithm for this study, we compare the performance of different characterization methods such as: the global information, 2 stage global information, and our proposed method, ABC, using data that was collected under a variety of adverse weather conditions. For this study, we used ARDEC's Hyperspectral VNIR Adverse Weather data collection comprised of heavy, light, and transitional fog, light and heavy rain, and low light conditions.
NASA Astrophysics Data System (ADS)
Morone, Flaviano; Min, Byungjoon; Bo, Lin; Mari, Romain; Makse, Hernán A.
2016-07-01
We elaborate on a linear-time implementation of Collective-Influence (CI) algorithm introduced by Morone, Makse, Nature 524, 65 (2015) to find the minimal set of influencers in networks via optimal percolation. The computational complexity of CI is O(N log N) when removing nodes one-by-one, made possible through an appropriate data structure to process CI. We introduce two Belief-Propagation (BP) variants of CI that consider global optimization via message-passing: CI propagation (CIP) and Collective-Immunization-Belief-Propagation algorithm (CIBP) based on optimal immunization. Both identify a slightly smaller fraction of influencers than CI and, remarkably, reproduce the exact analytical optimal percolation threshold obtained in Random Struct. Alg. 21, 397 (2002) for cubic random regular graphs, leaving little room for improvement for random graphs. However, the small augmented performance comes at the expense of increasing running time to O(N2), rendering BP prohibitive for modern-day big-data. For instance, for big-data social networks of 200 million users (e.g., Twitter users sending 500 million tweets/day), CI finds influencers in 2.5 hours on a single CPU, while all BP algorithms (CIP, CIBP and BDP) would take more than 3,000 years to accomplish the same task.
Morone, Flaviano; Min, Byungjoon; Bo, Lin; Mari, Romain; Makse, Hernán A
2016-07-26
We elaborate on a linear-time implementation of Collective-Influence (CI) algorithm introduced by Morone, Makse, Nature 524, 65 (2015) to find the minimal set of influencers in networks via optimal percolation. The computational complexity of CI is O(N log N) when removing nodes one-by-one, made possible through an appropriate data structure to process CI. We introduce two Belief-Propagation (BP) variants of CI that consider global optimization via message-passing: CI propagation (CIP) and Collective-Immunization-Belief-Propagation algorithm (CIBP) based on optimal immunization. Both identify a slightly smaller fraction of influencers than CI and, remarkably, reproduce the exact analytical optimal percolation threshold obtained in Random Struct. Alg. 21, 397 (2002) for cubic random regular graphs, leaving little room for improvement for random graphs. However, the small augmented performance comes at the expense of increasing running time to O(N(2)), rendering BP prohibitive for modern-day big-data. For instance, for big-data social networks of 200 million users (e.g., Twitter users sending 500 million tweets/day), CI finds influencers in 2.5 hours on a single CPU, while all BP algorithms (CIP, CIBP and BDP) would take more than 3,000 years to accomplish the same task.
Analysis and Reduction of Complex Networks Under Uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghanem, Roger G
2014-07-31
This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC teammore » consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.« less
Morone, Flaviano; Min, Byungjoon; Bo, Lin; Mari, Romain; Makse, Hernán A.
2016-01-01
We elaborate on a linear-time implementation of Collective-Influence (CI) algorithm introduced by Morone, Makse, Nature 524, 65 (2015) to find the minimal set of influencers in networks via optimal percolation. The computational complexity of CI is O(N log N) when removing nodes one-by-one, made possible through an appropriate data structure to process CI. We introduce two Belief-Propagation (BP) variants of CI that consider global optimization via message-passing: CI propagation (CIP) and Collective-Immunization-Belief-Propagation algorithm (CIBP) based on optimal immunization. Both identify a slightly smaller fraction of influencers than CI and, remarkably, reproduce the exact analytical optimal percolation threshold obtained in Random Struct. Alg. 21, 397 (2002) for cubic random regular graphs, leaving little room for improvement for random graphs. However, the small augmented performance comes at the expense of increasing running time to O(N2), rendering BP prohibitive for modern-day big-data. For instance, for big-data social networks of 200 million users (e.g., Twitter users sending 500 million tweets/day), CI finds influencers in 2.5 hours on a single CPU, while all BP algorithms (CIP, CIBP and BDP) would take more than 3,000 years to accomplish the same task. PMID:27455878
A Novel Color Image Encryption Algorithm Based on Quantum Chaos Sequence
NASA Astrophysics Data System (ADS)
Liu, Hui; Jin, Cong
2017-03-01
In this paper, a novel algorithm of image encryption based on quantum chaotic is proposed. The keystreams are generated by the two-dimensional logistic map as initial conditions and parameters. And then general Arnold scrambling algorithm with keys is exploited to permute the pixels of color components. In diffusion process, a novel encryption algorithm, folding algorithm, is proposed to modify the value of diffused pixels. In order to get the high randomness and complexity, the two-dimensional logistic map and quantum chaotic map are coupled with nearest-neighboring coupled-map lattices. Theoretical analyses and computer simulations confirm that the proposed algorithm has high level of security.
Fast Algorithms for Estimating Mixture Parameters
1989-08-30
The investigation is a two year project with the first year sponsored by the Army Research Office and the second year by the National Science Foundation (Grant... Science Foundation during the coming year. Keywords: Fast algorithms; Algorithms Mixture Distribution Random Variables. (KR)...numerical testing of the accelerated fixed-point method was completed. The work on relaxation methods will be done under the sponsorship of the National
Statistical Signal Models and Algorithms for Image Analysis
1984-10-25
In this report, two-dimensional stochastic linear models are used in developing algorithms for image analysis such as classification, segmentation, and object detection in images characterized by textured backgrounds. These models generate two-dimensional random processes as outputs to which statistical inference procedures can naturally be applied. A common thread throughout our algorithms is the interpretation of the inference procedures in terms of linear prediction
Zhang, Yiyan; Xin, Yi; Li, Qin; Ma, Jianshe; Li, Shuai; Lv, Xiaodan; Lv, Weiqi
2017-11-02
Various kinds of data mining algorithms are continuously raised with the development of related disciplines. The applicable scopes and their performances of these algorithms are different. Hence, finding a suitable algorithm for a dataset is becoming an important emphasis for biomedical researchers to solve practical problems promptly. In this paper, seven kinds of sophisticated active algorithms, namely, C4.5, support vector machine, AdaBoost, k-nearest neighbor, naïve Bayes, random forest, and logistic regression, were selected as the research objects. The seven algorithms were applied to the 12 top-click UCI public datasets with the task of classification, and their performances were compared through induction and analysis. The sample size, number of attributes, number of missing values, and the sample size of each class, correlation coefficients between variables, class entropy of task variable, and the ratio of the sample size of the largest class to the least class were calculated to character the 12 research datasets. The two ensemble algorithms reach high accuracy of classification on most datasets. Moreover, random forest performs better than AdaBoost on the unbalanced dataset of the multi-class task. Simple algorithms, such as the naïve Bayes and logistic regression model are suitable for a small dataset with high correlation between the task and other non-task attribute variables. K-nearest neighbor and C4.5 decision tree algorithms perform well on binary- and multi-class task datasets. Support vector machine is more adept on the balanced small dataset of the binary-class task. No algorithm can maintain the best performance in all datasets. The applicability of the seven data mining algorithms on the datasets with different characteristics was summarized to provide a reference for biomedical researchers or beginners in different fields.
Optimizing event selection with the random grid search
Bhat, Pushpalatha C.; Prosper, Harrison B.; Sekmen, Sezen; ...
2018-02-27
In this paper, the random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector bosonmore » fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.« less
Determining the Number of Clusters in a Data Set Without Graphical Interpretation
NASA Technical Reports Server (NTRS)
Aguirre, Nathan S.; Davies, Misty D.
2011-01-01
Cluster analysis is a data mining technique that is meant ot simplify the process of classifying data points. The basic clustering process requires an input of data points and the number of clusters wanted. The clustering algorithm will then pick starting C points for the clusters, which can be either random spatial points or random data points. It then assigns each data point to the nearest C point where "nearest usually means Euclidean distance, but some algorithms use another criterion. The next step is determining whether the clustering arrangement this found is within a certain tolerance. If it falls within this tolerance, the process ends. Otherwise the C points are adjusted based on how many data points are in each cluster, and the steps repeat until the algorithm converges,
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, Chengguang; Drinkwater, Bruce W.
In this paper the performance of total focusing method is compared with the widely used time-reversal MUSIC super resolution technique. The algorithms are tested with simulated and experimental ultrasonic array data, each containing different noise levels. The simulated time domain signals allow the effects of array geometry, frequency, scatterer location, scatterer size, scatterer separation and random noise to be carefully controlled. The performance of the imaging algorithms is evaluated in terms of resolution and sensitivity to random noise. It is shown that for the low noise situation, time-reversal MUSIC provides enhanced lateral resolution when compared to the total focusing method.more » However, for higher noise levels, the total focusing method shows robustness, whilst the performance of time-reversal MUSIC is significantly degraded.« less
Optimizing Event Selection with the Random Grid Search
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhat, Pushpalatha C.; Prosper, Harrison B.; Sekmen, Sezen
2017-06-29
The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events inmore » the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.« less
Optimizing event selection with the random grid search
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhat, Pushpalatha C.; Prosper, Harrison B.; Sekmen, Sezen
In this paper, the random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector bosonmore » fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.« less
Grude, Nils; Lindbaek, Morten
2015-01-01
Objective. To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Design. Randomized controlled trial. Setting. Out-of-hours service, Oslo, Norway. Intervention. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010–November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Subjects. Women (n = 441) aged 16–55 years. Mean age in both groups 27 years. Main outcome measures. Number of days until symptomatic resolution. Results. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Conclusion. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder. PMID:25961367
Bollestad, Marianne; Grude, Nils; Lindbaek, Morten
2015-06-01
To compare the clinical outcome of patients presenting with symptoms of uncomplicated cystitis who were seen by a doctor, with patients who were given treatment following a diagnostic algorithm. Randomized controlled trial. Out-of-hours service, Oslo, Norway. Women with typical symptoms of uncomplicated cystitis were included in the trial in the time period September 2010-November 2011. They were randomized into two groups. One group received standard treatment according to the diagnostic algorithm, the other group received treatment after a regular consultation by a doctor. Women (n = 441) aged 16-55 years. Mean age in both groups 27 years. Number of days until symptomatic resolution. No significant differences were found between the groups in the basic patient demographics, severity of symptoms, or percentage of urine samples with single culture growth. A median of three days until symptomatic resolution was found in both groups. By day four 79% in the algorithm group and 72% in the regular consultation group were free of symptoms (p = 0.09). The number of patients who contacted a doctor again in the follow-up period and received alternative antibiotic treatment was insignificantly higher (p = 0.08) after regular consultation than after treatment according to the diagnostic algorithm. There were no cases of severe pyelonephritis or hospital admissions during the follow-up period. Using a diagnostic algorithm is a safe and efficient method for treating women with symptoms of uncomplicated cystitis at an out-of-hours service. This simplification of treatment strategy can lead to a more rational use of consultation time and a stricter adherence to National Antibiotic Guidelines for a common disorder.
Wiethoff, Katja; Baghai, Thomas C; Fisher, Robert; Seemüller, Florian; Laakmann, Gregor; Brieger, Peter; Cordes, Joachim; Malevani, Jaroslav; Laux, Gerd; Hauth, Iris; Möller, Hans-Jürgen; Kronmüller, Klaus-Thomas; Smolka, Michael N; Schlattmann, Peter; Berger, Maximilian; Ricken, Roland; Stamm, Thomas J; Heinz, Andreas; Bauer, Michael
2017-01-01
Abstract Background Treatment algorithms are considered as key to improve outcomes by enhancing the quality of care. This is the first randomized controlled study to evaluate the clinical effect of algorithm-guided treatment in inpatients with major depressive disorder. Methods Inpatients, aged 18 to 70 years with major depressive disorder from 10 German psychiatric departments were randomized to 5 different treatment arms (from 2000 to 2005), 3 of which were standardized stepwise drug treatment algorithms (ALGO). The fourth arm proposed medications and provided less specific recommendations based on a computerized documentation and expert system (CDES), the fifth arm received treatment as usual (TAU). ALGO included 3 different second-step strategies: lithium augmentation (ALGO LA), antidepressant dose-escalation (ALGO DE), and switch to a different antidepressant (ALGO SW). Time to remission (21-item Hamilton Depression Rating Scale ≤9) was the primary outcome. Results Time to remission was significantly shorter for ALGO DE (n=91) compared with both TAU (n=84) (HR=1.67; P=.014) and CDES (n=79) (HR=1.59; P=.031) and ALGO SW (n=89) compared with both TAU (HR=1.64; P=.018) and CDES (HR=1.56; P=.038). For both ALGO LA (n=86) and ALGO DE, fewer antidepressant medications were needed to achieve remission than for CDES or TAU (P<.001). Remission rates at discharge differed across groups; ALGO DE had the highest (89.2%) and TAU the lowest rates (66.2%). Conclusions A highly structured algorithm-guided treatment is associated with shorter times and fewer medication changes to achieve remission with depressed inpatients than treatment as usual or computerized medication choice guidance. PMID:28645191
Adli, Mazda; Wiethoff, Katja; Baghai, Thomas C; Fisher, Robert; Seemüller, Florian; Laakmann, Gregor; Brieger, Peter; Cordes, Joachim; Malevani, Jaroslav; Laux, Gerd; Hauth, Iris; Möller, Hans-Jürgen; Kronmüller, Klaus-Thomas; Smolka, Michael N; Schlattmann, Peter; Berger, Maximilian; Ricken, Roland; Stamm, Thomas J; Heinz, Andreas; Bauer, Michael
2017-09-01
Treatment algorithms are considered as key to improve outcomes by enhancing the quality of care. This is the first randomized controlled study to evaluate the clinical effect of algorithm-guided treatment in inpatients with major depressive disorder. Inpatients, aged 18 to 70 years with major depressive disorder from 10 German psychiatric departments were randomized to 5 different treatment arms (from 2000 to 2005), 3 of which were standardized stepwise drug treatment algorithms (ALGO). The fourth arm proposed medications and provided less specific recommendations based on a computerized documentation and expert system (CDES), the fifth arm received treatment as usual (TAU). ALGO included 3 different second-step strategies: lithium augmentation (ALGO LA), antidepressant dose-escalation (ALGO DE), and switch to a different antidepressant (ALGO SW). Time to remission (21-item Hamilton Depression Rating Scale ≤9) was the primary outcome. Time to remission was significantly shorter for ALGO DE (n=91) compared with both TAU (n=84) (HR=1.67; P=.014) and CDES (n=79) (HR=1.59; P=.031) and ALGO SW (n=89) compared with both TAU (HR=1.64; P=.018) and CDES (HR=1.56; P=.038). For both ALGO LA (n=86) and ALGO DE, fewer antidepressant medications were needed to achieve remission than for CDES or TAU (P<.001). Remission rates at discharge differed across groups; ALGO DE had the highest (89.2%) and TAU the lowest rates (66.2%). A highly structured algorithm-guided treatment is associated with shorter times and fewer medication changes to achieve remission with depressed inpatients than treatment as usual or computerized medication choice guidance. © The Author 2017. Published by Oxford University Press on behalf of CINP.
Ascent guidance algorithm using lidar wind measurements
NASA Technical Reports Server (NTRS)
Cramer, Evin J.; Bradt, Jerre E.; Hardtla, John W.
1990-01-01
The formulation of a general nonlinear programming guidance algorithm that incorporates wind measurements in the computation of ascent guidance steering commands is discussed. A nonlinear programming (NLP) algorithm that is designed to solve a very general problem has the potential to address the diversity demanded by future launch systems. Using B-splines for the command functional form allows the NLP algorithm to adjust the shape of the command profile to achieve optimal performance. The algorithm flexibility is demonstrated by simulation of ascent with dynamic loading constraints through a set of random wind profiles with and without wind sensing capability.
Mutchler, Matt G; McDavitt, Bryce; Ghani, Mansur A; Nogg, Kelsey; Winder, Terrell J A; Soto, Juliana K
2015-09-01
Biomedical HIV prevention strategies, such as pre-exposure prophylaxis (PrEP) and post-exposure prophylaxis (PEP), represent new opportunities to reduce critically high HIV infection rates among young black men who have sex with men (YBMSM). We report results of 24 dyadic qualitative interviews (N=48), conducted in Los Angeles, CA, exploring how YBMSM and their friends view PrEP and PEP. Interviews were analyzed using a grounded theory approach. Participants had widely divergent levels of knowledge about these prevention methods. Misconceptions and mistrust regarding PrEP were common, and concerns were expressed about PrEP-related stigma and the potential for gossip among peers who might assume a person on PrEP was HIV-positive. Yet participants also framed PrEP and PEP as valuable new options within an expanded "tool kit" of HIV prevention strategies that created possibilities for preventing new HIV infections, dating men with a different HIV status, and decreased anxiety about exposure to HIV. We organized themes around four main areas: (1) information and misinformation about biomedical HIV prevention; (2) expectations about PrEP, sexual behavior, and stigma; (3) gossip, disclosure, and "spreading the word" about PrEP and PEP; and (4) the roles of PrEP and PEP in an expanded HIV prevention tool kit. The findings suggest a need for guidance in navigating the increasingly complex array of HIV-prevention options available to YBMSM. Such "prevention navigation" could counter misconceptions and address barriers, such as stigma and mistrust, while helping YBMSM make informed selections from among expanded HIV prevention options.
Arbitrary-step randomly delayed robust filter with application to boost phase tracking
NASA Astrophysics Data System (ADS)
Qin, Wutao; Wang, Xiaogang; Bai, Yuliang; Cui, Naigang
2018-04-01
The conventional filters such as extended Kalman filter, unscented Kalman filter and cubature Kalman filter assume that the measurement is available in real-time and the measurement noise is Gaussian white noise. But in practice, both two assumptions are invalid. To solve this problem, a novel algorithm is proposed by taking the following four steps. At first, the measurement model is modified by the Bernoulli random variables to describe the random delay. Then, the expression of predicted measurement and covariance are reformulated, which could get rid of the restriction that the maximum number of delay must be one or two and the assumption that probabilities of Bernoulli random variables taking the value one are equal. Next, the arbitrary-step randomly delayed high-degree cubature Kalman filter is derived based on the 5th-degree spherical-radial rule and the reformulated expressions. Finally, the arbitrary-step randomly delayed high-degree cubature Kalman filter is modified to the arbitrary-step randomly delayed high-degree cubature Huber-based filter based on the Huber technique, which is essentially an M-estimator. Therefore, the proposed filter is not only robust to the randomly delayed measurements, but robust to the glint noise. The application to the boost phase tracking example demonstrate the superiority of the proposed algorithms.
Simulation of the mechanical behavior of random fiber networks with different microstructure.
Hatami-Marbini, H
2018-05-24
Filamentous protein networks are broadly encountered in biological systems such as cytoskeleton and extracellular matrix. Many numerical studies have been conducted to better understand the fundamental mechanisms behind the striking mechanical properties of these networks. In most of these previous numerical models, the Mikado algorithm has been used to represent the network microstructure. Here, a different algorithm is used to create random fiber networks in order to investigate possible roles of architecture on the elastic behavior of filamentous networks. In particular, random fibrous structures are generated from the growth of individual fibers from random nucleation points. We use computer simulations to determine the mechanical behavior of these networks in terms of their model parameters. The findings are presented and discussed along with the response of Mikado fiber networks. We demonstrate that these alternative networks and Mikado networks show a qualitatively similar response. Nevertheless, the overall elasticity of Mikado networks is stiffer compared to that of the networks created using the alternative algorithm. We describe the effective elasticity of both network types as a function of their line density and of the material properties of the filaments. We also characterize the ratio of bending and axial energy and discuss the behavior of these networks in terms of their fiber density distribution and coordination number.
Longitudinal data analysis with non-ignorable missing data.
Tseng, Chi-hong; Elashoff, Robert; Li, Ning; Li, Gang
2016-02-01
A common problem in the longitudinal data analysis is the missing data problem. Two types of missing patterns are generally considered in statistical literature: monotone and non-monotone missing data. Nonmonotone missing data occur when study participants intermittently miss scheduled visits, while monotone missing data can be from discontinued participation, loss to follow-up, and mortality. Although many novel statistical approaches have been developed to handle missing data in recent years, few methods are available to provide inferences to handle both types of missing data simultaneously. In this article, a latent random effects model is proposed to analyze longitudinal outcomes with both monotone and non-monotone missingness in the context of missing not at random. Another significant contribution of this article is to propose a new computational algorithm for latent random effects models. To reduce the computational burden of high-dimensional integration problem in latent random effects models, we develop a new computational algorithm that uses a new adaptive quadrature approach in conjunction with the Taylor series approximation for the likelihood function to simplify the E-step computation in the expectation-maximization algorithm. Simulation study is performed and the data from the scleroderma lung study are used to demonstrate the effectiveness of this method. © The Author(s) 2012.
Adaptive mechanism-based congestion control for networked systems
NASA Astrophysics Data System (ADS)
Liu, Zhi; Zhang, Yun; Chen, C. L. Philip
2013-03-01
In order to assure the communication quality in network systems with heavy traffic and limited bandwidth, a new ATRED (adaptive thresholds random early detection) congestion control algorithm is proposed for the congestion avoidance and resource management of network systems. Different to the traditional AQM (active queue management) algorithms, the control parameters of ATRED are not configured statically, but dynamically adjusted by the adaptive mechanism. By integrating with the adaptive strategy, ATRED alleviates the tuning difficulty of RED (random early detection) and shows a better control on the queue management, and achieve a more robust performance than RED under varying network conditions. Furthermore, a dynamic transmission control protocol-AQM control system using ATRED controller is introduced for the systematic analysis. It is proved that the stability of the network system can be guaranteed when the adaptive mechanism is finely designed. Simulation studies show the proposed ATRED algorithm achieves a good performance in varying network environments, which is superior to the RED and Gentle-RED algorithm, and providing more reliable service under varying network conditions.
Tan, Robin; Perkowski, Marek
2017-01-01
Electrocardiogram (ECG) signals sensed from mobile devices pertain the potential for biometric identity recognition applicable in remote access control systems where enhanced data security is demanding. In this study, we propose a new algorithm that consists of a two-stage classifier combining random forest and wavelet distance measure through a probabilistic threshold schema, to improve the effectiveness and robustness of a biometric recognition system using ECG data acquired from a biosensor integrated into mobile devices. The proposed algorithm is evaluated using a mixed dataset from 184 subjects under different health conditions. The proposed two-stage classifier achieves a total of 99.52% subject verification accuracy, better than the 98.33% accuracy from random forest alone and 96.31% accuracy from wavelet distance measure algorithm alone. These results demonstrate the superiority of the proposed algorithm for biometric identification, hence supporting its practicality in areas such as cloud data security, cyber-security or remote healthcare systems. PMID:28230745
Tan, Robin; Perkowski, Marek
2017-02-20
Electrocardiogram (ECG) signals sensed from mobile devices pertain the potential for biometric identity recognition applicable in remote access control systems where enhanced data security is demanding. In this study, we propose a new algorithm that consists of a two-stage classifier combining random forest and wavelet distance measure through a probabilistic threshold schema, to improve the effectiveness and robustness of a biometric recognition system using ECG data acquired from a biosensor integrated into mobile devices. The proposed algorithm is evaluated using a mixed dataset from 184 subjects under different health conditions. The proposed two-stage classifier achieves a total of 99.52% subject verification accuracy, better than the 98.33% accuracy from random forest alone and 96.31% accuracy from wavelet distance measure algorithm alone. These results demonstrate the superiority of the proposed algorithm for biometric identification, hence supporting its practicality in areas such as cloud data security, cyber-security or remote healthcare systems.
An efficient randomized algorithm for contact-based NMR backbone resonance assignment.
Kamisetty, Hetunandan; Bailey-Kellogg, Chris; Pandurangan, Gopal
2006-01-15
Backbone resonance assignment is a critical bottleneck in studies of protein structure, dynamics and interactions by nuclear magnetic resonance (NMR) spectroscopy. A minimalist approach to assignment, which we call 'contact-based', seeks to dramatically reduce experimental time and expense by replacing the standard suite of through-bond experiments with the through-space (nuclear Overhauser enhancement spectroscopy, NOESY) experiment. In the contact-based approach, spectral data are represented in a graph with vertices for putative residues (of unknown relation to the primary sequence) and edges for hypothesized NOESY interactions, such that observed spectral peaks could be explained if the residues were 'close enough'. Due to experimental ambiguity, several incorrect edges can be hypothesized for each spectral peak. An assignment is derived by identifying consistent patterns of edges (e.g. for alpha-helices and beta-sheets) within a graph and by mapping the vertices to the primary sequence. The key algorithmic challenge is to be able to uncover these patterns even when they are obscured by significant noise. This paper develops, analyzes and applies a novel algorithm for the identification of polytopes representing consistent patterns of edges in a corrupted NOESY graph. Our randomized algorithm aggregates simplices into polytopes and fixes inconsistencies with simple local modifications, called rotations, that maintain most of the structure already uncovered. In characterizing the effects of experimental noise, we employ an NMR-specific random graph model in proving that our algorithm gives optimal performance in expected polynomial time, even when the input graph is significantly corrupted. We confirm this analysis in simulation studies with graphs corrupted by up to 500% noise. Finally, we demonstrate the practical application of the algorithm on several experimental beta-sheet datasets. Our approach is able to eliminate a large majority of noise edges and to uncover large consistent sets of interactions. Our algorithm has been implemented in the platform-independent Python code. The software can be freely obtained for academic use by request from the authors.
Ringed Seal Search for Global Optimization via a Sensitive Search Model.
Saadi, Younes; Yanto, Iwan Tri Riyadi; Herawan, Tutut; Balakrishnan, Vimala; Chiroma, Haruna; Risnumawan, Anhar
2016-01-01
The efficiency of a metaheuristic algorithm for global optimization is based on its ability to search and find the global optimum. However, a good search often requires to be balanced between exploration and exploitation of the search space. In this paper, a new metaheuristic algorithm called Ringed Seal Search (RSS) is introduced. It is inspired by the natural behavior of the seal pup. This algorithm mimics the seal pup movement behavior and its ability to search and choose the best lair to escape predators. The scenario starts once the seal mother gives birth to a new pup in a birthing lair that is constructed for this purpose. The seal pup strategy consists of searching and selecting the best lair by performing a random walk to find a new lair. Affected by the sensitive nature of seals against external noise emitted by predators, the random walk of the seal pup takes two different search states, normal state and urgent state. In the normal state, the pup performs an intensive search between closely adjacent lairs; this movement is modeled via a Brownian walk. In an urgent state, the pup leaves the proximity area and performs an extensive search to find a new lair from sparse targets; this movement is modeled via a Levy walk. The switch between these two states is realized by the random noise emitted by predators. The algorithm keeps switching between normal and urgent states until the global optimum is reached. Tests and validations were performed using fifteen benchmark test functions to compare the performance of RSS with other baseline algorithms. The results show that RSS is more efficient than Genetic Algorithm, Particles Swarm Optimization and Cuckoo Search in terms of convergence rate to the global optimum. The RSS shows an improvement in terms of balance between exploration (extensive) and exploitation (intensive) of the search space. The RSS can efficiently mimic seal pups behavior to find best lair and provide a new algorithm to be used in global optimization problems.
Inference from clustering with application to gene-expression microarrays.
Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M
2002-01-01
There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.
2012-01-01
Background Hemorrhagic events are frequent in patients on treatment with antivitamin-K oral anticoagulants due to their narrow therapeutic margin. Studies performed with acenocoumarol have shown the relationship between demographic, clinical and genotypic variants and the response to these drugs. Once the influence of these genetic and clinical factors on the dose of acenocoumarol needed to maintain a stable international normalized ratio (INR) has been demonstrated, new strategies need to be developed to predict the appropriate doses of this drug. Several pharmacogenetic algorithms have been developed for warfarin, but only three have been developed for acenocoumarol. After the development of a pharmacogenetic algorithm, the obvious next step is to demonstrate its effectiveness and utility by means of a randomized controlled trial. The aim of this study is to evaluate the effectiveness and efficiency of an acenocoumarol dosing algorithm developed by our group which includes demographic, clinical and pharmacogenetic variables (VKORC1, CYP2C9, CYP4F2 and ApoE) in patients with venous thromboembolism (VTE). Methods and design This is a multicenter, single blind, randomized controlled clinical trial. The protocol has been approved by La Paz University Hospital Research Ethics Committee and by the Spanish Drug Agency. Two hundred and forty patients with VTE in which oral anticoagulant therapy is indicated will be included. Randomization (case/control 1:1) will be stratified by center. Acenocoumarol dose in the control group will be scheduled and adjusted following common clinical practice; in the experimental arm dosing will be following an individualized algorithm developed and validated by our group. Patients will be followed for three months. The main endpoints are: 1) Percentage of patients with INR within the therapeutic range on day seven after initiation of oral anticoagulant therapy; 2) Time from the start of oral anticoagulant treatment to achievement of a stable INR within the therapeutic range; 3) Number of INR determinations within the therapeutic range in the first six weeks of treatment. Discussion To date, there are no clinical trials comparing pharmacogenetic acenocoumarol dosing algorithm versus routine clinical practice in VTE. Implementation of this pharmacogenetic algorithm in the clinical practice routine could reduce side effects and improve patient safety. Trial registration Eudra CT. Identifier: 2009-016643-18. PMID:23237631
Fractal Landscape Algorithms for Environmental Simulations
NASA Astrophysics Data System (ADS)
Mao, H.; Moran, S.
2014-12-01
Natural science and geographical research are now able to take advantage of environmental simulations that more accurately test experimental hypotheses, resulting in deeper understanding. Experiments affected by the natural environment can benefit from 3D landscape simulations capable of simulating a variety of terrains and environmental phenomena. Such simulations can employ random terrain generation algorithms that dynamically simulate environments to test specific models against a variety of factors. Through the use of noise functions such as Perlin noise, Simplex noise, and diamond square algorithms, computers can generate simulations that model a variety of landscapes and ecosystems. This study shows how these algorithms work together to create realistic landscapes. By seeding values into the diamond square algorithm, one can control the shape of landscape. Perlin noise and Simplex noise are also used to simulate moisture and temperature. The smooth gradient created by coherent noise allows more realistic landscapes to be simulated. Terrain generation algorithms can be used in environmental studies and physics simulations. Potential studies that would benefit from simulations include the geophysical impact of flash floods or drought on a particular region and regional impacts on low lying area due to global warming and rising sea levels. Furthermore, terrain generation algorithms also serve as aesthetic tools to display landscapes (Google Earth), and simulate planetary landscapes. Hence, it can be used as a tool to assist science education. Algorithms used to generate these natural phenomena provide scientists a different approach in analyzing our world. The random algorithms used in terrain generation not only contribute to the generating the terrains themselves, but are also capable of simulating weather patterns.
Webb, Samuel J; Hanser, Thierry; Howlin, Brendan; Krause, Paul; Vessey, Jonathan D
2014-03-25
A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints.A fragmentation algorithm is utilised to investigate the model's behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model's behaviour for the specific query. Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development.
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Dieudonne, J. E.; Filippas, T. A.
1971-01-01
An algorithm employing a modified sequential random perturbation, or creeping random search, was applied to the problem of optimizing the parameters of a high-energy beam transport system. The stochastic solution of the mathematical model for first-order magnetic-field expansion allows the inclusion of state-variable constraints, and the inclusion of parameter constraints allowed by the method of algorithm application eliminates the possibility of infeasible solutions. The mathematical model and the algorithm were programmed for a real-time simulation facility; thus, two important features are provided to the beam designer: (1) a strong degree of man-machine communication (even to the extent of bypassing the algorithm and applying analog-matching techniques), and (2) extensive graphics for displaying information concerning both algorithm operation and transport-system behavior. Chromatic aberration was also included in the mathematical model and in the optimization process. Results presented show this method as yielding better solutions (in terms of resolutions) to the particular problem than those of a standard analog program as well as demonstrating flexibility, in terms of elements, constraints, and chromatic aberration, allowed by user interaction with both the algorithm and the stochastic model. Example of slit usage and a limited comparison of predicted results and actual results obtained with a 600 MeV cyclotron are given.
Design of nucleic acid sequences for DNA computing based on a thermodynamic approach
Tanaka, Fumiaki; Kameda, Atsushi; Yamamoto, Masahito; Ohuchi, Azuma
2005-01-01
We have developed an algorithm for designing multiple sequences of nucleic acids that have a uniform melting temperature between the sequence and its complement and that do not hybridize non-specifically with each other based on the minimum free energy (ΔGmin). Sequences that satisfy these constraints can be utilized in computations, various engineering applications such as microarrays, and nano-fabrications. Our algorithm is a random generate-and-test algorithm: it generates a candidate sequence randomly and tests whether the sequence satisfies the constraints. The novelty of our algorithm is that the filtering method uses a greedy search to calculate ΔGmin. This effectively excludes inappropriate sequences before ΔGmin is calculated, thereby reducing computation time drastically when compared with an algorithm without the filtering. Experimental results in silico showed the superiority of the greedy search over the traditional approach based on the hamming distance. In addition, experimental results in vitro demonstrated that the experimental free energy (ΔGexp) of 126 sequences correlated well with ΔGmin (|R| = 0.90) than with the hamming distance (|R| = 0.80). These results validate the rationality of a thermodynamic approach. We implemented our algorithm in a graphic user interface-based program written in Java. PMID:15701762
What a Difference a Parameter Makes: a Psychophysical Comparison of Random Dot Motion Algorithms
Pilly, Praveen K.; Seitz, Aaron R.
2009-01-01
Random dot motion (RDM) displays have emerged as one of the standard stimulus types employed in psychophysical and physiological studies of motion processing. RDMs are convenient because it is straightforward to manipulate the relative motion energy for a given motion direction in addition to stimulus parameters such as the speed, contrast, duration, density, aperture, etc. However, as widely as RDMs are employed so do they vary in their details of implementation. As a result, it is often difficult to make direct comparisons across studies employing different RDM algorithms and parameters. Here, we systematically measure the ability of human subjects to estimate motion direction for four commonly used RDM algorithms under a range of parameters in order to understand how these different algorithms compare in their perceptibility. We find that parametric and algorithmic differences can produce dramatically different performances. These effects, while surprising, can be understood in relationship to pertinent neurophysiological data regarding spatiotemporal displacement tuning properties of cells in area MT and how the tuning function changes with stimulus contrast and retinal eccentricity. These data help give a baseline by which different RDM algorithms can be compared, demonstrate a need for clearly reporting RDM details in the methods of papers, and also pose new constraints and challenges to models of motion direction processing. PMID:19336240
Tsuruta, S; Misztal, I; Strandén, I
2001-05-01
Utility of the preconditioned conjugate gradient algorithm with a diagonal preconditioner for solving mixed-model equations in animal breeding applications was evaluated with 16 test problems. The problems included single- and multiple-trait analyses, with data on beef, dairy, and swine ranging from small examples to national data sets. Multiple-trait models considered low and high genetic correlations. Convergence was based on relative differences between left- and right-hand sides. The ordering of equations was fixed effects followed by random effects, with no special ordering within random effects. The preconditioned conjugate gradient program implemented with double precision converged for all models. However, when implemented in single precision, the preconditioned conjugate gradient algorithm did not converge for seven large models. The preconditioned conjugate gradient and successive overrelaxation algorithms were subsequently compared for 13 of the test problems. The preconditioned conjugate gradient algorithm was easy to implement with the iteration on data for general models. However, successive overrelaxation requires specific programming for each set of models. On average, the preconditioned conjugate gradient algorithm converged in three times fewer rounds of iteration than successive overrelaxation. With straightforward implementations, programs using the preconditioned conjugate gradient algorithm may be two or more times faster than those using successive overrelaxation. However, programs using the preconditioned conjugate gradient algorithm would use more memory than would comparable implementations using successive overrelaxation. Extensive optimization of either algorithm can influence rankings. The preconditioned conjugate gradient implemented with iteration on data, a diagonal preconditioner, and in double precision may be the algorithm of choice for solving mixed-model equations when sufficient memory is available and ease of implementation is essential.
Dinucleotide controlled null models for comparative RNA gene prediction.
Gesell, Tanja; Washietl, Stefan
2008-05-27
Comparative prediction of RNA structures can be used to identify functional noncoding RNAs in genomic screens. It was shown recently by Babak et al. [BMC Bioinformatics. 8:33] that RNA gene prediction programs can be biased by the genomic dinucleotide content, in particular those programs using a thermodynamic folding model including stacking energies. As a consequence, there is need for dinucleotide-preserving control strategies to assess the significance of such predictions. While there have been randomization algorithms for single sequences for many years, the problem has remained challenging for multiple alignments and there is currently no algorithm available. We present a program called SISSIz that simulates multiple alignments of a given average dinucleotide content. Meeting additional requirements of an accurate null model, the randomized alignments are on average of the same sequence diversity and preserve local conservation and gap patterns. We make use of a phylogenetic substitution model that includes overlapping dependencies and site-specific rates. Using fast heuristics and a distance based approach, a tree is estimated under this model which is used to guide the simulations. The new algorithm is tested on vertebrate genomic alignments and the effect on RNA structure predictions is studied. In addition, we directly combined the new null model with the RNAalifold consensus folding algorithm giving a new variant of a thermodynamic structure based RNA gene finding program that is not biased by the dinucleotide content. SISSIz implements an efficient algorithm to randomize multiple alignments preserving dinucleotide content. It can be used to get more accurate estimates of false positive rates of existing programs, to produce negative controls for the training of machine learning based programs, or as standalone RNA gene finding program. Other applications in comparative genomics that require randomization of multiple alignments can be considered. SISSIz is available as open source C code that can be compiled for every major platform and downloaded here: http://sourceforge.net/projects/sissiz.
Operational Modal Analysis of Bridge Structures with Data from GNSS/Accelerometer Measurements.
Xiong, Chunbao; Lu, Huali; Zhu, Jinsong
2017-02-23
Real-time dynamic displacement and acceleration responses of the main span section of the Tianjin Fumin Bridge in China under ambient excitation were tested using a Global Navigation Satellite System (GNSS) dynamic deformation monitoring system and an acceleration sensor vibration test system. Considering the close relationship between the GNSS multipath errors and measurement environment in combination with the noise reduction characteristics of different filtering algorithms, the researchers proposed an AFEC mixed filtering algorithm, which is an combination of autocorrelation function-based empirical mode decomposition (EMD) and Chebyshev mixed filtering to extract the real vibration displacement of the bridge structure after system error correction and filtering de-noising of signals collected by the GNSS. The proposed AFEC mixed filtering algorithm had high accuracy (1 mm) of real displacement at the elevation direction. Next, the traditional random decrement technique (used mainly for stationary random processes) was expanded to non-stationary random processes. Combining the expanded random decrement technique (RDT) and autoregressive moving average model (ARMA), the modal frequency of the bridge structural system was extracted using an expanded ARMA_RDT modal identification method, which was compared with the power spectrum analysis results of the acceleration signal and finite element analysis results. Identification results demonstrated that the proposed algorithm is applicable to analyze the dynamic displacement monitoring data of real bridge structures under ambient excitation and could identify the first five orders of the inherent frequencies of the structural system accurately. The identification error of the inherent frequency was smaller than 6%, indicating the high identification accuracy of the proposed algorithm. Furthermore, the GNSS dynamic deformation monitoring method can be used to monitor dynamic displacement and identify the modal parameters of bridge structures. The GNSS can monitor the working state of bridges effectively and accurately. Research results can provide references to evaluate the bearing capacity, safety performance, and durability of bridge structures during operation.
Operational Modal Analysis of Bridge Structures with Data from GNSS/Accelerometer Measurements
Xiong, Chunbao; Lu, Huali; Zhu, Jinsong
2017-01-01
Real-time dynamic displacement and acceleration responses of the main span section of the Tianjin Fumin Bridge in China under ambient excitation were tested using a Global Navigation Satellite System (GNSS) dynamic deformation monitoring system and an acceleration sensor vibration test system. Considering the close relationship between the GNSS multipath errors and measurement environment in combination with the noise reduction characteristics of different filtering algorithms, the researchers proposed an AFEC mixed filtering algorithm, which is an combination of autocorrelation function-based empirical mode decomposition (EMD) and Chebyshev mixed filtering to extract the real vibration displacement of the bridge structure after system error correction and filtering de-noising of signals collected by the GNSS. The proposed AFEC mixed filtering algorithm had high accuracy (1 mm) of real displacement at the elevation direction. Next, the traditional random decrement technique (used mainly for stationary random processes) was expanded to non-stationary random processes. Combining the expanded random decrement technique (RDT) and autoregressive moving average model (ARMA), the modal frequency of the bridge structural system was extracted using an expanded ARMA_RDT modal identification method, which was compared with the power spectrum analysis results of the acceleration signal and finite element analysis results. Identification results demonstrated that the proposed algorithm is applicable to analyze the dynamic displacement monitoring data of real bridge structures under ambient excitation and could identify the first five orders of the inherent frequencies of the structural system accurately. The identification error of the inherent frequency was smaller than 6%, indicating the high identification accuracy of the proposed algorithm. Furthermore, the GNSS dynamic deformation monitoring method can be used to monitor dynamic displacement and identify the modal parameters of bridge structures. The GNSS can monitor the working state of bridges effectively and accurately. Research results can provide references to evaluate the bearing capacity, safety performance, and durability of bridge structures during operation. PMID:28241472
NASA Astrophysics Data System (ADS)
Adame, J.; Warzel, S.
2015-11-01
In this note, we use ideas of Farhi et al. [Int. J. Quantum. Inf. 6, 503 (2008) and Quantum Inf. Comput. 11, 840 (2011)] who link a lower bound on the run time of their quantum adiabatic search algorithm to an upper bound on the energy gap above the ground-state of the generators of this algorithm. We apply these ideas to the quantum random energy model (QREM). Our main result is a simple proof of the conjectured exponential vanishing of the energy gap of the QREM.
Security authentication using phase-encoded nanoparticle structures and polarized light.
Carnicer, Artur; Hassanfiroozi, Amir; Latorre-Carmona, Pedro; Huang, Yi-Pai; Javidi, Bahram
2015-01-15
Phase-encoded nanostructures such as quick response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase-encoded QR codes. The system is illuminated using polarized light, and the QR code is encoded using a phase-only random mask. Using classification algorithms, it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase-encoded QR codes using polarimetric signatures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adame, J.; Warzel, S., E-mail: warzel@ma.tum.de
In this note, we use ideas of Farhi et al. [Int. J. Quantum. Inf. 6, 503 (2008) and Quantum Inf. Comput. 11, 840 (2011)] who link a lower bound on the run time of their quantum adiabatic search algorithm to an upper bound on the energy gap above the ground-state of the generators of this algorithm. We apply these ideas to the quantum random energy model (QREM). Our main result is a simple proof of the conjectured exponential vanishing of the energy gap of the QREM.
Random search optimization based on genetic algorithm and discriminant function
NASA Technical Reports Server (NTRS)
Kiciman, M. O.; Akgul, M.; Erarslanoglu, G.
1990-01-01
The general problem of optimization with arbitrary merit and constraint functions, which could be convex, concave, monotonic, or non-monotonic, is treated using stochastic methods. To improve the efficiency of the random search methods, a genetic algorithm for the search phase and a discriminant function for the constraint-control phase were utilized. The validity of the technique is demonstrated by comparing the results to published test problem results. Numerical experimentation indicated that for cases where a quick near optimum solution is desired, a general, user-friendly optimization code can be developed without serious penalties in both total computer time and accuracy.
Multisource passive acoustic tracking: an application of random finite set data fusion
NASA Astrophysics Data System (ADS)
Ali, Andreas M.; Hudson, Ralph E.; Lorenzelli, Flavio; Yao, Kung
2010-04-01
Multisource passive acoustic tracking is useful in animal bio-behavioral study by replacing or enhancing human involvement during and after field data collection. Multiple simultaneous vocalizations are a common occurrence in a forest or a jungle, where many species are encountered. Given a set of nodes that are capable of producing multiple direction-of-arrivals (DOAs), such data needs to be combined into meaningful estimates. Random Finite Set provides the mathematical probabilistic model, which is suitable for analysis and optimal estimation algorithm synthesis. Then the proposed algorithm has been verified using a simulation and a controlled test experiment.
Standard random number generation for MBASIC
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1976-01-01
A machine-independent algorithm is presented and analyzed for generating pseudorandom numbers suitable for the standard MBASIC system. The algorithm used is the polynomial congruential or linear recurrence modulo 2 method. Numbers, formed as nonoverlapping adjacent 28-bit words taken from the bit stream produced by the formula a sub m + 532 = a sub m + 37 + a sub m (modulo 2), do not repeat within the projected age of the solar system, show no ensemble correlation, exhibit uniform distribution of adjacent numbers up to 19 dimensions, and do not deviate from random runs-up and runs-down behavior.
Using Chaotic System in Encryption
NASA Astrophysics Data System (ADS)
Findik, Oğuz; Kahramanli, Şirzat
In this paper chaotic systems and RSA encryption algorithm are combined in order to develop an encryption algorithm which accomplishes the modern standards. E.Lorenz's weather forecast' equations which are used to simulate non-linear systems are utilized to create chaotic map. This equation can be used to generate random numbers. In order to achieve up-to-date standards and use online and offline status, a new encryption technique that combines chaotic systems and RSA encryption algorithm has been developed. The combination of RSA algorithm and chaotic systems makes encryption system.
A simple approach to nonlinear estimation of physical systems
Christakos, G.
1988-01-01
Recursive algorithms for estimating the states of nonlinear physical systems are developed. This requires some key hypotheses regarding the structure of the underlying processes. Members of this class of random processes have several desirable properties for the nonlinear estimation of random signals. An assumption is made about the form of the estimator, which may then take account of a wide range of applications. Under the above assumption, the estimation algorithm is mathematically suboptimal but effective and computationally attractive. It may be compared favorably to Taylor series-type filters, nonlinear filters which approximate the probability density by Edgeworth or Gram-Charlier series, as well as to conventional statistical linearization-type estimators. To link theory with practice, some numerical results for a simulated system are presented, in which the responses from the proposed and the extended Kalman algorithms are compared. ?? 1988.
Mining Distance Based Outliers in Near Linear Time with Randomization and a Simple Pruning Rule
NASA Technical Reports Server (NTRS)
Bay, Stephen D.; Schwabacher, Mark
2003-01-01
Defining outliers by their distance to neighboring examples is a popular approach to finding unusual examples in a data set. Recently, much work has been conducted with the goal of finding fast algorithms for this task. We show that a simple nested loop algorithm that in the worst case is quadratic can give near linear time performance when the data is in random order and a simple pruning rule is used. We test our algorithm on real high-dimensional data sets with millions of examples and show that the near linear scaling holds over several orders of magnitude. Our average case analysis suggests that much of the efficiency is because the time to process non-outliers, which are the majority of examples, does not depend on the size of the data set.
Combined rule extraction and feature elimination in supervised classification.
Liu, Sheng; Patel, Ronak Y; Daga, Pankaj R; Liu, Haining; Fu, Gang; Doerksen, Robert J; Chen, Yixin; Wilkins, Dawn E
2012-09-01
There are a vast number of biology related research problems involving a combination of multiple sources of data to achieve a better understanding of the underlying problems. It is important to select and interpret the most important information from these sources. Thus it will be beneficial to have a good algorithm to simultaneously extract rules and select features for better interpretation of the predictive model. We propose an efficient algorithm, Combined Rule Extraction and Feature Elimination (CRF), based on 1-norm regularized random forests. CRF simultaneously extracts a small number of rules generated by random forests and selects important features. We applied CRF to several drug activity prediction and microarray data sets. CRF is capable of producing performance comparable with state-of-the-art prediction algorithms using a small number of decision rules. Some of the decision rules are biologically significant.
Bilayer segmentation of webcam videos using tree-based classifiers.
Yin, Pei; Criminisi, Antonio; Winn, John; Essa, Irfan
2011-01-01
This paper presents an automatic segmentation algorithm for video frames captured by a (monocular) webcam that closely approximates depth segmentation from a stereo camera. The frames are segmented into foreground and background layers that comprise a subject (participant) and other objects and individuals. The algorithm produces correct segmentations even in the presence of large background motion with a nearly stationary foreground. This research makes three key contributions: First, we introduce a novel motion representation, referred to as "motons," inspired by research in object recognition. Second, we propose estimating the segmentation likelihood from the spatial context of motion. The estimation is efficiently learned by random forests. Third, we introduce a general taxonomy of tree-based classifiers that facilitates both theoretical and experimental comparisons of several known classification algorithms and generates new ones. In our bilayer segmentation algorithm, diverse visual cues such as motion, motion context, color, contrast, and spatial priors are fused by means of a conditional random field (CRF) model. Segmentation is then achieved by binary min-cut. Experiments on many sequences of our videochat application demonstrate that our algorithm, which requires no initialization, is effective in a variety of scenes, and the segmentation results are comparable to those obtained by stereo systems.
Li, Yun; Wu, Wenqi; Jiang, Qingan; Wang, Jinling
2016-01-01
Based on stochastic modeling of Coriolis vibration gyros by the Allan variance technique, this paper discusses Angle Random Walk (ARW), Rate Random Walk (RRW) and Markov process gyroscope noises which have significant impacts on the North-finding accuracy. A new continuous rotation alignment algorithm for a Coriolis vibration gyroscope Inertial Measurement Unit (IMU) is proposed in this paper, in which the extended observation equations are used for the Kalman filter to enhance the estimation of gyro drift errors, thus improving the north-finding accuracy. Theoretical and numerical comparisons between the proposed algorithm and the traditional ones are presented. The experimental results show that the new continuous rotation alignment algorithm using the extended observation equations in the Kalman filter is more efficient than the traditional two-position alignment method. Using Coriolis vibration gyros with bias instability of 0.1°/h, a north-finding accuracy of 0.1° (1σ) is achieved by the new continuous rotation alignment algorithm, compared with 0.6° (1σ) north-finding accuracy for the two-position alignment and 1° (1σ) for the fixed-position alignment. PMID:27983585
Batool, Nazre; Chellappa, Rama
2014-09-01
Facial retouching is widely used in media and entertainment industry. Professional software usually require a minimum level of user expertise to achieve the desirable results. In this paper, we present an algorithm to detect facial wrinkles/imperfection. We believe that any such algorithm would be amenable to facial retouching applications. The detection of wrinkles/imperfections can allow these skin features to be processed differently than the surrounding skin without much user interaction. For detection, Gabor filter responses along with texture orientation field are used as image features. A bimodal Gaussian mixture model (GMM) represents distributions of Gabor features of normal skin versus skin imperfections. Then, a Markov random field model is used to incorporate the spatial relationships among neighboring pixels for their GMM distributions and texture orientations. An expectation-maximization algorithm then classifies skin versus skin wrinkles/imperfections. Once detected automatically, wrinkles/imperfections are removed completely instead of being blended or blurred. We propose an exemplar-based constrained texture synthesis algorithm to inpaint irregularly shaped gaps left by the removal of detected wrinkles/imperfections. We present results conducted on images downloaded from the Internet to show the efficacy of our algorithms.
Curve Set Feature-Based Robust and Fast Pose Estimation Algorithm
Hashimoto, Koichi
2017-01-01
Bin picking refers to picking the randomly-piled objects from a bin for industrial production purposes, and robotic bin picking is always used in automated assembly lines. In order to achieve a higher productivity, a fast and robust pose estimation algorithm is necessary to recognize and localize the randomly-piled parts. This paper proposes a pose estimation algorithm for bin picking tasks using point cloud data. A novel descriptor Curve Set Feature (CSF) is proposed to describe a point by the surface fluctuation around this point and is also capable of evaluating poses. The Rotation Match Feature (RMF) is proposed to match CSF efficiently. The matching process combines the idea of the matching in 2D space of origin Point Pair Feature (PPF) algorithm with nearest neighbor search. A voxel-based pose verification method is introduced to evaluate the poses and proved to be more than 30-times faster than the kd-tree-based verification method. Our algorithm is evaluated against a large number of synthetic and real scenes and proven to be robust to noise, able to detect metal parts, more accurately and more than 10-times faster than PPF and Oriented, Unique and Repeatable (OUR)-Clustered Viewpoint Feature Histogram (CVFH). PMID:28771216
Automatic segmentation of psoriasis lesions
NASA Astrophysics Data System (ADS)
Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang
2014-10-01
The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.
SMERFS: Stochastic Markov Evaluation of Random Fields on the Sphere
NASA Astrophysics Data System (ADS)
Creasey, Peter; Lang, Annika
2018-04-01
SMERFS (Stochastic Markov Evaluation of Random Fields on the Sphere) creates large realizations of random fields on the sphere. It uses a fast algorithm based on Markov properties and fast Fourier Transforms in 1d that generates samples on an n X n grid in O(n2 log n) and efficiently derives the necessary conditional covariance matrices.
Saxton, Michael J
2007-01-01
Modeling obstructed diffusion is essential to the understanding of diffusion-mediated processes in the crowded cellular environment. Simple Monte Carlo techniques for modeling obstructed random walks are explained and related to Brownian dynamics and more complicated Monte Carlo methods. Random number generation is reviewed in the context of random walk simulations. Programming techniques and event-driven algorithms are discussed as ways to speed simulations.
Knowledge Guided Evolutionary Algorithms in Financial Investing
ERIC Educational Resources Information Center
Wimmer, Hayden
2013-01-01
A large body of literature exists on evolutionary computing, genetic algorithms, decision trees, codified knowledge, and knowledge management systems; however, the intersection of these computing topics has not been widely researched. Moving through the set of all possible solutions--or traversing the search space--at random exhibits no control…
Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed
NASA Technical Reports Server (NTRS)
Rakoczy, John; Steincamp, James; Taylor, Jaime
2003-01-01
A reduced surrogate, one point crossover genetic algorithm with random rank-based selection was used successfully to estimate the multiple phases of a segmented optical system modeled on the seven-mirror Systematic Image-Based Optical Alignment testbed located at NASA's Marshall Space Flight Center.
Automated brain tumor segmentation using spatial accuracy-weighted hidden Markov Random Field.
Nie, Jingxin; Xue, Zhong; Liu, Tianming; Young, Geoffrey S; Setayesh, Kian; Guo, Lei; Wong, Stephen T C
2009-09-01
A variety of algorithms have been proposed for brain tumor segmentation from multi-channel sequences, however, most of them require isotropic or pseudo-isotropic resolution of the MR images. Although co-registration and interpolation of low-resolution sequences, such as T2-weighted images, onto the space of the high-resolution image, such as T1-weighted image, can be performed prior to the segmentation, the results are usually limited by partial volume effects due to interpolation of low-resolution images. To improve the quality of tumor segmentation in clinical applications where low-resolution sequences are commonly used together with high-resolution images, we propose the algorithm based on Spatial accuracy-weighted Hidden Markov random field and Expectation maximization (SHE) approach for both automated tumor and enhanced-tumor segmentation. SHE incorporates the spatial interpolation accuracy of low-resolution images into the optimization procedure of the Hidden Markov Random Field (HMRF) to segment tumor using multi-channel MR images with different resolutions, e.g., high-resolution T1-weighted and low-resolution T2-weighted images. In experiments, we evaluated this algorithm using a set of simulated multi-channel brain MR images with known ground-truth tissue segmentation and also applied it to a dataset of MR images obtained during clinical trials of brain tumor chemotherapy. The results show that more accurate tumor segmentation results can be obtained by comparing with conventional multi-channel segmentation algorithms.
Warren, Kristen M; Harvey, Joshua R; Chon, Ki H; Mendelson, Yitzhak
2016-03-07
Photoplethysmographic (PPG) waveforms are used to acquire pulse rate (PR) measurements from pulsatile arterial blood volume. PPG waveforms are highly susceptible to motion artifacts (MA), limiting the implementation of PR measurements in mobile physiological monitoring devices. Previous studies have shown that multichannel photoplethysmograms can successfully acquire diverse signal information during simple, repetitive motion, leading to differences in motion tolerance across channels. In this paper, we investigate the performance of a custom-built multichannel forehead-mounted photoplethysmographic sensor under a variety of intense motion artifacts. We introduce an advanced multichannel template-matching algorithm that chooses the channel with the least motion artifact to calculate PR for each time instant. We show that for a wide variety of random motion, channels respond differently to motion artifacts, and the multichannel estimate outperforms single-channel estimates in terms of motion tolerance, signal quality, and PR errors. We have acquired 31 data sets consisting of PPG waveforms corrupted by random motion and show that the accuracy of PR measurements achieved was increased by up to 2.7 bpm when the multichannel-switching algorithm was compared to individual channels. The percentage of PR measurements with error ≤ 5 bpm during motion increased by 18.9% when the multichannel switching algorithm was compared to the mean PR from all channels. Moreover, our algorithm enables automatic selection of the best signal fidelity channel at each time point among the multichannel PPG data.
Random Matrix Approach to Quantum Adiabatic Evolution Algorithms
NASA Technical Reports Server (NTRS)
Boulatov, Alexei; Smelyanskiy, Vadier N.
2004-01-01
We analyze the power of quantum adiabatic evolution algorithms (Q-QA) for solving random NP-hard optimization problems within a theoretical framework based on the random matrix theory (RMT). We present two types of the driven RMT models. In the first model, the driving Hamiltonian is represented by Brownian motion in the matrix space. We use the Brownian motion model to obtain a description of multiple avoided crossing phenomena. We show that the failure mechanism of the QAA is due to the interaction of the ground state with the "cloud" formed by all the excited states, confirming that in the driven RMT models. the Landau-Zener mechanism of dissipation is not important. We show that the QAEA has a finite probability of success in a certain range of parameters. implying the polynomial complexity of the algorithm. The second model corresponds to the standard QAEA with the problem Hamiltonian taken from the Gaussian Unitary RMT ensemble (GUE). We show that the level dynamics in this model can be mapped onto the dynamics in the Brownian motion model. However, the driven RMT model always leads to the exponential complexity of the algorithm due to the presence of the long-range intertemporal correlations of the eigenvalues. Our results indicate that the weakness of effective transitions is the leading effect that can make the Markovian type QAEA successful.
Super-resolution processing for multi-functional LPI waveforms
NASA Astrophysics Data System (ADS)
Li, Zhengzheng; Zhang, Yan; Wang, Shang; Cai, Jingxiao
2014-05-01
Super-resolution (SR) is a radar processing technique closely related to the pulse compression (or correlation receiver). There are many super-resolution algorithms developed for the improved range resolution and reduced sidelobe contaminations. Traditionally, the waveforms used for the SR have been either phase-coding (such as LKP3 code, Barker code) or the frequency modulation (chirp, or nonlinear frequency modulation). There are, however, an important class of waveforms which are either random in nature (such as random noise waveform), or randomly modulated for multiple function operations (such as the ADS-B radar signals in [1]). These waveforms have the advantages of low-probability-of-intercept (LPI). If the existing SR techniques can be applied to these waveforms, there will be much more flexibility for using these waveforms in actual sensing missions. Also, SR usually has great advantage that the final output (as estimation of ground truth) is largely independent of the waveform. Such benefits are attractive to many important primary radar applications. In this paper the general introduction of the SR algorithms are provided first, and some implementation considerations are discussed. The selected algorithms are applied to the typical LPI waveforms, and the results are discussed. It is observed that SR algorithms can be reliably used for LPI waveforms, on the other hand, practical considerations should be kept in mind in order to obtain the optimal estimation results.